Archive for the 'integration' Category

Converting a mailman archive to work with mod_mbox

August 08th, 2017 | Category: apache,guides,integration

Recently I was working with a friend to get mod_mbox up and running with some of the Wikimedia mailing list archives which are on mailman. These mailing lists don’t immediately work because they’re not in the right format however it’s relatively easy to pick these up to work with mod_mbox on Debian. Read more

No comments

Month in review: January

January for me was an interesting and varied month. Here is a quick look back at what I got up to in the month.
Read more

No comments

Putting Token Login to work

So a few weeks ago I released JAuthTools 1.5.4 which features Token Login. Token Login was created to solve the need to generate a secure token that you can use for automatic login, for example with stuff like newsletters. Today I’m going to show you how you can write something simple with Token Login to handle automatic login with tokens in a unique problem case.
Read more

14 comments

Today: 6-Nov-2008: Barcodes and Backups!

November 06th, 2008 | Category: development,integration,joomla,opensource,today

As it seems a few of my mornings are starting I played a quick albeit one sided game of Dawn of War with my house mate to test that the network play was working between my Mac (using CrossOver Games) and his PC (Windows XP). Suffice to say everything worked fine network wise, my Mac stil has some visual issues but I’m not too phased about that at the moment. So that was an interesting start to the day and when he returns from work it’ll be an interesting end as well.

Work again is fun as always. We’re working on solving various election problems and getting envelopes. Our printers came back stating that they didn’t want to do Code 128 for the barcodes because its too hard preferring Code 39 instead. As the day progressed it turned out that we’re going to use Code 128 more for space issues because we’ve got that much junk going out on the envelopes. This saga has been occurring over a few days, I really want to get a few proofs done before we actually get the batch done from them first to ensure that everything is good but we’ll see. Testing on the system we’re going to be using to track the election on has also begun to ensure that the system can in fact handle items properly, we had a word of warning from one of the newer guys from the regions that it had issues and that was with nowhere near the same workload. So we’re hoping we can test the system out with a few VM’s emulating the system for multiple data entry. There have been issues with the system in question in the past (in fact the system is being replaced)  so hopefully this won’t be a time when it does have issues otherwise we’re going to have large numbers of problems.

The bulk of my day was spent working on the restore framework for Joomla! 1.6, or more accurately the new JDataLoad system and the JLoaderSql adapter. The data load system, as its name suggests, loads data into the Joomla! database from a data source. In this case I’m looking at the SQL files which in my sample data is actually one of the 1.5 sites that I’m an administrator over at work. Its relatively small in the grand scheme of things with only a few meg and around 9000 queries. So far its been sufficient to find a few issues, one being dropping a table before a task yield which caused a missing table error from J! (put in a simple patch for that, if the last query was a drop, go to the next query in the hope that its a create) and another was a minor typo error which caused some multiline strings to be processed incorrectly when they were on a yield boundary again as well. But all in all its working well even importing data faster than MySQL Query Browser was (to its fairness it highlights each query as it goes) in my test runs. Its now committed to trunk and when I get a chance I’ll write up something about it and put it somewhere

Extra fun today came from trying to write up business cases for the projects I want to work on in the next year or so until I have to write it up again (fun, yeah!). Initially my boss (who is great) thought of me and tried to convert the files from Excel into a more Open format so that I can get at it on my Mac and Linux box. Somewhere during the conversion however the fields got trimmed and data lost, so I offered to edit the document in Excel directly using our Windows only document management system logging in via our Citrix services. The system, OpenText’s DM, isn’t too bad for the most part and does the job well and today I found no fault with it. Today was the day when Microsoft’s tools decided they wanted to misbehave.

Earlier in the week my boss had emailed me a doclink to the document stored in DM. A doclink is a small text file with the document number in it which basically triggers the system to load the specified document, something that usually works quite well. However Outlook, due to various configuration changes, decided that it didn’t want to start for some reason even though earlier in the week it was working perfectly fine. After Outlook repeatedly informing me it wanted to recreate my profile and then informing me that it couldn’t contact my Exchange server and offering to allow me to work offline which consequently failed due to a lack of a profile, I ended up using Mac OS X’s built in “Mail” application (yes, the email application is called “Mail”) to get at my email to find the document number and open it. Usefully enough this is done through the IMAP interface on Exchange and worked well and doesn’t suffer from some of the other issues that the Outlook clients have, such as the address book caching which caches the old Lotus Notes addresses instead of their newer Exchange ones resulting in emails going to the wrong place. Yay Outlook! Suffice to say I found the file and made the changes that I needed to before accosting our Exchange administrator who had returned to see if he could fix the issue – which he did mind you after some trial and error.

The last little item I looked into was building a system to version the content from Joomla! back into our document management system. The new document management officer assures me that we can do it and has even proposed a nifty way of importing the data into the system. It looks like one of the products we have, KoFax, will help us by allow us to generate XML files which specify the documents that we’re creating and the different versions. If we can get this to work it will be really awesome as it’ll mean that our website is in part integrating back into our document management system without hacking into the database! I’ve still got to build it and work out where we want to target the extension, but suffice to say its on my project list for next year.

No comments

Today, 14-Jan-08: Query languages, LDAP, business intelligence and filesystems

I’m going to start regularly writing daily posts about what I did today and the things I found interesting, I’ll at least try anyway.

First up for today is a personal thing, I completed a rather largish Uni assignment today which reminded me of all of the pains that come with C++, but to follow that I returned to working on my filesystem in C, which is just more pain. I got a quick response back, and almost full marks (96%) so I’m happy for all of the time I put in to get it done and how its probably far more complicated than anything else that will be submitted (it used Boost Signals and a whole heap of other things that I don’t think will ever be taught in the subject for a long time). But hey, thats just Uni!

Today I finally managed to get Pentaho, some business intelligence (BI) software, to play nicely with Novell eDirectory’s LDAP interface. I must have missed the option, but Pentaho doesn’t seem to accept anonymous binding to the LDAP server, which means I need to bind as a user. By default our users funnily enough have less access than the anonymous account (which is actually a proxy account with full browse permissions). The solution was simple enough: we shunted our dummy Pentaho user into the same group as the anonymous proxy account and everything worked. So I’ve now got Pentaho using LDAP for authentication (yay!) and a MySQL database to get its role/group permissions. Funnily enough when its all said and done the documentation is pretty close to the mark.

But once I had that I don’t have an ability to manage the groups/roles within Pentaho, so I end up having to write some small PHP to manage that. Luckily I worked on a project a while back that I called “Joomla! Central Management for Users” which basically connected directly to MySQL databases of Joomla! installs and altered the users. I had originally built it with a plugin infrastructure in mind so that I could plug other stuff into it later. Starting this morning it only had a ‘connector’ for Joomla! 1.0 via MySQL and LDAP, now it has one for the Pentaho security tables. This means I can easily copy users from LDAP or Joomla! into Pentaho without too much issues and has a debugged user interface already. But wait theres more!

When I was originally developing the tool I wrote a query language for it. See, SQL is a great language for databases, but its a bit hard to apply in situations where you don’t quite need all of that power. So I wrote my own query language. Its quite simple it can validate simple attributes and allows for set operations within “Sites” (a site is a container for users and groups). So for example I want to see all of the users who are on our web site but not in our LDAP directory:
existsin “Web Sites” and not existsin “LDAP”

Primitive sure, but it because writing a large SQL expression for something simple. I hope to expand on it, but it already does what it needs to do for the time being.

So I’ve covered query languages, LDAP and BI! All I need now is the filesystem news. Today there was a whole heap of fan fare on Slashdot about the ZFS news from Apple, whilst thats cool and all (especially since I don’t mind Apple’s UI), I personally have my own filesystem that I’ve gotten back into to do some work on. It also happens to be a Uni assignment due on Friday! So I’ll be back to working on that and hopefully I’ll have it to a nice stage that I can do some lightening talks at linux.conf.au!

No comments

A look at Google Apps for Your Domain

October 08th, 2007 | Category: google,integration,technology

With Toowoomba in the middle of an amalgamation with 7 different local government authorities who share our boundaries (or in the case of some not even that!) life is looking mighty interesting on various fronts. One of these fronts involve the IT issue of merging multiple disparate systems into one single system.Like any medium sized organisation (Toowoomba City Council currently employs around 900 people) we have a few systems in place to handle things. We’re using Pathway for our LIS data (who lives where and if they have a dog or not style stuff), JD Edwards for financials and assets (e.g. controlling pay roll), ESRI’s suite of GIS products (e.g. ArcMap and ArcSDE; working out where things are in the City) and Hummingbird’s Document Management solution to maintain our corporate documents. The challenge is to take these products and try and merge the information stored in seven different organisations who in some places won’t share any applications and integrate it into one.To make matters worse the organisations are spread in some cases hours away from where we are. So the problem I’m looking at is how do we integrate email, contacts and documents for all of these people together. They’re not going to have any of our standard software beyond Office which is a problem as we use Notes and like to think that one day we’ll move to open source.So Google Apps becomes an option for this transitional period while we try to work out what we’re going to deploy and how we’re going to deploy it. It works within a web browser and an internet connection, its relatively lightweight compared to other distributed solutions (large file transfers weak network connections, Citrix deployments) and offers a far more responsive nature than either of these because technically they are native to the desktop application (e.g. the web browser).Setting it up is an interesting situation as part of it requires ‘verification’.  There are a few options to get verified:

  1. Put a HTML file on some web space (this didn’t work for me)
  2. Set up a new DNS pointer for Google to find (this also didn’t work)
  3. Just set up the DNS the way it needs to be (e.g. pointing things to ghs.google.com)

 The last one ended up being the solution for me even though it isn’t obvious when you first start off that things will work this way. Thankfully you can get things up and running without having to verify that you own the domain, just end up setting up your DNS to point to the right place solves things anyway. So what does Google give you?

  • Mail – Their Google Mail product available on your domain, the main reason a lot of people will be deploying this solution.
  • Calendar – Their calendar solution is integrated into the mail address book. Interestingly enough they don’t have the address book feature as another application which for building corporate address books might be handy, or linking into a website.
  • Pages – Google Pages is perhaps one of the lesser known products in Google’s application stable, is a product similar to the old Homesite and Geocities products of old (before they sold out and had lots of ads, then people realised that doing everything manually most of the time was too much effort and they just wanted a template and WordPress did all they needed, or Joomla! did what they wanted better). I’ve used it since the early beta and this, like its brothers is stand alone as well.
  • Docs and Spreadsheets – Again, the boon here is the integration with Mail’s address book application which means that you have the ability to share documents (and document control) with different people within your organisation. As an administrator you can also restrict documents to within the domain or allow users to share it externally, so this doesn’t make it less secure than other solutions for document sharing (still doesn’t stop users exporting it to another format and emailing it manually anyway).
  • Chat – The final major application is Google’s XMPP powered IM solution, which again integrates into Mail’s address book to provide contact list management integrated with your contact list. This is available via the web browser standalone interfaces, your start page, within the Mail application or using a dedicted IM client such as Adium on the Mac, Google’s Talk application on Windows or the Gajim Jabber client or Pidgin on Linux. These chats can also be logged and are available in the Mail application as well.
  • Start – Like the customised Google home page (iGoogle), this is provided as an option for your domain as well. Again it integrates with the rest of the products like Mail, Calendar, Talk and Docs to allow for a very functional first page to go to (more functional than most options I’ve seen around the place). Its heavy integration in a small way puts it at the functionality level of something like Microsoft’s Sharepoint style solution, however the Google solution is not customisable (unlike Sharepoint) however out of the box it enables users to see more information about their data (such as the Docs integration)

This was just a review of the standard edition, the premier edition (at $50/user/year) offers a few more interesting features such as optional ads, policy and message recovery, resource scheduling, single sign on and other user services (including a 10 GB mail box). As an option to a Microsoft powered world, some of the tools are better integrated and easier to use (collaboration and versioning is awesome in Google’s Docs product) however the simple problem is that when the network link goes down, so does your entire office productivity.Something to dwell on.

No comments