Archive for October, 2007

The segmentation of the Internet

October 29th, 2007 | Category: thoughts

More and more I see articles about how the internet is becoming more and more disjunct: firewalled and limited by sovereign nation states to prevent the flow of information. Today I saw an article on Slashdot commenting about Russia’s interest in creating their own internet similar to that of China’s (perhaps complete with firewall as well) and other parts of the world have had similar stories where the government is attempting to limit or reduce the availability of information made available from the internet.

The fact is that it reminds of something that scares me more than I realise: the Internet in is segmentation is similar to the ‘Internex’ of the Seaquest DSV world, a sci-fi tv show from 1993-1996. The Internex had some cross over into different countries (e.g. sovereign nations were still connected) however each nation was heavily gaurded from the next one, I assume to protect from cyberterrorism. The cyberterrorism threat in itself reminds me of the claimed issues of Chinese attacks with in the USA government departments. It is always interesting to see where fiction leads us to reality (or reality becomes like fiction) with so many parts of sci-fi becoming more and more a reality. But the question is, does the rest of the world predicted in these visions of the future also come with it? Or is the restriction of this information again the start of more borders as governments realise what the Internet is really capable of delivering to its users?

It makes me wonder. Where is the world headed?

No comments

A hard drive

October 18th, 2007 | Category: university

So today I gave a business student a hard drive, a business IT student. He looked at me, he looked at the hard drive and then said “What is this?”

Some days I wonder about Business IT students, I wonder what they’re actually learning.

Obviously not what a hard drive looks like.

No comments

A virtual thought for the day

October 15th, 2007 | Category: linux,opensource,technology,virtualization

Last Friday I spent half an hour inside one of our smaller server rooms to fix our development VMWare ESX box. Its called development because its a Dell box and it takes a good five minutes for it to get past the BIOS and SCSI controller load screens. That was the first and last Dell server I think we bought. However what had happened is that the powered had failed the day before and even though we managed to take the machine down, I think the UPS serving that server room had failed after the backup generator also failed which meant that the machine didn’t go down cleanly. However when it came back up it ended up failing and unable to find a root partition. I’ve broken enough Linux boxes to recognize the error and realize the solution is really simple, just fix up the fstab which is all I did and magically the box started working properly. But this lead me to an interesting thought.

Consider an all Microsoft shop who had never considered Linux before but wanted to virtualise their products. The best product to do this from what I can tell is the VMWare ESX platform which is really a small Linux hypervisor and then their own management tools underneath. The aim is that the end user never has to see the Linux back end they only see the graphical tools for Windows like Virtual Control Centre or the web based interfaces.  This leads me to the funny thought that a pure MS shop that had perhaps sworn off Linux might be deploying ESX because it is the best option out there (MS hasn’t brought out their solution yet, thats waiting on their Longhorn Server product), an entire Microsoft world virtualized under Linux.

It makes you wonder where the world is heading.

No comments

A look at Google Apps for Your Domain

October 08th, 2007 | Category: google,integration,technology

With Toowoomba in the middle of an amalgamation with 7 different local government authorities who share our boundaries (or in the case of some not even that!) life is looking mighty interesting on various fronts. One of these fronts involve the IT issue of merging multiple disparate systems into one single system.Like any medium sized organisation (Toowoomba City Council currently employs around 900 people) we have a few systems in place to handle things. We’re using Pathway for our LIS data (who lives where and if they have a dog or not style stuff), JD Edwards for financials and assets (e.g. controlling pay roll), ESRI’s suite of GIS products (e.g. ArcMap and ArcSDE; working out where things are in the City) and Hummingbird’s Document Management solution to maintain our corporate documents. The challenge is to take these products and try and merge the information stored in seven different organisations who in some places won’t share any applications and integrate it into one.To make matters worse the organisations are spread in some cases hours away from where we are. So the problem I’m looking at is how do we integrate email, contacts and documents for all of these people together. They’re not going to have any of our standard software beyond Office which is a problem as we use Notes and like to think that one day we’ll move to open source.So Google Apps becomes an option for this transitional period while we try to work out what we’re going to deploy and how we’re going to deploy it. It works within a web browser and an internet connection, its relatively lightweight compared to other distributed solutions (large file transfers weak network connections, Citrix deployments) and offers a far more responsive nature than either of these because technically they are native to the desktop application (e.g. the web browser).Setting it up is an interesting situation as part of it requires ‘verification’.  There are a few options to get verified:

  1. Put a HTML file on some web space (this didn’t work for me)
  2. Set up a new DNS pointer for Google to find (this also didn’t work)
  3. Just set up the DNS the way it needs to be (e.g. pointing things to ghs.google.com)

 The last one ended up being the solution for me even though it isn’t obvious when you first start off that things will work this way. Thankfully you can get things up and running without having to verify that you own the domain, just end up setting up your DNS to point to the right place solves things anyway. So what does Google give you?

  • Mail – Their Google Mail product available on your domain, the main reason a lot of people will be deploying this solution.
  • Calendar – Their calendar solution is integrated into the mail address book. Interestingly enough they don’t have the address book feature as another application which for building corporate address books might be handy, or linking into a website.
  • Pages – Google Pages is perhaps one of the lesser known products in Google’s application stable, is a product similar to the old Homesite and Geocities products of old (before they sold out and had lots of ads, then people realised that doing everything manually most of the time was too much effort and they just wanted a template and WordPress did all they needed, or Joomla! did what they wanted better). I’ve used it since the early beta and this, like its brothers is stand alone as well.
  • Docs and Spreadsheets – Again, the boon here is the integration with Mail’s address book application which means that you have the ability to share documents (and document control) with different people within your organisation. As an administrator you can also restrict documents to within the domain or allow users to share it externally, so this doesn’t make it less secure than other solutions for document sharing (still doesn’t stop users exporting it to another format and emailing it manually anyway).
  • Chat – The final major application is Google’s XMPP powered IM solution, which again integrates into Mail’s address book to provide contact list management integrated with your contact list. This is available via the web browser standalone interfaces, your start page, within the Mail application or using a dedicted IM client such as Adium on the Mac, Google’s Talk application on Windows or the Gajim Jabber client or Pidgin on Linux. These chats can also be logged and are available in the Mail application as well.
  • Start – Like the customised Google home page (iGoogle), this is provided as an option for your domain as well. Again it integrates with the rest of the products like Mail, Calendar, Talk and Docs to allow for a very functional first page to go to (more functional than most options I’ve seen around the place). Its heavy integration in a small way puts it at the functionality level of something like Microsoft’s Sharepoint style solution, however the Google solution is not customisable (unlike Sharepoint) however out of the box it enables users to see more information about their data (such as the Docs integration)

This was just a review of the standard edition, the premier edition (at $50/user/year) offers a few more interesting features such as optional ads, policy and message recovery, resource scheduling, single sign on and other user services (including a 10 GB mail box). As an option to a Microsoft powered world, some of the tools are better integrated and easier to use (collaboration and versioning is awesome in Google’s Docs product) however the simple problem is that when the network link goes down, so does your entire office productivity.Something to dwell on.

No comments

Playing with GeoServer, Google Earth and ArcSDE

October 07th, 2007 | Category: gis,google,opensource,technology

The other day I sat down for half a day and did some research into GeoServer (http://geoserver.org/) and connecting it to our Corporate GIS data store, which is an ArcSDE system. It took a surprisingly small amount of work to get GeoServer up and running and to get it to produce some simple basic results out of the ArcSDE system.The goal of the exercise was to see if I could get access to the mapping information in a system that wasn’t from ESRI, the makers of ArcSDE and ArcMap. The aim isn’t to replace these products for those who use them already or for those who might use them in the future but to provide a smaller end application of the GIS data that is presently stored in Council’s system. For this I picked Google Earth, Google’s 3D ATLAS application.The first set of sample data that I decided to use with the system was a set called ‘Queensland Towns’ which gave me a general view of the state in a large scale so that I could roughly validate that I hadn’t made too bad a mistake. Thankfully the towns came up roughly where they were supposed to be, though I had to check with the GIS department when Google’s data didn’t quite line up with the points – it turns out that their information is wrong! From here I moved onto something more fine grained: our roads data. For each of the roads in the city we have data for stuff like where they start and finish, the particular speed limit along those roads and what sort of road they are (e.g. is it a major road or just a small suburban road). This put a bit of load on my machine as it generated the points for the data, the record set is far more complex (roads are split up into different segments to allow for multiple speed limits for a single road). However even though it took a while (and it was best to be zoomed in as close as possible to the features to minimize the amount that had to be retreived) the road data lined up with Google’s ortho almost perfectly. The interface to retrieve the information about the points of the feature weren’t the friendliest but I’m sure with a bit of work something like that could be fixed up to make it more useful. On the whole setting everything up and getting some results took a few hours of work (took longer to find the ArcSDE SDK and get it installed properly than anything else)  and we’ve got an accessible open result for transferring information.

4 comments

64-bit Hell and Eclipse

October 06th, 2007 | Category: 64bit,development,linux,mac,macosx,opensource,technology,windows

For many years now I’ve had an AMD box that was capable of running 64-bit. I wouldn’t say I’m an early adopter, it just happened to be capable of 64-bit and it didn’t bother me if it was a feature or not. At the time I tried out the 64-bit builds of Linux and Windows, found Windows woefully equipped to handle 64-bit and Linux a bit better (having all of the source code to recompile and fix things on a new word size does help things).

Fast forward to today and I have (again) a AMD AthlonX2 64-bit box now on my desk and I’m running SLED10 64-bit. To be honest I’m doing better than Helpdesk who has a similar test box and have been trying to get 64-bit Windows XP up and running on the machine. They’re still hunting for drivers for the thing and keep complaining they have to go halfway across the internet to get things. For myself I’ve only downloaded one driver for the ATI graphics card on it, more to get dual head mode working on the graphics card. So I’m up and running and I’m not really noticing any issues with applications. Everything I’ve thrown at this box has been handled perfectly, until I decided to upgrade Eclipse. Eclipse is a strange beast and the build I have is a 32-bit build. It worked fine by default, however the Java version that I have on my desktop is rather ancient (1.4.2, thank you SuSE). This meant that some things didn’t want to work properly. I tried to upgrade to the IBM provided 1.5 release which wanted to be 64-bit. Which is fine, until you realize that the Eclipse build has a 32-bit SWT support layer. Try again! So I ended up downloading the 32bit Linux Java off the Sun website and installing it. That got me up and running with 1.6 and Eclipse started and almost got me to where I wanted to be. Then Eclipse hanged itself. Eclipse does this from time to time, so I just let it sit there and do what ever it
does and it came good. I have a feeling its trying to go to the internet or some other network resource which is taking its sweet time to respond, or for the internet, being blocked by a firewall somewhere.So this brings to light an issue with any system that indulges in dynamic linking. One of the issues here was Eclipse’s SWT library being 32-bit (there are 64-bit builds so that is fixable though I know not how) and at one point using a 64-bit build of Java. Funnily enough this isn’t as big an issue on my platform of choice, Mac OS X. As I pointed out in a Slashdot comment Apple has done a great job of shifting architectures for their operating system and let alone the 32-bit/64-bit transition. They’ve had to move from their original Motorolla m68k powered machines to PowerPC based machines and now from PowerPC on to Intel, and they’ve used emulation both times swapping from the m68k to PPC and then from PPC to Intel to make the transition lighter, and utilizing “Universal Binaries” similar to the “fat binaries” they used previously to get things up and running. The only other element of note is providing the “Classic” interface to ease the transition from the nanokernel that powered Mac OS 9 and earlier to OS X’s new XNU microkernel. The system is in effect emulating a Classic machine, though it isn’t complete. Though of most note Apple announced the toolchain to make the PPC to Intel switch all possible ahead of time and integrated it directly into their primary developer tool, XCode.Perhaps this is why Apple’s transitions are so much smoother than that of either Microsoft or Linux.

No comments

Fun with Subversion and Apache

October 03rd, 2007 | Category: apache,linux,opensource,subversion,web

I’ve been having some issues with getting my Subversion repository to play nicely. For some reason it started to randomly attempt to point to a repository called “error” which puzzled me a bit. A mailing list entry has set me up with a solution  (http://svn.haxx.se/users/archive-2004-06/1312.shtml) and it appears looking around a bit further and a few other people have had the issue as well. The basic problem is that somewhere the system is throwing an error, it appears a 500 which is an Internal Server Error. But this doesn’t seem to be the base cause since I feel this is actually being generated internally for some reason and bounced around. It happens that the reason why its bouncing here is because of an error directive somewhere is causing an error page to be looked for, and since by default we’re pointing to our errors being at /error, which Subversion tries to find a repository for because “SVNParentPath” is set with the Location set to the root (e.g. “”), we end up getting a strange error from Subversion. The vexing thing is that checkouts seem to have no issue, updates are fine but adding new files seems to have issues.

So the solution from the above as to add a redirect to another virtual host on the server:

RedirectMatch permanent ^/error http://pasamio.com/error

Another solution is to rewrite the ErrorDocument to make it return strings instead of page: 

ErrorDocument 400 "400 HTTP BAD REQUEST"
ErrorDocument 401 "401 HTTP UNAUTHORIZED"
ErrorDocument 403 "403 HTTP FORBIDDEN"
ErrorDocument 404 "404 HTTP NOT FOUND"
ErrorDocument 405 "405 HTTP METHOD NOT ALLOWED"
ErrorDocument 408 "408 HTTP REQUEST TIME OUT"
ErrorDocument 410 "410 HTTP GONE"
ErrorDocument 411 "411 HTTP LENGTH REQUIRED"
ErrorDocument 412 "412 HTTP PRECONDITION FAILED"
ErrorDocument 413 "413 HTTP REQUEST ENTITY TOO LARGE"
ErrorDocument 414 "414 HTTP REQUEST URI TOO LARGE"
ErrorDocument 415 "415 HTTP SERVICE UNAVAILABLE"
ErrorDocument 500 "500 HTTP INTERNAL SERVER ERROR"
ErrorDocument 501 "501 HTTP NOT IMPLEMENTED"
ErrorDocument 502 "502 HTTP BAD GATEWAY"
ErrorDocument 503 "503 HTTP SERVICE UNAVAILABLE"
ErrorDocument 506 "506 HTTP VARIANT ALSO VARIES" 

This also fixes the problems, though in a few more lines. This is perhaps the better solution as it wipes out the old ErrorDocument directives that were giving us some troubles and returns a straight string. The next simplest solution is of course to move the Subversion “SVNParentPath” from the root to its own path underneath the root, which is the solution that most people seem to go in for doing.

No comments

Two shocks for the day

October 03rd, 2007 | Category: today

I’m sitting in my Oracle prac at the moment and I’ve had two shocks: Vista actually fixed a problem with XP and my Business faculty based lecturer is heavily bagging out WebCT and hoping for Moodle to come in. I’m not quite sure where all of this support for Moodle came from in the University but it is exciting…shame it may not happen in my time. The ITS people have been suspicious of the software over WebCT and one of the reasons why WebCT is being scrapped is that it costs money for the Uni and the Uni is running out of money. Interesting times ahead, shame I probably won’t see it come to fruition.

No comments

Building a metadata filesystem

October 03rd, 2007 | Category: filesystems,mdfs,research

As part of my final year of studies at university I’m working on building a metadata filesystem, called “MDFS” or “MetaData FileSystem”. Creative naming don’t you think? The project is using FUSE to create the filesystem interface and PostgreSQL is used to store the data in the back end. Its in its early phases, but you can check it out here at a Trac instance I’ve created: http://mdfs.pasamio.com/.

No comments