Having got the Courier IMAP server up and with all my mail migrated onto the dedicated mail server zone, I decided to put a webmail interface on the front.
Having previously played with a few - Squirrelmail, Horde and Roundcube, I concluded that Roundcube was the one I wanted to use. The interface is very clean and it's simple.
I used Blastwave to setup the required components - Apache, PHP, MySQL, edited the httpd.conf to include PHP support, configured the database and we were off.
The final step was to configure sendmail so that I could use Roundcube to send emails as well. This proved more difficult than it should as sendmail was failing to resolve post.demon.co.uk (my upstream mail relay). This was fixed by specifying Demon's DNS servers in /etc/resolv.conf.
So now I have access to my email via Thunderbird (on Linux), Mail.app (on the Mac) and Roundcube (web browser). At some point I may investigate a commandline MUA for when I SSH into the network, but this will do for now.
Saturday, 29 December 2007
Friday, 28 December 2007
Mail.app and Courier IMAP
For a long while, I've been running IMAP on my Linux machine. This provides the ability to use different front ends to get to my mail. In order to move this to my server machine which is on 24x7, I created a dedicated Solaris Zone ("mailserver") which I've then installed Courier IMAP. A quick test with Thunderbird on my Linux install proved this was working.
To configure for the Mac:
Mail.app had a problem with the sorting of mail folders. The folders below Inbox were shown separately under a "Mailserver" section and not as subfolders of the Inbox. The solution was to edit the account and remove the IMAP Path Prefix "INBOX" and replace it with nothing. This is contrary to what I've read on a number of other blogs, but works here. YMMV.
The final thing to do then is to select the IMAP folders for Junk, Sent and Trash and select Mailbox > Use This Mailbox For menus to designate how Mail.app should handle these types of mails.
Which now means I can access my mailstore from the Mac.
To configure for the Mac:
Mail.app had a problem with the sorting of mail folders. The folders below Inbox were shown separately under a "Mailserver" section and not as subfolders of the Inbox. The solution was to edit the account and remove the IMAP Path Prefix "INBOX" and replace it with nothing. This is contrary to what I've read on a number of other blogs, but works here. YMMV.
The final thing to do then is to select the IMAP folders for Junk, Sent and Trash and select Mailbox > Use This Mailbox For menus to designate how Mail.app should handle these types of mails.
Which now means I can access my mailstore from the Mac.
Box.net WebDAV goodness
I've had a free 1GB box.net account for a while, but haven't used it in any anger beyond uploading a bunch of my how-to documents as a test.
Today I tried adding a WebDAV connection to my GNOME installation - a trivial act of specifying the correct URL, https://www.box.net/dav and then entering my box.net credentials in Nautilus. This worked pretty well so I repeated it on the Mac. This was also successful.
So I now have easy access - WebDAV but also the web interface - to a bunch of storage online from all my platforms. Should be good for storing those (non-private) things that I might want when I'm out and about.
Today I tried adding a WebDAV connection to my GNOME installation - a trivial act of specifying the correct URL, https://www.box.net/dav and then entering my box.net credentials in Nautilus. This worked pretty well so I repeated it on the Mac. This was also successful.
So I now have easy access - WebDAV but also the web interface - to a bunch of storage online from all my platforms. Should be good for storing those (non-private) things that I might want when I'm out and about.
Thursday, 27 December 2007
Cloud wishlist for 2008
Looking ahead, here are my Cloud-oriented wishes for 2008:
1) Gmail IMAP to come to the UK. It's still not available here!
2) Yahoo! IMAP to come online. I'd willingly pay for the Mail Plus option if it got rid of adverts and provided decent IMAP support.
3) An easy way to backup Google Docs documents. At the moment I'm reluctant to commit 100% to Google Docs as there is no easy way to backup everything. A WebDAV interface would be good, as would integration with OpenOffice (so it works a bit like Sharepoint).
4) A mobile phone platform that can handle Outlook 2007 categories well. Maybe Android, maybe the next generation iPhone, perhaps even something from Palm...?
I'm sure I'll add more as I think about it...
1) Gmail IMAP to come to the UK. It's still not available here!
2) Yahoo! IMAP to come online. I'd willingly pay for the Mail Plus option if it got rid of adverts and provided decent IMAP support.
3) An easy way to backup Google Docs documents. At the moment I'm reluctant to commit 100% to Google Docs as there is no easy way to backup everything. A WebDAV interface would be good, as would integration with OpenOffice (so it works a bit like Sharepoint).
4) A mobile phone platform that can handle Outlook 2007 categories well. Maybe Android, maybe the next generation iPhone, perhaps even something from Palm...?
I'm sure I'll add more as I think about it...
Monday, 17 December 2007
Microsoft "innovate" again
Despite Microsoft referencing themselves as an innovative company, the view of many of us in the industry is that they rarely come up with anything new. The adage that to see Microsoft's next product, you need to look at Apple three years ago (a paraphrase, but you get the point) continues to bear out.
In this particular instance, I'm referring to the announcement of a UI refresh for Windows Mobile.
In contrast, Apple have redefined the mobile phone interface with their first product. I'm no Apple fanboi, but can recognise true innovation when I see it, and Microsoft just doesn't get it. Rant over.
In this particular instance, I'm referring to the announcement of a UI refresh for Windows Mobile.
In contrast, Apple have redefined the mobile phone interface with their first product. I'm no Apple fanboi, but can recognise true innovation when I see it, and Microsoft just doesn't get it. Rant over.
Saturday, 8 December 2007
Printing on the Mac
Getting printing working on the Mac wasn't them most straightforward thing. For starters, I wanted to print to my Suse 10.2 print server over CUPS. The printer is an HP Deskjet 5150 which wasn't on the list of printers that supports network printing.
The first thing to do was download the ESP Ghostscript package (espgs-7.07.1.ppc.dmg), Foomatic RIP (foomatic-rip-3.43.2.15.ppc.dmg) and HP driver (hpijs-2.7.10-UB.dmg). Once these were installed, I tried to setup an IPP printer, but this didn't print and put the printer into "Stopped" mode.
The error was unhelpfully "200 Get-Printer-Attributes client-error-not-found".
Using a combination of log watching on the Mac and Linux sides, it appeared that the URL was incorrect. On the Mac, I opened Safari and browed to the CUPS interface (http://localhost:631). I then setup the printer so that the URL was:
ipp://192.168.192.5:631/printers/deskjet5100
The IP address is the IP of the Linux CUPS server and deskjet5100 is the name of the queue on that server. The /printers/ bit was important and the GUI didn't appear to set that up - hence no printing. Having made these changes, the print job was being sent to the correct server, but still didn't print; this time with another unhelpful error:
Print-Job client-error-document-format-not-supported.
Fortunately a quick Google pointed me in the right direction. To fix, I edited /etc/cups/mime.convs on the Linux server, uncommented the following line:
# application/octet-stream application/vnd.cups-raw 0 -
A quick restart of the cups server (service cups restart) was followed by a successful test print!
The first thing to do was download the ESP Ghostscript package (espgs-7.07.1.ppc.dmg), Foomatic RIP (foomatic-rip-3.43.2.15.ppc.dmg) and HP driver (hpijs-2.7.10-UB.dmg). Once these were installed, I tried to setup an IPP printer, but this didn't print and put the printer into "Stopped" mode.
The error was unhelpfully "200 Get-Printer-Attributes client-error-not-found".
Using a combination of log watching on the Mac and Linux sides, it appeared that the URL was incorrect. On the Mac, I opened Safari and browed to the CUPS interface (http://localhost:631). I then setup the printer so that the URL was:
ipp://192.168.192.5:631/printers/deskjet5100
The IP address is the IP of the Linux CUPS server and deskjet5100 is the name of the queue on that server. The /printers/ bit was important and the GUI didn't appear to set that up - hence no printing. Having made these changes, the print job was being sent to the correct server, but still didn't print; this time with another unhelpful error:
Print-Job client-error-document-format-not-supported.
Fortunately a quick Google pointed me in the right direction. To fix, I edited /etc/cups/mime.convs on the Linux server, uncommented the following line:
# application/octet-stream application/vnd.cups-raw 0 -
A quick restart of the cups server (service cups restart) was followed by a successful test print!
Friday, 7 December 2007
Posted from a Mac!
My Mac Mini arrived today!
It's tiny, silent and most of all, works with the Belkin Omnicube KVM! It's actually a bit more complex than that: The keyboard is an old AT keyboard (with the old style pre-PS/2 DIN interface) which is converted to PS/2 for the KVM. The mouse is a Microsoft Optical USB mouse which is also converted to PS/2 for the KVM switch. The Omnicube is a four port PS/2 switch which is then plugged into a two port PS/2 to USB converter (that I found in Maplins) and then plugged into the Mini.
The mouse is fine. The keyboard is almost completely sorted. For a start, it finds the UK pound sign correctly, but has problems with the positioning of the speech marks (should be above 2 on the UK keyboard, but is found in the US position), the @ sign (which is now above the 2) and the page up/down, home and end don't seem to work.
Apart from that, it's great. Now I just need to learn Mac OS X
It's tiny, silent and most of all, works with the Belkin Omnicube KVM! It's actually a bit more complex than that: The keyboard is an old AT keyboard (with the old style pre-PS/2 DIN interface) which is converted to PS/2 for the KVM. The mouse is a Microsoft Optical USB mouse which is also converted to PS/2 for the KVM switch. The Omnicube is a four port PS/2 switch which is then plugged into a two port PS/2 to USB converter (that I found in Maplins) and then plugged into the Mini.
The mouse is fine. The keyboard is almost completely sorted. For a start, it finds the UK pound sign correctly, but has problems with the positioning of the speech marks (should be above 2 on the UK keyboard, but is found in the US position), the @ sign (which is now above the 2) and the page up/down, home and end don't seem to work.
Apart from that, it's great. Now I just need to learn Mac OS X
Tuesday, 4 December 2007
Locking down Facebook
There has been a flurry of concern regarding the Facebook Beacon advertising system. The first warning came when some Facebook users became aware that third party websites were posting updates to their Facebook minifeed. In response, Facebook changed the privacy options to allow users to opt out, and a number of users found workarounds using the Blocksite add-on for Firefox.
Unfortunately this was insufficient and it has now been revealed that Beacon sites are sending Facebook information even when you are logged out of Facebook! Even worse, unless you are sniffing your network traffic, you wouldn't know about it.
I'm guessing this works because Facebook doesn't remove all cookies when you log out and it is using these cookies to link activities on third party sites back to Facebook.
So in order to allow access to Facebook, without letting them access the rest of my online life, I've done the following:
1) Create a dedicated Firefox profile for Facebook
2) Install Blocksite in the Facebook profile to stop Beacon working
3) Install Blockstie in the default profile to prevent access to Facebook
4) Delete Facebook cookies from the default profile
To create a new profile, configure Firefox to open the profile manager by appending -ProfileManager to the execute command.
Create the new profile (e.g., "facebook") and then create a new Firefox icon, removing the -ProfileManager and adding "-p facebook -no-remote" (without the quotes).
Change your original Firefox icon and add "-p default -no-remote" to the execute command line. You no longer need to use the -ProfileManager switch.
The -p option specifies a profile to use, and the -no-remote allows multiple profiles to be loaded concurrently, so you can have your normal browser and your Facebook browser open at the same time.
Once you have a dedicated Facebook profile setup, use the Blocksite add-on to prevent Beacon from working.
It might also be a good idea to prevent Facebook loading in your default profile, just in case you forget where you are. Again, this can be done using the Blocksite add-on.
Another step you can take to protect your privacy is to use a Yahoo! email disposable address specifically for Facebook. This means that third parties won't be able to link you based on your email address.
It's unlikely that this will be the end of the Facebook privacy problem, but hopefully these techniques will help Facebook isolated from the rest of your online activity.
Unfortunately this was insufficient and it has now been revealed that Beacon sites are sending Facebook information even when you are logged out of Facebook! Even worse, unless you are sniffing your network traffic, you wouldn't know about it.
I'm guessing this works because Facebook doesn't remove all cookies when you log out and it is using these cookies to link activities on third party sites back to Facebook.
So in order to allow access to Facebook, without letting them access the rest of my online life, I've done the following:
1) Create a dedicated Firefox profile for Facebook
2) Install Blocksite in the Facebook profile to stop Beacon working
3) Install Blockstie in the default profile to prevent access to Facebook
4) Delete Facebook cookies from the default profile
To create a new profile, configure Firefox to open the profile manager by appending -ProfileManager to the execute command.
Create the new profile (e.g., "facebook") and then create a new Firefox icon, removing the -ProfileManager and adding "-p facebook -no-remote" (without the quotes).
Change your original Firefox icon and add "-p default -no-remote" to the execute command line. You no longer need to use the -ProfileManager switch.
The -p option specifies a profile to use, and the -no-remote allows multiple profiles to be loaded concurrently, so you can have your normal browser and your Facebook browser open at the same time.
Once you have a dedicated Facebook profile setup, use the Blocksite add-on to prevent Beacon from working.
It might also be a good idea to prevent Facebook loading in your default profile, just in case you forget where you are. Again, this can be done using the Blocksite add-on.
Another step you can take to protect your privacy is to use a Yahoo! email disposable address specifically for Facebook. This means that third parties won't be able to link you based on your email address.
It's unlikely that this will be the end of the Facebook privacy problem, but hopefully these techniques will help Facebook isolated from the rest of your online activity.
Friday, 23 November 2007
Living on the Cloud: A lookback
The experiment to move a lot of my data to the cloud has been very successful. I now use my Google Mail and Yahoo Mail accounts regularly, Google Docs quite often, and Google Reader has become a part of my daily routing.
I'm using Yahoo Mail's AddressGuard feature regularly and it's been brilliant at giving me a way to stop companies using my real address. To be fair, I've not had a single spam mail from either Google or Yahoo.
Google Docs has been developing nicely and I especially like the collaboration features (the ability to allow others to access and update specified documents, versioning etc). It's not a perfect product, and there are better online office apps available, but it's developing quickly and has the advantages of having the resources of Google behind it.
But it's Google Reader that has had the most impact. It's solved the problem of keeping RSS feeds in sync, has a speedy interface that can be keyboard driven and actually allows me to keep up to date with a large number of sites and blogs.
There is still more to be done. I'm not synchronising my contacts, tasks, notes and calendar from Exchange at work to a cloud based service at the moment, but as I have a Blackberry Pearl, I can easily access my data on the move.
For the future, I would like to look further into online collaboration for church use. Some future form of Google Docs might be the answer, or there might be a better solution out there...
I'm using Yahoo Mail's AddressGuard feature regularly and it's been brilliant at giving me a way to stop companies using my real address. To be fair, I've not had a single spam mail from either Google or Yahoo.
Google Docs has been developing nicely and I especially like the collaboration features (the ability to allow others to access and update specified documents, versioning etc). It's not a perfect product, and there are better online office apps available, but it's developing quickly and has the advantages of having the resources of Google behind it.
But it's Google Reader that has had the most impact. It's solved the problem of keeping RSS feeds in sync, has a speedy interface that can be keyboard driven and actually allows me to keep up to date with a large number of sites and blogs.
There is still more to be done. I'm not synchronising my contacts, tasks, notes and calendar from Exchange at work to a cloud based service at the moment, but as I have a Blackberry Pearl, I can easily access my data on the move.
For the future, I would like to look further into online collaboration for church use. Some future form of Google Docs might be the answer, or there might be a better solution out there...
Saturday, 10 November 2007
Fixes to Solaris Express on VMware Server
Having installed the VMware Tools into my newly created Solaris Express VM, I discovered that the X screen resolution was absolutely massive (something like 2000x1000 pixels). The solution was to edit /etc/X11/xorg.conf from a command line login and remove the unwanted Modeline entries. As I wanted to run my VM in 1024x768, I removed all other lines, and then exited the shell. When the desktop manager reloaded, the screen was at the desired resolution.
The second problem was that installing the VMware Tools changed the network device from pcn0 to vmxnet0, but the /etc/hostname.pcn0 file remained and a new file was not created. I renamed this to /etc/hostname.vmxnet0 and restarted networking (rebooting will also work!).
Sendmail complains if it does not have a fully qualified domain name. The easiest fix it to edit /etc/hosts and add on a domain name alongside your host entry.
These are fairly minor changes than need to be made and the installation is looking pretty decent so far.
The second problem was that installing the VMware Tools changed the network device from pcn0 to vmxnet0, but the /etc/hostname.pcn0 file remained and a new file was not created. I renamed this to /etc/hostname.vmxnet0 and restarted networking (rebooting will also work!).
Sendmail complains if it does not have a fully qualified domain name. The easiest fix it to edit /etc/hosts and add on a domain name alongside your host entry.
These are fairly minor changes than need to be made and the installation is looking pretty decent so far.
Friday, 9 November 2007
Making virtualisation a reality
My second Shuttle is now up and running as a VMware Server. The host operating system is a fairly small version of OpenSUSE 10.2 which on its own seems very nice.
So far, I have created a Solaris Express installation which I'll need for doing the sysadmin exam, and I've just finished installing CentOS 4.5 which should mirror closely our hosting provider. I want to create a local development version of the MBC website.
After months of procrastinating (and getting married!), the VM project is finally kicking off!
So far, I have created a Solaris Express installation which I'll need for doing the sysadmin exam, and I've just finished installing CentOS 4.5 which should mirror closely our hosting provider. I want to create a local development version of the MBC website.
After months of procrastinating (and getting married!), the VM project is finally kicking off!
Saturday, 3 November 2007
Google Mail to get IMAP
Some people are reporting that Gmail is introducing IMAP support. This is a real bonus for those of us that want to access our mail on the cloud while at the same time maintaining a synchronised offline copy.
My current system for handling Gmail is to download it all through POP3 (and keeping a copy on the server). Email that is sent from my Thunderbird client goes through the Gmail SMTP servers which save a copy of the mail in my sent items "folder".
While this works, it's not as good as real IMAP. It looks like Google are making some web UI improvements to Gmail as part of the upgrade and although I don't have this feature yet, it looks pretty promising.
My current system for handling Gmail is to download it all through POP3 (and keeping a copy on the server). Email that is sent from my Thunderbird client goes through the Gmail SMTP servers which save a copy of the mail in my sent items "folder".
While this works, it's not as good as real IMAP. It looks like Google are making some web UI improvements to Gmail as part of the upgrade and although I don't have this feature yet, it looks pretty promising.
Friday, 31 August 2007
Why is software late?
The reason I have not posted anything over the last couple of weeks is because I've been very busy working on the migration of our applications from Ingres 2.6 to the latest 2006 R2 release. We had originally intended to ship this out in late July, but we are now looking (and hoping!) that things will be ready to ship by the end of next week.
So why are we late?
As I've reflected on it, I'm reminded that most IT projects fall into one of the following categories:
1) Release it only when it's ready (typical for many open source projects)
2) Release on time and then apply numerous patches later
3) Release late (and probably over budget)
Vista is probably the most famous project that was late, and this week Microsoft announced that Server 2008 will be delayed. Tonight I read that KDE 4.0 will be delayed by two months, and almost every government IT project is both late and overbudget.
In our case, the problem has been the discovery of a number of bugs that have required patches (for the database) or changes to our code. These have been difficult to predict and requires regression testing (the fact we want to simultaneously release on four platforms only adds to the delay). Now this is not intended to be a cheap excuse, but it does highlight a problem that affects all parts of the industry.
As long as projects like this are "stabs in the dark" (caused by using new and sometimes unfamiliar technology), I'm not sure how things can improve.
So why are we late?
As I've reflected on it, I'm reminded that most IT projects fall into one of the following categories:
1) Release it only when it's ready (typical for many open source projects)
2) Release on time and then apply numerous patches later
3) Release late (and probably over budget)
Vista is probably the most famous project that was late, and this week Microsoft announced that Server 2008 will be delayed. Tonight I read that KDE 4.0 will be delayed by two months, and almost every government IT project is both late and overbudget.
In our case, the problem has been the discovery of a number of bugs that have required patches (for the database) or changes to our code. These have been difficult to predict and requires regression testing (the fact we want to simultaneously release on four platforms only adds to the delay). Now this is not intended to be a cheap excuse, but it does highlight a problem that affects all parts of the industry.
As long as projects like this are "stabs in the dark" (caused by using new and sometimes unfamiliar technology), I'm not sure how things can improve.
Wednesday, 15 August 2007
Managing Brightstor ARCserve Backup
At work, we use CA's Brightstor ARCserve Backup (BAB) to backup our network of Solaris, Linux, Windows and AIX servers. The backup is done on a Solaris 10 server running version 11.5 of the server software.
We connect to a 40 tape Plasmon library equipped with two LTO2 drives (although we only use 1 tape drive at the moment and allocate only 20 slots to the backup server). This connects to our SAN using an ATTO FibreBridge 2300 (SCSI in one end, Fibre Channel out the other).
BAB is fairly good at what it does, but the FibreBridge is a bit dodgy and occasionally crashes. Something it did last Friday night.
Like many organisations, we perform a full system backup on a Friday night, and then use differentials (level 1) backups Monday through to Thursday. Monthly tapes are then stored off site.
One issue I've been fighting with is that BAB stores it's backup catalogs in an Ingres database. Each file that is backed up gets a row. We've been running the backup for about 4 months and now have 61 millions rows in one of the tables.
In order to reduce the size of the database, BAB includes a "prune" option which removes the details of old jobs from the database. I configured this to run, but then noticed that a number of the text catalogs from previous jobs had not been imported into the database (BAB writes these during the backup and then loads them into the database after each session using the MergeCat utility).
So most of yesterday was spent completing the MergeCat (check /opt/CA/BrightstorARCserve/dbase/tmpcat to see what needs importing) and then running the backup last night. I have now just kicked off the dbclean to purge the jobs.
To make things a bit faster, I did the following:
By default, the Ingres configuration only had one write behind thread. This causes a bottleneck as the log buffers filled up before they were being written out. I've increased this to 20.
The dbclean does a bunch of "delete from astpdat where sesid = x" transactions and if the table is big, the transaction log file will fill. I'm currently running a 5GB transaction log file.
The logging system was still choking. Running logstat -statistics showed that I was getting a lot of log waits and log split waits. I increased the buffer count to 200 (from 20) and changed the block size from 4k to 8k (reformatting the transaction log file in the process).
The biggest performance gains came from dropping the indexes. There are three created by default, so every insert or delete also has to update the indexes. In order to speed things up, I dropped all indexes during the MergeCat, and recreated only ix_astpdat_1 for the dbclean (ix_astpdat_1 is keyed on the sesid which is used during the delete - without it would be a nightmare as astpdat is a heap table).
This is causing things to motor along now, although I still want to do some cache analysis to see whether increasing the DMF cache will improve things.
Once the dbclean completes, the only thing remaining is to modify the database (and hopefully reclaim a load of space as heap tables don't compact when rows are deleted) and then recreate the indexes.
We connect to a 40 tape Plasmon library equipped with two LTO2 drives (although we only use 1 tape drive at the moment and allocate only 20 slots to the backup server). This connects to our SAN using an ATTO FibreBridge 2300 (SCSI in one end, Fibre Channel out the other).
BAB is fairly good at what it does, but the FibreBridge is a bit dodgy and occasionally crashes. Something it did last Friday night.
Like many organisations, we perform a full system backup on a Friday night, and then use differentials (level 1) backups Monday through to Thursday. Monthly tapes are then stored off site.
One issue I've been fighting with is that BAB stores it's backup catalogs in an Ingres database. Each file that is backed up gets a row. We've been running the backup for about 4 months and now have 61 millions rows in one of the tables.
In order to reduce the size of the database, BAB includes a "prune" option which removes the details of old jobs from the database. I configured this to run, but then noticed that a number of the text catalogs from previous jobs had not been imported into the database (BAB writes these during the backup and then loads them into the database after each session using the MergeCat utility).
So most of yesterday was spent completing the MergeCat (check /opt/CA/BrightstorARCserve/dbase/tmpcat to see what needs importing) and then running the backup last night. I have now just kicked off the dbclean to purge the jobs.
To make things a bit faster, I did the following:
By default, the Ingres configuration only had one write behind thread. This causes a bottleneck as the log buffers filled up before they were being written out. I've increased this to 20.
The dbclean does a bunch of "delete from astpdat where sesid = x" transactions and if the table is big, the transaction log file will fill. I'm currently running a 5GB transaction log file.
The logging system was still choking. Running logstat -statistics showed that I was getting a lot of log waits and log split waits. I increased the buffer count to 200 (from 20) and changed the block size from 4k to 8k (reformatting the transaction log file in the process).
The biggest performance gains came from dropping the indexes. There are three created by default, so every insert or delete also has to update the indexes. In order to speed things up, I dropped all indexes during the MergeCat, and recreated only ix_astpdat_1 for the dbclean (ix_astpdat_1 is keyed on the sesid which is used during the delete - without it would be a nightmare as astpdat is a heap table).
This is causing things to motor along now, although I still want to do some cache analysis to see whether increasing the DMF cache will improve things.
Once the dbclean completes, the only thing remaining is to modify the database (and hopefully reclaim a load of space as heap tables don't compact when rows are deleted) and then recreate the indexes.
Sunday, 12 August 2007
Plans for more upgrades...
Recently, the rumour sites were full of speculation that Apple were killing off the Mac Mini. This little machine offers PC users the ability to experience Mac OS X without needing to buy (and find space for) a new monitor, keyboard and mouse, and it's been on my radar for a while now.
Therefore, the news last week that Apple had upgraded the specification of the Mini was quite exciting, if only to give a bit more confidence that the machine would be around for a while longer.
Following some careful negotiations with T, I will be buying a new Mini when Leopard is released. There is no way that I would buy one now only to have to upgrade the OS in about two months! The addition of the latest iLife 08 will be good, and I'll probably pay out for iWork 08 as well. The latter being very interesting as the new Numbers application provides a new take on the spreadsheet.
There is some uncertainty over whether my Belkin Omnicube will work with it as this has PS/2 ports, and I'm not sure if a USB to PS/2 converter will do the job, so I might need to factor in a KVM upgrade as well.
I'm guessing I'll need to get an external hard disk for the Mini to get the benefit of Time Machine, perhaps something that will match the style of the Mini.
Unrelated to the Mac, I'm also looking at getting a large internal hard disk for my SK41G [Silver] Shuttle to host a number of virtual machines, which should now be possible thanks to the 2GB RAM upgrade I put in last week, as well as adding another 1GB to the SN95G5V3 [Black] Shuttle to bring it up to 2GB.
Finally, I'll be looking for a backup mechanism for the Silver Shuttle - maybe another external Firewire drives, but potentially a NAS disk of some sort. All exciting stuff really...
Therefore, the news last week that Apple had upgraded the specification of the Mini was quite exciting, if only to give a bit more confidence that the machine would be around for a while longer.
Following some careful negotiations with T, I will be buying a new Mini when Leopard is released. There is no way that I would buy one now only to have to upgrade the OS in about two months! The addition of the latest iLife 08 will be good, and I'll probably pay out for iWork 08 as well. The latter being very interesting as the new Numbers application provides a new take on the spreadsheet.
There is some uncertainty over whether my Belkin Omnicube will work with it as this has PS/2 ports, and I'm not sure if a USB to PS/2 converter will do the job, so I might need to factor in a KVM upgrade as well.
I'm guessing I'll need to get an external hard disk for the Mini to get the benefit of Time Machine, perhaps something that will match the style of the Mini.
Unrelated to the Mac, I'm also looking at getting a large internal hard disk for my SK41G [Silver] Shuttle to host a number of virtual machines, which should now be possible thanks to the 2GB RAM upgrade I put in last week, as well as adding another 1GB to the SN95G5V3 [Black] Shuttle to bring it up to 2GB.
Finally, I'll be looking for a backup mechanism for the Silver Shuttle - maybe another external Firewire drives, but potentially a NAS disk of some sort. All exciting stuff really...
Tuesday, 7 August 2007
Backing up Vista
Bought some DVD+R discs yesterday and created the HP Recovery Discs for T's new Vista PC. This means the course of system recovery is:
1) Perform factory-setting rebuild using recovery partition
2) If hard drive has died completely, use the Recovery Discs
There is also a bootable Hardware Diagnostics boot disc which could help identify problems. Of course, this won't recover any user data, so for that, Vista has a built-in backup program.
The Backup and Restore Center is different depending on the version of Vista being used. The Home Premium version (T's machine) supports backing up user data files, but not the entire machine. The Business version has the ability to do user data files, but can also backup the entire computer in the form of an image (a la Ghost).
On the Home version, backing up was a breeze. I had attached a 160GB LaCie D2 drive over Firewire (400) which Vista automatically picked up. I pointed the program to the external driveand it prompted me to select a backup schedule. The default was to create a weekly backup every Sunday night which I accepted. The backup kicked off and went and did its stuff and that was that.
Very easy and impressive. Given the amount and type of data people store on a PC these days, the ability to do a backup is essential and for it to be used, the process must be painless. Vista appears to have got this right (of course, the ability to restore is also critical, but that's for another day).
On an unrelated note, my RAM upgrade has arrived for my second Shuttle, so this should see the start of the big virtualisation project soon.
1) Perform factory-setting rebuild using recovery partition
2) If hard drive has died completely, use the Recovery Discs
There is also a bootable Hardware Diagnostics boot disc which could help identify problems. Of course, this won't recover any user data, so for that, Vista has a built-in backup program.
The Backup and Restore Center is different depending on the version of Vista being used. The Home Premium version (T's machine) supports backing up user data files, but not the entire machine. The Business version has the ability to do user data files, but can also backup the entire computer in the form of an image (a la Ghost).
On the Home version, backing up was a breeze. I had attached a 160GB LaCie D2 drive over Firewire (400) which Vista automatically picked up. I pointed the program to the external driveand it prompted me to select a backup schedule. The default was to create a weekly backup every Sunday night which I accepted. The backup kicked off and went and did its stuff and that was that.
Very easy and impressive. Given the amount and type of data people store on a PC these days, the ability to do a backup is essential and for it to be used, the process must be painless. Vista appears to have got this right (of course, the ability to restore is also critical, but that's for another day).
On an unrelated note, my RAM upgrade has arrived for my second Shuttle, so this should see the start of the big virtualisation project soon.
Monday, 6 August 2007
New HP S3150 PC
Yesterday, T and I bought a new PC from our local PC World. A few years ago the thought of doing this would have been unthinkable, but for me personally, the days of building from scratch are over and the extra cost of getting a pre-built machine is negligable.
The model I originally specced went end of life last week, so I had to look for an alternative.
The new machine is an HP S3150 with a Core 2 Duo, 2GB RAM, 250GB hard disk, TV tuner and a 20" HP TFT widescreen monitor. It's not the most powerful gaming rig ever built, but will be perfect for T's needs, and it's got a respectable Windows Experience score.
The S3150 is designed as a media centre PC and is tiny: about 11cm wide and under 30cm high. It fits nicely at the bottom of the PC desk and is quite quiet (but not silent). As a media-oriented PC, it's got a remote control (that I haven't played with yet) and a wireless keyboard and mouse. I've traditionally not been a fan of these, but I've been pleasantly surprised by the ones supplied.
It's running Vista Home Premium and having spent about an hour removing all the pre-installed software that scatters unwanted icons on the desktop (and the obnoxious Norton Internet Security), it's now at the stage where I'm ready to do a proper system backup.
One of the tools provided allows the creation of a hardware diagnostics boot CD which I've done but not tested. It also has the ability to generate some recovery media (to restore the PC back to its factory defaults), and I'm under the impression Vista has it's own tools, but need to investigate these further. I've read that some people have had problems creating recovery media with DVD-RW (or was it DVD+RW) so will need to check this out.
Interestingly, the 250GB is partitioned with a 50GB C: drive, a smaller 8GB(?) recovery partition and the remaining space unpartitioned. This would surely confuse inexperienced, normal users.
I've setup users for me (the default administrator user) and T (a normal user), but I've not installed any anti-virus software yet. I'm intriuged to see what Microsoft OneCare is like and will investigate further.
Also need to work out a regular way of backing up the system for T - probably using an external hard drive, and will need to get a small print server so that we can both share the printer.
All in all, a lot of fun to be had with it.
The model I originally specced went end of life last week, so I had to look for an alternative.
The new machine is an HP S3150 with a Core 2 Duo, 2GB RAM, 250GB hard disk, TV tuner and a 20" HP TFT widescreen monitor. It's not the most powerful gaming rig ever built, but will be perfect for T's needs, and it's got a respectable Windows Experience score.
The S3150 is designed as a media centre PC and is tiny: about 11cm wide and under 30cm high. It fits nicely at the bottom of the PC desk and is quite quiet (but not silent). As a media-oriented PC, it's got a remote control (that I haven't played with yet) and a wireless keyboard and mouse. I've traditionally not been a fan of these, but I've been pleasantly surprised by the ones supplied.
It's running Vista Home Premium and having spent about an hour removing all the pre-installed software that scatters unwanted icons on the desktop (and the obnoxious Norton Internet Security), it's now at the stage where I'm ready to do a proper system backup.
One of the tools provided allows the creation of a hardware diagnostics boot CD which I've done but not tested. It also has the ability to generate some recovery media (to restore the PC back to its factory defaults), and I'm under the impression Vista has it's own tools, but need to investigate these further. I've read that some people have had problems creating recovery media with DVD-RW (or was it DVD+RW) so will need to check this out.
Interestingly, the 250GB is partitioned with a 50GB C: drive, a smaller 8GB(?) recovery partition and the remaining space unpartitioned. This would surely confuse inexperienced, normal users.
I've setup users for me (the default administrator user) and T (a normal user), but I've not installed any anti-virus software yet. I'm intriuged to see what Microsoft OneCare is like and will investigate further.
Also need to work out a regular way of backing up the system for T - probably using an external hard drive, and will need to get a small print server so that we can both share the printer.
All in all, a lot of fun to be had with it.
Friday, 3 August 2007
New Virtual Projects
I've got a RAM upgrade on order for my second Shuttle (currently running Solaris Express). This will give the box 2GB RAM and I'm intending to put a few servers on it using VMWare Server.
Last night I put Fedora 7 on it and was reminded that the onboard graphics chip has a signal problem that results in a slight blur. It also eats into the main RAM which is never ideal.
So I dug out my old Matrox G450 and popped it in the server.
What an amazing contrast! The picture quality of the G450 remains stunningly sharp. Although my main graphics card is fine, the G450 has perhaps a sharper screen quality. I'm not expecting the G450 to have enough power to run XGL for wobbly windows, but that's not the point of the server.
I'm hoping to run a couple of VMs - OpenSolaris or Solaris Express, as well as a hardened Linux install (CentOS 5 or Fedora 7 with SELinux enabled). The ability to host a couple of small Solaris Zones within a VM should be interesting as well and will be useful for revising the Solaris 10 upgrade exam.
Then it will be a case of seeing how much of my own stuff I can access from the cloud. Technologies such as iFolder, FreeNX and some web based IMAP client (Roundcube maybe) will be interesting.
Last night I put Fedora 7 on it and was reminded that the onboard graphics chip has a signal problem that results in a slight blur. It also eats into the main RAM which is never ideal.
So I dug out my old Matrox G450 and popped it in the server.
What an amazing contrast! The picture quality of the G450 remains stunningly sharp. Although my main graphics card is fine, the G450 has perhaps a sharper screen quality. I'm not expecting the G450 to have enough power to run XGL for wobbly windows, but that's not the point of the server.
I'm hoping to run a couple of VMs - OpenSolaris or Solaris Express, as well as a hardened Linux install (CentOS 5 or Fedora 7 with SELinux enabled). The ability to host a couple of small Solaris Zones within a VM should be interesting as well and will be useful for revising the Solaris 10 upgrade exam.
Then it will be a case of seeing how much of my own stuff I can access from the cloud. Technologies such as iFolder, FreeNX and some web based IMAP client (Roundcube maybe) will be interesting.
Thursday, 2 August 2007
Windows Server 2008: Reinventing Unix?
Regular readers (hah!) will be aware that I am very impressed with Windows Vista. One of the reasons for this is that Windows is finally reaching the point where it can be run as a limited user and the system is more tightly enforcing the separation between system and user data (the virtualisation feature in the registry is very impressive). It also forces better practices such as not allowing users to write into the root of the C: drive.
With this in mind, I've been doing some initial reading up on Windows Server 2008, partly because the Microsoft dominance in IT means I'll have to become familiar with it at some point, but also because it looks like it might be a very interesting operating system.
Firstly, the OS can be stripped down to perform a specific role (file server, database server, web server etc) and only the required components are installed. This should result in better security (a smaller surface area to attack) and perhaps better performance (less processes running) which should mean Windows Server 2008 has the potential to be easier to manage.
Secondly, the Powershell is included by default, finally bringing a [best of breed] scripting environment to Windows. Apparently the whole operating system should be mangaged through the command line, which should certainly improve supporting customers over sometimes slow WAN links.
If these two features sound similar to Unix, then Henry Spencer's now classic quote "Those who don't understand UNIX are condemned to reinvent it, poorly." carries some resonance. The fact is that with Windows Server 2008, Microsoft are creating an evolution of their OS that does sound a lot more like Unix than previous versions.
And this can only be a good thing!
With this in mind, I've been doing some initial reading up on Windows Server 2008, partly because the Microsoft dominance in IT means I'll have to become familiar with it at some point, but also because it looks like it might be a very interesting operating system.
Firstly, the OS can be stripped down to perform a specific role (file server, database server, web server etc) and only the required components are installed. This should result in better security (a smaller surface area to attack) and perhaps better performance (less processes running) which should mean Windows Server 2008 has the potential to be easier to manage.
Secondly, the Powershell is included by default, finally bringing a [best of breed] scripting environment to Windows. Apparently the whole operating system should be mangaged through the command line, which should certainly improve supporting customers over sometimes slow WAN links.
If these two features sound similar to Unix, then Henry Spencer's now classic quote "Those who don't understand UNIX are condemned to reinvent it, poorly." carries some resonance. The fact is that with Windows Server 2008, Microsoft are creating an evolution of their OS that does sound a lot more like Unix than previous versions.
And this can only be a good thing!
Monday, 30 July 2007
A second look at the Foleo
As I alluded to in a previous posting, when Palm announced the Foleo, my first response was "Palm are seriously out of ideas and this will help kill them". The Foleo was overpriced, looked like a cut down laptop with a limited feature set and no "sex" appeal like the iPhone. Based on a lot of comments following the announcement, I was not alone in my thinking.
Fast forward a few weeks and the tone of the online media is changing somewhat, and this is coinciding with my own thoughts. The thing that first got me thinking about it as a potentially useful device was seeing a video clip of it on YouTube. It was a much smaller device than I originally expected. Then I read about some of the third party apps being developed for it and realised that the potential of this little device is much larger than I first thought.
The Foleo is effectively a small, silent, instant-on, personal laptop. It can surf the web (using the built in Opera browser), handle multiple email accounts, and do documents, spreadsheets and presentations. It's about the size of a decent notepad with many more features - a bit like the old Psion Netbook PDA but with the ability to access the 'net.
So I can imagine carrying this device into meetings (both at work and church) and having access to all my data. Okay, so if I had a smartphone I could do the same, but this would be a *lot* more practical in that text entry would be potentially as quick as using my full laptop.
Beyond that, the fact that the Foleo is running Linux, and there is a Terminal application available from the application menu open the possibilities of the device far beyond the web/email/documents market. Add in an 8GB SD card and who knows what it could do.
Okay, so it doesn't have a touch screen, but with a UI designed for the keyboard (apparently all the apps make use of keyboard shortcuts), this might not be an issue.
The Foleo is starting to look like a very compelling device...
Fast forward a few weeks and the tone of the online media is changing somewhat, and this is coinciding with my own thoughts. The thing that first got me thinking about it as a potentially useful device was seeing a video clip of it on YouTube. It was a much smaller device than I originally expected. Then I read about some of the third party apps being developed for it and realised that the potential of this little device is much larger than I first thought.
The Foleo is effectively a small, silent, instant-on, personal laptop. It can surf the web (using the built in Opera browser), handle multiple email accounts, and do documents, spreadsheets and presentations. It's about the size of a decent notepad with many more features - a bit like the old Psion Netbook PDA but with the ability to access the 'net.
So I can imagine carrying this device into meetings (both at work and church) and having access to all my data. Okay, so if I had a smartphone I could do the same, but this would be a *lot* more practical in that text entry would be potentially as quick as using my full laptop.
Beyond that, the fact that the Foleo is running Linux, and there is a Terminal application available from the application menu open the possibilities of the device far beyond the web/email/documents market. Add in an 8GB SD card and who knows what it could do.
Okay, so it doesn't have a touch screen, but with a UI designed for the keyboard (apparently all the apps make use of keyboard shortcuts), this might not be an issue.
The Foleo is starting to look like a very compelling device...
Saturday, 28 July 2007
The frustrating state of mobile computing
The new Palm Desktop did not work with Vista and refused to progress beyond the splash screen. To be honest, I've not spent a huge amount of time investigating the reasons. This means that looking for a new PDA device is still on the agenda. Of course, the traditional PDA market has almost gone, merged into mobile phones, which is where I've focused my attention.
The main players in the mobile operating system market are Windows Mobile, Symbian, Palm and Blackberry. To add to the mix, phones based on the Symbian operating system have a couple of choices of operating environments - Series 60 and UIQ.
All have their pros and cons - I've never been comfortable with the old Windows PocketPC interface. It always felt like Microsoft stuffed the Windows desktop down into a small device. This was in comparison with the Palm OS which was stunningly easy to use in comparison. However, with the latest version, Windows Mobile 6, things might have changed, so I should keep an open mind. I would at least expect it to sync my Outlook data well!
Series 60 looks very powerful and is certainly well supported, with Nokia as it's biggest backer. Having read up a bit on it, the two big problems are that it does not synchronise Outlook categories (which I use heavily) and does not support a touch screen (or at least most devices don't). The latter problem is manifested in the web browser that uses a little joystick to move a cursor around the screen - no, no, no, no, no!
Conversely, UIQ, the other Symbian based platform pushed by Sony Ericsson, is touchscreen oriented. Now this might still be an option - the new P1i smartphone does look pretty nifty, and it also comes with a pseudo-QWERTY keyboard. I don't have any experience with it's ability to sync with Outlook though so will need to investigate further.
I've not been a huge fan of early Blackberry models, partly because I wasn't keen on dragging my work around with me all the time, but also the early UI wasn't very appealing. Some of this might be the "interesting" theme one of my colleagues has chosen. I do know that it does Outlook (or Exchange to be mor precise) very well, and the new model is very small and looks just like a regular mobile. If I can get one from work and it manages the calendar and contacts well enough, then maybe that's all I need...?
The remaining contender is Palm. Bearing in mind this whole process started with the miserable failure of Palm to support anything but their top of the line products means their inclusion here is surprising. Let me explain:
I followed the Browse and Buy Devices link on Microsoft's page and specified that I wanted both a touch screen and a QWERTY keyboard, and that the device should support wi-fi. Of the few devices returned, the Treo 750 fulfilled the criteria. It also supports 3G which is another pre-requisite (but wasn't a filter option).
Now the Treo 750 is a Windows Mobile device but it does tick all the boxes. It's only a small sideways step to look at the Treo 755p which is effectively the same model but running Palm OS which I am very familiar with.
This comes full circle to originating problem - will the Palm sync with Outlook 2007 on Vista? My experience with the beta software has been poor, but perhaps I need to wait for the final release.
Why Palm OS? It's architecturally very poor and desperately in need of a new core (I've been hoping for this since Cobalt was announced!), but the user interface is still very, very nice compared to the competition.
I guess the first step is to see if the company will get me a small Blackberry, if only to replace the clunky, old Nokia I forget to carry with me. Then I'll check out the state of the market. Until then, I guess I'll carry on with my Sony Ericsson K800i for phone use and forgo all my personal contact data.
Side note: Despite my first thought regarding the Palm Foleo was "Palm are dead - who wants this product?", it looks like a software ecosystem is starting to develop for it. It's far too expensive at present (and really needs a touch screen), but maybe there is more to this device than first appears...
The main players in the mobile operating system market are Windows Mobile, Symbian, Palm and Blackberry. To add to the mix, phones based on the Symbian operating system have a couple of choices of operating environments - Series 60 and UIQ.
All have their pros and cons - I've never been comfortable with the old Windows PocketPC interface. It always felt like Microsoft stuffed the Windows desktop down into a small device. This was in comparison with the Palm OS which was stunningly easy to use in comparison. However, with the latest version, Windows Mobile 6, things might have changed, so I should keep an open mind. I would at least expect it to sync my Outlook data well!
Series 60 looks very powerful and is certainly well supported, with Nokia as it's biggest backer. Having read up a bit on it, the two big problems are that it does not synchronise Outlook categories (which I use heavily) and does not support a touch screen (or at least most devices don't). The latter problem is manifested in the web browser that uses a little joystick to move a cursor around the screen - no, no, no, no, no!
Conversely, UIQ, the other Symbian based platform pushed by Sony Ericsson, is touchscreen oriented. Now this might still be an option - the new P1i smartphone does look pretty nifty, and it also comes with a pseudo-QWERTY keyboard. I don't have any experience with it's ability to sync with Outlook though so will need to investigate further.
I've not been a huge fan of early Blackberry models, partly because I wasn't keen on dragging my work around with me all the time, but also the early UI wasn't very appealing. Some of this might be the "interesting" theme one of my colleagues has chosen. I do know that it does Outlook (or Exchange to be mor precise) very well, and the new model is very small and looks just like a regular mobile. If I can get one from work and it manages the calendar and contacts well enough, then maybe that's all I need...?
The remaining contender is Palm. Bearing in mind this whole process started with the miserable failure of Palm to support anything but their top of the line products means their inclusion here is surprising. Let me explain:
I followed the Browse and Buy Devices link on Microsoft's page and specified that I wanted both a touch screen and a QWERTY keyboard, and that the device should support wi-fi. Of the few devices returned, the Treo 750 fulfilled the criteria. It also supports 3G which is another pre-requisite (but wasn't a filter option).
Now the Treo 750 is a Windows Mobile device but it does tick all the boxes. It's only a small sideways step to look at the Treo 755p which is effectively the same model but running Palm OS which I am very familiar with.
This comes full circle to originating problem - will the Palm sync with Outlook 2007 on Vista? My experience with the beta software has been poor, but perhaps I need to wait for the final release.
Why Palm OS? It's architecturally very poor and desperately in need of a new core (I've been hoping for this since Cobalt was announced!), but the user interface is still very, very nice compared to the competition.
I guess the first step is to see if the company will get me a small Blackberry, if only to replace the clunky, old Nokia I forget to carry with me. Then I'll check out the state of the market. Until then, I guess I'll carry on with my Sony Ericsson K800i for phone use and forgo all my personal contact data.
Side note: Despite my first thought regarding the Palm Foleo was "Palm are dead - who wants this product?", it looks like a software ecosystem is starting to develop for it. It's far too expensive at present (and really needs a touch screen), but maybe there is more to this device than first appears...
Friday, 27 July 2007
Beyond the Palm
Having blogged about Vista quite a lot recently, I have decided to revisit the subject of cloud computing. I have still not got syncing between Vista / Outlook 2007, Palm and Yahoo! working, but Palm have released a beta of their new Palm Desktop for Vista software. I'll be giving it a try soon, but the T3 isn't on the officially supported list.
It appears that Palm are continuing to throw away their market. I've been a Palm user since the original Pilot Professional, upgrading to an M500 and then to the T3. The PIM applications have been the main reason why I've stuck with Palm (and I hate the Windows Mobile interface!). But by not supporting the T3 - a model that isn't *that* old - I'm forced to look at other options.
So my criteria is:
Synchronising with Outlook will get my cloud experiment back on track, and I like the ability to browse the internet when out and about. The keyboard is a bit contentious, but ultimately, I'm faster on a keyboard that writing and handwriting recognition struggles to interpret my scrawls.
I'm guessing I'm looking at some form of mobile phone / smartphone. The OS News review of the Nokia E90 looks quite good, so that'll do on the list of potentials. I'm still a fan of Sony Ericsson phones, so will look at their offerings.
iPhone? Maybe, but it doesn't have a keyboard (perhaps not a showstopper depending on how good the virtual keyboard is) and I'm not sure what it's PIM capabilities are.
It appears that Palm are continuing to throw away their market. I've been a Palm user since the original Pilot Professional, upgrading to an M500 and then to the T3. The PIM applications have been the main reason why I've stuck with Palm (and I hate the Windows Mobile interface!). But by not supporting the T3 - a model that isn't *that* old - I'm forced to look at other options.
So my criteria is:
- Decent PIM applications
- PIM synchronisation with Outlook 2007
- Web browser - ideally something like Opera
- Some form of Internet connectivity - wireless / bluetooth / ???
- Keyboard
Synchronising with Outlook will get my cloud experiment back on track, and I like the ability to browse the internet when out and about. The keyboard is a bit contentious, but ultimately, I'm faster on a keyboard that writing and handwriting recognition struggles to interpret my scrawls.
I'm guessing I'm looking at some form of mobile phone / smartphone. The OS News review of the Nokia E90 looks quite good, so that'll do on the list of potentials. I'm still a fan of Sony Ericsson phones, so will look at their offerings.
iPhone? Maybe, but it doesn't have a keyboard (perhaps not a showstopper depending on how good the virtual keyboard is) and I'm not sure what it's PIM capabilities are.
Wednesday, 25 July 2007
Vista: One week on
I've been using Vista now for a week and to my own surprise, I keep saying good things about it! Okay, so the problem with the Palm sync hasn't gone away (although I noticed tonight that Palm have released a beta Palm Desktop for Vista - without support for my T3!), but it's a nice working environment and definitely an improvement on XP.
Tomorrow I'll be working from home, so will get to see how Vista handles the Cisco VPN client, wireless access etc.
It looks like T will therefore be getting a Vista PC - something I might investigate on Saturday with a trip to PC World.
One of the really good things about my job is the variety. One moment I'll be testing Vista, the next I'll be hacking on a new shell script:
Things have been getting busier at work as we move towards the beta release of our product on Ingres 2006. I've been writing pages of documentation detailing the migration process and spent several hours working on a script that parses a database copy script and splits it so that tables can be copied in in parallel. Hopefully I'll get that finished tomorrow.
Tomorrow I'll be working from home, so will get to see how Vista handles the Cisco VPN client, wireless access etc.
It looks like T will therefore be getting a Vista PC - something I might investigate on Saturday with a trip to PC World.
One of the really good things about my job is the variety. One moment I'll be testing Vista, the next I'll be hacking on a new shell script:
Things have been getting busier at work as we move towards the beta release of our product on Ingres 2006. I've been writing pages of documentation detailing the migration process and spent several hours working on a script that parses a database copy script and splits it so that tables can be copied in in parallel. Hopefully I'll get that finished tomorrow.
Thursday, 19 July 2007
Vista: Day Two: Boo
I can't get the Palm to sync with Outlook. It's giving a strange OLERR error message which according to various sources, isn't actually caused by Vista. There are several possible causes, but I've tried a number of them without success.
The Yahoo! sync doesn't work either which means the cloud experiment has come crashing down in a nasty heap. There is a new version of the Palm Desktop scheduled for "the summer" so I'll try then. No word about Intellisync for Yahoo! so we will have to wait and see.
I tried to use Plaxo to sync to Outlook and in the process managed to duplicate my contacts and calendar entries. Fortunately I have most things categorised in Outlook so it was trivial to recover.
All in all, not a great day for progress.
The Yahoo! sync doesn't work either which means the cloud experiment has come crashing down in a nasty heap. There is a new version of the Palm Desktop scheduled for "the summer" so I'll try then. No word about Intellisync for Yahoo! so we will have to wait and see.
I tried to use Plaxo to sync to Outlook and in the process managed to duplicate my contacts and calendar entries. Fortunately I have most things categorised in Outlook so it was trivial to recover.
All in all, not a great day for progress.
Wednesday, 18 July 2007
Vista: Day One
The Vista install went very smoothly yesterday on my work laptop. It's mostly working... so far.
The Aero Glass interface is quite nice and the new default system font is very pleasant to look at. The sidebar is reproducing the gadgets I've previously played with using Yahoo! Widgets and the Google Desktop sidebar. I like the window opening and closing fade effect, but the frosted glass window frame is not very useful.
Adding a networked printer was extremely easy - clicking the Add Printer button presented a list of network printers, I selected the one I wanted and it automatically installed the correct driver. No need to scroll through long lists of drivers, although I'm disappointed that the printer selection window size cannot be changed and I have to scroll through a list in a tiny window when I have acres of screen space that could be used.
Things that aren't currently working: Palm HotSync and Yahoo! Intellisync. The Palm is supposed to work, so I'll have to do some digging.
One comment must be made about the UAC. It's actually not that bad. I'm running as a non-administrative user and so UAC keeps prompting me for an administrators password whenever it wants to perform a privileged operation. I believe that if you're already an administrator it simply pops up the infamous allow/deny box without requiring a password.
To be honest, having installed a bundle of software packages, it's not that bad. There is no caching of the password that you find on Linux, but then Vista is likely to be a more vulnerable target by the black hats. Of course, many users will blindly click Allow, but at least UAC is a start.
At the end of day one, it's a pretty positive experience. Onto day two...
The Aero Glass interface is quite nice and the new default system font is very pleasant to look at. The sidebar is reproducing the gadgets I've previously played with using Yahoo! Widgets and the Google Desktop sidebar. I like the window opening and closing fade effect, but the frosted glass window frame is not very useful.
Adding a networked printer was extremely easy - clicking the Add Printer button presented a list of network printers, I selected the one I wanted and it automatically installed the correct driver. No need to scroll through long lists of drivers, although I'm disappointed that the printer selection window size cannot be changed and I have to scroll through a list in a tiny window when I have acres of screen space that could be used.
Things that aren't currently working: Palm HotSync and Yahoo! Intellisync. The Palm is supposed to work, so I'll have to do some digging.
One comment must be made about the UAC. It's actually not that bad. I'm running as a non-administrative user and so UAC keeps prompting me for an administrators password whenever it wants to perform a privileged operation. I believe that if you're already an administrator it simply pops up the infamous allow/deny box without requiring a password.
To be honest, having installed a bundle of software packages, it's not that bad. There is no caching of the password that you find on Linux, but then Vista is likely to be a more vulnerable target by the black hats. Of course, many users will blindly click Allow, but at least UAC is a start.
At the end of day one, it's a pretty positive experience. Onto day two...
Tuesday, 17 July 2007
The start of the great Vista experiment
T has decided that she would like a new PC and that I would have the job of sorting it out. So I worked out a list of pros and cons based on her needs to decide whether a Mac, a Linux PC or another Windows machine would be the best choice.
I'm a big fan of the Mac and hope that the rumours of the Mini are greatly exaggerated as I was looking for a small Mac to play around with. The iMac seemed like it would be the best option for her, but with a product refresh imminent, it might not be the time to buy. Furthermore, T has a requirement to operate on Microsoft Office documents, and the added cost of purchasing Office 2004 pushed cost too high (when OpenOffice.org is native on MacOS X this might change things).
The second option was to migrate to Linux. Although I'm a happy Linux desktop user and happy with the combination of OpenOffice.org, Firefox and Thunderbird, there are some applications for which Linux isn't quite there yet (Scribus is a decent DTP program, but it's not as easy to use as Publisher...).
Which led me to investigate Vista. Now, I'm not a huge Microsoft fan and my first experience with a Vista beta was to loudly diss it as a cheap Mac OS X rip-off. But the reality is that at some point, be it at work or when helping my friends, I will have to learn it and use it.
So, this is the start of the great Vista experiment. At the moment, I'm typing the last part of this entry on my laptop which is installing a bunch of Vista patches. I'll blog my experiences - what's good, what's bad - and hopefully learn some things along the way...
I'm a big fan of the Mac and hope that the rumours of the Mini are greatly exaggerated as I was looking for a small Mac to play around with. The iMac seemed like it would be the best option for her, but with a product refresh imminent, it might not be the time to buy. Furthermore, T has a requirement to operate on Microsoft Office documents, and the added cost of purchasing Office 2004 pushed cost too high (when OpenOffice.org is native on MacOS X this might change things).
The second option was to migrate to Linux. Although I'm a happy Linux desktop user and happy with the combination of OpenOffice.org, Firefox and Thunderbird, there are some applications for which Linux isn't quite there yet (Scribus is a decent DTP program, but it's not as easy to use as Publisher...).
Which led me to investigate Vista. Now, I'm not a huge Microsoft fan and my first experience with a Vista beta was to loudly diss it as a cheap Mac OS X rip-off. But the reality is that at some point, be it at work or when helping my friends, I will have to learn it and use it.
So, this is the start of the great Vista experiment. At the moment, I'm typing the last part of this entry on my laptop which is installing a bunch of Vista patches. I'll blog my experiences - what's good, what's bad - and hopefully learn some things along the way...
Friday, 13 July 2007
Google Browser Sync - Yay!
Earlier this week I merged my personal and work bookmarks into a single file and installed the Google Browser Sync add-on for Firefox. This keeps multiple machines in sync with each other. As of tonight, I have got my "global" bookmarks syncing between my home PC, my work laptop and T's PC.
All good stuff and working very nicely. This is a real result for cloud living.
All good stuff and working very nicely. This is a real result for cloud living.
Monday, 9 July 2007
Bookmark Woes
Having successfully synced my calendar, to-do list, notes and contacts so I can access them from anywhere, I turned my attention to bookmarks as the next thing to move to the cloud.
I have a small collection of bookmarks on my home PC, as well as on my work laptop. Over the weekend I took the step of merging them together into Yahoo!'s new bookmark system. This is pretty funky....
...until you try and manage them! Although sporting a slick AJAX driven UI, Yahoo Bookmarks is limited in being able to easily drag and drop bookmarks and re-arrange them. It also requires a visit to the page to access them and my common bookmarks are not sorted at the top (accessing Facebook is slower than typing www.facebook.com into the address bar).
The Yahoo! toolbar for IE provides instant access to your bookmarks by clicking a button, but the Firefox toolbar tries to read the old bookmarks system (which wasn't very good!).
Alternatively, I could use the Google bookmarks system, but this doesn't appeal at the moment (it doesn't have the whizzy page preview that Yahoo has).
So for now, I'm opting to use the Google Browser Sync extension for Firefox that will keep a copy of the bookmarks on the cloud, while still providing the ability to manage them using the rich Firefox interface.
It's not a solved problem, but it's a good step forward.
I have a small collection of bookmarks on my home PC, as well as on my work laptop. Over the weekend I took the step of merging them together into Yahoo!'s new bookmark system. This is pretty funky....
...until you try and manage them! Although sporting a slick AJAX driven UI, Yahoo Bookmarks is limited in being able to easily drag and drop bookmarks and re-arrange them. It also requires a visit to the page to access them and my common bookmarks are not sorted at the top (accessing Facebook is slower than typing www.facebook.com into the address bar).
The Yahoo! toolbar for IE provides instant access to your bookmarks by clicking a button, but the Firefox toolbar tries to read the old bookmarks system (which wasn't very good!).
Alternatively, I could use the Google bookmarks system, but this doesn't appeal at the moment (it doesn't have the whizzy page preview that Yahoo has).
So for now, I'm opting to use the Google Browser Sync extension for Firefox that will keep a copy of the bookmarks on the cloud, while still providing the ability to manage them using the rich Firefox interface.
It's not a solved problem, but it's a good step forward.
Thursday, 5 July 2007
A plethora of blogging options
So I've just registered with Blogger (part of the Google Empire). This is in addition to my Slashdot Journal, neglected Yahoo! 360 page and my personal website. Far too many options to keep them all relevant, so I'll try and settle on the best and remove the others.
This blog is called "Living on the Cloud" and it refers to my current mini-project of moving all my "digital life" (how's that for a pretentious blog comment) from my local network to the Internet where I can [theoretically] access it from anywhere.
Check back soon for more updates (I know, you can't wait...)
This blog is called "Living on the Cloud" and it refers to my current mini-project of moving all my "digital life" (how's that for a pretentious blog comment) from my local network to the Internet where I can [theoretically] access it from anywhere.
Check back soon for more updates (I know, you can't wait...)
Big Brother is watching you...
Had a thought recently following on from the terrorist attacks in the UK:
If we had hi-res CCTV in most public places, along with a DNA database of the entire population (including those peskly tourists ;-)), then would it dramatically cut crime?
At the moment, TV programmes like Crimewatch show low-res, grainy CCTV footage that doesn't look very useful. Hi-res will presumably arrive sooner rather than later and could help massively in identifying criminals.
Similarly, imagine how simple it would be to solve crime by turning up, swabbing the scene for DNA and get an instant ID.
I'm not saying that this is a good thing - it's open to misuse like most technology - and the privacy concerns are significant.
But it could also be useful in proving one's innocence. Hi-res CCTV will make misidentification more difficult, and DNA identification could similarly be used to prove that you were NOT the culprit.
Just a thought...
If we had hi-res CCTV in most public places, along with a DNA database of the entire population (including those peskly tourists
At the moment, TV programmes like Crimewatch show low-res, grainy CCTV footage that doesn't look very useful. Hi-res will presumably arrive sooner rather than later and could help massively in identifying criminals.
Similarly, imagine how simple it would be to solve crime by turning up, swabbing the scene for DNA and get an instant ID.
I'm not saying that this is a good thing - it's open to misuse like most technology - and the privacy concerns are significant.
But it could also be useful in proving one's innocence. Hi-res CCTV will make misidentification more difficult, and DNA identification could similarly be used to prove that you were NOT the culprit.
Just a thought...
Wednesday, 27 June 2007
Upgrades and Downtime
The main problem of living on the cloud hit me last night when I got home to find my phone line was dead. The UK is experiencing a lot of flooding at the moment, and although the area I live in is unaffected, I'm guessing the rain has waterlogged a box somewhere and killed the line.
As always, BT claim that testing the line reports that all is okay. When pressed, they offer to send an engineer (but it will be chargeable if it's not BTs fault). After agreeing to this (I've done it enough times to know it's a delaying tactic), they immediately noticed that the problem was not just my line, but affected a significant number of lines in my area...(!).
So the "take home" from this is that life on the cloud is only as good as the connection to the cloud. Obvious stuff, but it's amazing how reliant one becomes on a permanent net connection.
Got to work this morning to read that Google Docs and Spreadsheets have been upgraded. The new interface is better and supports folders. It's still missing a way to bulk download documents for backup purposes. Will need to investigate this a bit more, but it's a good upgrade.
As always, BT claim that testing the line reports that all is okay. When pressed, they offer to send an engineer (but it will be chargeable if it's not BTs fault). After agreeing to this (I've done it enough times to know it's a delaying tactic), they immediately noticed that the problem was not just my line, but affected a significant number of lines in my area...(!).
So the "take home" from this is that life on the cloud is only as good as the connection to the cloud. Obvious stuff, but it's amazing how reliant one becomes on a permanent net connection.
Got to work this morning to read that Google Docs and Spreadsheets have been upgraded. The new interface is better and supports folders. It's still missing a way to bulk download documents for backup purposes. Will need to investigate this a bit more, but it's a good upgrade.
Tuesday, 26 June 2007
More cloud talk
I've been using Google Docs and Spreadsheets more and more. It's actually more than suitable for knocking out simple documents, and the ability to use it as a collaboration tool is very smart.
My experience with Google Reader has also been very positive. Finally a way to keep my RSS feed reading synchronised between work, home and away! And today I've managed to go another step towards living on the cloud: Yahoo! Bookmarks.
The old Yahoo! Bookmarks was a bit rubbish - too difficult to manage. So what does the new one have that improves on locally managed Bookmarks (beyond the obvious), and why not use Google Bookmarks? The killer feature of Yahoo! Bookmarks for me is the ability to save a copy of the page and view bookmarks using thumbnail images. All wrapped up in a nice AJAX interface.
So the cloud experience is going well. There are still concerns about backing up - perhaps something the Google and Yahoo APIs will address, and the concerns about privacy won't go away (but then I'm not storing personal sensitive details in the cloud).
Of interest is the ability to access data from anywhere. Google have created a Java application for my k800i phone that provides a really decent interface to Gmail. Yahoo have similar offerings, so I'll be doing some further investigation into that soon...
My experience with Google Reader has also been very positive. Finally a way to keep my RSS feed reading synchronised between work, home and away! And today I've managed to go another step towards living on the cloud: Yahoo! Bookmarks.
The old Yahoo! Bookmarks was a bit rubbish - too difficult to manage. So what does the new one have that improves on locally managed Bookmarks (beyond the obvious), and why not use Google Bookmarks? The killer feature of Yahoo! Bookmarks for me is the ability to save a copy of the page and view bookmarks using thumbnail images. All wrapped up in a nice AJAX interface.
So the cloud experience is going well. There are still concerns about backing up - perhaps something the Google and Yahoo APIs will address, and the concerns about privacy won't go away (but then I'm not storing personal sensitive details in the cloud).
Of interest is the ability to access data from anywhere. Google have created a Java application for my k800i phone that provides a really decent interface to Gmail. Yahoo have similar offerings, so I'll be doing some further investigation into that soon...
Thursday, 31 May 2007
Living in the cloud
Firstly, to follow up on my last journal entry regarding Facebook - I'm hooked! It's so much fun to be able to see what your friends are up to.
As detailed in previous entries, I've started to move so that my calendar, contacts, to-do list and notes are synchronized to my Yahoo! account, and therefore online. I've also spent more time sending email from Gmail. Recently I received an email with a .doc attachment and Gmail offered to open it with Google Docs and Spreadsheets. This proved to be a very pleasant way of easily checking out a document without having to have Office or Openoffice.org installed.
Unrelated to email, at work, I was using Thunderbird for RSS feeds, but got frustrated when I couldn't access them from home. Which led me to investigate the new, improved Google Reader. It's very impressive - even down to the keyboard shortcuts that include some vi-like functionality :-) I exported my RSS feeds from Thunderbird into Google Reader and added a whole bunch more. It's a highly addictive way of browsing for news.
I realised that these small steps were moving me to storing all my non-private (confidential) data onto "the cloud" which I could then access from anywhere. There are nagging doubts about the data mining that Yahoo and Google can do, but so far they are providing genuinely useful services.
I'll update the journal with my progress, but until then, back to Facebook... ;-)
As detailed in previous entries, I've started to move so that my calendar, contacts, to-do list and notes are synchronized to my Yahoo! account, and therefore online. I've also spent more time sending email from Gmail. Recently I received an email with a
Unrelated to email, at work, I was using Thunderbird for RSS feeds, but got frustrated when I couldn't access them from home. Which led me to investigate the new, improved Google Reader. It's very impressive - even down to the keyboard shortcuts that include some vi-like functionality
I realised that these small steps were moving me to storing all my non-private (confidential) data onto "the cloud" which I could then access from anywhere. There are nagging doubts about the data mining that Yahoo and Google can do, but so far they are providing genuinely useful services.
I'll update the journal with my progress, but until then, back to Facebook...
Monday, 16 April 2007
Yahoo! Mail AddressGuard and Facebook Fun
So some of my friends are on Facebook and I was feeling a bit left out. But I didn't want to open my inbox to a potential flood of spam which is what apparently happens sometimes with these social networking sites. This prompted to me to look into the Yahoo! mail AddressGuard feature.
The basic idea is that you create a default name (jrmailgate in my case) and then you can easily setup a number of disposible addresses, so jrmailgate-facebook@ etc. If one address starts getting too much spam, or has found itself onto a bulk list, simply trash the address. It seems like a pretty elegant way to protect your private address.
To be honest, I'm not sure I "get" the idea of a social networking site. Seems like a lot of hassle, but it's all "Web 2.0 goodness" so I should probably make an effort... :-)
Incidently, as a follow-up to my previous post, Yahoo! Mail does not have the same nifty synchronisation capabilities as Gmail, so I'm still looking for a solution to getting a *complete* backup of my Yahoo! mailbox (including sent items) without losing the ability to access it from anywhere. Ah well, at least it's sunny... :-)
The basic idea is that you create a default name (jrmailgate in my case) and then you can easily setup a number of disposible addresses, so jrmailgate-facebook@ etc. If one address starts getting too much spam, or has found itself onto a bulk list, simply trash the address. It seems like a pretty elegant way to protect your private address.
To be honest, I'm not sure I "get" the idea of a social networking site. Seems like a lot of hassle, but it's all "Web 2.0 goodness" so I should probably make an effort...
Incidently, as a follow-up to my previous post, Yahoo! Mail does not have the same nifty synchronisation capabilities as Gmail, so I'm still looking for a solution to getting a *complete* backup of my Yahoo! mailbox (including sent items) without losing the ability to access it from anywhere. Ah well, at least it's sunny...
Thursday, 5 April 2007
Backing up Gmail
I've started using Gmail more and more, primarily because I like the ability to read and respond to emails while away from my home machine. The only concern is that a copy of my email remains on Google's servers and there is no guarantee that this will be available.
The ideal way to fix this is to use the POP capability to download a copy of the email for archiving purposes. This is in preference to the mail forwarding option as it will also download the sent email items (the forwarding option will not enable archiving of sent mail).
Once this is done, it's a matter of configuring your email client (Thunderbird in my case) to download from Gmail's POP server. One important thing to remember is to "leave a copy of the email on the server". This will allow access to Gmail through a browser as well as in Thunderbird.
For sending, set the Thunderbird account to send via Gmail's SMTP server. This will magically save a copy of the outgoing email in the Gmail sent email list.
Finally, setup a filter rule in Thunderbird to move all email from xxx@gmail.com (where xxx is your own email address) into the Sent folder. This will keep things nice and tidy in your email client.
Not sure if the same can be done with Yahoo! mail. Will investigate...
The ideal way to fix this is to use the POP capability to download a copy of the email for archiving purposes. This is in preference to the mail forwarding option as it will also download the sent email items (the forwarding option will not enable archiving of sent mail).
Once this is done, it's a matter of configuring your email client (Thunderbird in my case) to download from Gmail's POP server. One important thing to remember is to "leave a copy of the email on the server". This will allow access to Gmail through a browser as well as in Thunderbird.
For sending, set the Thunderbird account to send via Gmail's SMTP server. This will magically save a copy of the outgoing email in the Gmail sent email list.
Finally, setup a filter rule in Thunderbird to move all email from xxx@gmail.com (where xxx is your own email address) into the Sent folder. This will keep things nice and tidy in your email client.
Not sure if the same can be done with Yahoo! mail. Will investigate...
Friday, 16 March 2007
ZFS and RAID levels
As a builder of Ingres systems, we tend to use RAID 1+0 (aka RAID10) for most of our systems. RAID5 is not recommended for databases due to the read-parity-write that is required for every write operation. Furthermore, the ability of a mirrored system to read two separate data blocks in parallel can give RAID1 a significant read performance advantage.
With the arrival of ZFS in Solaris 10, I've been reading up a little to see if this accepted wisdom changes with RAIDZ. A very interesting article (http://blogs.sun.com/roch/entry/when_to_and_not_t o) basically explains that the read potential for a RAIDZ filesystem can be significantly less than that of a RAID10. However, for non-data filesystems (such as the checkpoint backup area), RAIDZ can provide some advantages. In these areas, disk I/O is less critical (the Ingres archiver writing data to the journals is performed asynchronously to a users query execution) and sequential write operations are handled better than random I/O, making it ideal for checkpoints and journals, but less good for data areas.
Will now be putting this theory into practice...
With the arrival of ZFS in Solaris 10, I've been reading up a little to see if this accepted wisdom changes with RAIDZ. A very interesting article (http://blogs.sun.com/roch/entry/when_to_and_not_
Will now be putting this theory into practice...
Friday, 2 March 2007
Ingres Performance Tuning
I'm on an Ingres Performance Tuning course this week. Just completed day 4 of 5 and it's all been quite tough. There are many ways to tune a DBMS including choosing your table structures (heap, hash, isam and btree), when to use secondary indexes, how to examine the Query Execution Plan (QEP - the way the DBMS chooses how it will execute a query), key design, application design consideration, which have all led us to the locking and logging system today. The subsystems that enforce consistency but allow concurrency. The number of options and strategies are many and an Ingres installation is extremely flexible (it needs to be when systems have hundreds of concurrent users and databases in the hundreds of gigabytes).
The whole thing is really quite amazing and I bang my head against a wall when people automatically assume "open source database" equals only MySQL or Postgres.
I admit it - I like Ingres - I'm a fan. It's a free download and you can get it up and running (on Linux or Windows) very easily. There is a JDBC driver, .NET connector, PHP, Python and Perl interfaces. Go on, try something new... :-)
The whole thing is really quite amazing and I bang my head against a wall when people automatically assume "open source database" equals only MySQL or Postgres.
I admit it - I like Ingres - I'm a fan. It's a free download and you can get it up and running (on Linux or Windows) very easily. There is a JDBC driver,
Wednesday, 31 January 2007
New monitor (Iiyama 22" widescreen minireview)
After working with an Iiyama 17" CRT for the last few years, I recently decided to upgrade to a TFT; something that uses a bit less power and takes up a lot less space. I've always thought highly of Iiyama as a manufacturer and decided to get their new 22" widesceen LCD.
It's a nice piece of kit.
The screen is big (obviously) and runs at a pleasant 1680x1050, which although not as high as my work laptop (1980x1600), is very comfortable to work with. It's amazing how much more desktop there is to work with... :-)
There's not a lot else for me to say about it - colours are bright, resolution good, it's got a black casing which fits in nicely with my black Shuttle next to it. All in all, I'm pleased with this purchase and hope it'll last as long as the trusty (and yellowed) CRT.
It's a nice piece of kit.
The screen is big (obviously) and runs at a pleasant 1680x1050, which although not as high as my work laptop (1980x1600), is very comfortable to work with. It's amazing how much more desktop there is to work with...
There's not a lot else for me to say about it - colours are bright, resolution good, it's got a black casing which fits in nicely with my black Shuttle next to it. All in all, I'm pleased with this purchase and hope it'll last as long as the trusty (and yellowed) CRT.
Friday, 5 January 2007
Getting Virtual
As someone who has spent the last 5+ years specifying and building servers, the shift to a virtual environment has been... interesting. From the moment I first heard about SAN technologies and, later, virtualisation, it's made a lot of sense. However, it's only in the last month or so, when taking a step back and thinking things through has the true potential hit me.
(Apologies to those who are quicker than I am; I'm sure this will cause a few people to go "duh!").
Previously, when asked to specify a server, a number of metrics would be considered: Processor load, memory requirements, disk usage and performance. These would be totted up and a server specified. The result is a server room with a large number of servers, most of them 1U, sitting idle for the majority of the time while waiting to run a single service. Of course, when it came to adding more disk, this was often a problem as the previously specified server is now full (typically a 1U server would be specified as more would be overkill... until the upgrade is required).
The implementation of a SAN therefore made a lot of sense. By consolidating all storage into a central location, disks can be added and distributed as necessary. The first use of the SAN was quite reactive and basically involved assigning a bunch of disk to a box for file serving. The second server to use the SAN basically took so much disk (24x73GB FC disks) that the effect was the same as adding the arrays directly. The next step however will be more intelligent - to effectively utilise a SAN, I think you need to provision the storage up front so that you can proactively assign it when needed.
The same applies I think to virtualisation. I've spent a couple of weeks now working on the detail of a new server room infrastructure that is virtual. Getting my head around the idea that I no longer need to be specifying specific servers, but rather capacity, has taken some mental adjustment. The result is a breakdown of a server into components that units can purchase, starting with a basic server (1xCPU, 512MB RAM, 1x36GB disk) and then scaling up as necessary.
So today, I plan to take a close look at specifying a couple of servers that are meaty enough to host a number of virtual machines. I was originally looking at the Sun Fire X4100 - a 1U server capable of supporting 2 x dual core Opteron processors. Having spent a bit of time getting to grips with the possibility of creating a large number of servers as necessary, the X4600 (capable of 8 x dual core Opterons) seems more appealing, as additional capacity can be bought at a later date.
All exciting stuff...
(Apologies to those who are quicker than I am; I'm sure this will cause a few people to go "duh!").
Previously, when asked to specify a server, a number of metrics would be considered: Processor load, memory requirements, disk usage and performance. These would be totted up and a server specified. The result is a server room with a large number of servers, most of them 1U, sitting idle for the majority of the time while waiting to run a single service. Of course, when it came to adding more disk, this was often a problem as the previously specified server is now full (typically a 1U server would be specified as more would be overkill... until the upgrade is required).
The implementation of a SAN therefore made a lot of sense. By consolidating all storage into a central location, disks can be added and distributed as necessary. The first use of the SAN was quite reactive and basically involved assigning a bunch of disk to a box for file serving. The second server to use the SAN basically took so much disk (24x73GB FC disks) that the effect was the same as adding the arrays directly. The next step however will be more intelligent - to effectively utilise a SAN, I think you need to provision the storage up front so that you can proactively assign it when needed.
The same applies I think to virtualisation. I've spent a couple of weeks now working on the detail of a new server room infrastructure that is virtual. Getting my head around the idea that I no longer need to be specifying specific servers, but rather capacity, has taken some mental adjustment. The result is a breakdown of a server into components that units can purchase, starting with a basic server (1xCPU, 512MB RAM, 1x36GB disk) and then scaling up as necessary.
So today, I plan to take a close look at specifying a couple of servers that are meaty enough to host a number of virtual machines. I was originally looking at the Sun Fire X4100 - a 1U server capable of supporting 2 x dual core Opteron processors. Having spent a bit of time getting to grips with the possibility of creating a large number of servers as necessary, the X4600 (capable of 8 x dual core Opterons) seems more appealing, as additional capacity can be bought at a later date.
All exciting stuff...
Wednesday, 3 January 2007
Synchronising - progress so far
Got back to work this morning and setup Yahoo! Intellisync. Now got it working synchronising my Outlook to my Yahoo Calendar, Address Book and Notepad. All good so far.
Tried yesterday to get Evolution working with my Palm. Managed to get a backup of the Palm using pilot-xfer, and also got sync working with Kpilot, but gnome-pilot just doesn't want to work. It just sits there when I try and sync and eventually times out. Will do some more investigation...
Tried yesterday to get Evolution working with my Palm. Managed to get a backup of the Palm using pilot-xfer, and also got sync working with Kpilot, but gnome-pilot just doesn't want to work. It just sits there when I try and sync and eventually times out. Will do some more investigation...
Monday, 1 January 2007
Synchronising my life
I've been a Palm user for years - originally starting with the Pilot Personal before upgrading to the M500 and now the Tungsten T3 which I use today. For a long time, I tried to keep my personal life and work life separately, but one day realised I was spending too much time and effort entering the same data twice.
It was also around this time that I became fully acquainted with Outlook and started sharing my calendar with colleagues, managing my time using appointments etc. Synchronising with the Palm made a lot of sense, and I abandoned my previous attempts to delineate my life and embrace a joined up world. I effectively made Outlook the hub of my organising and started syncing the Palm with it (instead of Palm Desktop). The advantage of using Outlook in this way is that everything connects with it. I was then able to install my phone sync software (for the Sony Ericsson K800i) and ensure my contacts on my phone were up to date.
The problems I have at the moment is that my Linux box at home has nothing to do with my schedule, and there is no easy way to communicate my schedule with my fiancé.
Having spent a couple of days looking at the problem, I'm going to attempt the following:
1) Use Evolution instead of Thunderbird for email, and Palm sync to it for calendar, tasks and contacts. This is primarily because the PIM component of Thunderbird (Lightning - a reworking of Sunbird into an extension) isn't ready yet and cannot do Palm syncing.
2) Use Intellisync to connect Outlook to Yahoo Calendar and grant my fiancé read (and write..?) access to it.
This means that there is a danger of the Yahoo Calendar becoming out of date if I'm not at work, but it seems like the easiest way to do it.
This whole exercise has proven several things to me - most importantly that there is no easy way to share data that doesn't involve Outlook. It really has become the de facto schedule management software.
Also, I've learnt that Yahoo Calendar does not provide an automated way of reading the calendar as an iCalendar file. Also, Google Calendar does not provide any form of synchronisation beyond initial import/export of data.
If there is ever going to be a decent Open Source replacement for Microsoft Office, a worthy cross platform competitor to Outlook is needed. Hopefully Thunderbird/Lightning will be it and they'll get the Palm sync sorted.
The above seems like the most straightforward, but I'm also aware of Funambol - a potential server side solution which I might get around to investigating if Plan A doesn't work.
It was also around this time that I became fully acquainted with Outlook and started sharing my calendar with colleagues, managing my time using appointments etc. Synchronising with the Palm made a lot of sense, and I abandoned my previous attempts to delineate my life and embrace a joined up world. I effectively made Outlook the hub of my organising and started syncing the Palm with it (instead of Palm Desktop). The advantage of using Outlook in this way is that everything connects with it. I was then able to install my phone sync software (for the Sony Ericsson K800i) and ensure my contacts on my phone were up to date.
The problems I have at the moment is that my Linux box at home has nothing to do with my schedule, and there is no easy way to communicate my schedule with my fiancé.
Having spent a couple of days looking at the problem, I'm going to attempt the following:
1) Use Evolution instead of Thunderbird for email, and Palm sync to it for calendar, tasks and contacts. This is primarily because the PIM component of Thunderbird (Lightning - a reworking of Sunbird into an extension) isn't ready yet and cannot do Palm syncing.
2) Use Intellisync to connect Outlook to Yahoo Calendar and grant my fiancé read (and write..?) access to it.
This means that there is a danger of the Yahoo Calendar becoming out of date if I'm not at work, but it seems like the easiest way to do it.
This whole exercise has proven several things to me - most importantly that there is no easy way to share data that doesn't involve Outlook. It really has become the de facto schedule management software.
Also, I've learnt that Yahoo Calendar does not provide an automated way of reading the calendar as an iCalendar file. Also, Google Calendar does not provide any form of synchronisation beyond initial import/export of data.
If there is ever going to be a decent Open Source replacement for Microsoft Office, a worthy cross platform competitor to Outlook is needed. Hopefully Thunderbird/Lightning will be it and they'll get the Palm sync sorted.
The above seems like the most straightforward, but I'm also aware of Funambol - a potential server side solution which I might get around to investigating if Plan A doesn't work.
Subscribe to:
Posts (Atom)