I subscribe to the notion that the Macworld Keynote is one of the most important presentations in the IT industry. This is because Apple have a unique ability to create very smart and easy to use products that appeals to non-technical people, while also creating kit that geeks enjoy using (the number of MacBooks being used at the UK Unix User Group was testament to that).
Last year was the iPhone, which although not being a perfect product (lack of 3G for example) has shaken the mobile market so much that every other mobile company is having to up their game. Apple was also a significant influence in leading the music industry to drop DRM.
The buzz at this year's presentation is surrounding the MacBook Air. But in my mind, the biggest news is the upgraded Apple TV. When this was released last year, the Apple TV was pushed as being an iPod for the living room. This year, Steve Jobs admitted that the product didn't meet expectations, but they were revisiting it. The big news was the introduction of movie rentals.
iTunes Movie Rentals provides a way to download movies from iTunes straight from the TV with no computer being required. These movies can then be saved for 30 days and once started, you can play it for 24 hours. If this works for rentals, then presumably the next step would be purchases over the Internet. This could make the HD-DVD vs BluRay conflict completely irrelevant. The number of Playstation 3s will ensure that BluRay has some future, but perhaps the convenience of online purchase of movies will mean BluRay won't automatically replace DVD as the de facto format.
Regarding the MacBook Air - it really is thin and the multitouch demo was very impressive. It's not something I'd be interested in getting, but I can appreciate there will be a lot of people who will want one.
So although this year there was nothing to top last year's iPhone announcement, the potential for the upgraded Apple TV could be as disruptive to the media world as the iPhone has been to the mobile phone industry. Let's see how everyone else responds...
Friday, 18 January 2008
Saturday, 12 January 2008
Updating the backup process
When I upgraded from OpenSUSE 10.1 to 10.3, I took the decision to migrate away from XFS and use EXT3. Although XFS had served me well for many years, I felt the simplicity of using the "standard" Linux filesystem would make restores easier. For large filesystems with high throughput, XFS is fantastic, but for more home workstation, I don't think there is much in it. Actually, if anything, EXT3 is more responsive when doing a backup.
In making the change, I had to modify my backup scripts to use ext3 dump instead of xfsdump. This gave me the chance to revisit the LVM snapshot section I wrote a few years ago, but could not get working with the kernel supplied with 10.1 (It worked. Sometimes).
The new script is very simple and does the following:
1) Takes a filesystem to dump and dump level to take (0 = full, 1=everything that has changed since the last level 0, 2=everything that has changed since the last level 1 etc.).
2) Works out the underlying LVM logical volume the filesystem is built on.
3) Creates a snapshot of the logical volume and mounts in under /snapshot
4) Performs a dump of /snapshot to my external (Firewire) hard disk.
5) Umounts /snapshot and destroys the snapshot.
It appears to be working pretty well and provides what should be a very simple mechanism for ensuring my main workstation is backed up.
T's Vista PC is backed up to another external Firewire hard disk using the built in Vista tools. This appears to do the job and happens automatically, so it's a fire-and-forget operation.
I'm not currently backing up the Mac. This shouldn't be a problem because all my important files are stored on the network. It also seems that Time Machine has some limitations (setting backup schedules, limited to using local storage etc.). I'm guessing a future version of Mac OS X will fully support ZFS and will use ZFS snapshots for backing up.
This leaves the "SilverShuttle" which is my VMware Server / Solaris server. This needs some form of backup and I'm still chewing over the options. I'd ideally like a large external array with a bunch of disks that are managed by ZFS, but this doesn't solve my desire to keep an off-site copy of my data. Something for me to work on then...
In making the change, I had to modify my backup scripts to use ext3 dump instead of xfsdump. This gave me the chance to revisit the LVM snapshot section I wrote a few years ago, but could not get working with the kernel supplied with 10.1 (It worked. Sometimes).
The new script is very simple and does the following:
1) Takes a filesystem to dump and dump level to take (0 = full, 1=everything that has changed since the last level 0, 2=everything that has changed since the last level 1 etc.).
2) Works out the underlying LVM logical volume the filesystem is built on.
3) Creates a snapshot of the logical volume and mounts in under /snapshot
4) Performs a dump of /snapshot to my external (Firewire) hard disk.
5) Umounts /snapshot and destroys the snapshot.
It appears to be working pretty well and provides what should be a very simple mechanism for ensuring my main workstation is backed up.
T's Vista PC is backed up to another external Firewire hard disk using the built in Vista tools. This appears to do the job and happens automatically, so it's a fire-and-forget operation.
I'm not currently backing up the Mac. This shouldn't be a problem because all my important files are stored on the network. It also seems that Time Machine has some limitations (setting backup schedules, limited to using local storage etc.). I'm guessing a future version of Mac OS X will fully support ZFS and will use ZFS snapshots for backing up.
This leaves the "SilverShuttle" which is my VMware Server / Solaris server. This needs some form of backup and I'm still chewing over the options. I'd ideally like a large external array with a bunch of disks that are managed by ZFS, but this doesn't solve my desire to keep an off-site copy of my data. Something for me to work on then...
Saturday, 5 January 2008
A solution to wireless hassles: Ethernet over mains
I have all my computers upstairs in our office (second bedroom), but the telephone enters the house downstairs in the lounge. For the last few years, I've put the ADSL router (Netgear 834G) in the corner and connected to the upstairs using a Wireless Ethernet Bridge (Netgear WGE101). This worked pretty well at first - until I started getting neighbours with wireless!
I first realised that those around me were getting wireless connections when my own LAN kept losing the router. This was quickly fixed by me changing my channel to something unused, but random connection problems still persisted.
The other night I stared in bemusement as the Wireless Bridge claimed a 99% signal strength to the access point on my router, but refused to connect. I remembered a conversation I had with a colleague a few weeks previous where he mentioned that he had installed Ethernet over mains power.
So after T finished watching an episode of Desperate Housewives, I flicked the switch in the fuse box labelled "sockets" and observed that all mains sockets went out - both downstairs and upstairs - indicating that I had a single ring main.
A few minutes later, everything was back (the computers never went down thanks to the UPS! :-)) and I was ordering the Devolo HomePlug Adapter "Highspeed Starter Kit". This arrived last night and consisted of two RJ45 cables and two adapters that plug into the wall socket and have an Ethernet port.
One went downstairs and I plugged the router into it, and the other went upstairs into a four way expansion unit that was not connected to the UPS (I'm guessing that the UPS will clean the signal and could potentially interfere with the Ethernet, but haven't tried it). A few seconds later, and it was all working!
There is some software that can be used to set a custom encryption password for the mains side, and there is even a Linux version included on the CD, but I haven't bothered with it yet.
The unit is rated at up to 85Mbits which is still much faster than the 8Mbit bottleneck of the ADSL connection.
It was running all last night and so far, I am very impressed. The whole thing cost about £80 from Amazon and means I should no longer have to experience the woes of losing a wireless connection at home.
I first realised that those around me were getting wireless connections when my own LAN kept losing the router. This was quickly fixed by me changing my channel to something unused, but random connection problems still persisted.
The other night I stared in bemusement as the Wireless Bridge claimed a 99% signal strength to the access point on my router, but refused to connect. I remembered a conversation I had with a colleague a few weeks previous where he mentioned that he had installed Ethernet over mains power.
So after T finished watching an episode of Desperate Housewives, I flicked the switch in the fuse box labelled "sockets" and observed that all mains sockets went out - both downstairs and upstairs - indicating that I had a single ring main.
A few minutes later, everything was back (the computers never went down thanks to the UPS! :-)) and I was ordering the Devolo HomePlug Adapter "Highspeed Starter Kit". This arrived last night and consisted of two RJ45 cables and two adapters that plug into the wall socket and have an Ethernet port.
One went downstairs and I plugged the router into it, and the other went upstairs into a four way expansion unit that was not connected to the UPS (I'm guessing that the UPS will clean the signal and could potentially interfere with the Ethernet, but haven't tried it). A few seconds later, and it was all working!
There is some software that can be used to set a custom encryption password for the mains side, and there is even a Linux version included on the CD, but I haven't bothered with it yet.
The unit is rated at up to 85Mbits which is still much faster than the 8Mbit bottleneck of the ADSL connection.
It was running all last night and so far, I am very impressed. The whole thing cost about £80 from Amazon and means I should no longer have to experience the woes of losing a wireless connection at home.
Tuesday, 1 January 2008
OpenSUSE 10.3 upgrade and DNS fun
Over the last couple of days I've been busy tweaking my network. I've updated my main workstation from OpenSUSE 10.1 to 10.3 (having missed 10.2 completely). The new version is pretty decent and shows the continuing refinement of Linux as a desktop operating system.
The GNOME "slab" menu is pretty decent and works well. The confusion over the different software management tools (YAST vs Zenworks basically) has been resolved with a new look and feel to the software management tool. I'm not as impressed with it as the previous version as the interface is less informative, but it does work.
Getting Compiz working (for wobbly windows fun) was easy, although getting Compiz Fusion setup was more difficult. I experienced the unhelpful white screen when enabling XGL, but the Nvidia driver apparently does not need XGL for accelerated graphics, and the new whizzy effects are certainly pretty impresive (although the more advanced ones like Burn and Explode are a bit jerky on my old 5200FX graphics card).
With the number of virtual machines on my network now increasing, I was getting tired of looking up IP addresses in /etc/hosts and decided to build myself a small DNS server Solaris zone. Setting up BIND is one of those things that I always think is hard, but whenever I do it, I surprise myself how easy it is.
I opted to setup a dedicated zone because, well, the overhead is minimal and it provides a nice way of separating services. I built "dnsserver" as a sparse root zone, created /var/named and setup the zones files and /etc/named.conf. Now to be honest, I did cheat a little by using the h2n script to generate the zone information automatically, but having done that I then manually tweaked all the entries to get my A (authorative) and CNAME (canonical) entries setup correctly. A few more minutes spent on reconfiguring my other machines to use the DNS server and my "local.zone" was up and running.
I've also made tentative steps into the world of LDAP by installing Fedora Directory Server into a Centos 5.1 virtual machine. Having completed the install, I tried to setup an LDAP user, but have so far failed to get this working. Something to pick up when I have more time.
I'm still working on consolidating all my documents into a single fileserver, possibly using an iTunes server for sharing music. Will update when I've done more...
The GNOME "slab" menu is pretty decent and works well. The confusion over the different software management tools (YAST vs Zenworks basically) has been resolved with a new look and feel to the software management tool. I'm not as impressed with it as the previous version as the interface is less informative, but it does work.
Getting Compiz working (for wobbly windows fun) was easy, although getting Compiz Fusion setup was more difficult. I experienced the unhelpful white screen when enabling XGL, but the Nvidia driver apparently does not need XGL for accelerated graphics, and the new whizzy effects are certainly pretty impresive (although the more advanced ones like Burn and Explode are a bit jerky on my old 5200FX graphics card).
With the number of virtual machines on my network now increasing, I was getting tired of looking up IP addresses in /etc/hosts and decided to build myself a small DNS server Solaris zone. Setting up BIND is one of those things that I always think is hard, but whenever I do it, I surprise myself how easy it is.
I opted to setup a dedicated zone because, well, the overhead is minimal and it provides a nice way of separating services. I built "dnsserver" as a sparse root zone, created /var/named and setup the zones files and /etc/named.conf. Now to be honest, I did cheat a little by using the h2n script to generate the zone information automatically, but having done that I then manually tweaked all the entries to get my A (authorative) and CNAME (canonical) entries setup correctly. A few more minutes spent on reconfiguring my other machines to use the DNS server and my "local.zone" was up and running.
I've also made tentative steps into the world of LDAP by installing Fedora Directory Server into a Centos 5.1 virtual machine. Having completed the install, I tried to setup an LDAP user, but have so far failed to get this working. Something to pick up when I have more time.
I'm still working on consolidating all my documents into a single fileserver, possibly using an iTunes server for sharing music. Will update when I've done more...
Subscribe to:
Posts (Atom)