Sunday, June 17, 2007

Nerd Food: Ubuntu in the Real World

After raving and ranting about Ubuntu so many times, I decided it was time to put it to the test in really demanding conditions. And there are no more demanding conditions than those set by children, in particular when they are nearing their teens. So it was that I installed Feisty in two machines and gave them to my nephews. After 24 hours, the experiment had already produced some interesting results.

The two machines in question are as follows:
  • Machine A is a 64-bit AMD, NVidia graphics card and 512 RAM (less than 4 years old);
  • Machine B is 32-bit AMD, ATI graphics card and 512 RAM (around 6 years old).
The installation in both was pretty straightforward, with all hardware detected. Additional software was easily installed via Synaptic. After the easiness, problems started to appear.

Problem one is related to the well-known issues with ATI cards. Enabling restricted drivers doesn't seem to do anything at all (finds none suitable to my card), and enabling desktop effects without binary drivers results in some kind of video corruption (there's a 5 centimetre area on the right side of my screen that doesn't appear to be used). Even if I had managed to get the ATI card going with binary drivers - which I don't think I would have as this particular card is really old and unsupported - but even if I did, I would have had to setup XGL in order to use compiz, which is really not something I think the average user should do. On the positive side, NVidia support is brilliant. All I had to do on machine B was to enable the restricted driver and enable desktop effects. Compiz was up and running in no time.

Problem two is related to flash and 64-bit. There is no available Firefox/Epiphany plug-in for flash, a vital element of the browsing experience for any young kid these days. I could have gone and installed some less standard flash support, but again, this is well beyond the call of duty for a normal user.

Problem three was DVD playout. I never quite got Totem-GStreamer playing encrypted DVDs. Not sure if it's me being thick and not understanding how to configure decss for GStreamer, but regardless of the underlying causes, my solution for this problem has always been to install Totem-Xine. In addition, I never quite got subtitles working with Totem-Xine so I play all movies requiring subtitles from XineUI (and here, for some reason, I never quite managed to increase the subtitle's font size but at least they're there). None of these steps make sense to the average user.

At this point I had the machine sorted out, and ready for my two eager customers. The first one was impressed with her desktop for a few moments, until she realised I hadn't installed The Sims. Now, as far as games go, she isn't much of a gamer. In fact, for the last couple of years she only bought two PC games: The Sims and The Sims expansion pack. So yes, this was a vital requirement for the experiment.

Problem four: the dreaded Sims. The installer worked well enough, as well as one could expect on a windows box even, up to the point when it finished CD 1. You are then expected to insert CD 2 and continue the installation. Alarm bells started ringing at this point. You see, UNIX and Linux have a very special relationship with devices. When you mount your CD and start running applications from it, its neigh impossible to unmount it until you close all the open files you have on that device. In my particular case I had cd'd to /media/cdrom to run wine start.exe. This basically meant I could not unmount the cdrom properly until I exited from wine, but I couldn't really exit from wine until I got CD 2 mounted. I was about to restart the whole process again when it occurred to me that even if I didn't cd into that directory, I would still have start.exe as an open file (I needed to run it, of course) which meant that I would not be able to unmount the device. I'm not sure about this but logic seems to imply that wine cannot cope with installing programs with two or more discs because of the underlying UNIX mentality. I hope some wine person will prove me wrong.

The temporary solution was to copy the contents of disc 1 to a local folder and execute start.exe from there. This worked a treat until I got to disc 2. Here, for some random reason the installer refused to recognise the disc. Because the installer gives you absolutely no clues as to where it is looking for disc 2 I couldn't tell if there was something wrong with the CD or if it was just looking in the wrong place. After much fiddling it occurred to me that the installer was probably looking for disc 2 on the local directory. However, disc 2 had a similar structure to disc 1 (same setup directory) which meant I couldn't just copy it over disc 1. My final solution was to rename disc two's setup to setup2 while installing disc one, and then renaming setup2 to setup when it asked for disc 2. This perverted experiment actually produced the expected results and the installation completed successfully.

However, the problems were far from over. After all that pain, the sims.exe binary simply refused to launch. It would start but do nothing afterwards. The problem appears to be as reported in the wine DB here and here. There isn't much I can do until wine support improves for the sims. As one would expect, this did not please my user at all.

Problem five: the crash. When I thought everything was quiet and everyone was happy enough, it all collapsed in front of me. Two kids were happily playing with the machine when electricity suddenly went off. On reboot, they were stuck on the fsck screen, at which point they proceeded to randomly press keys and switch the computer on and off, probably several times, in a desperate attempt to fix it. Apparently this technique worked quite well with their previous box running Windows 95.

This, methinks, is one fundamental problem with the current Ubuntu boot. If some operation takes too long, you are thrown back in to the console with huge amounts of text output. This is fine for a nerd user but an absolutely scary experience for any user. It would be much better if one could have some kind of user interface with a massive warning saying "DO NOT SWITCH THE COMPUTER OFF" or something like that. Text mode just scares people off, and when it is compounded with things like "running fsck" then there's absolutely no hope of survival. The end result of all of this was that the machine was rendered unbootable by the time I got there. Yep, you read that right. First time it went all the way to fsck again and froze, second time it froze on grub, the exact same point it froze on for the subsequent twenty reboots.

So here I am on the local Internet cafe, downloading an Ubuntu amd64 ISO image, preparing myself to re-install it yet again. Ele há dias...


Monday, June 04, 2007

Nerd Food: Merging and Branching Procedures

It seems version control is a popular topic again, thanks to the ever courteous Linus. If you haven't seen the talk he gave at Google, do watch it as it's quite interesting. Linus, in his usual so-offensive-its-funny style, criticises SVN to death. I got to say that I quite like SVN, perhaps because I've been forced to use ClearCase, SourceSafe, RCS and CVS for far too long. My only complaint with SVN has always been the terrible merging, something that Linus rightly criticises on his talk. The good news is it appears the most severe problems with merging will be fixed on the next SVN release.

Linus' talk did make me more aware of distributed version control though, but I'm not entirely convinced it would work in a commercial software house. After all, we already have a hard time with branches - let alone having multiple repositories...

All this talk about version control reminded me of a set of procedures for merging and branching I once wrote. I can't take all the credit, of course, since my good friends Kevin and Chris - the ClearCase genius - fixed some mistakes and added important bits. Here is the procedure, in the hope that someone else may find it not entirely without merit. Apologies for the (lack of) indentation.

Merging and Branching

1. Trunk (HEAD/LATEST/[insert version control term here]) is always compilable and stable.

2. When a new project is started, a set of branches are created:
2.1. Integration branch: off trunk;
2.2. Branch for each developer (or whatever the required granularity is, one branch per 2 developers, etc.): off the integration branch;
2.3. All branches should be named sensibly and follow a repository-wide convention (e.g. PRODUCT_TEAM_PROJECT or something equally meaningful).

3. Each developer works on his/her own development branch. Developers must check in every day (PC never contains unsaved data), but are encouraged to do so more frequently.

4. When the developer is happy enough with his/her changes, he/she "rebases" (aka forward merges), that is:
4.1. Updates development branch to the current state of the integration branch (this should be done as often as possible anyways);
4.2. Ensures no one else is merging to the integration branch;
4.3. Tests development with the new changes (in theory runs his/her [J|N]Unit tests, in practice, well... :-)
4.4. "merges back", that is updates integration branch to the state of the development branch;
4.5. Features should be merged one-at-a-time, that is if a developer is working on 5 features for a given release, he/she should merge to integration each feature at a time, allowing other developers to pick and test each change rather than one huge patch.

5. While the project is in development, the integration branch may be rebased from the trunk, but never the opposite (see small bugfixes below).

6. When the project enters development testing (feature freeze):
6.1. All developers rebase from integration and merge back to integration as described above;
6.2. Developers test the current state of the integration branch (normally this means validating the functionality they've coded). Integration branch is by now equal to dev branches;
6.3. Bugfixes are applied to development branches, and rebased/merged back to integration (iterate till right).

7. When integration branch is ready for a release to QC (UAT):
7.1. Release branch is created with unique release number, off integration branch. (i.e. release is "labeled", but this is equivalent to branching in SVN). All dev branches are locked;
7.2. Release is shipped to QC and release branch is locked;

7.3. If release passes QC, ship it. If release needs another spin:
7.3.1. Bugfix branch is created with the bug ticket number and the version number. This is off the integration branch;
7.3.2. Bugfix is made, tested in bugfix branch and rebased back to integration;
7.3.3. When all bugfix branches have been merged in, integration branch is dev tested;
7.3.4. New release branch with release number is created and tested. rinse, repeat;
7.3.5. If required, a "special" release branch is created for the final release so we can distinguish between release candidates and final release.

8. When a release passes QC and is shipped:
8.1. No one is allowed to merge back to trunk (this has to be serialized across all teams using the trunk and must be done asap after the release);
8.2. Integration branch, which at this point is identical to the final release branch, is rebased off trunk;
8.3. Integration branch is tested (QC should get involved);
8.4. Integration branch is merged back into the trunk;
8.5. At this point the release is complete.

9. Small bugfixes:
9.1. Branch of trunk with version number and bug ticket number (just integration branch will do, no need for dev branches);
9.2. Do bugfix in integration branch;
9.3. Dev test integration branch;
9.4. Create release branch off integration branch;
9.5. QC release branch;
9.6. Ship;
9.7. Rebase / merge back to trunk.

10. Ideally, once all the merging is done, branches should be deleted IF the version control system keeps the history of the merged files. This greatly avoids clutter (i.e. you can actually find a branch when looking for it), makes the repository smaller and improves the performance of a few operations.

11. Ideally you should be running something like CruiseControl on selected branches (such as HEAD and integration).

Saturday, June 02, 2007

Nerd Food: On Ubuntu, DELL and the Playstation 3

Unlike many ubunteros, I'm not entirely pleased with the DELL "victory". I mean, I was initially, but reading the small print made me cringe uncontrollably. As a quick summary, for those not following the latest developments, DELL setup a suggestions website called IdeaStorm which was quickly swamped with "I Want Linux" comments. First DELL thought they wouldn't be able to pick and choose from the myriad of available Linux distros, but eventually someone upstairs concluded that Ubuntu was popular enough. As a result, DELL now has a limited range of models with Ubuntu pre-installed (a better description is available here).

While the idea is a good one in principle, the devil is in the details. These machines will not be able to play any of the codecs modern users require, and there will be no DVD support (as in, you can't play movies in encrypted DVD format). This comment is very much telling about the Linux attitude towards the codecs: "they are not free!", "you'd have to install them on windows anyway!", "it's not that hard!".

I've criticised this attitude in the past and will have to do so again. Just because Microsoft, the biggest desktop company in the world, can get away with things, it doesn't mean that all aspiring desktop-wannabees can do it too. Mac is gaining market share because their stuff "just works" - or at least, its perceived as such by everyone. We're not trying to be like Microsoft, we're trying to improve on them. I'm not a stategist, but it seems obvious that DELL and Ubuntu should have talked with Fluendo first before embarking on this adventure and made sure the full range of codecs was available as standard. This would have been great for all parties involved. Fluendo would have agreed to a massively discounted price, a still rather rewarding proposition due to the potential in terms of volume. Codeweavers could have also had a piece of the pie, since software such as iTunes is popular with the crowds. This would have been a more challenging proposition a) because Microsoft seems to dislike Wine quite a lot (and for all we know explicitly asked DELL not to include it) and b) as people start installing random Windows software, the support overload would grow beyond DELL's capacity.

I know Ubuntu has tried to make configuration of codecs and restricted drivers easier but to be absolutely honest, both failed when I tried to use it. The technology does not appear to be entirely mature yet. Now, if the same happens with the newly converts, they will most likely say that "Linux does not work". This was a great chance to woo new users with the beauty of Compiz (if not Beryl), GStreamer et al, but I cannot help but think that a lot of new buyers will end up giving up on Linux because they won't get the whole "configuration" thing. And it's not because Windows is easier to configure; its because Linux is configured differently, and the 50 or so USD you save are not enough to compensate the time needed to learn a new way of doing things.

Which brings me neatly to my next topic. Even more important than DELL is the Playstation 3. There are over 6 million PS3's out there. Some estimate Ubuntu to be installed in 6 to 20 million computers worldwide, so adding 6 million to that number would have a major impact. And the relationship would be entirely symbiotic, since Sony managed to price the PS3 out of the console range and into the low-end PC range; it is possible to buy a DELL model, including a TFT monitor, for around the same price of a PS3 - which is, of course, monitorless. I personally wanted to get a PS3 and use it as a PC, but was not amused when I found out that much of its functionality doesn't work under Linux (including accelerated graphics, wireless, problems with sound - and let's not forget that Flash seems to be 32-bit x86 only at the moment). Not only that, but the entire installation process is non-trivial, meaning that only die-hard ubunteros are going to go for it.

Now, you tell me: if you were a manager at Sony, would you not have started talking to Linux vendors long before the PS3 was due to launch to ensure Linux would be 100% compatible with your hardware? And would you not select a Linux vendor and pre-install the distro? After all, many console users are not IT savvy, they see the console as yet another "white good" in their house. If not, ask youself: what is the point of buying a "'Computer', Not A Console", as Sony's CEO said, if it has no decent general purpose software on it?

Its hard not to feel that we've wasted two great opportunities to fight for market and mind share.

Update: check this for some pics of the setup of a new DELL laptop.