Saturday, November 8, 2008

Linux Software Installation - Either Easy or Painful

I'm certainly not the first to have an opinion about Linux packaging and software installation. Iain Murdock, among others, has had something to say about it, and his points are well taken.

I ended up installing Ubuntu Intrepid today, tossing out Debian Etch, just so I could get a recent Anjuta through the apt packaging system. Ubuntu had version 2.24, and Debian was still at 1.x in Etch (Lenny does have the latest, but considering how many other packages get installed, I didn't want to use a distribution in a testing state).

I spent most of the waking hours of one weekend day trying to build Anjuta 2.24 from source on Debian Etch. I'm not too bad at the process...I've done it a lot, and a decade ago or more that was basically what you had to do on Linux anyhow. However, this particular evolution was bad - once you're in a tarball situation, you'll find that many of your prerequisites also become tarball builds. Mainly because if the package for application/library A in your distro was too old, then probably most of the other packages it needs are as well, for the purposes of the latest source for application/library A.

I did get a good ways into it, installing gtk+, Cairo, Pango, ATK, the latest GLib, and about fifteen other prereqs - sometimes down 3 or 4 levels in the dependency tree. I had to keep very careful track on paper to remember where I was in the game of what needs what. I also had to do some patching, lots of Googling, and even some source code fixes of my own.

I gave up when two things happened. One, in one particular area of my dependency tree, it sure looked like tarball X needed an installed Y, and tarball Y needed an installed X. Two, after patching the source of ORBIT-2.0 once, it was still not compiling, and I was already mentally rejecting the entire idea of building a CORBA ORB just so I could have a nice IDE.

Some research had already indicated that moving over to Ubuntu 8 would allow me to apt-get install Anjuta, and so it was. The actual time spent in downloading and burning the Ubuntu ISO, installing Ubuntu, setting up my WUSB600 wireless adapter with ndiswrapper, re-installing Java 6, NetBeans 6.1, GHC 6.8.2, J 6.0.2, ntfs-3g, and installing Anjuta 2.4 was perhaps 3 hours, and could have been reduced to two hours if the package repository hadn't been so slow.

Point being, things are pretty bad when a body is willing to change their Linux distro just to avoid building from a tarball with umpteen dependencies.

Murdock states that when you can locate a recent version of your desired application or library in your distro, you're laughing. Well, no argument from me.


A little tip for novices wanting to pass an environment variable when using sudo...you'll often want to sudo when installing from a .bin or .sh, so that the script goes to /usr/local rather than your $HOME:

sudo JAVA_HOME=/usr/local/jdk1.6.0_10 ./netbeans-6.1-ml-linux.sh

Another little tip. To get the .bin script for JDK on Linux to install where you want it to go, cd to the target directory before running the script. This is actually in the Sun installation documents, which all of us read of course. :-)

The obvious problem - the central problem - is dependencies. This is a problem for package systems as well as tarballs. There are two main issues here, as I see it. First off, while I'm cool with modular development, I'm not cool with with Swiss Army apps that require 15 or 20 other programs and libraries. Some dependencies are clearly acceptable...others should be optional (many more than is currently the case). I'm specifically talking about absolute dependencies. And from time to time, maybe re-inventing the wheel in one or two source files is preferable to requiring someone else's library, which in turn requires four more. If you simply must use someone else's code, choose something that fits the bill but minimizes the agony for end-users.

What would be really sweet is if developers cooperated on more public APIs. Let's say that every C/C++ implementation of XSL-FO provided support for Version 1.3 of the C/C++ XSL-FO Processor API, perhaps by providing adapter classes or functions. That way a programmer who needed XSL-FO support in his application could just code to the API header files, knowing that any compliant XSL-FO processor could be chosen by the end-user.

Second, developers tend to use the latest version of everything. What Java developer do you know who develops new stuff using JDK 1.4? Not if they have a choice about it. Similarly, C/C++ developers use the latest version of every library they need. An end-user may have version 1.8.2 of libfoo, but be forced to install version 1.9.0, for no good reason at all...the older version may have been fine.

I recognize that asking developers to examine, build with, and test with, earlier versions of required libraries is extra work, but it cuts down on the aggro for end users. In a number of cases it may get someone to actually use the program rather than having them abandon the build process in disgust, or stick with an earlier version they can install through a packaging system.

In a future related post I'll have more to say about public APIs.

No comments:

Post a Comment