Sunday, November 16, 2008

Page Controllers and Front Controllers

There is a variety of opinion as to whether JSF uses a page controller pattern or a front controller pattern. I've personally managed to get by using the technology without knowing the answer to this burning question, but I confess to having got interested despite myself. After failing to locate a site that analyzed the matter to my satisfaction I decided to figure it out for myself.

Note that both patterns are specific controller implementation strategies within MVC. To refresh your architectural understanding, Model View Controller (MVC) is a way of managing interaction between a user of a system and the data and behaviour of the system. The Model is responsible for the data and behaviour; the Controller interprets user inputs; the View manages display of information. Both the Controller and the View depend on the Model.

For starters, here are some definitions of the Page Controller pattern:

Martin Fowler - An object that handles a request for a specific page or action on a Web site. Either the page itself or an object corresponding to it;

Microsoft - each dynamic Web page is handled by a specific controller. Possible use of controller base class.

Sun - doesn't apparently admit that there is such a thing in the J2EE world.

And here are some definitions of the Front Controller pattern:

Martin Fowler - A controller that handles all requests for a Web site;

Microsoft - single controller coordinates all of the requests that are made to the Web application. The controller itself is usually implemented in two parts: a handler and a hierarchy of commands.

Sun - The controller manages the handling of the request, including invoking security services such as authentication and authorization, delegating business processing, managing the choice of an appropriate view, handling errors, and managing the selection of content creation strategies.
Note: Sun also mentions typical ways of decomposing a complicated front controller.

Fowler mentions Application Controller in the context of front controllers: A centralized point for handling screen navigation and the flow of an application. Sun Java architects (for example, Alur, Crupi & Malks) mention the use of an application controller in conjunction with a front controller as well.

Now, what does JSF do? Well, no question but that the
FacesServlet intercepts all requests. However, if you peruse the source code for a typical FacesServlet implementation class, it doesn't do much - create the Lifecycle and FacesContextFactory instances in init(), and then call execute() and render() on that lifecycle instance in the servlet service() method, obtaining a new FacesContext instance for each request to pass to execute() and render().

What actually does the navigation - assuming you haven't customized this - is the default
NavigationHandler. Although even this is just acting on the result of an action and the rules defined in faces-config.xml. So some of your controller logic is embodied in the navigation rules and cases...the rest of it resides in the code you write in your managed beans, because that's what actually generates the outcomes acted upon by the navigation handler.

Now, is a managed bean - specifically one that hosts methods referenced in page component action attributes - a page controller, or is it a decomposed chunk of a conceptual application controller supporting the
FacesServlet front controller? There is no shortage of authors who argue that JSF follows a page controller pattern, precisely because so much of the dynamic navigation is contained in objects that often have nearly a one-to-one correspondence with pages.

Nevertheless, we're faced with one inescapable fact - each request to a JSF page goes to
FacesServlet, and the flow of execution is through a stack of other JSF classes. In particular, there is not a page controller for each page that is run by the server (this is the default ASP.NET model). So I would classify JSF as definitely using a Front Controller - it just so happens that identifying FacesServlet as being the controller is misleading, albeit technically accurate. It's really FacesServlet plus a whack of other stuff.

As an aside, the ASP.NET MVC framework uses the Front Controller pattern. The routing subsystem is sort of like
FacesServlet, and then there are a bunch of user controller classes that act on requests that are routed to them. I consider this relatively new offering from Microsoft well worth checking out.

Sunday, November 9, 2008

Ubuntu, Subversion, Apache and Blogs

The title says it all...you Googled because you've got Ubuntu, and you're trying to get Subversion working with Apache, likely Apache2. This probably also isn't the first blog or article that you stumbled across. You may already be in some state of frustration by now, particularly if - like me - it's the first time you encountered the non-standard way that Apache2 is configured on Debian and Ubuntu.

I have no intentions of spelling everything out in detail. I am just going to indicate what worked for me. Also, my intention was merely to get Subversion working over HTTP, with basic password authentication...nothing fancy. If something I did conflicts with other blogs, I'll point it out, and explain why. My primary objective was to get a working setup - I can tweak it later. Once you know that something works, more or less, you're 90% there.

To cut to the chase, before I discovered that Ubuntu configures Apache 2.x in a non-standard way, I messed up the installation...both in /etc/apache2 and also in /usr/local/apache2. Because I'd started out by building Apache from a tarball, which I'm used to on other platforms. If you end up in the same fix, ruthlessly prune the above directories, and use Synaptic or another apt GUI to zap every package related to Apache2. Maybe even completely remove all Subversion-related packages ("Mark for complete removal" approach). Start with a clean slate, in other words.

Here's what Debian/Ubuntu does for Apache2 setup. The /etc/apache2 directory is where stuff happens. The central file is apache2.conf - this looks like the httpd.conf you are familiar with. In fact there is a /etc/apache2/httpd.conf..it starts out empty, and for the purposes of this discussion you won't need it.

apache2.conf includes, among other things, all *.conf and *.load files in /etc/apache2/mods-enabled. The former have configuration directives associated with the LoadModule directives found in the corresponding *.load files. apache2.conf also pulls in VirtualHost definitions in /etc/apache2/sites-enabled. The ports.conf file defines port numbers. The user/group for Apache2 is set in the envvars file.

If starting from scratch, just use apt-get (with sudo if necessary) to install apache2, then subversion, then libapache2-svn. If (it happened to me) you get complaints during the libapache2-svn package install of dav_svn not being found, you may discover that you have no dav_svn.conf and dav_svn.load in /etc/apache2/mods-available. A simple fix for this is just to create them yourself, as follows:

dav_svn.conf:


<Location /test>
DAV svn
SVNPath /var/svn/test
AuthType Basic
AuthName "Subversion Repository"
AuthUserFile /etc/apache2/dav_svn.passwd
<LimitExcept GET PROPFIND OPTIONS REPORT>
Require valid-user
</LimitExcept>
</Location>


dav_svn.load:

LoadModule dav_svn_module /usr/lib/apache2/modules/mod_dav_svn.so
LoadModule authz_svn_module /usr/lib/apache2/modules/mod_authz_svn.so

Note that I've set this up for a single Subversion repository located at /var/svn/test, to be accessed with

http://:/test/

Re-install libapache2-svn if you had to create these files for the aforementioned reason. Hopefully it will succeed.

Start or restart your server with

sudo /etc/init.d/apache2 start/restart

At this stage you ought to see your "It Works!" page at http://localhost/. If so, create your Subversion repository if necessary...in any case edit the /etc/apache2/mods-available/dav_svn.conf to reflect the actual location and the path you want to use to access it. Also, create a user with htpasswd, as in

sudo htpasswd -cm /etc/apache2/dav_svn.passwd myuser

Ensure that your repository user (chown -R) is www-data (the user as set up in envvars), and that you have appropriate permissions.

Note: I didn't even touch anything to do with virtual hosts. My successful setup is working with the 000-default virtual host...no others are defined, and I didn't edit the configuration file associated with the default host. Some blogs suggest that you must mess around with virtual hosts to get Subversion running with Apache2 on Ubuntu/Debian - that's not the case. i'm not suggesting that you shouldn't set up a virtual host to access your repo with - fill your boots. But at this point you probably just want to do Subversion things over HTTP and worry about details later.

Another note: if you do create virtual hosts, don't forget /etc/hosts.

I also didn't mess around with SSL. Once you get this far - in my experience - it's no big deal. In any case I don't need it...my repository is a private one for testing. The ones I really manage require users to be on a private network before they ever contemplate accessing a Subversion server.

One main point I'd like to make: don't trust blogs, including this one. Blog writers aren't usually trying to make your life miserable, either. But we forget that critical step (or all three of them) that really made the evolution succeed. We also forget that something often works in more than one way. Finally, we omit mention of things that are obvious to us, but not to others. Was the fact that Ubuntu configured Apache 2.x differently obvious to me? Hell no. And a lot of blog writers didn't say anything about that at all.

Saturday, November 8, 2008

Linux Software Installation - Either Easy or Painful

I'm certainly not the first to have an opinion about Linux packaging and software installation. Iain Murdock, among others, has had something to say about it, and his points are well taken.

I ended up installing Ubuntu Intrepid today, tossing out Debian Etch, just so I could get a recent Anjuta through the apt packaging system. Ubuntu had version 2.24, and Debian was still at 1.x in Etch (Lenny does have the latest, but considering how many other packages get installed, I didn't want to use a distribution in a testing state).

I spent most of the waking hours of one weekend day trying to build Anjuta 2.24 from source on Debian Etch. I'm not too bad at the process...I've done it a lot, and a decade ago or more that was basically what you had to do on Linux anyhow. However, this particular evolution was bad - once you're in a tarball situation, you'll find that many of your prerequisites also become tarball builds. Mainly because if the package for application/library A in your distro was too old, then probably most of the other packages it needs are as well, for the purposes of the latest source for application/library A.

I did get a good ways into it, installing gtk+, Cairo, Pango, ATK, the latest GLib, and about fifteen other prereqs - sometimes down 3 or 4 levels in the dependency tree. I had to keep very careful track on paper to remember where I was in the game of what needs what. I also had to do some patching, lots of Googling, and even some source code fixes of my own.

I gave up when two things happened. One, in one particular area of my dependency tree, it sure looked like tarball X needed an installed Y, and tarball Y needed an installed X. Two, after patching the source of ORBIT-2.0 once, it was still not compiling, and I was already mentally rejecting the entire idea of building a CORBA ORB just so I could have a nice IDE.

Some research had already indicated that moving over to Ubuntu 8 would allow me to apt-get install Anjuta, and so it was. The actual time spent in downloading and burning the Ubuntu ISO, installing Ubuntu, setting up my WUSB600 wireless adapter with ndiswrapper, re-installing Java 6, NetBeans 6.1, GHC 6.8.2, J 6.0.2, ntfs-3g, and installing Anjuta 2.4 was perhaps 3 hours, and could have been reduced to two hours if the package repository hadn't been so slow.

Point being, things are pretty bad when a body is willing to change their Linux distro just to avoid building from a tarball with umpteen dependencies.

Murdock states that when you can locate a recent version of your desired application or library in your distro, you're laughing. Well, no argument from me.


A little tip for novices wanting to pass an environment variable when using sudo...you'll often want to sudo when installing from a .bin or .sh, so that the script goes to /usr/local rather than your $HOME:

sudo JAVA_HOME=/usr/local/jdk1.6.0_10 ./netbeans-6.1-ml-linux.sh

Another little tip. To get the .bin script for JDK on Linux to install where you want it to go, cd to the target directory before running the script. This is actually in the Sun installation documents, which all of us read of course. :-)

The obvious problem - the central problem - is dependencies. This is a problem for package systems as well as tarballs. There are two main issues here, as I see it. First off, while I'm cool with modular development, I'm not cool with with Swiss Army apps that require 15 or 20 other programs and libraries. Some dependencies are clearly acceptable...others should be optional (many more than is currently the case). I'm specifically talking about absolute dependencies. And from time to time, maybe re-inventing the wheel in one or two source files is preferable to requiring someone else's library, which in turn requires four more. If you simply must use someone else's code, choose something that fits the bill but minimizes the agony for end-users.

What would be really sweet is if developers cooperated on more public APIs. Let's say that every C/C++ implementation of XSL-FO provided support for Version 1.3 of the C/C++ XSL-FO Processor API, perhaps by providing adapter classes or functions. That way a programmer who needed XSL-FO support in his application could just code to the API header files, knowing that any compliant XSL-FO processor could be chosen by the end-user.

Second, developers tend to use the latest version of everything. What Java developer do you know who develops new stuff using JDK 1.4? Not if they have a choice about it. Similarly, C/C++ developers use the latest version of every library they need. An end-user may have version 1.8.2 of libfoo, but be forced to install version 1.9.0, for no good reason at all...the older version may have been fine.

I recognize that asking developers to examine, build with, and test with, earlier versions of required libraries is extra work, but it cuts down on the aggro for end users. In a number of cases it may get someone to actually use the program rather than having them abandon the build process in disgust, or stick with an earlier version they can install through a packaging system.

In a future related post I'll have more to say about public APIs.