July 24, 2006

CalendarTag: A simple JSP Calendar Component

I was looking around at jsp calendar components for a client. I'd previously had a hard time finding a decent one, so I didn't hold out much hope. Luckily, this search was slightly different--I don't need any fancy features, just a simple calendar display rendered in html, with links on the day numbers. In addition, the calendar needs to be indexable, so FlatCalendarXP, which I've used before, is not an option.

I looked at the HtmlCalendarBean and the Calendar Taglib, both from ServletSuite. I expected the price to be reasonable, but didn't expect source, which was important for the client.

A bit of digging around on SourceForge found the perfect project: CalendarTag. This project looks abandoned, but the author got things into good shape before ceasing development. I'd say the project is mature, rather than abandoned. In addition, he responded to my questions (including "is this project alive?") in less than 24 hours.

I found the documentation to be very useful. In addition, you can customize the calendar's output to a very large degree. I especially like the decorator, which lets you control exactly how each day is displayed by implementing a relatively simple interface or extending a a class.

Make no mistake. This is a simple component which just displays a calendar in html. There's no support for users or events (Update 3:24. I should have read the docs more closely and said, there's no built in support for events or users. From within the calendar, you have access to the request and so can code up event handling and/or user specific calendars.). If you want those features, I'd look at a more complex calendaring system. Calendartag does something simple and does it well.

One tip--you'll need to have the standard-1.0.4.jar file available to your web application, as well as the calendartag-1.0-rc4.jar file, otherwise you see this rather fearsome exception:

org.apache.jasper.JasperException: org/apache/taglibs/standard/tag/el/core/ExpressionUtil
	at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:254)
	at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:295)
	at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:241)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:247)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:193)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:256)
	at 
org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:643)
	at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:480)
	at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:995)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
	at 
org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:643)
	at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:480)
	at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:995)
	at org.apache.catalina.core.StandardContext.invoke(StandardContext.java:2422)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:180)
	at 
org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:643)
	at org.apache.catalina.valves.ErrorDispatcherValve.invoke(ErrorDispatcherValve.java:171)
	at 
org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:641)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:163)
	at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:641)
	at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:480)
	at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:995)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:174)
	at 
org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:643)
	at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:480)
	at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:995)
	at org.apache.coyote.tomcat4.CoyoteAdapter.service(CoyoteAdapter.java:199)
	at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:828)
	at 
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.processConnection(Http11Protocol.java:700)
	at org.apache.tomcat.util.net.TcpWorkerThread.runIt(PoolTcpEndpoint.java:584)
	at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:683)
	at java.lang.Thread.run(Thread.java:595)

root cause

javax.servlet.ServletException: org/apache/taglibs/standard/tag/el/core/ExpressionUtil
	at org.apache.jasper.runtime.PageContextImpl.handlePageException(PageContextImpl.java:536)
	at org.apache.jsp.index_jsp._jspService(index_jsp.java:59)
	at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:137)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
	at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:210)
	at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:295)
	at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:241)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:247)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:193)
	at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:256)
	at 
org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:643)
	at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:480)
	at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:995)
	at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
	at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:643)
	at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:480)
	at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:995)
	at org.apache.catalina.core.StandardContext.invoke(StandardContext.java:2422)
	at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:180)
	at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:643)
	at org.apache.catalina.valves.ErrorDispatcherValve.invoke(ErrorDispatcherValve.java:171)
	at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:641)
	at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:163)
	at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:641)
	at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:480)
	at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:995)
	at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:174)
	at 
org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:643)
	at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:480)
	at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:995)
	at org.apache.coyote.tomcat4.CoyoteAdapter.service(CoyoteAdapter.java:199)
	at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:828)
	at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.processConnection(Http11Protocol.java:700)
	at org.apache.tomcat.util.net.TcpWorkerThread.runIt(PoolTcpEndpoint.java:584)
	at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:683)
	at java.lang.Thread.run(Thread.java:595)
Posted by moore at 12:05 PM

April 23, 2006

Jini and JavaSpaces at BJUG

I went to BJUG last week to see a presentation about Jini by someone from GigaSpaces. It was an intensely interesting presentation for a number of reasons. First off, I knew the presenter, Owen Taylor. About 6 years ago, I took a class from him, along with a few other people. The class covered BEA Weblogic and EJBs. I've attended (and given a couple) technical presentations in my time including some conferences. I don't think I've ever met someone who was more energetic and practiced at conveying hard concepts than Owen Taylor. Owen! Start blogging!

Another reason it was interesting is that Brian Pontarelli, an old friend, really likes Jini and has told me about some of his experiences. I actually looked into it when Bill DeHora published his entry two classic hardbacks. I downloaded Jini and JavaSpaces (Jini is the framework, JavaSpaces is the tuple space repository.) and started playing with it. The final reason that it was an interesting presentation is that JavaSpaces is something that I have never had a chance to use, and didn't foresee using in the future. By the end of the presentation, I was convinced that this concept deserved more research, if nothing else.

What follows are my scribbled notes from that meeting, along with a smattering of other comments and thoughts regarding these technologies. More information is here, however no presentation artifacts are available, unfortunately.

The problem with distributed systems is that they move data around a lot. What you really want is for the processing and the data to be at most one step apart. Stored procedures do this, but you can't change the logic easily.

Jini was originally developed for pervasive computing, but the focus of the presentation was on the enterprise applications that can be built based on that spec. This class of applications has some amazing features, including low latency, extremely high throughput and '100%' uptime capability.

For that reason, many large institutions are looking at replacing or augmenting JEE (nee J2EE) applications with JavaSpace applications. He mentioned that GigaSpaces recruited him with the notion that a laptop could run 3 million events an hour. This kind of blew his mind.

JavaSpaces is the command pattern--code and data are distributed, based on Linda. Orbitz uses the technology and talks about 100% uptime. Anyplace where you are batching, you can now do it in real time. The key is to keep everytihing in memory and use replication for persistance, rather than disk. (Eventually you want to push it to disk, for reporting and auditing purposes, but you can do that asynchronously.) Databases tend to be used as a bus between in memory processes right now, and you can replace that with a JavaSpace.

Jini is composed of discrete objects that can run anywhere; more to the point, they don't care where they run. It also expects failure, as opposed to many other technologies that simply assume that things will run correctly. Jini is a LAN based technology, though Owen mentioned that there are ways to turn it into a WAN technology and cited several examples. I am not competent to give a general overview of Jini--please check out this tutorial for more information.

One thing that really struck me is that all of the complexity that EJB and other JEE technologies hide (clustering, transaction management, thread management, lifecycle), JavaSpaces revels in. Owen actually mentioned that JavaSpaces brings skills that JEE developers currently rarely need to use, like threading and classloading, back into the toolbox, rather than depending on a vendor. That can be a plus and a minus, right? The whole point of not trusting servlet threading to a business developer is that it allows them to focus on the business logic. The problem with much of JEE is that it hasn't done a very good job of doing this. Do you remember the 'deployer' role?

Jini has only interfaces; the named implementations are shipped around transparently. Ha ha, just like EJB remote calls are transparent. However, one very nice aspect of Jini is that when you register an implementation of an interface, you say how long the implementation is going to be available (the lease length). As a service provider, you can keep track of that lease and re-register yourself when it is near to up. Of course, if the service is no longer available, for whatever reason, it is not provided to clients--there's no need for the JVM to garbage collect. The clients do need to be a bit smart about things though.

As for licensing, version 2.0 has been released under an apache license, as opposed to the Sun Community Source License, which was the previous license. This should grow the jini.org community significantly.

Configuration of Jini takes place with a java syntax, which can be a bit confusing, since you don't compile and execute it. The names of the services (reggie, webster) are a bit cutesy. Webster is the web server which serves implementation classes, but shouldn't be used in a production environment. Use Tomcat.

Spring and JavaSpaces are complementary; work is in progress to integrate them and completion is expected in the next few months. GigaSpaces has scaled implementations (linearly!) to 2000 cpus on 500 machines....

At this point Owen began talking about various architectural patterns that could be used with Jini; he also covered some war stories. However, I didn't take any notes--you'll have to see him talk sometime.

Issues include (so my friend says) versioning. Owen mentioned that debugging isn't a strong suit. And I did some parallel computing for my senior thesis so I know that splitting up problems so they can be parallelized is not always as easy as you'd like. However, the web paradigm is actually rather suited to parallelization, since you do have the request/response model. The problem is, as it so often is, state.

Posted by moore at 01:42 PM

April 13, 2006

Eclipse impressions

I have previously espoused opinions about IDEs. But, I've heard great things about Eclipse, including this rather direct statement from a developer who I respect:

Having a solid IDE like IntelliJ or Eclipse so radically improves your productivity that I quite simply don't see how you can call yourself a professional developer without using one.

So, I thought I'd give Eclipse a try. Again. The latest version is Eclipse 3.1. This time I wasn't going to try to get by with just the free tutorials. I did some browsing on Amazon and found Eclipse Distilled. This book, while aimed at Eclipse 3.0, is eminently readable and quite informative on the Eclipse way of doing things. All the views and perspectives and projects and jargon can be a bit confusing, so I was happy to pay $35 for this guide.

After using Eclipse for a few weeks, I have some likes and dislikes:

Likes:


  • Code completion: huge. Hitting control-space and choosing a method rather than having to remember exactly what it is named is big. (Charles Petzold talks about a similar feature called IntelliSense in Visual Studio and some of the ramifications. Not sure if all of them apply to Eclipse.)

  • Integration with existing projects: while you can easily start new projects with Eclipse, I was also very impressed with how easy it was to bring an existing codebase into the system and begin using Eclipse to modify it.

  • Refactoring: again, huge. I find that I use the 'rename method' refactoring most often. The ability to just change the name of a method in one place and have it propagate allows you much more flexibility.

Dislikes:


  • Using CVS externally confuses Eclipse: I consider myself a power user of CVS. This means it's often easier for me to drop down and run commands from the prompt. This seems to confuse Eclipse, especially when I'm adding files.

  • No support for local CVS repositories: it's a known bug, with some workarounds available.

  • Memory usage: 150M of memory is used, even when it is doing nothing. Now, I realize that most new boxes are shipped with gigs and gigs of memory, if you run Eclipse inside VMware with Oracle and Tomcat, eventually things start to get a bit crowded.

  • I have a few other quibbles, but above are the main ones I've run into so far.

So, ok, ok. I was wrong. Those of you who have used Eclipse or NetBeans or VisualStudio or IntelliJ or Visual SlickEdit are snoozing right now, but I've learned something important. IDEs can be very good and when a free cross platform IDE is available and paired with an external build tool, the results can be powerful indeed.

Posted by moore at 09:27 AM

March 22, 2006

Ant techniques

The JavaRanch Journal has a new newsletter out; one of the articles is an interesting look at some of the new, advanced 'enterprise' features of Ant. This is just part one of the series by Ajith Kallambella; I'll be keeping my eyes out for the next parts.

Posted by moore at 08:29 AM | Comments (0)

March 10, 2006

Exceptions in API design

Here's an old but fantastic post about API (mis)design.

To handle an exception, you have four choices, one of which is:

Log it! We bought big hard disks for those servers, let's use them! Log the exception toString() and print its stack trace, but only if you expect the exception to be thrown over 1,000,000 times each day. Alternatively, if you think that the exception would only occur rarely and that it could indicate a problem worth looking at, just print the exception class name ... since stack traces just confuse people!

Hilarious. And, it looks like it's part 4 of a 7 part series. Via Dejan Bosanac.

Posted by moore at 04:06 PM | Comments (0)

November 29, 2005

Running Tomcat on port 80

The typical java web application is fronted by a web server (usually Apache) for a number of reasons. Apache handles static content well, and also is easier to configure to listen on privileged ports (under 1024). I've written before about different options for connecting Tomcat and Apache, but there are times when all you need is a servlet engine, and installing Apache is overkill. If you don't want users to see a nonstandard port in their url (http://foo.com:8080/webapp/), then you have a couple of options.

You can run tomcat as root. This is probably not a good idea, since anyone who can write a jsp can now execute arbitrary commands as root. I don't know how Tomcat's security is, but in general, the fewer applications running with super user privileges, the better.

If you share my dislike of Tomcat running as root, here's an excellent rundown of the options for running Tomcat on port 80. I went the route of jsvc. This seemed to work just fine, though every time we shut down tomcat, we would get an entry in the error log file: jsvc.exec error: Service exit with a return value of 143.

That didn't start to disturb me until I realized that the destroy method of our servlets weren't being called. This method cleaned up after the servlet and it was important that it get executed. A bit of googling turned up a discussion of this very problem. The version of jsvc that ships with Tomcat 5.0.27 doesn't shut down Tomcat very nicely.

I downloaded and compiled subversion, because that's the version control system that the daemon jakarta project (of which jsvc is a part) used. I then checked out the version of the source tagged daemon-1_0_1 (svn co http://svn.apache.org/repos/asf/jakarta/commons/proper/daemon/tags/daemon-1_0_1/) and rebuilt jsvc. This new version allows tomcat to call the destroy methods of servlets, and everything seems to be happy.

Posted by moore at 10:06 AM | Comments (0)

October 26, 2005

JBoss at Work ships

JBoss at Work by Thomas Marrs and Scott Davis and a book I technically reviewed, is shipping. Having read all of it, I'd say it's worth a look both for the technical content--the authors take the reader all the way through a standard J2EE application, pointing out all the JBoss specific configurations and gotchas--and for the slightly whimsical and easy to read style. Sometimes it was a be repetitive, but that's not bad for a book aimed at getting folks without much experience up and running. Read the sample chapter on ear building and deployment and see if it fits with your needs.

Posted by moore at 04:15 PM | Comments (0)

October 19, 2005

Calling one servlet from another, part 2

Previously, I'd discussed calling one servlet from another. The best answer I've found is not to parse the content yourself, but to use Apache HttpClient which does a fantastic job of dealing with http's idiosyncracies.

Posted by moore at 09:55 AM | Comments (0)

September 22, 2005

InstallAnywhere Impressions

I helped write a java program a few months ago, a product designed to run on mail servers. Now, we had the program packaged up as a zip file (for windows) and a tarball (for unix). That wasn't my decision, but it makes sense to me--if you are deploying a program on a mail server, you should know how to unzip files and edit configuration files.

But, that's not what the client wanted. They came back recently with a few changes, and a desire to install via InstallAnywhere. I am no expert at InstallAnywhere, but the client didn't have the engineering cycles to repackage the program, so they paid me to do it. What follows is my overall impression of InstallAnywhere, and a few tips and tricks.

Overall, I like InstallAnywhere. This program makes it easy to build java program installers for a variety of different platforms (most of the unices, Macs and Windows), execute sundry actions pre and post install, and grab user input while installing. It supports both GUI and console installation procedures. In particular, the Windows installer was a snap, and I didn't have to learn the first thing about the registry--InstallAnywhere took care of that, even to the point of having the program show up on the 'Add/Remove Programs' Control Panel.

On the downside, there are a bevy of options and the help file wasn't exactly the best. They have a free trial version, but it complains every time you install from a file built with the trial version; such installers stop working around 10 days after you build with the trial version as well--but the trial version doesn't tell you about the future failure.

A few tips:

* It's possible to keep the install configuration file in CVS, except for the fact that it hardcodes paths to resources that it includes in the install file. I was able to work around that by using ant's replace task.

* When you start up the program (on unix), you can't kill it normally, via either cntrl-c or backgrounding it and running the kill command on the process. I believe this is because the default behavior of a launcher is to listen to the console for stdin. You can change this easily enough, but the default really should be correct.

* The installer builder doesn't default to generating a log, even though many of the default log messages point you to the install log file. You have to click a checkbox on the Project Info Pane in the Advance Installer.

* The console installer insisted that there were errors in the installation process even though the program, post install, worked fine and there were no errors written in the installer log. I think this is due to the fact that I'm using the trial version, but am not sure.

* There doesn't seem to be any way in the InstallAnywhere GUI to specify that if the DISPLAY variable is set (on unix systems), the GUI installer should run, otherwise the console installer should run. If you want, you can edit the generated install.bin installer script--search for 'FAILSAFE' and use a modern editor capable of long lines--but I couldn't figure out a way to automate this change. This is my biggest gripe, since this is a very typical demand. If you don't run install.bin -i console to specify a console installation, you get a lovely exception:

Stack Trace:
java.awt.HeadlessException:
No X11 DISPLAY variable was set, but this program performed an operation which requires it.
        at java.awt.GraphicsEnvironment.checkHeadless(GraphicsEnvironment.java:159)
        at java.awt.Window.(Window.java:317)
        at java.awt.Frame.(Frame.java:419)
        at java.awt.Frame.(Frame.java:384)
        at javax.swing.JFrame.(JFrame.java:150)
        at com.zerog.ia.installer.LifeCycleManager.f(DashoA8113)
        at com.zerog.ia.installer.LifeCycleManager.g(DashoA8113)
        at com.zerog.ia.installer.LifeCycleManager.a(DashoA8113)
        at com.zerog.ia.installer.Main.main(DashoA8113)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:585)
        at com.zerog.lax.LAX.launch(DashoA8113)
        at com.zerog.lax.LAX.main(DashoA8113)
This Application has Unexpectedly Quit: Invocation of this Java Application has caused an InvocationTargetException. This application will now exit. (LAX)

Overall, I'd say if you're delivering a java based product to typical users, InstallAnywhere is a good choice. I don't know how much it costs, but the experience for the users is as seamless as you want it to be, the width and breadth of installer customization is impressive, and it didn't take too long to get up to speed.

Posted by moore at 11:59 AM | Comments (0)

August 24, 2005

Webtop customizations and a java.lang.NoClassDefFoundError

So, a few weeks back, I was working with a webtop customization (webtop is a web interface to Documentum). We were getting a weird error whenever we tried to access it. When I showed details, I saw this message:

com/documentum/web/formext/action/IActionPrecondition

After closing this window, press the Refresh or Reload button on your browser to continue.

Hide Details

Stack Trace:

java.lang.NoClassDefFoundError: com/documentum/web/formext/action/IActionPrecondition at java.lang.ClassLoader.defineClass(Ljava.lang.String;[BIILjava.security.ProtectionDomain;)Ljava.lang.Class;(Unknown Source)
at java.security.SecureClassLoader.defineClass(Ljava.lang.String;[BIILjava.security.CodeSource;)Ljava.lang.Class;(SecureClassLoader.java:123)
...

We spent the better part of the day debugging it. The IActionPrecondition class was in the WEB-INF/classes directory of the webtop web app. It looked just the same as it had before (no recent modifications). When we commented out the Precondition, we didn't see this error, but still couldn't use the customization. It happened on Unix and on Windows.

We were msytified, and ended up having to go back to a system that worked, then move forward very slowly until the system didn't work, then focus on those changes (all hail CVS).

What we found suprised the heck out of me. Basically, we have a jar file of all the TBOs and documentum utilities that we have to put in the system classpath, because it needs to be loaded when the dfc.jar is loaded. (These are the words of the documentum experts I was working with.) The webtop custom classes were inadvertantly included in this jar file, and were loaded by the system classloader. But some of these classes depended on classes in the webtop web apps classes directly, loaded by that descendant classloader. Hence, the custom classes were couldn't find their super classes and were invalid. Once we moved the custom webtop classes out of the TBO jar files, everything was good as gold.

I once worked with a colleague who said, rather than calling her a senior software developer, she sometimes felt she should be called a senior classpath debugger. Indeed.

Posted by moore at 04:30 PM | Comments (0)

August 09, 2005

Lesson learned: copying bytes from an OutputStream to an InputStream in Weblogic 8

A few weeks back, I posted about calling one servlet from another, passing a multipart/form-data for submission. I ended up using an HttpURLConnection and POSTing to the second servlet. My target platform was Weblogic 8 on Solaris 8, but I was developing on Windows Server 2003; both had Weblogic running on a Java 1.4 platform.

I set both the doOutput and doInput properties of the connection to true, so I can first post the needed data and then read the response, which I can parse and pass on to the user. Here's the setup:

    URL url = new URL("http://localhost/myws/myservlet");
    URLConnection con = url.openConnection();
    HttpURLConnection conn = (HttpURLConnection)con;
    conn.setDoOutput(true);
    conn.setDoInput(true);
    conn.setRequestProperty("Content-Type", request.getHeader("Content-Type"));
    conn.setRequestProperty("Cookie", request.getHeader("Cookie"));
    BufferedInputStream bis = new BufferedInputStream(request.getInputStream());
    BufferedOutputStream bos = new BufferedOutputStream(conn.getOutputStream());

I set the Cookie property so that the user does not have to reauthenticate to the second servlet.

However, when I was reading from the first servlet's InputStream and writing to the second servlet's OutputStream, I encountered some issues. The below code ran perfectly fine in the development environment, but in the production environment, the file being transferred was corrupted. The corruption happened in a number of ways, actually. At first, the file was about twice the size. Then, after fiddling a bit and moving the bis.close() and bos.close() closer to the write/read, I was able to get the file to be smaller than the posted file. However, it was a zip file and was still corrupt.)

    byte[] data = new byte[1024];
    int res = bis.read(data);
    while (res != -1) {
        bos.write(data);
        res = bis.read(data);
    }
    bos.flush();
    // other stuff, like getting response code
    bis.close();
    bos.close();

You can see that I'm double buffering the InputStream here, first with the BufferedInputStream, and then with my byte array, data[]. That's apparently unnecessary; I don't know where I got the idea. It actually leads to loss of data on the Solaris box.

Luckily, the Java Cookbook (and for that matter, the I/O section of the Java Tutorial) suggested code similar to this:

    int res;
    while ((res = bis.read()) != -1) {
        bos.write(res);
    }
    bis.close();
    bos.close();

This code works on both Windows and Solaris. I'm a bit mystified as to why the first byte transfer method worked on Windows but not Solaris, and I thought I'd post this so that any other folks struggling with the same problem don't endure the debugging that I did.

Posted by moore at 07:06 AM | Comments (0)

July 12, 2005

Building a Full-Text Search Engine from Open Source Components

A friend and former colleague did a presentation a few months ago about "Building a Full-Text Search Engine from Open Source Components". The slides are up. From the abstract:

In addition to the many useful open source applications that are available ready-to-run, there are quite a few open source APIs out there that are just crying out to be combined in new, useful, and interesting ways. By "just" writing a few lines of code to join them together it should be possible to build a new application that has a unique set of features.

Posted by moore at 10:07 PM | Comments (0)

July 11, 2005

Calling one servlet from another

So, I'm building a RESTful web service for a client, which is going to accept a large (60 mbish) file and a set of parameters that are attributes of the file, using the multipart/form-data enctype. The idea is to have this service be available for external programs to post to, but to also provide a nice web interface. I built another servlet which generates the usable user interface (the UI servlet), and am now having trouble pushing the data over to the RESTful servlet. After the RESTful service is called, the UI servlet needs to ensure that any errors are understandable to the user.

It looks like there are a couple of options:

1. Use RequestDispatcher to hand the request entirely over to the service. This is easy, but it means that the service now needs to return a human readable response, or you need to insert a filter to provide one.

1a. Have the RESTful service take a parameter which indicates whether its caller is another program or a human being, and use the RequestDispatcher from the UI servlet.

1b. Have no UI servlet at all, but just have the RESTful servlet be able to generate a nice user interface (or redirect to a pleasant looking JSP) via a given parameter.

2. Use the URL and HttpURLConnection objects to have the UI servlet post to the RESTful servlet just as you'd post to any other remote resource on the internet with java. This seems to work ok (I think), but requires (I also think) an absolute URL and also requires a bit of I/O to push the stream of bytes from the UI servlet to the RESTful servlet.

I can't think of any other ways to solve this problem, and the only other solution that searching turned up is a no-no in modern servlet engines.

Posted by moore at 10:18 PM | Comments (0)

June 28, 2005

Web calendars: jwebcalendar and FlatCalendarXP

Just a quick note. I just spent a few days looking at web calendars for a client. In particular, they wanted to show a 12 months view, rather than the more typical month at a time view. It's in a java environment.

Stay away from jwebcalendar. Hasn't been developed for a few years, and even though the feature set looks great and the screen shots look gorgeous, I wasn't able able to get it to run (in tomcat 4 or tomcat 5). After putting in some log statements, and learning that you needed to specify the xsl and xml locations on the url line (like so: localhost:8080/jwebcalendar/calendar?LAYOUT=form.url&XSL=webcalendar.form.url.xsl&XSLbase=./data/webcalendar/xsl/&XML=webcalender.form.url.vm.xml&XMLbase=./data/webcalendar/xml/&XMLfilter=.xml&HTMLbase=&TITLE=PPres) I ended up seeing this error message:

Error org.apache.xmlbeans.XmlException: error: Element type "input" must be foll owed by either attribute specifications, ">" or "/>". org.apache.xmlbeans.XmlException: error: Element type "input" must be followed b y either attribute specifications, ">" or "/>". at org.apache.xmlbeans.impl.store.Root$SaxLoader.load...

Since I'd burned enough time, I didn't follow the path any further. Major bummer, as it seemed like it would be a good fit.

After that, we looked at other calendar systems, and FlatCalendarXP seemed to fit the bill. It's payware (for commercial software) but it has an elegant API and has worked well so far. Recommended.

Posted by moore at 01:51 PM | Comments (0)

June 10, 2005

BJUG last night: breaking it down

OK, here's a grabbag of links and info regarding my talk last night (links and powerpoint available here). I was interested to know how many folks had actually worked on an internationalized application. (I18n is one of those APIs that you ignore until you need it, and then you absolutely have to have it.) Here are some rough numbers, based on a quick survey of hands. Out of a crowd of 25, about 5 people had used i18n. Of those, 3 had used it with web applications, and one with a J2SE application (I think the last person had used i18n with C). 3 of them had supported European languages and three had supported Asian languages.

I was emphasizing how important it was to pull out all the strings to be displayed to properties files, so they could be localized. A few folks in the audience mentioned that eclipse has support for this--out of the box for java classes, and with a plugin for jsps. (If you are reading this and know the name of the plugin, please comment; I did some googling and couldn't find any jsp plugins that claimed an 'Extract Strings' capability.)

One person asked if we encountered issues with translating plurals, like house/houses. I hadn't seen that problem with the application I worked upon, but was pretty sure that the java class libraries supported a solution. That solution is ChoiceFormat.

When I mentioned that we used Excel spreadsheets as our transfer format for slinging around translated strings, one of the other folks, who spent five years supporting a java application that was internationalized (he had some war stories), mentioned the XLIFF, an XML variant that is used in several translation editors. If you're planning to do a lot of translation, you might want to take a look.

The talk after mine was an interesting discussion of SOA by Glen Coward. It was a rather discombobulating mix of an extremely high level look at the SOA landscape, complete with TLAs galore, and an extremely low level example of a rich client in Flex talking to two web services, one built with Axis on JBoss, and the other built with MS.NET deployed on Mono. The demo went poorly, as demos tend to do, but it was enough to whet my appetite to look a bit more closely at SOA.

In addition, as you'd expect from an employee of Novell, he was interested in delineating the differences between open source software and commercial software in a corporate environment, and where each made sense. His point was that in quickly changing areas, corporate software made sense, because of productivity gains that commercial IDE support makes available. In addition, deep integration (read, anything to do with backend legacy systems) is probably going to require commercial software. He didn't specify why, but I'd guess that specialized backend integration is primarily going to continue to be commercial software for a number of reasons: it's not sexy, it requires the backend system to test (not often available for the average hacker), and there are intellectual property issues. What Glen didn't make explicit, but I will, is where that leaves open source--smack dab in the middle, gluing together all the applications. You know, like Apache and JBoss and Tomcat (among many others) already do.

As usual, I learned something at BJUG. And you can't beat free pizza.

Posted by moore at 08:19 AM | Comments (0) | TrackBack

June 09, 2005

i18n talk powerpoint available

My i18n talk, that I gave tonight, is available on this page.

Posted by moore at 09:35 PM | Comments (0)

June 08, 2005

Wikibooks

I just discovered wikibooks. They look pretty cool, will be interesting to see if they succeed like Wikipedia or fall by the wayside like many other wikis (Bruce Eckel has some comments on this phenomenon). Of particular interest to me is the J2ME wikibook.

Posted by moore at 03:28 PM | Comments (0)

BJUG Localization talk

I'm giving a talk tomorrow (Thursday) in Boulder at BJUG at 6:00. The topic will be "Internationalization and Localization in the Real World":

This is not another rehash of the Internationalization Trail fromthe Java Tutorial website. Rather, Dan examines one website that is supporting a large number of locales in the real world and looks at how i18n and l10n were implemented in the real world. This includes the nuts and bolts of loading multi character data, framework tools that helped, how users were associated with locales, what parts of the i18n API were not used, and issues to be aware of.

Afterwards, there will be pizza and pop, then a talk about Services Oriented Architecture. Hope to see you there.

Posted by moore at 09:33 AM | Comments (0)

May 26, 2005

Dropping the .com from package names

Dion wonders whether you need fully qualified package names on your java packages. For instance, is code that I write with this package declaration: package com.mooreds.foo; that much better than package mooreds.foo;?

Given that there are no other mooreds MLDs (from a search at network solutions):


mooreds.net is available.
mooreds.org is available.
mooreds.info is available.
mooreds.tv is available.
mooreds.us is available.
mooreds.cc is available.
mooreds.name is available.
mooreds.bz is available.
mooreds.co.uk is available.
mooreds.de is available.
mooreds.be is available.
mooreds.co.nz is available.
mooreds.at is available.
mooreds.com is unavailable.

I think the answer is that it doesn't matter very much right now. And the chances of it mattering in the future are slim. I'd have to write some code with the same classname as another 'mooreds' packager, and want to import that code. Improbable, but possible. And if this situation arose, I'd have to rename my class, use a different package name (after all, packages don't have to be meaningful) or use a different class.

What are the benefits of leaving off the com. declaration? Well, it saves everyone who wants to use it four characters of typing per import (those who don't use auto importers). Four characters!

So, it's safe to say that package mooreds.foo; and package com.mooreds.foo; probably won't hurt anything, but given the cost benefit analysis, I can't see why anyone wouldn't use the full package declaration: package com.mooreds.foo; .

Now, if someone is using a domain they don't own, well, that's just braindead. 15$ and a credit card will get you a domain name. If you can't afford that, then choose a TLD of your own creation; package lalala.mooreds.foo; won't collide with anyone who is following the spec, and has an even smaller chance of colliding with someone who isn't than just dropping the TLD.

In a different vein, I used to give an unusual name for restaurant waitlists, but oftne when they called out 'Archibald' I wasn't attuned to it like I was to 'Dan' and more often than not, I missed my seating. Similarly, if you use a domain that someone else owns as your package name, well, you're looking for trouble that you don't really need to. I mean, really, isn't software hard enough?

Posted by moore at 10:07 AM | Comments (0)

April 04, 2005

J2ME Article reposted

My J2ME article, previously posted and discussed at The Server Side, is now available on this website, sans TSS CSS.

Posted by moore at 09:27 PM | Comments (0)

March 03, 2005

Quartz and java job scheduling

I'm working on a stand alone java application that needs some fairly sophisticated scheduling capabilities, more than java.util.Timer can provide. Normally, I'd reach for trusty old cron, but in this case, it's a java program that needs to be run on both unix and windows with a minimum of fuss.

Quartz to the rescue. This open source java package lets you schedule a myriad of executable objects in many different ways. There are many different ways to use Quartz; there's a nice tutorial here, and the Quartz javadoc is pretty up to date.

All I'm using it for is a cross-platform cron replacement, but it does seem to have large number of other features. The one that I'm not using that seems the most useful is the ability to differentiate between activities (a Job) to be scheduled, and the events, be they time or otherwise related, that should cause those activities to be executed (a Trigger). Nice orthogonality, but for my purposes, overkill. However, I can see the usefulness of this feature.

The coolest feature that I am using is JobExecutionContext.getScheduler() which allows any job running to access the scheduler in which they are running. They can delete themselves, verify that other jobs are working, and even shutdown the scheduler.

If you have to do scheduling in java, you should take a look at Quartz. (Here is a survey of job scheduling options in Java.)

Posted by moore at 03:31 PM | Comments (1)

February 28, 2005

Two Expresso Good Practices

I've been working with Expresso 5.5 for the last couple of months. Two things I've learned, though they certainly don't qualify as 'best practices':

1. Expresso provides a nice way to manipulate the model by setting certain criteria and then retrieving all the rows that match such a criteria. However, this can be abused.

For example, here we look up all the Indo-European languages:

MyLangDBObject lookup = new MyLangDBObject(); lookup.setField("FAMILY", "Indo-European");
for (Iterator i = lookup.searchAndRetrieveList().iterator(); i.hasNext(); ) {
   MyLangDBObject instance = (MyLangDBObject) i.next();
   // process instance
}

(For more, see Chapter 6 of the Expresso Developer's Guide.)

This is all well and good, as long as there are not a significant number of objects that match your criteria. Because each object is retrieved from the database and plunked into an ArrayList, this method can use a tremendous amount of memory. A more memory efficient method of retrieving and processing a large number of rows is:

MyLangDBObject lookup = new MyLangDBObject();
lookup.setField("FAMILY", "Indo-European");
lookup.search();
Object[] keys = lookup.getFoundKeys();
MyLangDBObject instance = new MyLangDBObject();
for (int i = 0; i <= keys.length; i++ ) {
   String key = keys[i].toString();
   instance.clear();
   instance.setField("ID");
   instance.retrieve();
   // process instance
}

The above code still creates a large List, but each entry in that list is much smaller. I'm not sure how to treat objects with multi valued keys. I just looked in the Expresso 5.5 DBObject class, and it looks like multiple keys are concatenated with '/' and returned as a single string; beware as that's not documented anywhere and I haven't tested it.

2. When you're doing complicated filtering, DBObjects let you add a number of 'and' clauses. For example, this code finds all the dead Indo-European languages from Asia:

MyLangDBObject lookup = new MyLangDBObject();
lookup.setField("FAMILY", "Indo-European");
lookup.setField("GEOGRAPHIC_AREA", "Asia");lookup.setField("TYPE", "dead");
for (Iterator i = lookup.searchAndRetrieveList().iterator(); i.hasNext(); ) {
   MyLangDBObject instance = (MyLangDBObject) i.next();
   // process instance
}

This approach works well for quite a number of cases. However, if you want to do anything more complicated, such as date ranges or 'or' rather than 'and' clauses, you have three options.

* You can call setCustomWhereClause(). This allows you to escape the abstraction and essentially drops you down to SQL. All well and good; this should probably be your primary means of doing more complicated filtering. (Unfortunately, in Expresso 5.5, JoinedDataObjects, an Expresso construct which joins multiple tables together and presents a unified view thereof, do not support the setCustomWhereClause method. Apparently Expresso 5.6 has added such support.) This code finds all the languages that are dead or are Indo-European:

MyLangDBObject lookup = new MyLangDBObject();
lookup.setCustomWhereClause(
   "FAMILY = \"Indo-European\" OR TYPE = \"dead\"");
for (Iterator i = lookup.searchAndRetrieveList().iterator(); i.hasNext(); ) {
   MyLangDBObject instance = (MyLangDBObject) i.next();
   // process instance
}

* You can pull back the data and filter on it in the middleware server. This is a bad idea, since you're not only using java where SQL would be better used, you're also pulling back unneeded data. However, it is an option that will always work, though it may be slow. For example, if the setCustomWhereClause did not work, you could replicate the above example via this code:

MyLangDBObject lookup = new MyLangDBObject();
for (Iterator i = lookup.searchAndRetrieveList().iterator(); i.hasNext(); ) {
   MyLangDBObject instance = (MyLangDBObject) i.next();
   if (! ("Indo-European".equals(instance.getField("FAMILY"))||
      "dead".equals(instance.getField("TYPE"))
      ) ) {
         continue;
   }
   // process instance
}

* You can create a view and point the database object at the view instead of at the underlying tables. This is probably the cleanest, fastest method for a complicated where clause with read only data, since no unneeded data is returned by the database. This works for JoinedDataObjects as well. If you are making updates, however, views may or may not work.

Posted by moore at 10:46 AM | Comments (0)

February 25, 2005

Setting the content encoding for HTML message parts with Javamail

I spent an hour chasing down the solution to this issue, so I figured I'd post it (or at least what worked for me). Basically, I have a multi-part message that can have different content encodings for each text part. I want to send this message via javamail. Now, there's support for setting content as type 'text/plain' with a different character set, but if you want to add a part that is a different subtype of text to your message, there is no convenience method. However, this mail message had an example of how to specify html content and a character set:

MimeBodyPart htmltext = new MimeBodyPart();
htmltext.setContent(someDanishChars, "text/html; charset=\"ISO-8859-1\"");

(The author had some issues with this method in different app servers; it works fine for me in a stand alone java program.)

These additional parameters to the 'Content-Type' header are laid out, for text documents, in section 4.1 of RFC 2046. Here's a collection of helpful email related RFCs. Additionally, section 5.1 of RFC 2045 outlines the way to add parameters and gives examples of the charset parameters.

Posted by moore at 01:18 PM | Comments (0)

February 22, 2005

Runtime log4j configuration

So, I've spent the last day or so trying to track down how to configure log4j at runtime (log4j 1.2.8). Now, there are some things that are easy: setting the level of the root logger is as easy as: LogManager.getRootLogger().setLevel((Level) Level.DEBUG). However, if you want to do more complicated things at runtime based on other inputs than the log4j.{properties,xml} file, things begin to get a bit kludgy. For example, I wanted to set up a set of appenders with sane defaults. Then, if values were present in a configuration file, I wanted to update those appenders with different configuration values and change the root logger's behavior.

The easiest way I could find was to manipulate the properties file, as shown below:
package test;

import org.apache.log4j.*;
import org.apache.log4j.net.SMTPAppender;
import org.apache.log4j.net.SyslogAppender;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import java.util.Properties;
import java.io.*;

public class Test {
   private Log log = LogFactory.getLog(Test.class);
   Test() {
      log.debug("test1");
      switchAppenders();
      log.debug("test2");
   }
   public static void main(String args[]) {
      Test t = new Test();
   }
   private void switchAppenders() {
      Properties props = new Properties();
      try {
           InputStream configStream = getClass().getResourceAsStream("/log4j.properties");
           props.load(configStream);
           configStream.close();
      } catch(IOException e) {
          System.out.println("Error: Cannot laod configuration file ");
      }
      props.setProperty("log4j.rootLogger","DEBUG, file");
      props.setProperty("log4j.appender.file.File","out.log");
      LogManager.resetConfiguration();
      PropertyConfigurator.configure(props);
     }
}
This code is executed via this command, making sure that log4j.properties is present in the classpath: java -classpath .:log4j-1.2.8.jar:commons-logging.jar test.Test

This is quite a kludge, but I couldn't find anything better out there. It has the obvious setback that the changes you make to the log4j aren't persisted, nor can they easily happen in more than one place, and any changes to appender names break a log of things, but at least it works.
Posted by moore at 01:33 PM | Comments (0)

February 14, 2005

JMS at the most recent BJUG

I went to BJUG last Thursday, and enjoyed the informative talk about JMS by Chris Huston. It started out as a bit of a tutorial, with the typical "here's a messaging system, here are the six types of messages, etc." When he was doing the tutorial bit, I thought it was a bit simple for a main talk, but it got better as the the speaker continued. It was clear from the speaker's comments that he was deeply knowledgeable in the subject, or, if not that, at least has been enmeshed in JMS for a while. This wasn't just a "I downloaded an open source JMS server and ran through the Sun tutorial talk" and I appreciated that.

I had a couple of take aways. One is that managing messaging with transactions is something that you're always going to want to do, but this is fraught with difficulty, since you'll then have two transactional systems. And we all know what that means; you'll have to buy this book. It also means that, in a real system, you'll never want to use local transactions, as you'll want the transactions to be managed by a global transaction service, typically your application server.

Recovery of such a transactional messaging service was touched upon. If you have two different transactional systems, and failure occurs, recovering can be a real issue. Chris recommended, if at all possible, having the JMS provider and your data layer live in the same database, as then you can use the backup tools and ensure the two systems are in a consistent state.

One of the most interesting parts of the evening was a question asked by the audience. A fellow asked what scenarios JMS was useful for, and Chris said it was typically used in two ways:

1. Clustering/failover. You can set up a large number of machines and since all they are getting is messages with no context, it's much easier to fail over to another machine. There's no state to transfer.

I've seen this in the Jetspeed 1.5 project, where messaging is used to allow clustering.

2. Handling a large amount of data while increasing the responsiveness of the system. Since messages can be placed into queues, with no need for immediate response, it's possible for a message source to create a tremendous number of messages very quickly. These messages may take quite a bit of time to process, and this rules out a synchronous solution. JMS (and messaging solutions in general) allow hysteresis.

I've seen this in a client's system, where they send out a tremendous number of emails and want to ensure they can track the status of each one. It's too slow to write the status to the database for each email, but sending a message to a queue is quick enough. On the receiving end, there's some processing and status is written to the database. The performance is acceptable and as long as the JMS provider doesn't crash or run out of memory, no messages are lost.

The only scenario that I thought of that Chris didn't mention is one that I haven't seen. But I've heard that many legacy systems have some kind of messaging interface, and so JMS might be an easy way (again, no context required) to integrate such a system.

It was an interesting talk, and reminded me why I need to go to more BJUGs.

Posted by moore at 11:07 AM | Comments (0)

February 03, 2005

Networked J2ME Application article up at TSS

An article I wrote about Networked J2ME applications is up at TheServerSide.com. This was based on the talk I gave last year.

Posted by moore at 10:52 AM | Comments (0)

December 10, 2004

Useful tools: javap

javap lets you examine java class files and jar files in a number of ways. See this web page for more information. For me, it's an API reference. I use it in two ways:

1. When I'm coding, and I need to know the exact syntax of a method, I shell out: javap java.util.StringTokenizer. (Yes, I know that any modern IDE will do this for you without shelling out, but javap will work anywhere java is installed and with any editing tool. You trade portability for convenience.) One large catch is that inherited methods are not shown:

$ javap java.io.BufferedReader
Compiled from "BufferedReader.java"
public class java.io.BufferedReader extends java.io.Reader{
    public int read();
       throws java/io/IOException
    static {};
    public void close();
       throws java/io/IOException
    public void reset();
       throws java/io/IOException
    public boolean markSupported();
    public boolean ready();
       throws java/io/IOException
    public void mark(int);
       throws java/io/IOException
    public long skip(long);
       throws java/io/IOException
    public int read(char[],int,int);
       throws java/io/IOException
    public java.io.BufferedReader(java.io.Reader);
    public java.io.BufferedReader(java.io.Reader,int);
    public java.lang.String readLine();
       throws java/io/IOException
    java.lang.String readLine(boolean);
       throws java/io/IOException
}

Running javap on java.io.BufferedReader does not show the method read(char[]), inherited from java.io.Reader. (This example is from the J2SE 1.4 libraries.)

2. Sometimes, the javadoc is too up-to-date (or your jar files are too old) to answer questions about an API. For example, I'm working on a project with Jetspeed which depends on Turbine version 2.2. Unfortunately, this is an extremely old version of Turbine (release 16-Aug-2003), and the javadoc doesn't appear to be available. (Updated Dec 11: It looks like the Turbine 2.2 javadoc is indeed online. Whoops.) Generating the javadoc with ant is certainly an possibility, and if I found myself going time and again to verify the API of Turbine 2.2, I'd do that. But for a quick one- or two-off question about an API that no web search turns up, javap can be very handy.

In short, if you have a quick question about an API, javap can help you out.

Posted by moore at 06:45 PM | Comments (0)

November 27, 2004

Jetspeed 1 and 2

Here's an article about Jetspeed 2. I'm actually very excited about this, though I feel that compliance to the spec is overrated. Perhaps I'm not really an enterprise developer, but I haven't seen very many situations where I wanted a portlet (or an EJB) that had been developed on one portal server and deployed to another. But some of the Spring oriented features and ability to deploy struts application war files as portlets seem pretty neat and useful.

I'm currently working on a portal application, and we're using Jetspeed 1--version 1.5, which has extensive documentation and has actually been released. JS1 is built on the Turbine framework. I hadn't done much looking at this (other than a glance when I was looking at Torque for an OR layer). But so far, I've been really really impressed with Jetspeed and Turbine.

What has most impressed me is the configurability of these frameworks. There are a set of properties files that specify services, things like logging, persistence, localization, session validation, and authentication. The developers have done a great job of breaking these up into well defined chunks and letting me subclass services and plug in my own implementations. One example: our users do not login to the portal. Rather, they are authenticated by another application, which sets a cookie. I was able to disable Jetspeed's own authentication system (which looks to a database) and plug in mine without making any modifications to source code. With properties overriding, I didn't even have to modify the default properties files. Fantastic.

Jetspeed supports both JSP and Velocity as templating languages for the view. Velocity is used throughout the default portal, and I decided to use it as well. It's an interesting language which has a lot of the benefits I've read about Groovy--there's no types, and methods and properties look the same. It does look a bit perlish, I'll admit--lots of $ and #, but it's been fun to learn a different view language.

JS1 also provides an easy framework for developing portlets--an MVC framework, in fact. I don't want to repeat what the relevant section of the portlet tutorial says, so I'll just mention that developing dynamic content is a breeze.

I don't want to say that Jetspeed has been a pure joy, however. There were a few days a couple of weeks ago that I wondered whether it had been a good choice at all. We have an existing base of users (hence the alternate authentication) and were noticing that the portal was loading slow when running against a user table of around 100K users. Then I ran some tests with The Grinder and noticed it was running really slow. Like 30 seconds to render a page with 5 portlets. Luckily, some sleuthing around indicated that there was much extraneous database access going on, and when that was eliminated, performance became acceptable.

In addition, the default build process is mavenized. I had an extremely bad experience and ended up writing a simple ant build script to do what I want. It'd be nice if both were options--since lots of people come to JS1 looking to slap a portal together for cheap, rather than thrash around with a complex build process. (I'd say at least forty percent of the messages to the mailing list are build related--not a good sign.) I just went to a BJUG talk about Maven, and Thompson was persuasive, so I might give maven another chance.

All in all, though, I've been very happy with Jetspeed. I am looking forward to JS2; I wish the timing had been better so I could have used it on the current project.

Posted by moore at 01:41 PM | Comments (0) | TrackBack

November 23, 2004

Useful tools: p6spy

This entry kicks off a series of entries where I'll examine some of my favorite tools for development. Some of them will be long, some short, but all of them will highlight software I use to make my life a bit easier.

A large, large chunk of the development I do is taking data from a relational database to a an HTML screen, and back again. Often there are business rules for transforming the data, or validation rules, but making sure the data is stored safely and consistently is a high priority, and that means a relational database.

However, I do much of my work in java, which means that the relational-OO impedance mismatch is a common problem. One common way to deal with it is to use an OR tool--something like OJB or JDO. These tools provide object models of your database tables, usually with some help from you. You then have the freedom to pretend like your database doesn't exist, and use these objects in your application. The OR framework takes care of the dirty work like SQL updates and caching.

Every convenience has its price, however, and OR mapping tools are no exception. The same abstraction that lets you pretend that you're simply dealing with objects means that you cannot easily examine the SQL that is generated. In addition, the way that you're using the objects may cause performance issues, because you're treating the data as objects, rather than rows.

It's much the same issue as calling methods over the network via RMI or accesing files via NFS: the abstraction is great and means that programmers don't have to think about the consequences of remote access. But the failure of the abstraction can be catastrophic, all the more so because the programmer was not expecting to have to deal with the grotty details under the abstraction (that's the whole point, right?).

OR tools do not fail often, or have many catastrophic failure modes, but they sure can be slow. With open source software, you can dig around and see how SQL is being generated, but that's tedious and time consuming. With commercial products, you don't even have that option. (Some OR tools may have their own 'Show me the SQL' switch--I haven't run into them.)

Enter p6spy. p6spy can be used in place of any JDBC driver. You point it to the the real driver and it passes on any configuration or SQL calls to that driver. But p6spy logs every SQL statement passed to it and every result set passed back. (A fine non object oriented example of the Decorator pattern.)

It took me about 15 minutes to figure out how to use p6spy, the software is open source with decent documentation, the latest version has data source support, and it scratches an itch that most, if not all, java developers will have at some time. With p6spy, you can find out what that OR tool is doing under the covers--it's an easy way to peel back some of the abstraction if needed.

Posted by moore at 11:36 PM | Comments (0) | TrackBack

October 26, 2004

Oracle JDBC chapter online

There's a a fascinating chapter of Java Programming with Oracle JDBC available online. What I find most interesting is that (as of 2001), the thin driver is good enough to use everywhere--in fact, in most cases outlined above, it outperforms the OCI (type 2) driver. In addition, typically it takes 10s of repeated calls to a given PreparedStatement to make using the PreparedStatement faster than a regular Statement.

Posted by moore at 02:29 PM | Comments (0) | TrackBack

October 05, 2004

Expresso authentication and authorization

I've only briefly worked with Expresso. But I've heard good things about it. However, one 'feature' is really chapping my hide at the moment. Apparently, the only way to authenticate someone is to call the attemptLogin method on a 'Controller' object (a subclass of a Struts Action), which is protected and takes, among other things, the http request and response. There's no way I can find to just pass in a username/password and authenticate. In addition, the authorization system is not broken out either. In OO fashion, you ask an object if a user can access it, and the object knows enough to reply.

I'm not trying to rag on the Expresso developers. After all, they are giving away a fine, full featured java web framework for free. But this just drove home to me how important it is in web applications to have the classes that talk http be nothing more than a thin translating layer around business classes. For instance, all a struts action should do is convert http forms to domain specific value objects, and then call business methods on business objects.

If this was the case in Expresso, it'd be trivial for me to leverage Expresso's existing authentication model--I'd just have to fall the methods on the business object, perhaps after creating a domain specific value object. Now, however, I'll probably have to monkey around with the http request and response, and decode exactly what parameters it wants, and fake those up.

Posted by moore at 09:02 AM | Comments (1)

Open source portal search

I've been looking at some open source portals. My client has an existing java application, written in Expresso that has some reasonably complex logic embedded in it. Additionally, it's massively internationalized, with dynamic international content coming from a database, and static content coming from a set of resource bundles. There's an existing process around updating both of these sets of data. And when we're talking internationalization, we're talking Asian character sets as well as the European character sets.

So, the criteria for the portal were:

1. Support for multi-byte character sets and easy localization.

2. Ability to integrate with Expresso's authentication and authorization systems.

3. Support for normal portal features--adding/moving/removing portlets, minimize/maximize portlets.

4. Documentation.

I looked at a fair number of portals, including jcorporate's own ePortal, eXo, Liferay, Jetspeed 1, Jetspeed 2, and Pluto (a last alternative, to be avoided if possible, is to roll our own portal-like application). First, I looked at ePortal, but that's a dead project, with no releases. Then, I installed pluto, which seemed like it would be a perfect fit to be integrated into Expresso. However, integrating pluto looked complex, and after installing it (fantastic instructions for installing pluto here), I realized that pluto did not have a layout manager that would allow for the addition, rearranging or moving of portlets.

I then battled with Jetspeed 2, which involved installing a subversion client and building from source. This looked to be pretty cool, but the sheer lack of documentation, and the fact that there have been no releases, caused me to shy off. This is no failure of Jetspeed 2--this is what projects in development are like; I think it will be a fine project when done but my client just doesn't need to be on the bleeding edge. I also took a quick look at Liferay, which seems to be a much more full featured portal application than we needed. After reading this blog on portals I decided to take a closer look at eXo. However, the documentation wasn't fantastic, and it wasn't readily apparent how to plug in authentication.

I also downloaded and installed Jetspeed 1; if you download the src distribution, you get the helpful tutorial. While Jetspeed 1 is not a standards based solution (I expect most of the portlets will be custom developed anyway), the user community is fairly active, as indicated by the mailing list, and I've found the documentation to be extensive. In addition, it meets the pluggable authentication and authorization systems.

I'm less than thrilled about having to use maven for builds. Others have said it better than I, but it's just too much for my needs. However, I was able to get an independent directory tree for my project by copying over the maven.xml, project.properties, and project.xml from the tutorial directory to an empty directory. Then I tweaked the project.* files, ran maven jetspeed:genapp, tweaked a few settings in TubineResources.properties to make sure the localization settings were correct, and, voila, I have a working project tree, that, using the Jetspeed maven plugin, is one command away from a deployable war file.

Posted by moore at 08:25 AM | Comments (0)

August 24, 2004

OJB and object caching, pt II

Well, I was wrong, when I posted that OJB rc4 didn't support caching. Because of the way the application is architected, there are two places where we grab data from the database. I know, I know, don't repeat yourself. But when you're using free JAAS modules and free O/R mapping tools, you can't be too picky.

The upshot is, when I actually look at SQL statements for a typical two user session, I see 21 of a certain select statement for when caching using the org.apache.ojb.broker.cache.ObjectCacheEmptyImpl class, and only 6 when performing exactly the same user actions with the org.apache.ojb.broker.cache.ObjectCacheDefaultImpl class. Don't ask me why it's not a 2 to 1 ratio; I'm looking into it. (Deep are the ways of object caching.)

Posted by moore at 06:28 PM | Comments (0) | TrackBack

August 15, 2004

Book Review: Java Transaction Processing

Since many financial institutions have standardized on it, I hear Java is the new COBOL. Whether or not this is true, if Java is to become the business language of choice, transaction support is crucial. (By 'transaction,' I mean 'allowing two or more decisions to me made under ACID constraints: atomically, consistently, (as) in isolation and durably'.) Over the last five ears, the Java platform has grown by leaps and bounds, not least in this area.

Java Transaction Processing by Mark Little, Jon Maron and Greg Pavlik, explores transactions and their relationship with the Java language and libraries. Starting with basic concepts of transactions, both local and distributed, including the roles of participant and coordinator, and the idea of transaction context, the book covers much old but useful ground. Then, by covering the Java Transaction API (JTA) as well as OTS, the OMG's transaction API which is JTA's foundation, this book provides a solid understanding of the complexities of transactions for Java programmers who haven't dealt with anything more complex than a single RDBMS. I'd say these complexities could be summed up simply: failures happen; how can you deal with them reliably and quickly?

The book then goes on to examine transactions and the part they play in major J2EE APIs: Java Database Connectivity (JDBC), Java Message Service (JMS), Enterprise Java Beans (EJB) and J2EE Connector Architecture (JCA). These chapters were interesting overviews of these technologies, and would be sufficient to begin programming in them. However, they are complex, and a single chapter certainly can't do justice to any of the APIs. If you're new to them, expect to buy another book.

In the last section, the authors discuss the future of transactions, especially long running activities (the Java Activity Service) and web services. This was the most interesting section to me, but also is the most likely to age poorly. These technologies are all still under development; the basic concepts, however, seem likely to remain useful for some time. And, if you need to decide on a web service transaction API yesterday, don't build your own, read chapter 10.

There were some things I didn't like about Java Transaction Processing. Some of the editing was sloppy—periods or words missing. This wasn't too big a problem for me, since the publisher provided me a free copy for review, but if I were paying list price ($50) I'd be a bit miffed. A larger annoyance was incorrect UML and Java code snippets. Again, the meaning can be figured out from the text, but it's a bit frustrating. Finally, while the authors raise some very valid points about trusting, or not, the transaction system software provider, I felt the constant trumpeting of HP and Arjuna technologies was a bit tedious. Perhaps these companies are on the forefront of Java transactions (possible); perhaps the authors are most familiar with the products of these companies (looking at the biographies, this is likely). The warnings—find out who is writing the transaction software, which is probably at the heart of your business, and how often they've written such software before—were useful, if a bit repetitive.

That said, this book was still a good read, if a bit long (~360 pages). I think that Java Transaction Processing would be especially useful for an enterprise architect looking to leverage existing (expensive) transactional systems with more modern technology, and trying to see how Java and its myriad APIs fit into the mix. (This is what I imagine, because I'm not an enterprise architect.) I also think this book would be useful to DBAs; knowing about the Java APIs and how they deal with transactions would definitely help a DBA discuss software issues with a typical Java developer.

To me, an average Java developer, the first section of the book was the most useful. While transactions are fairly simple to explain (consider the canonical bank account example), this section illuminated complexities I'd not even thought of—optimizations, heuristic outcomes, failure recovery. These issues occur even in fairly simple setups—I'm working at a client who wants to update two databases with different views of the same information, but make sure that both are updated or neither; this seems to be a typical distributed transaction. The easiest way to deal with this is to pretend that such updates will always be successful, and then accept small discrepancies. That's fine with click-throughs—money is a different matter.

However, if you are a typical web developer, I'm not sure this book is worth the price. I would borrow it from your company's enterprise architect, as reading it will make you a better programmer (as well as giving you a sense of history—transactions have been around for a long time). But, after digesting fundamental distributed transaction concepts, I won't be referencing this book anytime soon, since the scenarios simply don't happen that often (and when they do, they're often ignored, as outlined above).

Posted by moore at 03:03 PM | Comments (1)

August 08, 2004

Decreasing the size of a midlet jar

The J2ME application I have been working on has been ready for testing for quite some time, but I didn't want to get a new AT&T phone. For J2ME, you really need a GSM phone--I don't think any of the older TDMA models support it. But the GSM network coverage doesn't match the coverage of the TDMA network--especially out west (aside: isn't that magnifying glass pretty cool?). So I put off buying a phone until my summer road tripping was done.

I've had a Nokia 6160 for almost 4 years. Even though friends mocked the size of it, it was a great phone--durable, good talk time. I thought I'd try another Nokia, and got one of the lower end GSM phones, the 6200. This supported J2ME, and weighed maybe half as much. I was all stoked to try the application on my brand new phone.

I started download the jad file, and was getting 'File Too Large' errors. A couple of searches later, I found Nokia's developer device matrix which is much more useful than the User Guide or the customer facing description of phones. Whoops. Most of the Series 40 (read: affordable) Nokia devices only supported J2ME applications which were, when jarred up, less than 64K in size.

Our application, however, was about 78K. This highlights one of the differences between J2ME and J2SE/J2EE. When coding in the latter world, I was never concerned about code size--getting the job done quickly was paramount, and if I needed to use 13 libraries which bloated the final size of my application, I did. On a cell phone, however, there's no appeal to adding memory or changing the JVM configuration to optimize memory use. If the Nokia phone only accepts jars of 64K or less, I had three options:

1. Write off the Nokia Series 40 platform. Ugh--I like Nokias, and other folks do too.

2. Do some kind of magic URL classloading. This seemed complicated and I wasn't sure how to do it.

3. Decrease the size of the jar file.

Now, the 78K jar had already been run through an obfuscator. I wasn't going to get any quick and easy gains from automated software. I posted a question on the JavaRanch J2ME forum and received some useful replies. Here's the sequence I went through:

1. Original size of the application: 79884 bytes.

2. Removal of extra, unused classes: 79881. You can see that the obfuscator did a good job of winnowing out unused classes without my help.

3. Changed all the data objects (5-6 classes), which had been written in classic J2SE style with getters and setters for their properties, to have public variables instead: 79465

4. Combined 3 of the data object classes into one generic class: 78868

5. Combined 5 networking classes into 2: 74543

6. Removed all the logging statements: 66044. (Perl to the rescue--$ perl -p -i -e 's!Log\.!//Log.!' `find . -name "*.java" -print |xargs grep -l 'Log\.'`)

7. Next, I played around with the jode obfuscator which Michael Yuan recommended. I was able to radically decrease the size of the jar file, but, unfortunately, that jar file didn't work on the phone. I also got a ton of exceptions:

java.util.NoSuchElementException
        at jode.bytecode.BytecodeInfo$1.next(BytecodeInfo.java:123)
        at jode.obfuscator.modules.LocalOptimizer.calcLocalInfo(LocalOptimizer.java:370)
        at jode.obfuscator.modules.LocalOptimizer.transformCode(LocalOptimizer.java:916)
        at jode.obfuscator.MethodIdentifier.doTransformations(MethodIdentifier.java:175)
        at jode.obfuscator.ClassIdentifier.doTransformations(ClassIdentifier.java:659)
        at jode.obfuscator.PackageIdentifier.doTransformations(PackageIdentifier.java:320)
        at jode.obfuscator.PackageIdentifier.doTransformations(PackageIdentifier.java:322)
        at jode.obfuscator.PackageIdentifier.doTransformations(PackageIdentifier.java:322)
        at jode.obfuscator.PackageIdentifier.doTransformations(PackageIdentifier.java:322)
        at jode.obfuscator.ClassBundle.doTransformations(ClassBundle.java:421)
        at jode.obfuscator.ClassBundle.run(ClassBundle.java:526)
        at jode.obfuscator.Main.main(Main.java:189)

I'm sure I just didn't use it right, but the jar file size was so close to the limit that I abandoned jode.

8. Instead, I put all the classes in one file (perl to the rescue, again) and compiled that: 64057 bytes. The jar now downloads and works on my Nokia 6200 phone.

When I have to do this again, I'll definitely focus on condensing classes, basically replacing polymorphism with if statements. After removing extraneous Strings and concatenating all your classes into one .java file (both of which are one time shots), condensing classes is the biggest bang for your buck.

Posted by moore at 09:15 AM | Comments (0) | TrackBack

July 29, 2004

JMS first impressions

I'm currently working on a project using JMS. It's my first experience with messaging, though I've read a bit about it; back in the dot.com boom, you could get the O'Reilly JMS book just for giving up demographic data (thanks Sonic!). However, this project is using JBossMQ, sending TextMessages containing XML into a number of Queues. I've integrated message sending into existing client applications. These applications come under heavy load periodically, so we wanted to make producing messages as simple as possible. The XML documents that these applications create are simply well-formed, and typically small. The consumers of these messages, on the other hand, undertake some fairly slow activities: they do some data massaging, and update or insert into multiple databases. Therefore, the consumers process messages asynchronously.

Such behavior demonstrates the strength of messaging (outlined most eloquently here): because of the decoupling between the producer of the message and the consumer, some objects can do "stuff" at a different rate of speed than the other objects which are needed to finish handling the "stuff". The downside, of course, is the difficulty of receiving any form of confirmation (unlike typical synchronous systems, where, since the calling object blocks until return, passing back values is simple). Enterprise messaging systems, for which JMS simply provides a uniform (and somewhat limited) API, provide some guarantees--at least the producer can rest assured that the message did get to some consumer. This contract means that the producer can throw many many messages into a queue, confident that even if it takes a long time to parse and handle each message, every one will be passed off to a consumer. Of course, if the rates of message production and consumption are vastly different, there can be problems.

(When a component of an typical, synchronous system fails, then the caller is left to handle the wreckage. Since messaging systems interpose, when a consumer fails, the producer never finds out or has to deal with it.)

It seems like messaging systems would be great for integrating disparate systems as well--after all, message formats can be entirely arbitrary and don't have to be understood by the messaging server at all. For example, this project has a perl program that was generating quite a bit of interesting data; it would have been nice to put this into a queue for a java program to consume. Unfortunately, the options for having a non-java producer participate in a JMS system are limited:

1. Some implementations provide client APIs for other languages (I saw a posting about SonicMQ and C++). JBossMQ has none that I could find.

2. Perl can call methods on java objects. A bit scary for a production system, and not a solution for producers/consumers written in other languages.

3. You could set up a java service listening on a port that would just take what's given and send a JMS message containing that to a queue. Now you lose much of the robustness of a messaging solution, since you're dependent on this service to make sure your messages get through.

4. Cut out the java service above, and decode the format that JBossMQ is using--since it's listening on a port, and you have access to the JBossMQ source you could probably hack up a client to send a message directly. This would be a maintenance hassle and isn't portable between JMS implementations.

The perl client problem ended up going away because we used a scalable, asynchronous message delivery system--Sendmail. (I wonder whether anyone has ever slapped JMS on top of Sendmail. A quick Google search showed nothing, but it seems like a natural pairing. I'm a bit worried about the reliability of delivery, but I've a sysadmin friend who says that if you control both the beginning and endpoints of a mail system, you can guarantee delivery.) All in all, JMS seems like a clean standard manner in which to enforce separation of concerns and gain a fair amount of robustness.

Posted by moore at 10:21 PM | Comments (1)

July 22, 2004

struts module ClassCastException

If you get this exceptions like this:
2004-07-21 15:35:06 action: null
java.lang.ClassCastException
        at org.apache.struts.action.ActionServlet.initModulePlugIns(ActionServle
t.java:1142)
        at org.apache.struts.action.ActionServlet.init(ActionServlet.java:486)
        at javax.servlet.GenericServlet.init(GenericServlet.java:256)
        at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:918)
        at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:810)
        at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:3279)
        at org.apache.catalina.core.StandardContext.start(StandardContext.java:3421)
        at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1123)
        at org.apache.catalina.core.StandardHost.start(StandardHost.java:638)
        at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1123)
        at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:343
)
        at org.apache.catalina.core.StandardService.start(StandardService.java:388)
        at org.apache.catalina.core.StandardServer.start(StandardServer.java:506)
        at org.apache.catalina.startup.Catalina.start(Catalina.java:781)
        at org.apache.catalina.startup.Catalina.execute(Catalina.java:681)
        at org.apache.catalina.startup.Catalina.process(Catalina.java:179)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:324)
        at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:243)

2004-07-21 15:35:06 StandardWrapper[/yourmodule:action]: Marking servlet action as unavailable
2004-07-21 15:35:06 StandardContext[/yourmodule]: Servlet /yourmodule threw load() exception
and you're using struts with modules, make sure that all of the classes referenced in all of the module-level struts-config.xml files are in the classpath of Tomcat.
Posted by moore at 08:36 AM | Comments (0) | TrackBack

July 05, 2004

OJB and object caching

I'm working on a project with ObJectRelationalBridge 1.0RC4. This release is a year old, but has suited our needs up to now, but now I've a couple of gripes.

1. The caching online documentation doesn't apply to my version. Hey, RC4 isn't the latest and greatest, so that's fair enough, but it would be nice if it was clear to which version the documentation applied. Once I realized (through a xerces exception) that this was the case, I was able to download the correct version and view that documentation locally. But what if I'd been using rc1, which doesn't appear to be downloadable anymore? Perhaps I could get documentation from CVS, but this just shows that you really should keep virgin downloads of your external dependencies (code, documentation, whatever) for future reference. Would it be overkill to spider the software's website when you make the decision to go with a particular version, and store that off somewhere?

2. In OJB 1.0RC4, object caching doesn't seem to work (using the ObjectCacheDefaultImpl class). Not too much useful on the mailing list but I did a bit of sleuthing on my own. P6Spy is a slick java application that decorates any JDBC driver, writing all the statements that clients are making to a log file, then passing those statements on through to the driver. Installation on tomcat was very easy, and it was enlightening to see what OJB had been up to under the covers: going back to the database to recreate a user object, even though ObjectCacheDefaultImpl has no timeout for cached objects.

I'll probably update the project to the newly released OJB 1.0.0 (don't those numbers strike fear into your heart? I suppose after 7 release candidates over more than a year, 1.0.0 should be pretty solid) and see if object caching works.

Posted by moore at 11:34 PM | Comments (0)

July 02, 2004

More on the difficulty of tuning the JVM

Here's a great overview of the JVM memory model (for all of Sun's JVMs, including the latest changes). I find it intensely interesting that he brushes over what, for me, is the most complicated part of any web application tuning--testing. He outlines a process for initially sizing the various compartments of the JVM heap (for versions 1.4 and below), and then says: 'The [resizing] process continues by "testing and tweaking" until things look good.'

Wow. Talk about waving your hands. I did some testing of a web application a few months ago, but when I presented the results, I was very clear that they were guidelines only. I didn't have the resources or ingenuity to replicate the behavior of real users on real clients. A few years ago, I was part of a project that ran aground on this same rock, costing the company I was working for plenty of money. Using software to imitate user behavior is hard. A short list of the differences between software and users:

1. Users get distracted--by popups, their kids, etc. Software, not so much.
2. Users are connecting via a variety of methods, with a wide range of quality levels--modems, broadband.
3. Users don't use applications in the way developers intended. The testing software, on the other hand, is programmed by developers, who naturally have it use the application in the way they intended

The adaptive memory model for the next JDK (wow, now it's J2SE 5--Sun pulled another Solaris reversioning trick) that the author outlines might make the "tweaking" portion of his hand waving, err, I mean tuning, easier but leaves the "testing" as difficult as ever.

Posted by moore at 10:29 AM | Comments (0) | TrackBack

June 16, 2004

java memory management, oh my!

How much do you understand basic java? Every day I find some part of this language that I'm not aware of, or don't understand. Some days it's cool APIS (like JAI) but today it's concurrency. Now, language managed memory is a feature that's been present in the languages in which I've been programming since I started. I've looked at C and C++, but taking a job coding in those seems to me it'd be like a job with a long commute--both have obstacles keeping you from getting real work done. (I'm not alone in feeling this way.) But this thread of comments on Cameron Purdy's blog drove home my ignorance. However, the commenters do point out several interesting articles (in particular, this article about double checked locking was useful and made my head hurt at the same time) to alleviate that. I took a class with Tom Cargill a few years back, which included his threading module, that helped a bit.

However, all these complexities are why servlets (and EJBs) are so powerful. As long as you're careful to only use local variables, why, you shouldn't have to worry about threading at all. That's what you use the container for, right? And we all know that containers are bug free, right? And you'd never have to go back and find some isolated thread related defect that affected your code a maddeningly miniscule amount of time, right?

Posted by moore at 11:00 AM | Comments (5) | TrackBack

May 23, 2004

Denver No Fluff Just Stuff

Well, I just got done with two days of the Denver No Fluff Just Stuff conference. First off, unlike the previous NFJS conferences, this one wasn't held in the DTC. You forget how far that is from Boulder, until you drive there and back twice on a weekend.

Anyway, I thought I'd share a few choice thoughts and tidbits regarding some of the sessions I attended. These are by no means full summaries of the talks.

Mock objects--Dave Thomas

Mock objects are objects that emulate behavior of external entities that make testing difficult. (I've worked with a few Englishmen in my life, and Dave Thomas had the same acerbic sense of humor.) Dave illustrated how to choose when to implement a mock object, as opposed to using the real object. He also touched on the true difficulty of mock objects, which is figuring out how to choose which object to use in your class (factory, pass the correct object into the constructor, AOP, class loaders).

JSF (both beginning and advanced)--David Geary

JSF is the new standard for web frameworks. David compared it to Swing and Struts meeting in a particle accelerator. Thompson's fussed about tools for JSF, but I don't think they'll be needed for all JSF development, just like tools for Struts help, but aren't required. I think that the most important bit about JSF is that it really tries to treat HTML widgets as full featured GUI components, which is something that is a bit of an adjustment for me. I'm really really used to thinking of HTML interfaces as generated strings, but this higher level approach (which has been used in the rich client world for a long time) is interesting.

There was an expert panel, consisting of several of the speakers. One hot topic was whether EJB 3.0 had been hijacked by Gavin King; everyone seemed to have an opinion on that. However, the choicest statement to emerge was Bruce Tate saying Java's "type safety is an illusion" because everyone who uses a collection casts whenever they take anything out.

Herding Racehorses, Racing Sheep--Dave Thomas

This was a non-technical talk discussing how to improve programming as a profession. He referenced the Dreyfus Model of Skill Acquisition (novices learn differently from experts), and referenced Patricia Benner and her study of nurses in the 1970s, and how it was analgous to the current situation of developers. A great quote was "Training is how you give people knowledge; time is how you give people experience." He also talked about how to move up the skill ladder, and how that will make it more difficult to outsource. However, he didn't talk about how the relative dearth of novices would create a future shortage of experts, other than to acknowledge that everyone, anywhere, can move up the skill ladder and we need to prepare for that. Prepare by having a plan; this makes sense, as what you're really doing is choosing where to invest your most precious commodity--your time.

TDD in the web tier--Rick Hightower

Rick covered the basics of Test Driven Development, and seemed a bit surprised that everyone wasn't practicing it; he said it's helped his code quite a bit. He went over a few tools that make testing (not just unit testing) easier today. A key point seemed to be the differentiation between TDD and Continuous Integration; tests that run for TDD need to be fast, since you're running them multiple times a day, whereas CI tests can be slower. He also made the offhand comment that you could have JMeter proxy requests from a QA tester (in a web browser) and use Canoo (a JSP testing tool) to automate those tests. Wouldn't that be cool?--cheaper than LoadRunner, that's for sure.

Another expert panel. Someone asked "what are you folks going to be looking at in the next 6 months" and I was struck by the lack of diversity in the responses. Groovy, Hibernate, Tapestry came up again and again. Where do the new ideas come from? And where does deep knowledge come from, if everyone is running to a new cool tool every 6-12 months?

An offhand comment that someone made when we were talking abouty why so many apps had extraneous EJBs: "Yup, that was design by resume."

Appfuse--Rick Hightower

Appfuse is a way to kick start your Struts applications. It provides a large chunk of best practices all in one place, along with a few pages that everyone needs (user creation, user authentication). Its license is liberal enough that you can use the code in your own project. I was struck by how many times Rick mentioned ripping stuff out, but I'm sure that I would learn quite a bit by poking around it. It was also clear to me that AppFuse is great for staring new applications, but I'm not sure it's a good thing (other than a learning tool) for retrofitting best practices to existing applications. Also, Rick mentioned multiple times that he wouldn't use Struts for a new application; given that AppFuse is primarily a Struts starter kit, I was a bit confused by this statement.

GIS--Scott Davis

This was a 1,000 foot overview of (primarily java) GIS applications. There are quite a few tools out there for displaying GIS data, which has several standardized formats (both those formally blessed by a standards organization, and those informal standards that grow out of network effects). There aren't a collection of open source data sets, but you can get a ton of GIS data from government websites. The satellite that Scott's company owns takes photos that are 15GB of data, and takes 500 such photos a day. Talk about storage needs. Also, anyone who wants to find out a bit more about satellite imaging would do well to read "Private eyes in the sky", an article from the May 4th 2000 edition of the Economist, which is a good overview of the business.

Again, apologies for the jerky nature of my comments above. (Hey, at least I'm not talking about tugging any unmentionables.) Hangovers are not conducive to good note taking, but even if I had been rested, I still couldn't do justice to 90 minutes of expert dialog in a paragraph on my blog. But it's well worth going to one of these conferences, especially if you're doing java web development.

Posted by moore at 10:57 PM | Comments (0) | TrackBack

April 29, 2004

What use is certification?

What good are certifications like the Sun Certified Java Programmer (SCJP) and the Microsoft Certified Systems Engineer programs? Unlike the Cisco certifications, you don't have to renew these every couple of years (at least the Java certifications--in fact, everything I mention below applies only to the Java certifications, as those are the only ones of which I have more than a passing knowledge). I am a SCJP for Java2, and I have an acquaintance who is a certified programmer for Java1.1; a Java1.1 cert isn't very useful unless you're targeting .Net, or writing applets that need to run on most every browser. Yet my colleague and myself can continue to call ourselves 'Java Certified Programmers.' I realize that there's an upgrade exam, but I've never met a soul who's taken it; and I don't believe I'm prohibited from heading down the Java Certification path and handing Sun more money because I am not an SCJP for the most recent version of Java. In fact, I'm studying right now for the Sun Certified Web Component Developer (SCWCD) and plan to take the exam sometime this summer. Even though these certifications may be slightly diluted by not requiring renewal, I think there are a number of reasons why they are a good thing:

1. Proof for employers.

Especially when you deal with technologies that are moving fast (granted, changes to Java have slowed down in the past few years, but it's still moving faster than, for example, C++ or SQL), employers may not have the skill set to judge your competence. Oh, in any sane environment you will probably interview with folks who are up to date on technology, but who hasn't been screened out by HR because of a lack of appropriate keywords. Having a certification is certainly no substitute for proper experience, but it serves as a baseline that employers can trust. In addition, a certification is also a concrete example of professional development: always a good thing.

2. Breadth of understanding.

I've been doing server side Java development for web environments for 3 years now, in a variety of business domains and application servers. Now, that's not a long time in programming years, but in web years, that's a fair stint. But, studying for the SCWCD, I'm learning about some aspects of web application development that I hadn't had a chance to examine before. For example, I'm learning about writing tag libraries. (Can you believe that the latest documentation I could find on sun.com about tag libraries was written in 2000?) I was aware of tag libraries, and I'd certainly used them, the struts tags among others, but learning how to implement one has really given me an appreciation for the technology. Ditto for container managed security. Studying for a certification definitely helps increase the breadth of my Java knowledge.

3. Depth of understanding.

Another aspect is an increased depth of understanding; actually reading the JSP specification or finding out what the difference is between overriding and overloading (and how one of them cares about the type of the object, whereas the other cares only about the type of the reference) or in what order static blocks get initialized. (My all time favorite bit of know-how picked up from the SCJP was how to create anonymous arrays.) The knowledge you gain from certification isn't likely to be used all the time, but it may save you when you've got a weird bug in your code. In addition, knowing some of the methods on the core classes saves you from running to the API every time (though, whenever I'm coding, the javadoc is inevitably open). Yeah, yeah, tools can help, but knowing core methods can be quicker (and your brain will always be there, unlike your IDE).

4. A goal can be an incentive.

Personally, I'm goal oriented, and having a certification to achieve gives me a useful framework for expenditure of effort. I know what I'm aiming for and I'm aware of the concrete series of steps to achieve that goal. I can learn quite a bit just browsing around, but for serious understanding, you can't beat a defined end point. I'd prefer it to be a real-world project, but a certification can be a useful stand in. (Yes, open source projects are good options too--but they may not cover as much ground and certainly, except for a few, are not as widely known as certifications.)

I've met plenty of fine programmers who weren't certified (just as I've met plenty of fine programmers who weren't CS majors). However, I think that certifications can be a useful complement to real world experience, giving job seekers some legitimacy while also increasing the depth and breadth of their understanding of a language or technology.

Posted by moore at 11:26 PM | Comments (0)

April 21, 2004

Inlining of final variables and recompilation

This problem that has bitten me in the ass a few times, and I'd love to hear any bright ideas on how y'all avoid it.

Suppose you have an interface that defines some useful constants:

public interface foo {
 int five = 6;
}

and a class that uses those constants:

public class bar {
 public static void main(String[]args) {
  System.out.println("five: "+foo.five);
 }
}

All well and good, until you realize that five isn't really 6, it's 5. Whoops, change the foo java file and rebuild, right? Well, if you use javac *.java to do this (as you might, if you only have the foo and bar files), then you'll be alright.

But, if you're like the other 99% of the java development world, and you use a build tool, like ant, smart enough to look at timestamps, you'll still get 6 for the output of java bar. Ant is smart enough to look at the timestamps of .class and .java files to determine which .java files have changed since it last did a compilation. But it is too dumb to realize that the bar class has a dependency on foo, and should thus be recompiled even though bar.java is older than bar.class. (I haven't looked at the byte code, but I expect that the value of five is just inlined into the bar class because it's a final variable.) If you're using a make based build system, I believe you can use javadeps to build out the correct dependency list, but I haven't seen anything similar for ant. Another options is to just remember to blow away your build directory anytime you change your 'constants'.

I guess this is why properties files might be a better choice for this type of configuration information, because they're always read in anew at startup, and thus cannot be inlined (since they're a runtime thing). Of course, then you lose the benefits of type checking. Not sure what the correct answer is.

Posted by moore at 12:38 AM | Comments (1) | TrackBack

April 19, 2004

Kris Thompson's review of my talk

Kris Thompson attended my BJUG kick start talk on J2ME development. I wanted to comment on his post.

1. I wouldn't say that J2ME development has scarred me. But J2ME is definitely a technology (well, a set of technologies, really) that is still being defined. This leads to pain points and opportunities, just like any other new technology. Lots of ground to be broken.

2. Caching--you can do it, but just like in any other situation, caching in J2ME introduces additional complexities. Can it be worth it, if it saves the user time and effort? Yes. Is it worth it for the application I was working on? Not yet.

3. PNG--it's easy to convert images from GIF/JPEG format to PNG. Check out the extremely obtuse JAI.create() method, and make sure you check out the archives of the jai-interest mailing list as well.

4. Re: Shared actions between MIDP and web portions of the application, I guess I wasn't very clear on this--the prime reason that we don't have shared action classes between these two portions was because, other than in one place (authentication) they don't have any feature overlap. What you can do on the web is entirely separate from what you can do with the phone (though they can influence each other, to be sure).

Anyway, thanks Kris for the kind words.

As a last note, I would reiterate what Kris mentions: "Find out which phones have the features you want/need" and perhaps add "and make sure your service provider supports those features as well." Unlike in the server side world, where everyone pretty much targets IE, J2ME clients really do have different capabilities and scouting those out is a fundamental part of J2ME development.

Posted by moore at 11:03 AM | Comments (0) | TrackBack

April 16, 2004

Software archeology

I presented this evening on J2ME for the kickstart meeting at BJUG, where Grady Booch was the featured speaker. After unknowingly knocking UML in his presence, I enjoyed a fine talk on software archeology. This discipline involves looking at larger, historical patterns of software development. Essentially, when we build software, we are building artifacts. And, just as the plans and meetings of the the slave foremen who built the pyramids used are not recorded, so there are aspects of present day software development that are simply lost when the project ends or the programmers die. One of Booch's projects is to capture as much of that data as possible, because these architectures are full of valuable knowledge that many folks have sweated for. It needs to happen soon, because, in his words, "time is not on our side" when it comes to collecting this kind of data. Man, I could handle that kind of job.

Speaking of architecture, I stumbled on "Effective Enterprise Java" which looks to be a set of rules for enterprise java development. I really enjoy "Effective Java", by Joshua Bloch, so I hope that Ted Neward's book lives up to its name. And I certainly hope this project doesn't get stranded like "Interface Design" apparently did.

Posted by moore at 12:19 AM | Comments (2)

April 12, 2004

Is transparent access control worth unintelligible error messages?

Partly egged on by Rob and Brian, I just took a long overdue look at container managed security for web applications.

My conclusion: it's nice, but there is one major flaw that dooms the whole premise. Users expect informative error messages when they 'sign in' and there's no way to do that with container managed security.

I was using Tomcat 4.1, which is to say, I was examining the servlet 2.3 specification. (I just looked at the 2.4 specification and can see no amelioration of the above issue.) I also focused on the FORM method of authentication, as that's the most customizable. (I imagine, for an intranet app obsessed with security, client certificates would be an worthwhile avenue of investigation.) I found the servlet specs to be very helpful in this process.

With the FORM method of authentication, you can customize the appearance of your login and error pages, to some extent. This is a huge win.

I really liked the automatic access control--no checking at the beginning of every ActionForm or JSP for any specific attribute. Additionally, you can protect different URL patterns easily, and for most of the applications I write, this is enough. If you need to protect buttons on a page, you can always resort to isUserInRole.

Also, you can protect the login and error pages, which should never be accessed directly in a separate /safe directory, to which you can prohibit all access.

For the times when the user is denied access to a resource, you you can create a custom 403 error page, using the error-page directive in web.xml. Unfortunately, you only seem to get three attributes: javax.servlet.error.message, javax.servlet.error.request_uri and javax.servlet.error.status_code, which limits the nature of your response. These were what Tomcat gave me--I don't think it's part of the spec. Regardless, IE, with default settings, doesn't display any custom error messages, which makes this a rather moot point for general webapps.

Creating a logout page is fairly easy, just call session.invalidate() (though there seem to be some non standard methods of doing it as well).

However, as mentioned above, I just don't think that users will accept the generic login error messages that you are forced to give. For instance, you can't tell whether a user didn't enter a password, or entered an incorrect password. You can't redirect them back to a login page with helpful error messages around the incorrect box. These are fundamental issues with authentication--no serious webapp simply throws up its hands when a user doesn't login correctly the *first* time.

Separate from user experience, but still related to authentication behavior, you can't 'lock out' users who've attempted to login too many times. Sure, you can keep track of how many times they've tried to login, but the authentication process is out of your hands.

Additionally, the fact that you're tied to a particular implementation for user/role definition means that writing custom authentication code that just accesses a RDMBS is actually more portable.

The answer, to the question posed in the title of this post: "is transparent access control worth unintelligible error messages?", is almost always "no." And folks accuse developers of not having any sense of user interface!

Posted by moore at 04:43 PM | Comments (2) | TrackBack

April 11, 2004

SimpleDateFormat and the 13th month

Wow. I just learned something about SimpleDateFormat, a class that I always resort to when I have to convert a String to a Date in java. Check out this bit of code:

import java.text.*;
import java.util.*;

public class foo {
public static void main (String[] args) throws Exception {
SimpleDateFormat sdf = new SimpleDateFormat("MMddyyyy");
System.out.println("12012000 "+ sdf.parse("12012000"));
System.out.println("13012000 "+ sdf.parse("13012000"));
System.out.println("12322000 "+ sdf.parse("12322000"));
}
}

and the output from that code:

$ java -classpath . foo
12012000 Fri Dec 01 00:00:00 MST 2000
13012000 Mon Jan 01 00:00:00 MST 2001
12322000 Mon Jan 01 00:00:00 MST 2001

Any overflow gets rolled into the the next higher, well, in addition, I'd call this a place. The 32nd day of December is the 1st of Jan, and the 13th month of any year is Jan. This is an implementation detail, as I found no mention of it in the SimpleDateFormat javadoc, nor the DateFormat javadoc, but others have noticed this too.

Posted by moore at 08:07 AM | Comments (3) | TrackBack

April 10, 2004

jalopy now closed source

Jalopy, which I wrote about here, is now closed source. It's about $40 for a single user license. For more info, see the corporate website. I see that the open source version is still around, though there hasn't been a release since 1.02, about 18 months (about the same as xdoclet actually).

I totally respect Hunsicker Marco (who is the developer, I think and certainly the owner of the corporate domain) and his right to earn a living. $40 certainly isn't that much (in fact, he even has a link to the old, free version on his purchase page!), but I hope that he eventually rolls the improvements into the free version, a la ESR's "Free The Future, Sell the Present" business model.

Posted by moore at 06:51 PM | Comments (0)

April 03, 2004

Scripting languages and productivity

Bruce Eckel has some things to say about different languages and productivity. One quote in particular stood out:

"I didn't have to look that up, or to even think about it [reading the contents of a file using python], because it's so natural. I always have to look up the way to open files and read lines in Java. I suppose you could argue that Java wasn't intended to do text processing and I'd agree with you, but unfortunately it seems like Java is mostly used on servers where a very common task is to process text."

I agree entirely. I come from a perl background (it's the language I cut my teeth on, which, I suppose, dates me), and unlike some, I'm unabashedly in favor of it. I've looked at python briefly, and it does seem to have perl's flexibility and agility with less ambiguity. When you have to grab a file from the filesystem (or parse a file and stuff it into a database) there's simply no comparison, and anyone who reaches for Java to solve such problems simply hasn't experienced the joy of the freedom of scripting languages.

The problem with such free form languages arises when you start doing large scale systems. Java, for all its faults and complexity, forces choices about implementation to be done at a high level--which framework do we want to use, how do we architect this solution. Perl (note that I'm not talking about python, since I'm a python newbie), on the other hand, is more flexible, and hence allows more latitude. It requires more discipline to code OO perl, or, for that matter, readable perl, than it does to code readable java. (There are different ways to implement objects in perl--see Object Oriented Perl for more information.) By limiting some of the latitude of the developer, you gain some maintainability.

I was trying to think of trivial examples that illustrate this point, but I couldn't. Perhaps it's because I've been out of touch with perl's evolving core libraries for so long, or perhaps it's because all the perl I've ever had to maintain has been intensely idiomatic, where all the java I've had to maintain has been, though at times obtuse, fairly easy to read, but I just feel that perl is a harder language to maintain than java.

Now, how does this apply to Eckel's statements? Well, he uses python as his example--stating that you just plain can get more done with python than you can with java. It's hard to argue with that.... But the majority of code expense and lifecycle is not in the creation but the maintenance. How do the scripting languages stack up for large scale systems? My experience (which, granted, is primarily applicable to small to medium size systems) indicates that the very flexibility which allows Bruce such amazing productivity hampers further enhancements and bug fixing on the code he writes.

Posted by moore at 08:20 AM | Comments (1)

March 21, 2004

The Grinder

I did some performance testing against a web application that I helped write this weekend. I used The Grinder and was quite happy with the beta version. This lets you write scripts in Jython and uses the quite nice HTTPClient library. The Grinder, like most other performance tools, has an admin interface (the console) and a set up distributed agents that are given tasks and communicate results back via the network to the console. I used the HTTP client, but it looks like you can 'grind' anything you can talk to via java, from databases to email servers.

I did run into a few problems. I'm using cygwin on WinXP, and had some difficulties running java from the command line. The fix was to use the cygpath command, like so:

#!/bin/sh
# to start the agent
JAVA=/cygdrive/c/j2sdk1.4.2_03/bin/java
CLASSPATH=`cygpath -p -w /home/Owner/work/grinder/grinder-3.0-beta20/lib/grinder.jar:\
/home/Owner/work/grinder/grinder-3.0-beta20/lib/jakarta-oro-2.0.6.jar:\
/home/Owner/work/grinder/grinder-3.0-beta20/lib/jython.jar`
$JAVA -cp $CLASSPATH net.grinder.Grinder

The client application that I was testing doesn't use cookies (it's a J2ME application, and the MIDP spec doesn't support cookies out of the box). Or rather, it uses cookies, but only to grab the first one that the server sends, store it off, and then pass it back as a query parameter. This type of configuration isn't The Grinder's sweet spot, and I had to do a bit of hacking to make sure the appropriate cookie value was sent with the appropriate client request. It would have been nice to use contexts but The Grinder wraps the HTTPConnection in its own class. Apparently, if you are simulating use by a browser, cookies are apparently handled correctly. One gripe--there's no javadoc for the main classes available on The Grinder's website, so you have to grab the source if you want to see interactions between pieces (for example, how net.grinder.plugin.http.HTTPRequest interacts with HTTPClient.HTTPConnection).

I also mucked with some of the properties, primarily initialSleepTime. You'll want to make sure that you read about these properties--I blithely uncommented what was in the sample grinder.properties and ended up with an obscene value for grinder.sleepTimeFactor.

After all the setup, I was able to hammer our server. I discovered two useful things: an error in our logout code, which threw exceptions around 10% of the time, and also discovered that our connection timeout between Apache and Tomcat was set incorrectly. Changing this from 0 to 1000 fixed the dreaded SEVERE: All threads are busy, waiting. Please increase maxThreads or check the servlet status error that I was getting. In addition to these two useful bugs, by making some assumptions about how the application will be used, I was able to gimmick up some interesting numbers about supportable users.

I like The Grinder a fair bit. It's got a nice GUI. It's still under active development. I'm a bit leery of using beta software (especially open source beta software), but a poll on the homepage convinced me to try the beta. By using this, I was also able to pick up snatches of python which is a new language to me (finally got to consult my long unused copy of Learning Python). I considered looking at JMeter, but The Grinder appears to be a bit more recently maintained. It's no LoadRunner, but then again, it doesn't pretend to be. All in all, if you're in the market for a cheap, quick performance tool, The Grinder is worth a look.

Posted by moore at 12:57 PM | Comments (11)

March 16, 2004

Miswanting and web application frameworks

I've wanted to respond to this post by Kris Thompson where he predicts that "Struts will continue to lose industry acceptance as the MVC leader in the J2EE space" in 2004 for some time now. I believe this is happening already; if you read the blogging community or some of the industry rags, it seems like other alternatives to Struts are being promoted (not least of which is JSF). But there are still tons of Struts applications out there being built every day. There have been over 2000 messages on the struts mailing list for the past year (granted this number is declining--perhaps because folks are GFTFA [googling for the fcuking answer]).

This article explains why I continue to develop in struts: "A wider range of slightly inferior options, then, can make it harder to settle on one you're happy with." There is definitely a wide range of J2EE frameworks. In my case, these alternatives to struts are not inferior because of any technical shortfall, but rather because I have to learn them.

(An aside: I have programmers' optimism as much as anyone else. But after a few years in the industry, I've realized that while I feel I can learn a new technology as quickly as need be, in order to really understand the pitfalls of a framework or API, I have to "build one to throw away." I really do.)

Please don't take these statements as a whiny "I don't want to learn anything new." That's not true. But my time is finite, and the project deadlines are always creeping up. And struts works well enough for my problem set.

And that's why struts is going to be around for a while yet.

Posted by moore at 02:49 PM | Comments (0)

March 09, 2004

Long running queries in servlets

The stateless nature of the web presents some user interface issues. Not least of these is how to handle long running processes most efficiently. Do you keep the user waiting, do you poll, etc? Remember, even if everything is going dandy, normal users like to see something.

This JavaRanch article is a good explication of how to use message driven beans and asynchronous access to data in the web tier to deal with these problems. It leans a bit heavily on WebSphere, but does seem to address some of Dion's issues about there not being enough use of messaging systems. And it even throws a couple of design patterns in as well.

Posted by moore at 01:49 PM | Comments (2)

February 27, 2004

What are EJBs good for?

Dion had a good post about what EJBs are good for. I've only used EJBs seldom (and peripherally), but it's my understanding, from reading the literature, that EJBs are appropriately named--that is, good for enterprise situations. In that case, what on earth are these folks thinking? They demonstrate using an EJB in JSP. What?

Posted by moore at 08:31 PM | Comments (0)

February 23, 2004

SQL Server JDBC driver troubles

I'm responsible for a small struts application for one of my clients. The application was originally coded on Windows against a SQL Server 2000 database. When I was contracted to roll it to production, a Linux box talking to a SQL Server 7 database, I found I couldn't use the existing MS JDBC drivers, which only support SQL Server 2000. So, I went looking for SQL Server 7 JDBC drivers. There are a ton of choices out there, but most are commercial. I looked at jTDS, but that didn't work because, at the time, jTDS did not support CallableStatements, which were used extensively by this application. (Apparently, jTDS does now.)

So, I looked at a few commercial drivers, and decided that Opta2000 offered the best feature set for the price ($800 for unlimited web application connections). Then, the database was upgraded from SQL Server 7 to SQL Server 2000. Luckily, we hadn't bought the JDBC driver yet, so, hey, let's use MS JDBC drivers--they're free! Fantastic. The installation went fine (not that it was that complicated--dropping some new jars in the WEB-INF/lib directory and changing some lines in the struts-config.xml

Except, Tomcat (version 4.1.24) started behaving badly. With IE (and, to a lesser extent, with Mozilla), the pages started loading very slowly after Tomcat had been running for a while. A restart alleviated this symptom, but didn't obviously solve the problem. Initially, we thought it was the load, and some misconfiguration of tomcat (tomcat was serving images--not usually considered its strong point, though benchmarks are needed to tell the full tale), but nothing seemed to change the behavior. We tried changing how tomcat was passed requests (mod_jk, mod_proxy), but nothing seemed to work. A colleague of mine looked at when the instability started, and it correlated with the installation of the MS JDBC drivers. So, we switched back to Opta. The application returned to a stable state, and we haven't seen the problems since. (We plan to purchase the drivers now, although we may take a look at jTDS.)

Posted by moore at 05:44 PM | Comments (4)

February 13, 2004

jad

If you've never used jad then you're missing out on a great tool. Jad lets you easily decompile java class files. It may be shady legally, depending on what contracts you've signed, but it's definitely useful in debugging and understanding behavior of java applications. It couldn't be simpler to use. Just run

jad classfile.class

from the command line, and you get a java file (named classfile.java) in the same directory. The names of the variables aren't fantastic (s1, s2...) but it sure beats reading the bytecode output of javap -c.

Note, it's free for noncommercial use, but if you want to use it commercially, contact the author for terms. And if you get a chance to download it from the above tripod.com link, grab it and store it someplace else, because the page often is unavailable due to its exceeding bandwidth limits.

Posted by moore at 02:01 PM | Comments (2)

January 30, 2004

Book Review: Enterprise J2ME

I go to Java Users Groups (yes, I'm struggling to get in touch with my inner geek) once every two or three months. Sometimes there's an engaging speaker, but most of the time the fellow up front looks like he's just swallowed a hot pepper, speaks like he has a permanent stutter, and answers questions like I'm speaking Greek. (I'm not making fun; I had a hard time when I was in front of a JUG too.) Regardless of the quality of the speaker, I gain something just by watching the presentation--he points out interesting technologies and usually has a list of resources at the end that I can use for further research.

I think Michael Yuan would be a great speaker at a JUG, as he seems to have a masterful understanding of Java 2 Platform, Micro Edition (J2ME). However, the true value of his book, Enterprise J2ME, was in its introduction of new ideas and concepts, and the extensive resource listings. This book is a survey of the current state of the art in mobile java technology. Whatever your topic is, except for gaming development, you'll find some coverage here. Securing information on the device or network, XML parsing strategies, messaging architectures, and data synchronization issues are all some of the topics that Yuan covers.

My favorite chapter was Chapter 7, 'End to End Best Practices.' Here, Yuan covers some of the things he's learned in developing his own enterprise applications, and offers some solutions to five issues that differ between the J2ME world and the worlds familiar to most Java developers: J2EE and J2SE. He offers capsule solutions to the issues of "limited device hardware, slow unreliable networks, pervasive devices, ubiquitous integration [and] the impatient user." Later in the book, he explores various architectures to expand on some of these capsules.

However, the strength of this book, exposing the reader to a number of different mobile technologies, is also its weakness. JUG speakers very rarely dive into a technology to the point that I feel comfortable using it without additional research; I usually have to go home, download whatever package was presented, and play with it a bit to get a real feel for its usefulness. This book was much the same. Some of the chapters, like chapters 12 and 13, where issues with databases on mobile devices (CDC devices, not CLDC devices) weren't applicable to my kind of development, but you can hardly fault Yuan for that. Some of the later chapters felt like a series of 'hello world' applications for various vendors. This is especially true of chapter 12, and also of chapter 20, which is a collection of recipes for encryption on the device.

Additionally, I feel like some of the points he raised in Chapter 7 are never fully dealt with. An example of this is section 7.3.3, "Optimize for many devices." The project I'm on is struggling with this right now, but I had trouble finding any further advice on this important topic beyond this one paragraph section. However, these small issues don't take away from the overall usefulness of the book--if you are developing enterprise software, you'll learn enough from this book to make its purchase worthwhile.

However, I wouldn't buy the book if you're trying to learn J2ME. Yuan gives a small tutorial on basic J2ME development in Appendix A, but you really need an entire book to learn the various packages, processes and UI concerns of J2ME, whether or not you have previously programmed in Java. Additionally, if you're trying to program a standalone game, this book isn't going to have a lot to offer you, since Yuan doesn't spend a lot of time focused on UI concerns and phone compatibility issues. Some of the best practices about limited hardware may be worth reading, and if it's a networked game, however, you may gain from his discussions in Chapter 6, "Advanced HTTP Techniques." In general though, I'm not sure there's enough to make it worth a game developer's while.

I bought this book because I'm working on a networked J2ME application, and it stands alone in its discussion of the complex architectural issues that such applications face. It covers more than that, and isn't perfect, but it is well worth the money, should you be facing the kind of problems I am. Indeed, I wish I had had this book months ago, as I'm sure it would have improved the my current application.

Posted by moore at 09:05 AM | Comments (1)

January 11, 2004

Jalopy

I like javadoc. Heck, I like documentation. But I hate adding javadoc to my code. It's tedious, and I can never remember all the tags. I don't use an IDE so the formatting gets to me.

After attending a presentation at BJUG about software tools, I investigated jalopy and I liked what I found. Now, jalopy is more than just a javadoc comment inserter, but javadoc insertion was my primary use of the tool. It may be piss poor for code formatting and whatnot, but it was pretty good at inserting javadoc. I was using the ant plug-in and the instructions were simple and straight forward. It didn't blow away any existing comments, and it didn't munge any files, once I configured it correctly. And there are, make no mistake, lots of configuration options.

Jalopy has a slick Swing interface to set all these configuration options, and you can export your configuration to an XML file which can be referenced by others. This, along with the ant integration, make it a good choice for making sure that all code checked in by a team has similar code formatting.

However, I do have a few minor quibbles with this tool.

1. The default configuration of javadoc is busted. When you run it, it javadocs methods and classes just fine, but any fields are marked with "DOCUMENT ME!" when they should be commented out: "/** DOCUMENT ME! */". This means that, with the default configuration, you can't even run the formatter twice, since jalopy itself chokes on the uncommented "DOCUMENT ME!".

2. The configuration file is not documented anywhere that I could find. I looked long and hard on the Internet, and only found one example of a jalopy configuration file here. And this is apparently just the default options exported to a file. I've put up a sample configuration file here which fixes problem #1. (This configuration is only for javadoc; it accepts all other defaults.)

3. The zip file that you download isn't in its own directory. This means that when you unassumingly unzip it, it spews all over your current directory.

None of these are show stoppers, that's for sure. If you're looking for a free, open source java code formatting tool, jalopy is worth a close look.

Posted by moore at 08:23 PM | Comments (2)

December 29, 2003

Java Tidbits

Quick, take this true/false quiz and test your Java(tm) knowledge:

1. Private members can only be accessed by the object which owns them.

2. The contents of a finally block are guaranteed to run.

Everybody knows that both of these are true, right?

Actually each of these is false. In the first case, this program shows that objects can twiddle with private members of other objects of the same class:

class PrivateMember {
   private int i; 
   public PrivateMember() { 
      i = 2;
   }
   public void printI() {
      System.out.println("i is: "+i);
   }
   public void messWithI(PrivateMember t) {
      t.i *= 2;
   }
   public static void main (String args[]) {
      PrivateMember sub = new PrivateMember();
      PrivateMember obj = new PrivateMember();
      obj.printI();
      sub.messWithI(obj);
      obj.printI();
   }
}

and this program shows that finally blocks don't run if the JVM exits:

class TryCatch {
   public static void main (String args[]) {
      try {
         System.out.println("in try");
         throw new Exception();
      } catch (Exception e) {
         System.out.println("in catch");
         System.exit(0);
      } finally {
         System.out.println("in finally");
      }
   }
}
Posted by moore at 08:04 PM | Comments (0)

December 27, 2003

J2ME development advice

Here is some advice for any of you thinking about doing any development for J2ME. I've written before on J2ME considerations, but the project I'm working on is a bit further on now, and I thought I'd share some hard earned knowledge.

1. Get Windows.

Make sure you have access to a windows box, running a modern version of windows. While Sun's Wireless Tool Kit supports Linux and Solaris, and IBM's WebSphere Device Developer supports Linux, no other emulators do (nor do they work under wine).

Whether you want to support Motorola, Nokia, or NEC, you are pretty much need windows to run the emulator. And an emulator is crucial, because it allows you to rapidly develop an application. And testing on as many emulators as possible means that your application will be as tight as possible. However, when you get something that is close to finished, you'll need a real phone to test it on.

2. Get (a) real phone(s).

While emulators can tell you a lot, they certainly can't assure you that an application will run on a real phone. One project I worked on had an emulator that ran the app just fine, but the app was locking up on the real phone. It turned out that the networking code needed to run in a separate thread. There are many things that differ between an emulator and a real phone. Installation of a midlet is different (and differs between phones as well). Instead of accessing a file on disk, you have to use OTA provisioning (sure you can emulate that with the WTK, but that's just an emulation. I've run into issues with DNS that just don't show up on the emulator). Also, as mentioned above, the networking capability differs, and the connection is much slower. The amount of memory that you can use while developing on the desktop is effectively unlimited (some of the emulators let you monitor the amount used, but I don't know of any that lets you limit it), but phones have hard limits. Don't you want to know what happens when you try to use 101KB of memory on a device that only has 100KB? The limitations you face on user interface are also more real on a phone, when you can't use the keyboard to enter your username or the backspace key to fix errors. For all these reasons, you should get a phone as soon as you can.

3. Explore existing resources.

A couple of good books: Enterprise J2ME is a great survey book, with some very good ideas for building business applications (with large numbers of users) but not a whole lot of nuts and bolts. Wireless Java: Developing with J2ME, Second Edition is a good nuts and bolts book (it explains how to do your own Canvas manipulations, for example). Check out what else Amazon suggests for other ideas.

A couple of helpful urls: Fred Grott's weblog, MicroJava, the EnterpriseJ2ME site, Sun's site, of course, and the javaranch saloon is pretty helpful too.

The various carrier websites are useful, if only to find out what kind of phones you want to target: AT&T, Sprint, T-Mobile, Nextel. (Verizon in the USA is BREW only.)

4. Have fun.

Posted by moore at 11:24 AM | Comments (0)

November 04, 2003

Sending Binary Data Via Struts

Struts is a MVC web framework. It allows you to specify the paths through a web application in an XML configuration file, and provides some nice tag libraries for JSP manipulation. The views for the states are usually JSPs, and the controller is a servlet provided for you.

This standard setup is fine, most of the time. But I've run into situations, as have others (1,2), where you need Struts to output a binary file. In my case, I'm sending a dynamic image to a non browser client, and I basically need to write a ByteArrayOutputStream to the response output stream.

Now, you'd think I'd be able to do this with a JSP. After all, a JSP is just a servlet turned inside out, right? Well, according to the specification, you'd be right. From page 42 of JSP 1.2 spec:

----------------
The JSP container should not invoke response.getWriter() until the time when the first portion of the content is to be sent to the client. This enables a number of uses of JSP, including using JSP as a language to glue actions that deliver binary content, or reliably forwarding to a servlet, or change dynamically the content type of the response before generating content. See Chapter JSP.3.
----------------

But, according to Tomcat 4.1.24 on the Linux platform, you'd be wrong. When calling 'java.io.OutputStream rs = response.getOutputStream();' in my JSP, I get this code generated:

---------------

....snip...
      application = pageContext.getServletContext();
      config = pageContext.getServletConfig();
      session = pageContext.getSession();
      out = pageContext.getOut();
      _jspx_out = out;
java.io.OutputStream rs = response.getOutputStream();
....snip...

---------------

The JSP is taking my stream before my code has a chance. Therefore, I get an "getOutputStream() has already been called for this response" error. The weird bit is that this doesn't seem to happen on Tomcat 4.1.24 on Windows (same version of struts).

So, what do you do? You write a servlet instead. That way you have utter control over the output stream:

---------------

import javax.servlet.http.*;
import javax.servlet.*;
import java.io.*;

public class BinaryStreamServlet extends HttpServlet {

   public void service(HttpServletRequest req, HttpServletResponse res)
throws ServletException, IOException {
      String contentType =
(String)req.getAttribute("contentType");
      if (contentType == null || "".equals(contentType)) {
         contentType = "image/png"; // default
      }
      res.reset();
      res.setContentType(contentType);
      OutputStream sos = res.getOutputStream();
      ByteArrayOutputStream baos = (ByteArrayOutputStream)req.getAttribute("baos");
      baos.writeTo(sos);
   }
}

---------------

I set up my action classes to cache the ByteArrayOutputStream in the request, with the name "baos." I added these lines to my web.xml:

---------------

<servlet>
      <servlet-name>binaryFileServlet</servlet-name>
       <servlet-class>BinaryStreamServlet</servlet-class>
  </servlet>
....snip...
  <servlet-mapping>
    <servlet-name>binaryFileServlet</servlet-name>
    <url-pattern>/binaryFile</url-pattern>
  </servlet-mapping>

---------------

and this to my struts-config.xml for any actions that needed to be able to send binary data:

---------------

<forward name="success"         path="/binaryFile"/>

---------------

Works like a charm. Leave the JSPs to the character data, and use this simple servlet for binary delivery.

Posted by moore at 10:15 PM | Comments (11)

November 03, 2003

J2ME considerations

I'm working on a project that uses J2ME to display and interact with data from a remote server on handheld devices, specifically cellular phones. We are coding to the MIDP 1.0 specification, because that's prevalent right now. The MIDP 2.0 will have some nice additions, and I'm looking forward to its widespread implementation. J2ME is nice because it has a lot of the same capabilities as java on the PC (J2SE). The specification states that J2ME is a strict subset of J2SE, that is, any class that is in the J2ME spec has to have all the methods of the J2SE class. The user interface of J2ME is also similar--there are forms, with various items that are added to them (choice groups, which are like radio buttons, and text fields are the main user input controls). In addition, a developer gets all the niceties of java--garbage collection.

While the transition from 'normal' web development (servlets, jsps) has been fairly painless, there have been a few hiccups. I wanted to cover some of them. This isn't a colossal project, but we do have about 60 classes in 10 packages on the client communicating to ~150 classes on the server, so it's not a stock ticker either.

Handheld devices, and particularly cell phones, are different than the browser. They are much more under the control of the folks who build the devices and install the initial software on them. If you thought it's difficult to install the new JVM on your PC, try installing any kind of JVM on your cell phone. This means that the carrier (AT&T, Verizon, etc) matters in a fundamentally different manner than the OEM of your PC. This immediately jumps out when deciding on a platform. There are two main platforms out there for cell phones: BREW and J2ME. I don't want to go into what BREW is here; Colin Fahey does a good job of covering the differences. Suffice it to say that, for both non technical and technical reasons, we decided to pass on BREW for a while. This means that Verizon, the largest US carrier, is off limits to us, as they only support BREW. We're in the unenviable position of telling paying customers that they have to switch cell phone service if they want to use our app. Luckily, there's hope for the future. J9 is a plug-in for BREW that will allow J2ME apps to run on BREW devices. However, it's my understanding that we have to wait until BREW 2.0 is widely deployed to use this capability.

In addition to dealing with different support from different carriers (even with the same handset), developers of J2ME apps also have to deal with developing on a different platform. Only the foolish actually compile applications for a cell phone on a cell phone. They have slow processors, limited memory, no floating point support, etc. The MIDP 1.0 specification only requires you to have 128KB of memory and 8KB of persistent storage! This lack of resources means that you are going to cross compile: work and compile on your PC, download and test on your device. There are a couple of different solutions for this: IBM WebSphere Device Developer [WSDD] (which is based on Eclipse) and Sun's Wireless Toolkit [WTK]. Each of these also provide an emulator so a developer is not continually downloading to their phone. I happen to have a non J2ME capable phone, so an emulator was a must for me. But, as ever, there are complications. Emulators aren't perfect, as we discovered the first time we downloaded the application to a real phone. Our application makes a number of network hits to get information. This was working fine on IBM's emulator, but when downloaded to the phone, the application locked up. After some investigation, it was determined that this was because the phone was asking for user permission to access the Internet, but our application wasn't well enough behaved to let the phone have the system thread to ask the question. Luckily, this article set us straight. The Sun emulator was less forgiving. So, what I've ended up doing is using the 'best of breed' solution for each problem. I use vi to edit the source files, WSDD to compile and build that jad (WSDD makes it much easier than WTK), and the WTK emulator to test.

Information architecture is also a different beast on mobile devices. Rather than having 1024x768, 800x600, or 640x480 pixels, a developer has a 150x150 screen (of course, this depends wildly on the phone, but none of the phones I know of get anywhere close to the PC screen). This means that each piece of information must be carefully chosen. If there's a ton of information, this also means that sometimes one has to break it up across screens. But, one should also minimize network hits and the amount of clicking around a user needs to do. Have you ever text messaged someone? Wasn't entering text tedious? So, there's this constant tension between having the application displaying useful information and minimizing the user input needed to get that information. Add to this the fact that we don't really know how folks are going to use our device, and you have a coding headache. This article has some good suggestions. Of course, Jakob Nielsen has an answer--test it! Put it in front of some actual users and see what they do. And we will. But, when coding for a handheld device, I hold it as a general principle to minimize user input. In addition, there isn't the nice MVC framework that I've grown used to. Instead of having an explicitly defined state machine, our application has a stack of previously visited screens. This works well currently, when information traversal is fairly linear (take me to the information I want!) but I'm leery of what will happen when the application grows more complex. Having each screen 'know' where it should go next doesn't seem to scale very well.

Versioning for J2ME applications is new for me. It is similar to versioning for desktop applications, rather than web applications. I guess I'm spoiled--I'm used to being able to change the application at the server, and have those changes ripple out to all the dumb clients. J2ME is different--since you're downloading a client to the user, changes to the server need to be coordinated with changes to the client. Not all changes, just those in the interface (which in our case is get parameters and XML over HTTP). But still, when you change that interface, you have four options: leave the old server up and running, break the older apps, have the features of the new server be a strict superset of the old one, or forcibly upgrade the older clients. This is a problem we haven't faced yet, because we're still in development. But, I can see it looming on the horizon. And as far as I can tell, this problem hasn't been solved for the desktop (DLL hell, anyone), and they've had a heck of a lot more time and money to throw at it than I will.

The last concern about J2ME that I'm going to touch on today is performance. We haven't gone through any kind of performance testing, and are still green enough to not have any experience that might guide us. So this section is just questions you should think about, rather than any kind of answers. There are several different types of performance, and they obviously aren't orthogonal: number of threads, storage, memory, and network trips, and, perhaps most importantly, perceived. What are the limits on the number of threads on a mobile device? What is the cost of the context switches that happen? Should you keep a thread around, or dispose of it when you're done? What should you store? How often should you verify it hasn't changed on the server? MIDP 1.0 gives you the ability to store raw bytes--how do you want to abstract this for your application? What's the speed of reading from 'disk'? How much memory does your application use? Is it constant, or does it spike? How many new objects do you create? What's the effect of this on the user experience? How many network trips do you make? What is this costing your user (both in dollars and in time)? How robust is it? What happens when you don't have a crystal clear connection? How are you communicating with your server? How can you minimize your network connections? What feedback do you give a user when the device is waiting? What can you cache to speed up their experience?

J2ME is very cool, and I'm really enjoying learning more about it. As you can see from above, I've plenty to learn. There are several differences that you need to be aware of when doing mobile development in java, and I hope I've outlined some of them.

Posted by moore at 09:37 AM | Comments (0)

October 22, 2003

Book Review: Second Edition of "A Programmer's Guide to Java Certification"

I used "A Programmer's Guide to Java Certification" as a study guide for achieving my Java Certified Programmer (JCP) status two years ago, so when I had the chance to review the second edition, I jumped at it (full disclosure: the publisher sent me the second edition to review). As I expected, I was again aghast and delighted at the level of detail, the exercises and the arrangement of this fine book.

Mughal and Rasmussen do a good job of covering all the nitty gritty details that the JCP requires one to know. Whether the length in bits of an int, the difference between overloading and overriding, or the order in which initializer expressions get executed, this book gives one enough detail to overwhelm the novice Java programmer, as well as cause those more experienced to scratch their heads and perhaps write a small program to verify what was read was valid. While this book lacks the discussion of I/O and the GUI of the previous edition (due to changes in the JCP test), it has a fine set of chapters on some of the fundamental libraries and classes. My two favorite explications are the chapter on Threads (Chapter 9), where that complicated subject is treated well enough to motivate more learning while not overwhelming the reader with detail, and the String and StringBuffer section of Chapter 10. So much of the Java programming I've done has been dealing with Strings, so this section, which covers the String class method by method and deals with issues of memory and performance as well as normal use, is very welcome.

The exercises were crucial to my passing the JCP, and they remain useful in this book. Grouped at the end of logical sections of chapters, they break up the text and re-iterate the lessons learned in the previous sections. The answers to these exercises are in the back of the book. Also, a full mock exam is included at the back, as well as an annotated version of the JCP exam requirements which serves as a study guide (both for the full JCP 1.4 and for the upgrade exam). Reading over the mock exam definitely let me know what areas I'd need to study if I was taking the JCP again. In short, the didactic nature of this book has not been lost.

The arrangement of this book is also useful. A fine index and the logical progression through the features of the Java language eases the onslaught of detailed information mentioned above. The extensive use of UML diagrams (especially class and sequence diagrams) was helpful as well. If one reads the book sequentially, one learns about how object references are declared (Chapter 4), then the various control structures available in Java (Chapter 5), then the basics of Object Orientation (Chapter 6), then the object life cycle (Chapter 8), in a very linear fashion. Additionally, there is extensive cross-referencing. This may not be useful to the novice programmer, but to anyone using this book as a reference, it's invaluable, because it allows Mughal and Rasmussen to provide yet more logical linking of disparate topics.

However, this book is not for everyone. I wouldn't buy it if I wanted to learn to program. While there are a few chapters that have general value (Chapter 1, Chapter 6), the emphasis on mastering idiomatic Java, not general programming concepts. Also, as they state in the preface, this is not a complete reference book for Java. It covers only what is needed for the JCP. Finally, if one wants to know how to use Java in the real world, don't buy this book. While most of the java programming I've done has benefited from the understanding I gained from this book, it has not resembled the coding I did for the exercises at all. This makes sense--this book is teaching the fundamentals, and does not pretend to cover any of the higher level APIs and concepts that are used in everyday programming.

Posted by moore at 09:05 AM | Comments (0)
© Moore Consulting, 2003-2006