Back in early 2014 Oracle released the results of the Java EE community survey which was created to gather feedback about what developers want to see in Java EE 8. One of the questions in the survey was whether people would like to see a MVC web framework alongside with JSF in Java EE 8. The result was pretty clear. Over 60% of the participants voted "Yes". As a result JSR 371 (MVC 1.0) was started to work on the new action-based web framwork for Java EE.

Fast forward to 2016. JSR 371 has been doing well. The EG released the second early draft review and got a lot of positive feedback. The JSR was moving forward even after Oracle suddenly stopped all work on the Java EE JSRs.

Recently Oracle announced the revised plan for Java EE 8 during JavaOne 2016. Surprisingly Oracle now questions whether an action-based MVC web-framwork is still relevant in the cloud age and therefore proposes to drop MVC 1.0. At the same time Oracle started a new survey to get feedback about the revised plan for Java EE 8. This survey contains the following statement:

When we first proposed Java EE 8, we got feedback that an action based web UI MVC framework standard would be a good addition to Java EE. We're now questioning whether it is still important to complete the MVC API (JSR 371).

I don't understand how Oracle comes to the conclusion that MVC is not relevant any more. People wanted it in 2014 and the EG is getting great feedback for the work done so far. So what changed in the last two years?

I was wondering what people really think about JSR-371 (MVC 1.0) in 2016. So I had a look of the survey results of the DZone Java EE survey which was created with the help of the Java EE guardians. I checked all the comments referring to MVC, especially from people which considered MVC as "not important". Then I created a list of representative comments and wrote down my thoughts about them:

Comment: Modern web applications use stateless REST services anyway.

It's true that JavaScript based web frameworks like AngularJS, React, Vue.js and friends are very popular today. Writing single page applications (SPAs) which run completely in the browser and get their data from REST backend services has many advantages, especially in regard to usability and scalability.

However, if you have been long enough in the IT industry, you have hopefully learned that there is no silver bullet. Never. Developers created various web application frameworks for decades. And guess what, nobody ever created a framework which solves all your problems. That's why there are so many of them. There is no "right" and "wrong" approach for web frameworks. There are different categories of frameworks which address different problem domains. You have to choose the right tool for your specific requirements. If your application is not highly interactive and if you have to deal with SEO requirements, a server-rendered web framework may be your best choice. An in this case MVC 1.0 will be a great option.

Searching for silver bullets seems to be a common phenomenon in our industry. Some people seem to think for example that NoSQL databases like MongoDB, Redis, Cassandra and others are the silver bullet for storing data. No, they aren't. There are many use cases for which MongoDB is a great fit, but depending on your specific requirements a classic relational database may still work better. The answer is simply: it depends!

So I think it totally makes sense to provide an alternative web framework in Java EE which implements a completely different concept than JSF. Neither of the two frameworks is right or wrong. They are different. So you can use Angular + JAX-RS, classic JSF or the new MVC framework depending on your needs. Having a choice is a good thing.

BTW: Of cause you can use JavaScript frameworks like AngularJS and React together with MVC. This technology stack is a great mix. You can even use MVC to render React applications on the server. Don't believe me? Just have a look at this awesome proof of concept for a React ViewEngine created by Niko Köbler.

Comment: The same can be achieved with JAX-RS

That's true. JAX-RS is a great foundation for creating web applications. That's why MVC is based on JAX-RS. Actually MVC is just a very thin layer on top of JAX-RS which provides everything you need to build a server-rendered web application with JAX-RS. So MVC doesn't reinvent the wheel but builds on top of what the Java EE platform already provides. You should have a look at the latest API. I'm sure you will be surprised how easy it is to create a web application with MVC 1.0 compared to do it manually with plain JAX-RS.

Comment: If a new framework is added, completely remove JSF before, or it will be to complicated for a new user to choose.

Well, dropping JSF would be a REALLY bad idea. Backward compatibility is a key feature of the Java EE platform. If you invested in JSF by building applications using it, your investment will be safe. Future versions of the platform will keep supporting JSF for the foreseeable future. And that's great, unless your boss allows you to migrate your application to a new shiny web framework every month. But I guess that's not the case. ;-)

If MVC 1.0 gets included in Java EE 8, you will have the choice between two web frameworks. And that's great. Ed Burns wrote a great blog post about this topic. I highly recommend to read his post. The key message is that component oriented and action based web frameworks are completely different concepts and both have their right to exist. Why? Remember, there is no silver bullet. You should use the framework which best fits your needs. The same is true for other technologies. If you want to process XML, you can either use low level APIs like DOM/SAX/StAX or you could use an object mapper like JAXB. Which one to choose depends on your specific situation.

Comment: I think it missed the train and now comes too late.

Well, in my view a JSR like MVC 1.0 should always try to standardize things that are proven to work good in practice. Therefore creating a JSR to standarize well established concepts can never be too late.

JPA is a very good example for this. Prior to EJB 3.0, persistence in Java EE was a very heavy weight approach. Most developers weren't happy with it. That's why people preferred 3rd party solutions like Hibernate. So Hibernate became a de-facto standard for ORM in Java. The JPA specification then created a standard which was heavily influenced by Hibernate. Which is a good thing. Remember that standardization should focus on technology that is battle tested and works good in practice. So did JPA and that's why Hibernate and JPA a very similar.

The same should be done for MVC. Of cause there are many MVC frameworks out there. And that's good because today we know which concepts work best. The most popular MVC framework in the Java space is Spring MVC. That's why MVC 1.0 is heavily influenced by Spring MVC. Some people criticize that MVC 1.0 copies Spring MVC too much. But these people don't understand that using concepts from well established frameworks for standardization is a good thing. Standards should look at the best technologies out there.

Comment: Happy with Spring MVC :-)

Hey, you are using Spring MVC? That's great! Spring MVC is awesome! If you are happy with it, there is no good reason to migrate your apps to MVC 1.0. However, if you are creating a new application on the Java EE platform, using MVC 1.0 which ships with your container may be an interesting option. Especially if you are using Java EE technologies like CDI, JAX-RS, Bean Validation and friends. In this case including Spring into your project just to get the web framework wouldn't make any sense. But if you are typically using the Spring framework and other components of the Spring ecosystem, you should definitely go with Spring MVC.

Comment: Too late. Enhance JAX-RS.

Back then, before JSR 371 (MVC 1.0) was started, there were quite some discussions whether web framework support should be added to JAX-RS directly or if a separate specification would make more sense. At that time the JAX-RS EG decided that adding web framework related concepts to the core JAX-RS API would not be a good idea. So a separate JSR was created. However, as I mentioned before, MVC 1.0 is based on JAX-RS. So I could argue that MVC 1.0 is actually an enhancement of the JAX-RS spec. It's just a separate specification bundled in a separate JAR which is using a different package name.

Final words

Thanks for reading so far. I hope you liked reading my thoughts about MVC 1.0 and Oracle's decision to drop it from Java EE 8. If you agree with me that MVC is still relevant and should be included in Java EE 8, please fill out the new Oracle Java EE survey and help to confess Oracle to keep investing in MVC 1.0.

The MVC 1.0 (JSR371) specification has just released an early draft for review. I'm very happy to see that people are interested in this JSR and start to have a deeper look at how MVC 1.0 will look like. At this point any kind of feedback is very valuable and welcome.

One of the questions which is asked very frequently is which view technologies MVC 1.0 will support. The answer is that MVC has built-in support for JSP and Facelets. This seems to disappoint people. Especially because JSP isn't considered to be a "modern" view technology any more.

I want to note a few things here. First JSP isn't as bad as most people think. Sure, it has its weaknesses (like any other technology), but there are good arguments for using JSP. Every IDE for example supports JSP out of the box and I think that decent tooling is something that is lacking for many of the other view technologies. One of the major problems with JSP is that it allows you to do really really bad things like embedding Java code in the view. But if you use JSP in a reasonable way, it works very well. And if you don't like JSP, you can also choose Facelets which has been around in the JSF world for quite some time and offers many great templating concepts.

However, I agree that there are many other great view technologies out there. And I fully understand people who prefer libraries like Thymeleaf as they offer many great features you don't get with JSP or Facelets.

One of the things I like most about MVC 1.0 is that it actually doesn't matter what view technologies it supports out of the box. Why? Because it is so easy to integrate arbitrary view technologies with MVC. You don't believe me? Ok, here is an example to convince you.

Creating a custom view engine

One of the most popular template engines for node.js is Jade. Similar to Haml in the Ruby world it is based on the idea to use indention to define blocks which basically means that you don't need to manually close elements any more. There is even a Java implementation of the Jade language called jade4j. So let's have a look the required steps to use jade4j as the view technology for your MVC 1.0 based application.

To integrate a template engine with MVC, you have to implement the ViewEngine interface which consists of only two methods. MVC 1.0 uses CDI to discover all view engine implementations. There is no need to register the implementation anywhere. Just add @ApplicationScoped to your implementation and MVC will find it.

OK, let's have a look at the code:

@ApplicationScoped
public class JadeViewEngine implements ViewEngine {

  @Inject
  private ServletContext servletContext;

  @Override
  public boolean supports(String view) {
    return view.endsWith(".jade");
  }

  @Override
  public void processView(ViewEngineContext context) throws ViewEngineException {

    try {

      String viewName = "/WEB-INF/views/" + context.getView();
      URL template = servletContext.getResource(viewName);

      String html = Jade4J.render(template, context.getModels(), true);

      context.getResponse().getWriter().write(html);

    } catch (IOException e) {
      throw new IllegalStateException(e);
    }

  }

}

That's all! Everything you need to integrate jade4j with MVC 1.0. That's not much code, isn't? Of cause this code can be (and should be) improved in regard to error handling, caching and so on. But it is a great example to demonstrate the basics.

The supports() method is used by the MVC implementation to find the view engine responsible for the view name returned by a controller method or specified with the @View annotation. In this example our view engine will process all views with the file extension .jade.

The processView() method is where all the magic happens. The purpose of this method is to render the view and write the result to the response stream. Everything you need for this is available from the ViewEngineContext. The code above uses the ServletContext to load the Jade template from the default view folder /WEB-INF/views. Jade4J provides simple static methods you can use to evaluate templates. All you need is the template and a model. After that you can write the resulting HTML page to the output stream.

Using the view engine

Now let's check if the view engine works as expected. First we will create a simple controller that uses the MVC 1.0 API:

@Path("/hello")
public class HelloController {

  @Inject
  private Models models;

  @GET
  @Controller
  public String controller() {
    models.put("name", "Christian");
    return "hello.jade";
  }

}

As you see there is nothing special about this controller. It looks like every other MVC 1.0 controller. The only difference is that it returns the view name hello.jade instead of something like hello.jsp. Now let's have a look at the corresponding view:

!!! 5
html
  head
    title Jade4J Demo
  body
    h1.
      Hello #{name}

I guess this looks very weird if you have never seen something like Jade or Haml before. This is a simple page that renders a h1 element containing a greeting. It uses EL-like expressions to reference values from the model. If you want to learn more about Jade, have a look at the Jade Language Reference.

So you see the Jade integration is working fine. And we had to create just a single class. Easy, isn't it? :)

Conclusion

I hope this example shows you how easy it is to integrate custom view technologies with MVC 1.0. Ozark, the reference implementation of MVC 1.0, already provides a number of extra view engine implementations for template engines like Thymeleaf, Freemarker, Velocity, Handlebars and Mustache. All of them are available from Maven Central.

I hope you enjoyed this blog post. If you want to give MVC 1.0 a try, I recommend to have a look at todo-mvc, which is a small sample application I created to demonstrate how a typical application created with MVC 1.0 looks like.

Have fun!

I'm very proud to announce that I will be speaking at JavaLand 2015 in about a month. I'm really looking forward to it as I wasn't able to participate last year and (according to the feedback I got from various sources) JavaLand is an awesome conference.

The title of my talk is BYOC - Bring Your Own Container. Basically it is about the concept of shipping the container with your application which allows you to create executable JARs that don't need to be deployed to an application server any more. This provides an improved developer experience like the one you get when building you application with Spring Boot or Dropwizard.

I will show a small open source project I created called Backset which is basically an implementation of this concept. The core idea is similar to projects like Spring Boot or Dropwizard but with one important advantage: You build your app using well known Java EE APIs like CDI, JPA, JSF, etc.

I'm also looking forward to the various other talks such as:

So, what is your excuse for not joining us at JavaLand 2015? :)

There have been recently much discussion about the missing binary downloads of JBoss AS 7.1.2.Final and 7.1.3.Final. To keep the long story short, here are just the most important facts: The latest version of AS7 available on the download page is 7.1.1.Final which has been release about a year ago. But if you have a look at the AS7 GitHub repository, you will notice tags for 7.1.2.Final and 7.1.3.Final.

To be honest, I don't fully understand the reason for this. But that's the current situation. So if you want JBoss AS 7.1.3.Final, you have to build it yourself.

It seems like most people think that the process of compiling AS7 is difficult. But it is not. It is actually pretty straight forward and only takes about 10 minutes.

The first step is of cause to download the sources. Fortunately GitHub provides downloadable ZIP files for all tags.

$ wget https://github.com/jbossas/jboss-as/archive/7.1.3.Final.zip
$ unzip 7.1.3.Final.zip
$ cd jboss-as-7.1.3.Final/

Now you have to start the build by entering this command:

$ ./build.sh -DskipTests -Drelease=true install

This command requires some explanation. The build.sh file is a helper script that performs some additional checks to ensure you are using the correct Maven version. I also recommend to use -DskipTests to skip the test suite which will reduce the overall build time dramatically. You also have to set -Drelease=true to ensure the build creates the distribution archives.

The compilation took about 9 minutes on my box. After the build completed you will find the ZIP distribution in the dist/target directory:

$ ls -1 dist/target/*.zip
dist/target/jboss-as-7.1.3.Final-src.zip
dist/target/jboss-as-7.1.3.Final.zip

That's all. Simple, isn't it? So you see there is no reason for not building AS7 yourself. :)

I love git! And therefore I also love github.com! I use GitHub very often to publish smaller or even large projects and share them with others. As I mostly use Maven to build my Java projects, I recently searched for an easy way to publish Maven artifacts via GitHub. I learned that it is in fact very easy! Interested? Read on! :-)

The basic idea of hosting Maven repositories on GitHub is to use GitHub Pages. This GitHub feature offers a simple but powerful way for creating and hosting web sites on their infrastructure. Fortunately this is all we need to create Maven repositories. I'll explain the process by example. Therefore I'll show you how I created a repository for jsf-maven-util, one of my recent spare time projects.

The first step is to create a separate clone of your GitHub repository in a directory next to your primary local repository:

$ pwd
/home/ck/workspace/jsf-maven-util
$ cd ..
$ git clone git@github.com:chkal/jsf-maven-util.git jsf-maven-util-pages
$ cd jsf-maven-util-pages

The GitHub Pages web site must be created as a branch named gh-pages in your repository. So lets create this branch and empty it. Refer to the GitHub Pages Manual if you are interested in the exact meaning of these commands.

$ git symbolic-ref HEAD refs/heads/gh-pages
$ rm .git/index
$ git clean -fdx

We will place the Maven repository in a subdirectory of this new branch:

$ mkdir repository

We also want to have a pretty directory listing. Unfortunately GitHub Pages doesn't have native support for this. So we will create our own directory listing with a simple bash script.

Create a file named update-directory-index.sh in the root of the new branch (next to the repository directory). This script will walk recursively into the repository directory and create index.html files in each subdirectory. Please be careful when using this script as it overwrites all exiting index.html files it finds.

#!/bin/bash

for DIR in $(find ./repository -type d); do
  (
    echo -e "<html>\n<body>\n<h1>Directory listing</h1>\n<hr/>\n<pre>"
    ls -1pa "${DIR}" | grep -v "^\./$" | grep -v "^index\.html$" 
        | awk '{ printf "<a href=\"%s\">%s</a>\n",$1,$1 }'
    echo -e "</pre>\n</body>\n</html>"
  ) > "${DIR}/index.html"
done

Congratulations! Your repository is ready. Now you will have to modify the distributionManagement section of your pom.xml to let Maven deploy your artifacts to the new repository. Go back to your primary repository clone and edit your pom.xml:

<distributionManagement>
  <repository>
    <id>gh-pages</id>
    <url>file:///${basedir}/../jsf-maven-util-pages/repository/</url>
  </repository>
</distributionManagement>

Now you are ready to deploy your first artifact to the repository:

$ mvn -DperformRelease=true clean deploy

You will see that Maven copies the artifacts to your local checkout of the GitHub Pages branch. After Maven has finished you'll have to update the directory listings, commit the changes made to the repository and push them to GitHub:

$ cd ../jsf-maven-util-pages/
$ ./update-directory-index.sh
$ git add -A
$ git commit -m "Deployed my first artifact to GitHub"
$ git push origin gh-pages

Now let's check the result. Please note that the first publish may take some time to appear on the web server.

Looks great, doesn't it? :-)

If you want to use your repository in another project, just add the following repository entry to the pom.xml:

<repository>
  <id>jsf-maven-util-repo</id>
  <name>jsf-maven-util repository on GitHub</name>
  <url>http://chkal.github.com/jsf-maven-util/repository/</url>
</repository>

As you can see deploying Maven artifacts to GitHub is very simple. You can also use a similar approach to publish your Maven generated project site to GitHub. But that's a different story.... :-)

I was recently confronted with the task of displaying the version of a JSF project in its page title. As the version was already contained in the project's pom.xml and I didn't want to duplicate this information in another file, I searched for a simple way to display the Maven artifact's version in the JSF page.

As there was no easy way to do this, I created a small library for this usecase and named it jsf-maven-util. The main idea of it is to supply a JSF managed bean that lazily checks for pom.properties files of Maven artifacts on the classpath. These files are created during the Maven packaging process and are stored in the META-INF/maven/ directory of the output archive.

The library is very easy to use. A bean named maven is automatically placed in the application scope of your webapp. It contains a map which you can use to get the version of an artifact by using the groupId and artifactId (colon-separated) as the key.

This example shows how to display the version of a web application in its page title.

<head>
  <title>
    My Application #{maven.version['com.example.myapp:myapp-webapp']}
  </title>
</head>  

You can also display the version of any of your project's dependencies as long as it includes a pom.properties in its archive:

<p>
  powered by Weld #{maven.version['org.jboss.weld:weld-core']}  
</p>

If you are interested in using this feature in your own project, add the following repository to your pom.xml:

<repository>
  <id>jsf-maven-util-repo</id>
  <name>jsf-maven-util Repository</name>
  <url>http://chkal.github.com/jsf-maven-util/repository/</url>
</repository>

Then add the following dependency to your project:

<dependency>
  <groupId>de.chkal.jsf</groupId>
  <artifactId>jsf-maven-util</artifactId>
  <version>1.1</version>
</dependency>

I pushed the source to a GitHub repository. Let me know if you have any issues.

I recently looked for a way to integrate the Google Analytics tracking code into the project site of my current spare time project Criteria4JPA. I'm using the Maven Site Plugin to automatically build the project page because it makes the process of creating a site very easy.

After some time I realized that there seems to be no easy way to do this. Somebody on the maven-user list proposed to create a copy of the original site template and modify it to include the necessary JavaScript in the page header. But I think that this solution is much to complicated for such a simple job. I also found an existing JIRA issue describing the problem but it is still unresolved.

But after much more searching I discovered a very simple and elegant way to get the Google Analytics tracking code into the maven site. I found the hint on the doxia-dev mailing list. Someone talked about a mysterious <head> element that can be used in the site.xml descriptor. The JIRA issue DOXIA-150 seemed to prove the existence of this feature.

I tried it and it worked. See the site.xml file of Criteria4JPA for an example:

<?xml version="1.0" encoding="ISO-8859-1"?>
<project name="Criteria4JPA">

  <body>

    <head>
      <!-- Google Analytics - Start -->
      <script type="text/javascript">
      var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www.");
      document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E"));
      </script>
      <script type="text/javascript">
      try {
      var pageTracker = _gat._getTracker("UA-1234567-8");
      pageTracker._trackPageview();
      } catch(err) {}</script>      
      <!-- Google Analytics - End -->
    </head>

    <!-- more stuff -->

  </body>
</project>

As you can see adding the tracking code is very easy. Just place a <head> element inside the body and copy the Google Analytics code in there. The default JavaScript code you get from Google Analytics is already correctly escaped so you can copy and paste it into the XML descriptor without problems.

I don't know for sure which versions of Maven, Doxia and the Site Plugin are required for this but I can confirm that Maven 2.2.0 together with the Maven Site Plugin 2.0-beta-7 works.

Happy tracking... :-)

I recently tried to setup the Eclipse TPTP Profiler on my two Linux boxes (Ubuntu 7.04 "Feisty Fawn" and 8.10 "Intrepid Ibex"). I thought this would only require installing some plugins from the Ganymede update site. I learned that the installation can be more complex on Linux systems.

The most problematic part of the TPTP Profiler installation was setting up the Agent Controller. The Agent Controller is a native binary and as we all know native code does not follow the philosophy "compile once, run everywhere"! :-)

After installing the the features "TPTP Tracing and Profiling Tools Project" and "TPTP Profiling for Web applications" from the Ganymede update site I restarted Eclipse and tried to profile a simple web application via "Profile as -> Profile on Server". Unfortunately this failed with:

[Error: FATAL ERROR: JPIAgent can't load ACCollector]

Basic troubleshooting procedure

To debug such problems I recommend to try starting the Agent Controller directly from the command line. This way you can easily find problems related to the agent controller. I decided to start the Agent Controller binary ACServer instead of the startup shell script ACStart.sh, because I didn't figure out which way Eclipse uses to execute the Agent Controller.

To start the Agent Controller from the command line you must find the directory of the Agent Controller bundle. For my Ganymede installation the process looks like this:

$ cd plugins/org.eclipse.tptp.platform.ac.linux_*/agent_controller/bin
$ ./ACServer
./ACServer: error while loading shared libraries: libtptpUtils.so.4: cannot open shared object file: No such file or directory

Your first try will probably fail like in this example because of a missing shared library. The library libtptpUtils residents in the library directory of the agent controller. Eclipse will take care of setting the corresponding paths but as we are trying to start the Agent Controller from the command line, we have to perform this step to temporarily fix this problem for our test:

$ export LD_LIBRARY_PATH=../lib

If you are lucky, the Agent Controller now starts without any problems. But on my systems the execution failed for different reasons described in the following sections.

Broken symlinks

There seems to be a problem with symlink creation during the installation of the Agent Controller bundle. This bug showed up with Ubuntu 7.04 but not with Ubuntu 8.10.

$ ./ACServer
./ACServer: error while loading shared libraries: libtptpUtils.so.4: cannot open shared object file: File too short

On my system there was a regular file named libtptpUtils.so.4 which contained only the string libtptpUtils.so.4.5.0. Instead of creating symlinks the installation process seemed to create regular files which contain the name of the referenced file.

I fixed the problem by manually removing the broken files and creating the symlinks:

$ rm libtptpUtils.so libtptpUtils.so.4
$ ln -s libtptpUtils.so.4.5.0 libtptpUtils.so.4
$ ln -s libtptpUtils.so.4 libtptpUtils.so

$ rm libxerces-c.so libxerces-c.so.26
$ ln -s libxerces-c.so.26.0 libxerces-c.so.26
$ ln -s libxerces-c.so.26 libxerces-c.so

$ rm libxerces-depdom.so libxerces-depdom.so.26
$ ln -s libxerces-depdom.so.26.0 libxerces-depdom.so.26
$ ln -s libxerces-depdom.so.26 libxerces-depdom.so

$ rm libtransportSupport.so libtransportSupport.so.4
$ ln -s libtransportSupport.so.4.5.0 libtransportSupport.so.4
$ ln -s libtransportSupport.so.4 libtransportSupport.so

Surprisingly this problem only showed up on my first try getting Eclipse and the TPTP Profiler to work. I tried to reproduce the problem with a clean Eclipse installation while working on this blog post. But the second time all symlinks were created successfully. This makes me suspect that the bug is fixed in the current TPTP releases.

Missing shared libraries

The Ubuntu 8.10 box showed another problem, that didn't occur on my old Ubuntu 7.04 installation:

$./ACServer
./ACServer: error while loading shared libraries: libstdc++-libc6.2-2.so.3: cannot open shared object file: No such file or directory

The Agent Controller requires an old libstdc++ library that wasn't installed on Ubuntu 8.10. Fixing this problem depends on the Linux distribution and its version. For Ubuntu 7.10 the library is contained in the "libstdc++2.10-glibc2.2" package, which was already installed. Ubuntu 8.10 is lacking a package containing this file. I fixed the problem by manually downloading a package from an older Ubuntu release:

$ cd /tmp
$ wget "http://de.archive.ubuntu.com/ubuntu/pool/universe/g/gcc-2.95/libstdc++2.10-glibc2.2_2.95.4-24_i386.deb"
$ sudo dpkg -i libstdc++2.10-glibc2.2_2.95.4-24_i386.deb

Depending on your system there might by other libraries missing. The following list contains all shared library dependenies of the Agent Controller binary:

$ ldd ACServer
   linux-gate.so.1 =>  (0xffffe000)
   libtptpUtils.so.4 => ../lib/libtptpUtils.so.4 (0xb7f72000)
   libtptpLogUtils.so.4 => ../lib/libtptpLogUtils.so.4 (0xb7f65000)
   libtptpConfig.so.4 => ../lib/libtptpConfig.so.4 (0xb7f4b000)
   libprocessControlUtil.so.4 => ../lib/libprocessControlUtil.so.4 (0xb7f46000)
   libxerces-c.so.26 => ../lib/libxerces-c.so.26 (0xb7b3f000)
   libpthread.so.0 => /lib/tls/i686/cmov/libpthread.so.0 (0xb7b12000)
   libc.so.6 => /lib/tls/i686/cmov/libc.so.6 (0xb79d1000)
   libdl.so.2 => /lib/tls/i686/cmov/libdl.so.2 (0xb79cd000)
   libuuid.so.1 => /lib/libuuid.so.1 (0xb79ca000)
   libstdc++-libc6.2-2.so.3 => /usr/lib/libstdc++-libc6.2-2.so.3 (0xb7982000)
   libm.so.6 => /lib/tls/i686/cmov/libm.so.6 (0xb795a000)
   /lib/ld-linux.so.2 (0xb7f95000)

If some of these libraries are missing on your system you will have to find and install the corresponding packages. A Good place to start searching for packages is the Ubuntu Packages Search, the Debian Package Search, the Fedora Package Database or other distribution's package directories.

Missing TEMP environment variable

This problem was the easiest to solve. The Agent Controller complains about the missing environment variable TEMP.

$ ./ACServer
The TEMP environment variable does not point to a valid directory.
Agent Controller will not start.

As already hinted this problem can be solved easily. Just put this line in your .bashrc file:

export TEMP=/tmp

It should be mentioned that I did not test whether Eclipse takes care of setting this variable for the Agent Controller process. But it does certainly no harm to set it manually.

Finally...

If you don't stumble across other problems the Agent Controller should now start normally. In this case you won't see any output on the console.

$ ./ACServer
<no console output>

Now you can stop the Agent Controller by hitting CTRL+C and retry profiling an application in Eclipse. On my system profiling in Eclipse now worked as expected.

Conclusion

The installation of the TPTP Profiler can be very problematic on a Linux box. I have presented solutions for the different problems I was confronted with. By reading this short blog post everyone should get an idea of the most common problems regarding the Agent Controller and which way to go to solve them.