The Legacy Developer’s Guide to Java 9

You can’t take advantage of Java 9’s shiny new features if your application needs to support older versions of Java… or can you? Here’s what Java 9 offers the developers of legacy Java applications.

Introduction: The Curse of the New Java Release

Every few years, a new version of Java is released. Speakers at JavaOne tout the new language constructs and APIs, and laud their benefits. Excited developers line up, eager to use the new features. It’s a rosy picture, except most developers are charged with maintaining and enhancing existing applications, not creating new ones from scratch. Most applications, particularly commercially sold applications, need to be backwards-compatible with earlier versions of Java, which won’t support these new features. And, finally, most customers and end users, particularly those in enterprises, are cautious about adopting the newly announced Java platform, preferring to wait until they’re confident that the new platform is solid.

This leads to problems when developers want to use a new feature. Do you like the idea of using default interface methods in your code? You’re out of luck if your application needs to run on Java 7 or earlier. Do you want to use the java.util.concurrent.ThreadLocalRandom class to generate pseudo-random numbers in a multi-threaded application? Too bad if your application needs to run on Java 6 as well as on Java 7, 8 and 9.

The result is that legacy developers feel like kids with their noses pressed up against the window of the candy store, and they’re not allowed in. It can be disappointing and frustrating.

Is there anything in the upcoming Java 9 release that’s aimed at developers working on legacy Java applications? Is there anything that makes our life easier, while at the same time allows us to use the exciting new features that are coming out next year? Fortunately, the answer is yes.

By the way, this is not a complete guide to everything new in Java 9. A great guide to all the new features in Java 9 can be found here.

What We Do Now

There are ways to shoehorn new platform features into legacy applications that need to be backward-compatible. Particularly, there are ways to take advantage of new APIs. It can get a little ugly, however.

We can use late binding to attempt to access a new API when our application also needs to run on older versions of Java that don’t support that API. For example, let’s say that we want to use the class that was introduced in Java 8, and we want to use LongStream’s anyMatch(LongPredicate) method, but the application has to run on Java 7. We could create a helper class as follows:

public class LongStreamHelper {

     private static Class longStreamClass;
     private static Class longPredicateClass;
     private static Method anyMatchMethod;

     static {
          try {
               longStreamClass = Class.forName("");
               longPredicateClass = Class.forName("java.util.function.LongPredicate");
                  = longStreamClass.getMethod("anyMatch", longPredicateClass):
          } catch (ClassNotFoundException e) {
               longStreamClass = null;
               longPredicateClass = null;
               anyMatchMethod = null
          } catch (NoSuchMethodException e) {
               longStreamClass = null;
               longPredicateClass = null;
               anyMatchMethod = null;

     public static boolean anyMatch(Object theLongStream, Object thePredicate)
        throws NotImplementedException {
          if (longStreamClass == null) throw new NotImplementedException();

          try {
               Boolean result
                  = (Boolean) anyMatchMethod.invoke(theLongStream, thePredicate);
               return result.booleanValue();
          } catch (Throwable e) {    // lots of potential exceptions to handle. Let’s simplify.
               throw new NotImplementedException();

There are ways to make this a little simpler, or more general, or more efficient, but you get the idea.

Now, instead of calling theLongStream.anyMatch(thePredicate) as you would in Java 8, you can call LongStreamHelper.anyMatch(theLongStream, thePredicate) in any version of Java. If you’re running on Java 8, it’ll work, but if you’re running on Java 7, it’ll throw a NotImplementedException.

Why is this ugly? Well, it can get extremely complicated and tedious when there are lots of APIs we want to access. (In fact, it’s tedious already, with a single API.) It’s also not type safe, since we can’t actually mention LongStream or LongPredicate in our code. Finally, it’s much less efficient, because of the overhead of the reflection and the extra try-catch blocks. So, while we can do this, it’s not much fun, and it’s error-prone if we’re not careful.

While we can access new APIs and still have our code remain backward-compatible, we can’t do this at all for new language constructs. For example, let’s say that we want to use lambdas in code that also needs to run in Java 7. We’re out of luck. The Java compiler will not let us specify a source compliance level later than its target compliance level. So, if we set a source compliance level of 1.8 (i.e., Java 8), and a target compliance level of 1.7 (Java 7), it will not allow us to proceed.

Multi-Release JAR Files to the Rescue

Until recently, there hasn’t been a good way to use the latest Java features while still allowing the application to run on earlier versions of Java that don’t support the applications. Java 9 finally provides a way to do this for both new APIs and for new Java language constructs: multi-release JAR files.

Multi-release JAR files look just like old-fashioned JAR files, with one crucial addition: There’s a new nook in the JAR file where you can put classes that use the latest Java 9 features. If you’re running Java 9, the JVM recognizes this nook and uses the classes in that nook, and ignores any classes of the same name in the regular part of the JAR file. If you’re running Java 8 or earlier, however, the JVM doesn’t know about this special nook and will ignore it, and only runs the classes in the regular part of the JAR file. In the future, when Java 10 comes out, there’ll be another nook specifically for classes using new Java 10 features, and so forth.

JEP 238, the Java enhancement proposal that specifies multi-release JAR files, gives a simple example. Consider a JAR file containing four classes that will work in Java 8 or earlier:

JAR root

  • A.class
  • B.class
  • C.class
  • D.class

Let’s say that Java 9 comes out, and we rewrite classes A and B to use some new Java 9-specific features. Later, Java 10 comes out and we rewrite class A again to use Java 10’s new features. At the same time, the application should still work with Java 8. The new multi-release JAR file looks like this:

JAR root

  • A.class
  • B.class
  • C.class
  • D.class
    • Versions
      • 9
        • A.class
        • B.class
      • 10
        • A.class

In addition to the new structure, the JAR file’s manifest contains an indication that this is a multi-release JAR.

When we run this JAR file on a Java 8 JVM, it ignores the \META-INF\Versions section, since it doesn’t know anything about it and isn’t looking for it. Only the original classes A, B, C and D are used. When we run it using Java 9, the classes under \META-INF\Versions\9 are used, and are used instead of the original classes A and B, but the classes in \META-INF\Versions\10 are ignored. When we run it using Java 10, both \META-INF\Versions branches are used; particularly, the Java 10 version of A is used, as is the Java 9 version of B, and the default versions of C and D.

So, if you want to use the new Java 9 ProcessBuilder API in your application while still allowing your application to run under Java 8, just put the new versions of your classes that use ProcessBuilder in the \META-INF\Versions\9 section of the JAR file, while leaving old versions of the classes that don’t use ProcessBuilder in the default section of the JAR file. It’s a straightforward way to use the new features of Java 9 while maintaining backward-compatibility.

The Java 9 JDK will contain a version of the jar.exe tool that supports creating multi-release JAR files. Other non-JDK tools will also provide support.

Modules Everywhere

The Java 9 module system (also known as Project Jigsaw), is undoubtedly the biggest change to Java 9. One goal of modularization is to strengthen Java’s encapsulation mechanism so that the developer can specify which APIs are exposed to other components and can count on the JVM to enforce the encapsulation. Modularization’s encapsulation is stronger than that provided by the public/protected/private access modifiers of classes and class members. The second goal of modularization is to specify which modules are required by which other modules, and to ensure that all necessary modules are present before the application executes. In this sense, modules are stronger than the traditional classpath mechanism, since classpaths are not checked ahead of time, and errors due to missing classes only occur when the classes are actually needed, which means that an incorrect classpath might be discovered only after an application has been run for a long time, or after it has been run many times. The entire module system is large and complex, and we are not going to provide a complete discussion in this article. (There are plenty of good explanations, including the one here.) Rather, we are going to concentrate on aspects of modularization that support developers of legacy Java applications.

I’d like to preface the discussion by saying that modularization is a very good thing, and that developers should try to modularize their new code wherever possible, even if the rest of the legacy application is not (yet) modularized. Fortunately, the modularization specification makes this easy.

First, a JAR file becomes modularized (and becomes a module) when it contains a file module-info.class (compiled from at the JAR file root. contains metadata specifying the name of the module, which packages are exported (i.e., made visible to the outside), which modules the current module requires, and some other information. The information in module-info.class is only visible when the JVM is looking for it, which means that modularized JAR files are treated like ordinary JAR files when running on older versions of Java (assuming the code has been compiled to target an earlier version of Java. Strictly speaking, you’d need to cheat a little and still target module-info.class to Java 9, but that’s doable).  That means that you should still be able to run your modularized JAR files on Java 8 and earlier, assuming that they’re otherwise compatible with that earlier version of Java. Also note that module-info.class files can be placed, with restrictions, in the versioned areas of multi-release JAR files.

In Java 9, there is both a classpath and a module path. The classpath works like it always has. If a modularized JAR file is placed in the classpath, it’s treated just like any other JAR file. This means that if you’ve modularized a JAR file, but are not ready to have your application treat it as a module for whatever reason, you can just put it in the classpath, and it will work as it always has, and your legacy code should be able to handle it just fine. Also, note that the collection of all JAR files in the classpath are considered to be part of a single unnamed module. The unnamed module is considered a regular module, but it exports everything to other modules, and it can access all other modules. This means that, if you have a Java application that’s modularized, but have some old libraries that haven’t been modularized yet (and perhaps never will be), you can just put those libraries in the classpath and everything will just work.

Java 9 contains a module path that works alongside the classpath. Using the modules in the module path, the JVM can check, both at compile time and at run time, that all the necessary modules are present, and can report an error if any are missing. As mentioned before, all JAR files in the classpath, as members of the unnamed module, are accessible to the modules in the module path and vice versa. It’s easy to migrate a JAR file from the classpath to the module path, to get the advantages of modularization. First, you can add a module-info.class file to the JAR file, then move the modularized JAR file to the module path. The newly minted module can still access all the classpath JAR files that have been left behind, because they’re part of the unnamed module and everything is accessible. It’s also possible that you might not want to modularize a JAR file, or that the JAR file belongs to someone else, so you can’t modularize it yourself. In that case, you can still put the JAR file into the module path; it becomes an automatic module. An automatic module is considered a module even though it doesn’t have a module-info.class file. The module’s name is the same as the name of the JAR file containing it, and can be explicitly required by other modules. It automatically exports all its publicly accessible APIs, and reads (that is, requires) every other named module, as well as the unnamed modules. This means that it’s possible to make an unmodularized classpath JAR file into a module with no work at all: Legacy JAR files become modules automatically, albeit without some of the information needed to determine whether all required modules are really there, and to determine what is missing.

Not every unmodularized JAR file can be moved to the module path and made an automatic module. There is a rule that a package can only be part of one named module. So if a package is in more than one JAR file, then only one of the JAR files containing that package can be made into an automatic module – the other can be left in the classpath and remain part of the unnamed module.

The mechanism we’ve described sounds complicated, but it’s really quite simple. All it really means is that you can leave your old JAR files in the classpath or you can move them to the module path. You can modularize them or you can leave them unmodularized. Once they’re modularized, you can leave them in the classpath or put them in the module path. In most cases, everything should just work as before. Your legacy JAR files should be at home in the new module system. Of course, the more you modularize, the more dependency information can be checked, and missing modules and APIs will be detected far earlier in the development cycle, possibly saving you a lot of work.

Supplying Your Own Java Environment: The Modular JDK and Jlink

One problem with legacy Java applications is that the end user might not be using the right Java environment, and one way to guarantee that the Java application will run is to supply the Java environment with the application. Java allows the creation of a private or redistributable JRE, which may be distributed with the application. The JDK/JRE installation comes with instructions on how to create a private JRE. Typically, you take the JRE file hierarchy that’s installed with the JDK, keep the required files, and retain those optional files whose functionality your application will need. The process is a bit of a hassle: You need to maintain the installation file hierarchy, you have to be careful that you don’t leave out any files and directories that you might need, and, while it does no harm to do so, you don’t want to leave in anything that you don’t need, since it will take up unnecessary space. It’s easy to make a mistake. So why not let the JDK do the job for you? With Java 9, it’s now possible to create a self-contained environment with your application and anything it needs to run. No need to worry that the wrong Java environment is on the user’s machine, and no need to worry that you’ve created the private JRE incorrectly.

The key to creating these self-contained run-time images is the module system. Not only can you modularize your own code (or not), but the Java 9 JDK is itself now modularized. The Java class library is now a collection of modules, as are the tools of the JDK itself. The module system requires you to specify the base class modules that your code requires, and that in turn will specify the parts of the JDK that are needed. To put it all together, we use a new Java 9 tool called jlink. When you run jlink, you’ll get a file hierarchy with exactly what you’ll need to run your application — no more and no less. It’ll be much smaller than the standard JRE. Of course, it’s platform-specific (that is, specific to an operating system and machine), so if you want to create these runtime images for different platforms, you’ll need to run jlink in the context of installations on each platform for which you want an image. Also note that if you run jlink on an application in which nothing has been modularized, there won’t be enough information to narrow down the JRE, and jlink will have no choice but to package the whole JRE. Even there, you’ll get the convenience of having jlink package the JRE itself, so you don’t need to worry about correctly copying the required file hierarchy.

With jlink, it becomes easy to package up your application and everything it needs to run, without worrying about getting it wrong, and only packaging that part of the runtime that’s necessary to run your application. This way, your legacy Java application has an environment on which it’s guaranteed to run.

Summing Up

One of the problems with having to maintain a legacy Java application is that you’re shut out from all the fun when a new version of Java comes along. Java 9, like its predecessors, has a bunch of great new APIs and language features, but developers remembering past experiences might assume that there’s no way to use those new features without breaking compatibility with earlier versions of Java. Java 9’s designers, to their credit, seem to have been aware of this, and they’ve worked hard to make those new features accessible to developers who have to worry about their applications supporting older versions of Java.

Multi-release JAR files allow developers to work with new Java 9 features, and segregate them in a part of the JAR file where earlier Java versions won’t see them. This makes it easy for developers to write code for Java 9 and leave the old code for Java 8 and earlier, and allow the runtime to choose the classes it can run.

Java modules allow developers to get better dependency checking by writing any new JAR files in a modular style, while leaving old code unmodularized. The system is remarkably tolerant, is designed for gradual migration, and will almost always work with legacy code that knows nothing about the module system.

The modular JDK and jlink allow users to easily create self-contained runtime images, so that an application is guaranteed to come with the Java runtime that it needs to run, and that everything that it needs is guaranteed to be there. Previously, this was an error-prone process, but with Java 9 the tools are there to make it just work.

Unlike earlier Java releases, the new features of Java 9 are ready for you to use, even if you have an older Java application and need to make sure that customers can run your application even if they’re not as eager as you are to move up to the newest Java version.

A version of this article was originally published on TechBeacon: