Core Java

Devoxx 2012: Java 8 Lambda and Parallelism, Part 1

Overview

Devoxx, the biggest vendor-independent Java conference in the world, took place in Atwerp, Belgium on 12 – 16 November. This year it was bigger yet, reaching 3400 attendees from 40 different countries. As last year, I and a small group of colleagues from SAP were there and enjoyed it a lot.

After the impressive dance of Nao robots and the opening keynotes, more than 200 conference sessions explored a variety of different technology areas, ranging from Java SE to methodology and robotics. One of the most interesting topics for me was the evolution of the Java language and platform in JDK 8.
 
My interest was driven partly by the fact that I was already starting work on Wordcounter, and finishing work on another concurrent Java library named Evictor, about which I will be blogging in a future post.

In this blog series, I would like to share somewhat more detailed summaries of the sessions on this topic which I attended. These three sessions all took place in the same day, in the same room, one after the other, and together provided three different perspectives on lambdas, parallel collections, and parallelism in general in Java 8.

In this post, I will cover the first session, with the other two coming soon.

On the road to JDK 8: Lambda, parallel libraries, and more

In the first session, Joe Darcy, a lead engineer of several projects at Oracle, introduced the key changes to the language coming in JDK 8, such as lambda expressions and default methods, summarized the implementation approach, and examined the parallel libraries and their new programming model. The slides from this session are available here.

Evolving the Java platform

Joe started by talking a bit about the context and concerns related to evolving the language.
The general evolution policy for OpenJDK is:

  • Don’t break binary compatibility
  • Avoid introducing source incompatibilities.
  • Manage behavioral compatibility changes

The above list also extends to the language evolution. These rules mean that old classfiles will be always recognized, the cases when currently legal code stops compiling are limited, and changes in the generated code that introduce behavioral changes are also avoided. The goals of this policy are to keep existing binaries linking and running, and to keep existing sources compiling.

This has also influenced the sets of features chosen to be implemented in the language itself, as well as how they were implemented. Such concerns were also in effect when adding closures to Java. Interfaces, for example, are a double-edged sword. With the language features that we have today, they cannot evolve compatibly over time. However, in reality APIs age, as people’s expectations how to use them evolve. Adding closures to the language results in a really different programming model, which implies it would be really helpful if interfaces could be evolved compatibly. This resulted in a change affecting both the language and the VM, known as default methods.

Project Lambda

Project Lambda introduces a coordinated language, library, and VM change. In the language, there are lambda expressions and default methods. In the libraries, there are bulk operations on collections and additional support for parallelism. In the VM, besides the default methods, there are also enhancements to the invokedynamic functionality. This is the biggest change to the language ever done, bigger than other significant changes such as generics.

What is a lambda expression?

A lambda expression is an anonymous method having an argument list, a return type, and a body, and able to refer to values from the enclosing scope:

(Object o) -> o.toString()
(Person p) -> p.getName().equals(name)

Besides lambda expressions, there is also the method reference syntax:

Object::toString()

The main benefit of lambdas is that it allows the programmer to treat code as data, store it in variables and pass it to methods.

Some history

When Java was first introduced in 1995 not many languages had closures, but they are present in pretty much every major language today, even C++. For Java, it has been a long and winding road to get support for closures, until Project Lambda finally started in Dec 2009. The current status is that JSR 335 is in early draft review, there are binary builds available, and it’s expected to become very soon part of the mainline JDK 8 builds.

Internal and external iteration

There are two ways to do iteration – internal and external. In external iteration you bring the data to the code, whereas in internal iteration you bring the code to the data. External iteration is what we have today, for example:

for (Shape s : shapes) {
    if (s.getColor() == RED)
        s.setColor(BLUE);
}

There are several limitations with this approach. One of them is that the above loop is inherently sequential, even though there is no fundamental reason it couldn’t be executed by multiple threads.

Re-written to use internal iteration with lambda, the above code would be:

shapes.forEach(s -> {
    if (s.getColor() == RED)
        s.setColor(BLUE);
})

This is not just a syntactic change, since now the library is in control of how the iteration happens. Written in this way, the code expresses much more what and less how, the how being left to the library. The library authors are free to use parallelism, out-of-order execution, laziness, and all kinds of other techniques. This allows the library to abstract over behavior, which is a fundamentally more powerful way of doing things.

Functional Interfaces

Project Lambda avoided adding new types, instead reusing existing coding practices. Java programmers are familiar with and have long used interfaces with one method, such as Runnable, Comparator, or ActionListener. Such interfaces are now called functional interfaces. There will be also new functional interfaces, such as Predicate and Block. A lambda expression evaluates to an instance of a functional interface, for example:

PredicateisEmpty = s -> s.isEmpty();
Predicate isEmpty = String::isEmpty;
Runnable r = () -> { System.out.println(“Boo!”) };

So existing libraries are forward-compatible with lambdas, which results in an “automatic upgrade”, maintaining the significant investment in those libraries.

Default Methods

The above example used a new method on Collection, forEach. However, adding a method to an existing interface is a no-go in Java, as it would result in a runtime exception when a client calls the new method on an old class in which it is not implemented.

A default method is an interface method that has an implementation, which is woven-in by the VM at link time. In a sense, this is multiple inheritance, but there’s no reason to panic, since this is multiple inheritance of behavior, not state. The syntax looks like this:

interface Collection<T> {
    ...
    default void forEach(Block<T> action) {
        for (T t : this)
            action.apply(t);
    }
}

There are certain inheritance rules to resolve conflicts between multiple supertypes:

  • Rule 1 – prefer superclass methods to interface methods (“Class wins”)
  • Rule 2 – prefer more specific interfaces to less (“Subtype wins”)
  • Rule 3 – otherwise, act as if the method is abstract. In the case of conflicting defaults, the concrete class must provide an implementation.

In summary, conflicts are resolved by looking for a unique, most specific default-providing interface. With these rules, “diamonds” are not a problem. In the worst case, when there isn’t a unique most specific implementation of the method, the subclass must provide one, or there will be a compiler error. If this implementation needs to call to one of the inherited implementations, the new syntax for this is A.super.m().

The primary goal of default methods is API evolution, but they are useful as an inheritance mechanism on their own as well. One other way to benefit from them is optional methods. For example, most implementations of Iterator don’t provide a useful remove(), so it can be declared “optional” as follows:

interface Iterator<T> {
    ...
    default void remove() {
        throw new UnsupportedOperationException();
    }
}

Bulk operations on collections

Bulk operations on collections also enable a map / reduce style of programming. For example, the above code could be further decomposed by getting a stream from the shapes collection, filtering the red elements, and then iterating only over the filtered elements:

shapes.stream().filter(s -> s.getColor() == RED).forEach(s -> { s.setColor(BLUE); });

The above code corresponds even more closely to the problem statement of what you actually want to get done. There also other useful bulk operations such as map, into, or sum. The main advantages of this programming model are:

  • More composability
  • Clarity – each stage does one thing
  • The library can use parallelism, out-of-order, laziness for performance, etc.

The stream is the basic new abstraction being added to the platform. It encapsulates laziness as a better alternative to “lazy” collections such as LazyList. It is a facility that allows getting a sequence of elements out of it, its source being a collection, array, or a function. The basic programming model with streams is that of a pipeline, such as collection-filter-map-sum or array-map-sorted-forEach. Since streams are lazy, they only compute as elements are needed, which pays off big in cases like filter-map-findFirst.

Another advantage of streams is that they allow to take advantage of fork/join parallelism, by having libraries use fork/join behind the scenes to ease programming and avoid boilerplate.

Implementation technique

In the last part of his talk, Joe described the advantages and disadvantages of the possible implementation techniques for lambda expressions. Different options such as inner classes and method handles were considered, but not accepted due to their shortcomings. The best solution would involve adding a level of indirection, by letting the compiler emit a declarative recipe, rather than imperative code, for creating a lambda, and then letting the runtime execute that recipe however it deems fit (and make sure it’s fast).

This sounded like a job for invokedynamic, a new invocation mode introduced with Java SE 7 for an entirely different reason – support for dynamic languages on the JVM. It turned out this feature is not just for dynamic languages any more, as it provides a suitable implementation mechanism for lambdas, and is also much better in terms of performance.

Conclusion

Project Lambda is a large, coordinated update across the Java language and platform. It enables much more powerful programming model for collections and takes advantage of new features in the VM. You can evaluate these new features by downloading the JDK8 build with lambda support. IDE support is also already available in NetBeans builds with Lambda support and IntelliJ IDEA 12 EAP builds with Lambda support.

I already made my own experiences with lambdas in Java in Wordcounter. As I already wrote, I am convinced that this style of programming will quickly become pervasive in Java, so if you don’t yet have experience with it, I do encourage you to try it out.
 

Reference: Devoxx 2012: Java 8 Lambda and Parallelism, Part 1 from our JCG partner Stoyan Rachev at the Stoyan Rachev’s Blog blog.

Stoyan Rachev

Software developer, architect, and agile software engineering coach at SAP
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Back to top button