Featured FREE Whitepapers

What's New Here?


Java 8 Friday: Let’s Deprecate Those Legacy Libs

At Data Geekery, we love Java. And as we’re really into jOOQ’s fluent API and query DSL, we’re absolutely thrilled about what Java 8 will bring to our ecosystem. Java 8 Friday Every Friday, we’re showing you a couple of nice new tutorial-style Java 8 features, which take advantage of lambda expressions, extension methods, and other great stuff. You’ll find the source code on GitHub. For the last two Fridays, we’ve been off for our Easter break, but now we’re back with another fun article: Let’s Deprecate Those Legacy Libs Apart from Lambdas and extension methods, the JDK has also been enhanced with a lot of new library code, e.g. the Streams API and much more. This means that we can critically review our stacks and – to the great joy of Doctor Deprecator – throw out all the garbage that we no longer need. Here are a couple of them, just to name a few: LINQ-style libraries There are lots of libraries that try to emulate LINQ (i.e. the LINQ-to-Collections part). We’ve already made our point before, because we now have the awesome Java 8 Streams API. 5 years from today, no Java developer will be missing LINQ any longer, and we’ll all be Streams-masters with Oracle Certified Streams Developer certifications hanging up our walls. Don’t get me wrong. This isn’t about LINQ or Streams being better. They’re pretty much the same. But since we now have Streams in the JDK, why worry about LINQ? Besides, the SQLesque syntax for collection querying was misleading anyway. SQL itself is much more than Streams will ever be (or needs to be). So let’s list a couple of LINQesque APIs, which we’ll no longer need: LambdaJ This was a fun attempt at emulating closures in Java through arcane and nasty tricks like ThreadLocal. Consider the following code snippet (taken from here): // This lets you "close over" the // System.out.println method Closure println = closure(); { of(System.out).println(var(String.class)); }// in order to use it like so: println.apply("one"); println.each("one", "two", "three"); Nice idea, although that semi-colon after closure(); and before that pseudo-closure-implementation block, which is not really a closure body… all of that seems quite quirky! Now, we’ll write: Consumer<String> println = System.out::println;println.accept("one"); Stream.of("one", "two", "three").forEach(println); No magic here, just plain Java 8. Let’s hear it one last time for Mario Fusco and Lambdaj. Linq4j Apparently, this is still being developed actively… Why? Do note that the roadmap also has a LINQ-to-SQL implementation in it, including: Parser support. Either modify a Java parser (e.g. OpenJDK), or write a pre-processor. Generate Java code that includes expression trees. Yes, we’d like to have such a parser for jOOQ as well. It would allow us to truly embed SQL in Java, similar to SQLJ, but typesafe. But if we have the Streams API, why not implement something like Streams-to-SQL? We cannot say farewell to Julian Hyde‘s Linq4j just yet, as he’s still continuing work. But we believe that he’s investing in the wrong corner. Coolection This is a library with a fun name, and it allows for doing things like… from(animals).where("name", eq("Lion")) .and("age", eq(2)) .all();from(animals).where("name", eq("Dog")) .or("age", eq(5)) .all(); But why do it this way, when you can write: animals.stream() .filter(a -> a.name.equals("Lion") && a.age == 2) .collect(toList());animals.stream() .filter(a -> a.name.equals("Dog") || a.age == 5) .collect(toList()); Let’s hear it for Wagner Andrade. And then off to the bin Half of Guava Guava has been pretty much a dump for all sorts of logic that should have been in the JDK in the first place. Take com.google.guava.base.Joiner for instance. It is used for string-joining: Joiner joiner = Joiner.on("; ").skipNulls(); . . . return joiner.join("Harry", null, "Ron", "Hermione"); No need, any more. We can now write: Stream.of("Harry", null, "Ron", "Hermione") .filter(s -> s != null) .collect(joining("; ")); Note also that the skipNulls flag and all sorts of other nice-to-have utilities are no longer necessary as the Streams API along with lambda expressions allows you to decouple the joining task from the filtering task very nicely. Convinced? No? What about:com.google.common.base.Optional -> java.util.Optional com.google.common.base.Predicate -> java.util.function.Predicate com.google.common.base.Supplier -> java.util.function.SupplierAnd then, there’s the whole set of Functional stuff that can be thrown to the bin as well: https://code.google.com/p/guava-libraries/wiki/FunctionalExplained Of course, once you’ve settled on using Guava throughout your application, you won’t remove its usage quickly. But on the other hand, let’s hope that parts of Guava will be deprecated soon, in favour of an integration with Java 8. JodaTime Now, this one is a no-brainer, as the popular JodaTime library got standardised into the java.time packages. This is great news. Let’s hear it for “Joda” Stephen Colebourne and his great work for the JSR-310. Apache commons-io The java.nio packages got even better with new methods that nicely integrate with the Streams API (or not). One of the main reasons why anyone would have ever used Apache Commons IO was the fact that it is horribly tedious to read files prior to Java 7 / 8. I mean, who would’ve enjoyed this piece of code (from here): try (RandomAccessFile file = new RandomAccessFile(filePath, "r")) { byte[] bytes = new byte[size]; file.read(bytes); return new String(bytes); // encoding?? ouch! } Over this one? List<String> lines = FileUtils.readLines(file); But forget the latter. You can now use the new methods in java.nio.file.Files, e.g. List<String> lines = Files.readAllLines(path); No need for third-party libraries any longer! Serialisation Throw it all out, for there is JEP 154 deprecating serialisation. Well, it wasn’t accepted, but we could’ve surely removed about 10% of our legacy codebase. A variety of concurrency APIs and helpers With JEP 155, there had been a variety of improvements to concurrent APIs, e.g. to ConcurrentHashMaps (we’ve blogged about it before), but also the awesome LongAdders, about which you can read a nice article over at the Takipi blog. Haven’t I seen a whole com.google.common.util.concurrent package over at Guava, recently? Probably not needed anymore. JEP 154 (Serialisation) wasn’t real It was an April Fools’ joke, of course… Base64 encoders How could this take so long?? In 2003, we’ve had RFC 3548, specifying Base16, Base32, and Base64 data encodings, which was in fact based upon base 64 encoding specified in RFC 1521, from 1993, or RFC 2045 from 1996, and if we’re willing to dig further into the past, I’m sure we’ll find earlier references to this simple idea of encoding binary data in text form. Now, in 2014, we finally have JEP 135 as a part of the JavaSE8, and thus (you wouldn’t believe it): java.util.Base64. Off to the trash can with all of these libraries!Apache Commons Codec (unless you’re using some other weird encoding from that library JAXB’s internal Base64 encoders Gauva, again JEE’s javax.mail.internet.MimeUtility Jetty’s implementation This weird thing here Or this weird thing here… gee, it seems like everyone and their dog worked around this limitation, prior to the JDK 8… More? Provide your suggestions in the comments! We’re curious to hear your thoughts (with examples!) Conclusion As any Java major release, there is a lot of new stuff that we have to learn, and that allows us to remove third-party libraries. This is great, because many good concepts have been consolidated into the JDK, available on every JVM without external dependencies. Disclaimer: Not everything in this article was meant seriously. Many people have created great pieces of work in the past. They have been very useful, even if they are somewhat deprecated now. Keep innovating, guys! Want to delve more into the many new things Java 8 offers? Go have a look over at the Baeldung blog, where this excellent list of Java 8 resources is featured: http://www.baeldung.com/java8 … and stay tuned for our next Java 8 Friday blog post, next week!  Reference: Java 8 Friday: Let’s Deprecate Those Legacy Libs from our JCG partner Lukas Eder at the JAVA, SQL, AND JOOQ blog....

Java EE7 and Maven project for newbies – part 2 – defining a simple war for our application

Resuming from the first part Part #1 We have just defined, our parent pom. A special type of pom that eventually defines the libraries that our application is going to use. It also configures all the maven tools used in order to package each module of our application. You can check out the part -1 sample code here. So up until now in the directory where we will be developing our application we have a single folder called sample-parent and in this directory a pom.xml resides. Our parent pom!As we can see in the section modules, we have defined, the building blocks of our applicationsample-ear sample-web sample-services sample-domainWe need to create related maven modules and add the specific pom.xml files for each one of them. Defining the war module Under the sample-parent folder we create a sub-folder called sample-web and we also add a pom.xml file. (some people do it on the same level). <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <parent> <groupId>gr.javapapo</groupId> <artifactId>sample-parent</artifactId> <version>0.0.1-SNAPSHOT</version> </parent> <artifactId>sample-web</artifactId> </project> But this just nothing, we need to be more specific on what this pom will be helping us to builld, so we need to define the packing type, a name for the module (for this war) and any dependencies. ... <artifactId>sample-web</artifactId> <packaging>war</packaging> <build> <finalName>${project.artifactId}</finalName> </build> <dependencies> <dependency> <groupId>javax</groupId> <artifactId>javaee-api</artifactId> <scope>provided</scope> </dependency> </dependencies> </project> In case you are using an IDE (e.g Eclipse) that supports Maven, it will automatically detect the changes on the content of your pom and will, create for you automatically folders, that conform with the Maven War packaging. It will create for you the following structure.  You can of-course do it on your own but, it is handy! sample-websrcmainjava (add your java code here) webapp (this is where WEB-INF\web.xml is placed) resources (resources, like properties) testjava resourcesUnder the webapp subfolder I have already pre-created the \WEB-INF\web.xml file . I could skip this part because the maven plugin can do it for us, but just to show case that there cases where you want to create it on your own and any custom entriesIf you are wondering what to ‘put’ in an empty Servlet 3.1 web.xml file, then have a look here, or download the code for this post. I have also added in the java subfolder under a simple package a very simple Servlet, that is going to be included in our application. Just a few lines of code. Again you can download all the code in the related git (bitbucket) link, at the end of the post.  So, we have added just a few lines on our war module pom file, and then in case we have an IDE, magically the tool created a very specific folder layout for us. We have ‘followed’ this layout and added a very simple servlet java class and a small xml descriptor. What is the real point here. Well, the great thing about maven is that some of the stuff that our War module needs to built, are already defined and configured in the ‘special’ parent pom. But what are these stuff, and how Maven is going to use it? As we have already elaborated Maven is all about conventions. You put the right things in the ‘right’ way and then it does all the work for you. So when maven scan(s) this war packing pom it will need tocompile our java class, which is a servlet and package everything under the sample-web folder, into a war file + any dependencies.Who is going to do all  of these things, since we have not added something special in our war pom( except the one dependency library). Well it is the configuration or our parent pom (see the previous post). The maven-compiler-plugin is going to be ‘invoked’ in order to compile our sources, and since we have defined that the packaging of our maven module is ‘war’ then maven-war-plugin is going to invoked to package everything for us, create the appropriate descriptors. So in a case where our application might have several war or jar modules, if we have a parent pom and we have defined in one central place the plugins and a basic configuration for then we DO NOT have to re-define it in all or our war / jar pom (s). Only in case one of the war(s) or jar(s) need special treatment (e.g package something extra or have a special layout), then under the build section we could re-define the plugin and over-write or add some extra, behavior. But this not our case. We want our plugins to be defined, once, and have a common configuration that is going to be ‘inherited‘ by all the modules of our application that re going to use it. Using the above hint, you can experiment and try, to create the sample-services module we have ‘defined’ above, or wait for the third part where we will quickly cover the rest of the concrete modules. You can find the code for this post here. (post2 tag) ResourcesPart 1 Maven war folder layout JavaEE 7- xml descriptorsReference: Java EE7 and Maven project for newbies – part 2 – defining a simple war for our application from our JCG partner Paris Apostolopoulos at the Papo’s log blog....

Broken company acronyms

Acronyms are there as easy to remind references for extended topics and obviously to somehow summarize them, and that’s great especially when you can use them as new words to quickly express relationships and get straight to the point (mentioning for instance a SLA, KPI, SOA, ROI and so on). Moreover, they could also be used as a simple and basic criteria to check their real appliance (which indeed is the part I like the most): if you claim to apply the SRP (Single Responsibility Principle), I could then check whether is there a single responsibility or not in the concerned component; if you claim to be an PM (Project Manager), I can check it against your acronym, whether is there a project and an effective management. If the applied acronym doesn’t really match the particular case, then something is wrong or at least it’s an alert to investigate further. And you can actually perform that simple and basic check within companies, departments and projects (and on your own role as well, repeatedly over time) as a coherency and healthy control. Among others, the most common broken acronyms we can encounter are:R&D (Research&Development): probably among the most abused ones. If research is indeed developing ideas and concepts, if yet developing a new product involves indeed research concerning the right technologies stack, approach, design, if hence research and development look so strongly coupled, then we should clearly always use such an acronym for any development team and as a consequence development should always be fun and challenging. But that’s obviously not true and it often happens that an R&D team should be actually called F&M, Fix & Maintenance, because that’s exactly what the team may be daily providing. Research is just an activity (and normally performed by just few members of a team) within the wider and more complex domain of development. It might sound cool, it might look promising and appealing, but in most of the cases R&D would easily be a broken acronym. If you fancy a change, consider PD (Product Development) or SD (Solution Delivery) instead, wouldn’t it be more honest and clear? HR (Human Resources): often disappointing, the human resource department mainly provides administration and coordination services, they will certainly be your (starting) contact point for interview, payroll and specific cases indeed, but focusing much more on the Resource part of its acronym, easily forgetting about the Human one. You can then meet HR assistants with a degree in Physiology or Sociology but concretely working with MS Excel, daily filling numbers and checking budgets, amounts, allocations. An RA (Resource Administration) would be much more precise when managing the human capital of a company. HR officers should focus on people empowering instead, not just organizing a company event but actively reviewing manager’s performances and their impact on teams and engagement (which also impact benefit and productivity of the company by the way). If you didn’t see your HR contact since years, if you didn’t even need to exchange emails or the communication is reduced to a recurring template (about time-sheet, about reimbursements, and so on), then it’s all about Resources, not much of Human indeed. TL (Team Lead): sadly, it happens to see leaders ending up just assigning tickets and caring about their final status, focusing on project deadlines, roadmap and business needs. It might be because of their workload or even due to a lack of soft skills, but when there is no follow up nor coaching, as long as there is no active team interaction nor cooperation, then that acronym is broken because team doesn’t simply mean pool of resources and leadership is something more complex than task assignment. TM (Task Manager) or RL (Resource Lead) would be more appropriate in such a case or you may also consider the PO (Product Owner) introduced by Scrum, to a certain extent though. Of course, you can’t pretend that each leader would act as a resonant leader, because that’s not an easy accomplishment indeed (neither impossible though), but you can however try to remind that acronym and focus on the Team as well, your subordinates will certainly appreciate, positively impacting on motivation and commitment (and, again, company benefit and productivity as well).Conclusion It might sound something silly, but verifying whether someone or some company’s entity is actually acting according to its acronym or not it immediately provides a quick and simple confirmation about evolution of responsibilities over time and proper working attitudes, about what somebody was supposed to do (or the initial intention of the role) and what he/she actually doing (any encountered gap could be a point of improvement). It should probably be the starting point of any peer review as well: remind me your acronym, now tell me what is your main activity, how do you accomplish it and in which part of your acronym you are weaker/stronger; if something sounded broken, then something should change, quickly.Reference: Broken company acronyms from our JCG partner Antonio Di Matteo at the Refactoring Ideas blog....

Java 8 : Functional Interface Example

To Support lambda expressions in Java 8, they introduced Functional Interfaces. An interface which has Single Abstract Method can be called as Functional Interface. Runnable, Comparator,Cloneable are some of the examples for Functional Interface. We can implement these Functional Interfaces by using Lambda expression. For example:   Thread t =new Thread(new Runnable(){ public void run(){ System.out.println("Runnable implemented by using Lambda Expression"); } }); This is the old way of creating a Thread. As Runnable is having Single Abstract Method, we can consider this as a Functional Interface and we can use Lambda expression like below. Thread t = new Thread(()->{ System.out.println("Runnable implemented by using Lambda Expression"); }); Here instead of passing Runnable object we just passed lambda expression. Declaring our own Functional Interfaces: We can declare our own Functional Interface by defining Single Abstract Method in interface. public interface FunctionalInterfaceTest{ void display(); } //Test class to implement above interface public class FunctionInterfaceTestImpl { public static void main(String[] args){ //Old way using anonymous inner class FunctionalInterfaceTest fit = new FunctionalInterfaceTest(){ public void display(){ System.out.println("Display from old way"); }}; fit.display();//outputs: Display from old way //Using lambda expression FunctionalInterfaceTest newWay = () -> {System.out.println("Display from new Lambda Expression");} newWay.display();//outputs : Display from new Lambda Expression } } We can annotate with @FunctionalInterface annotation, to tell compile time errors. It is optional for ex: @FunctionalInterface public interface FunctionalInterfaceTest{ void display(); void anotherDisplay();//shows an error, FunctionalInterface should have only one abstarct method. } Default Method: Functional interface can not have more than one abstract method but it can have more than one default methods. Default methods are introduced in Java 8, to add new methods to interface with out disturbing the implemented classes. interface DefaultInterfaceTest{ void show(); default void display(){ System.out.println("Default method from interface can have body..!"); } } public class DefaultInterfaceTestImpl implements DefaultInterfaceTest{ public void show(){ System.out.println("show method"); } //we dont need to provide any implementation to default method. public static void main(String[] args){ DefaultInterfaceTest obj = new DefaultInterfaceTestImpl(); obj.show();//out puts: show method obj.display();//outputs : Default method from interface can have body..! } } Main use of default method is with out forcing the implemented class , we can add a method to an interface. Multiple Inheritance: If the same default method is there in two interfaces and one class is implementing that interface, then it will throw an error. //Normal interface with show methodinterface Test{default void show(){ System.out.println("show from Test"); }}//Another interface with same show methodinterface AnotherTest{default void show(){ System.out.println("show from Test"); }}//Main class to implement above two interfacesclass Main implements Test, AnotherTest{ //here is an ambiguity which show method has to inherit here } This class wont compile because there is an ambiguity between Test, AnotherTest interfaces show() method, to resolve this we need to override show() method to Main Class. class Main implements Test, AnotherTest{void show(){ System.out.println("Main show method"); }}Reference: Java 8 : Functional Interface Example from our JCG partner Ramesh Kotha at the java2practice blog....

Java Tutorial Through Katas: Tennis Game (Easy)

A programming kata is an exercise which helps a programmer hone his skills through practice and repetition. This article is part of the series “Java Tutorial Through Katas”. Articles are divided into easy, medium and hard. Fizz Buzz (Easy) – Java 7 Berlin Clock (Easy) – Java 7 and 8 Tennis Game (Easy) – Java 7 Reverse Polish Notation (Medium) – Java 7 and 8   The article assumes that the reader already has experience with Java, that he is familiar with the basic usage of JUnit tests and that he knows how to run them from his favorite IDE (ours is IntelliJ IDEA). Tests that prove that the solution is correct are displayed below. Recommended way to solve this kata is to use test-driven development approach (write the implementation for the first test, confirm that it passes and move to the next). Once all of the tests pass, the kata can be considered solved. One possible solution is provided below the tests. Try to solve the kata by yourself first. Tennis Game Implement a simple tennis game Rules:Scores from zero to three points are described as “love”, “fifteen”, “thirty”, and “forty” respectively. If at least three points have been scored by each side and a player has one more point than his opponent, the score of the game is “advantage” for the player in the lead. If at least three points have been scored by each player, and the scores are equal, the score is “deuce”. A game is won by the first player to have won at least four points in total and at least two points more than the opponent.[TESTS] public class GameTest {Player victor; Player sarah; Game game;@Before public void beforeGameTest() { victor = new Player("Victor"); sarah = new Player("Sarah"); game = new Game(victor, sarah); }@Test public void loveShouldBeDescriptionForScore0() { Game game = new Game(victor, sarah); assertThat(game, hasProperty("score", is("love, love"))); }@Test public void fifteenShouldBeDescriptionForScore1() { sarah.winBall(); assertThat(game, hasProperty("score", is("love, fifteen"))); }@Test public void thirtyShouldBeDescriptionForScore2() { victor.winBall(); victor.winBall(); sarah.winBall(); assertThat(game, hasProperty("score", is("thirty, fifteen"))); }@Test public void fortyShouldBeDescriptionForScore3() { IntStream.rangeClosed(1, 3).forEach((Integer) -> { victor.winBall(); }); assertThat(game, hasProperty("score", is("forty, love"))); }@Test public void advantageShouldBeDescriptionWhenLeastThreePointsHaveNeenScoredByEachSideAndPlayerHasOnePointMoreThanHisOpponent() { IntStream.rangeClosed(1, 3).forEach((Integer) -> { victor.winBall(); }); IntStream.rangeClosed(1, 4).forEach((Integer) -> { sarah.winBall(); }); assertThat(game, hasProperty("score", is("advantage Sarah"))); }@Test public void deuceShouldBeDescriptionWhenAtLeastThreePointsHaveBeenScoredByEachPlayerAndTheScoresAreEqual() { for(int index = 1; index <= 3; index++) { victor.winBall(); } for(int index = 1; index <= 3; index++) { sarah.winBall(); } assertThat(game, hasProperty("score", is("deuce"))); victor.winBall(); assertThat(game, hasProperty("score", is(not("deuce")))); sarah.winBall(); assertThat(game, hasProperty("score", is("deuce"))); }@Test public void gameShouldBeWonByTheFirstPlayerToHaveWonAtLeastFourPointsInTotalAndWithAtLeastTwoPointsMoreThanTheOpponent() { for(int index = 1; index <= 4; index++) { victor.winBall(); } for(int index = 1; index <= 3; index++) { sarah.winBall(); } assertThat(game, hasProperty("score", is(not("Victor won")))); assertThat(game, hasProperty("score", is(not("Sarah won")))); victor.winBall(); assertThat(game, hasProperty("score", is("Victor won"))); }} Test code can be found in the GitHub GameTest.java. Another set of tests (not listed above) can be found in the GitHub PlayerTest.java. [ONE POSSIBLE SOLUTION] public class Game {private Player player1; private Player player2;public Game(Player player1, Player player2) { this.player1 = player1; this.player2 = player2; }public String getScore() { if (player1.getScore() >= 3 && player2.getScore() >= 3) { if (Math.abs(player2.getScore() - player1.getScore()) >= 2) { return getLeadPlayer().getName() + " won"; } else if (player1.getScore() == player2.getScore()) { return "deuce"; } else { return "advantage " + getLeadPlayer().getName(); } } else { return player1.getScoreDescription() + ", " + player2.getScoreDescription(); } }public Player getLeadPlayer() { return (player1.getScore() > player2.getScore()) ? player1 : player2; }} Java solution code can be found in the Game.java solution. It uses Player class (not listed above) that can be found in the Player.java solution.  Reference: Java Tutorial Through Katas: Tennis Game (Easy) from our JCG partner Viktor Farcic at the Technology conversations blog....

Scala for-comprehension with concurrently running futures

Can you tell what’s the difference between the following two? If yes, then you’re great and you don’t need to read further.                 Version 1: val milkFuture = future { getMilk() } val flourFuture = future { getFlour() }for { milk <- milkFuture flour <- flourFuture } yield (milk + flour) Version 2: for { milk <- future { getMilk() } flour <- future { getFlour() } } yield (milk + flour) You are at least curious if you got here. The difference is that the two futures in version 1 (can possibly) run in parallel, but in version 2 they can not. Function getFlour() is executed only after getMilk() is completed. In the first version both futures are created before they are used in the for-comprehension. Once they exists it’s only up to execution context when they run, but nothing prevents them to be executed. I am trying not to say that they for sure run in parallel becuase that depends on many factors like thread pool size, execution time, etc. But the point is that they can run in parallel. The second version looks very similar, but the problem is that the “getFlour()” future is created only once the “getMilk()” future is already completed. Therefore the two futures can never run concurrently no matter what. Don’t forget that the for-comprehension is just a syntactic sugar for methods “map”, “flatMap” and “withFilter”. There’s no magic behind. That’s all folks. Happy futures to you.Reference: Scala for-comprehension with concurrently running futures from our JCG partner Rado Buransky at the Rado Buransky’s Blog blog....

Spring Scala based sample bean configuration

I have been using Spring Scala for a toy project for the last few days and I have to say that it is a fantastic project, it simplifies Spring configuration even further when compared to the already simple configuration purely based on Spring Java Config. Let me demonstrate this by starting with the Cake Pattern based sample here:             // ======================= // service interfaces trait OnOffDeviceComponent { val onOff: OnOffDevice trait OnOffDevice { def on: Unit def off: Unit } } trait SensorDeviceComponent { val sensor: SensorDevice trait SensorDevice { def isCoffeePresent: Boolean } }// ======================= // service implementations trait OnOffDeviceComponentImpl extends OnOffDeviceComponent { class Heater extends OnOffDevice { def on = println("heater.on") def off = println("heater.off") } } trait SensorDeviceComponentImpl extends SensorDeviceComponent { class PotSensor extends SensorDevice { def isCoffeePresent = true } } // ======================= // service declaring two dependencies that it wants injected trait WarmerComponentImpl { this: SensorDeviceComponent with OnOffDeviceComponent => class Warmer { def trigger = { if (sensor.isCoffeePresent) onOff.on else onOff.off } } }// ======================= // instantiate the services in a module object ComponentRegistry extends OnOffDeviceComponentImpl with SensorDeviceComponentImpl with WarmerComponentImpl {val onOff = new Heater val sensor = new PotSensor val warmer = new Warmer }// ======================= val warmer = ComponentRegistry.warmer warmer.trigger Cake pattern is a pure Scala way of specifying the dependencies. Now, if we were to specify this dependency using Spring’s native Java config, but with Scala as the language, let me first start with the components that need to be wired together: trait SensorDevice { def isCoffeePresent: Boolean }class PotSensor extends SensorDevice { def isCoffeePresent = true }trait OnOffDevice { def on: Unit def off: Unit }class Heater extends OnOffDevice { def on = println("heater.on") def off = println("heater.off") }class Warmer(s: SensorDevice, o: OnOffDevice) { def trigger = { if (s.isCoffeePresent) o.on else o.off } } and the configuration that wires these components together with a sample that uses this configuration: import org.springframework.context.annotation.Configuration import org.springframework.context.annotation.Bean@Configuration class WarmerConfig { @Bean def heater(): OnOffDevice = new Heater@Bean def potSensor(): SensorDevice = new PotSensor@Bean def warmer() = new Warmer(potSensor(), heater()) }import org.springframework.context.annotation.AnnotationConfigApplicationContextval ac = new AnnotationConfigApplicationContext(classOf[WarmerConfig])val warmer = ac.getBean("warmer", classOf[Warmer]) warmer.trigger Taking this further to use Spring-Scala project to specify the dependencies, the configuration and a sample look like this: import org.springframework.context.annotation.Configuration import org.springframework.context.annotation.Bean import org.springframework.scala.context.function.FunctionalConfigurationclass WarmerConfig extends FunctionalConfiguration { val h = bean("heater") { new Heater } val p = bean("potSensor") { new PotSensor } bean("warmer") { new Warmer(p(), h()) } }import org.springframework.context.annotation.AnnotationConfigApplicationContext import org.springframework.scala.context.function.FunctionalConfigApplicationContextval ac = FunctionalConfigApplicationContext[WarmerConfig] val warmer = ac.getBean("warmer", classOf[Warmer]) warmer.trigger The essence of the Spring Scala project as explained in this wiki is the “bean” method derived from the `FunctionalConfiguration` trait, this method can be called to create a bean, passing in parameters to specify, if required, bean name, alias, scope and a function which returns the instantiated bean. This sample hopefully gives a good appreciation for how simple Spring Java Config is, and how much more simpler Spring-Scala project makes it for Scala based projects.  Reference: Spring Scala based sample bean configuration from our JCG partner Biju Kunjummen at the all and sundry blog....

Why you should not work extra hours

There are pros and cons in working extra hours or over time regularly, here is an attempt to list them all. Some are well known, some are taken from my experience, if you know other reasons just comment and I’ll include them in the list.                 CONS: You are going to introduce bugs Human concentration does not last for long time, and it decreases drastically if your brain doesn’t rest properly. We are already introducing bugs when we are extremely focused and well rested in the morning, so it’s easy to imagine the disaster that can happen at 10PM. 8 hours of mental work a day is more than enough for your brain. Your changes cannot be promptly reviewed  Code reviews is an extremely powerful tool that is widely used by team in order to control code quality. It works perfectly when reviews are done immediately and you can talk face to face with your team members. It works well when it’s done using tools like Gerrit, Reviewboard and so on. It works bad if every morning there are tons of lines of code to review because of nightly or extra hours commits. Stories team estimation goes wildOne of the most painful part of any agile team is the stories estimation (less in Kanban, more in Scrum). It is painful, stressful, long and it is tiring for all the participants. Estimations are based on stories complexity and time. The time is the 8 hours per 5 days a week. Now, all this estimation gets distorted and so useless when team members regularly work extra hours. Team will get frustrated. You can create tension between team members  One thing I noticed in teams where some members work over time, is the presence of hostilities and tension. Why is that? People that follow regular work hours can be worried about their career being compromised because they do not stay longer in the office. People who work late can see the others as non involved/interested enough on the job. Solution? simple, stick on regular hours. You create wrong expectationsEveryone knows, you give a hand and they want your arm. You work regularly over time and your manager will start soon to expect and to count on you to stay late. You lose a great part of your lifeYep. PROS: Reference: Why you should not work extra hours from our JCG partner Marco Castigliego at the Remove duplication and fix bad names blog....

Load inheritance tree into List by Spring

I noticed interesting Spring feature. One of my colleagues used it for loading whole inheritance tree of Spring beans into list. Missed that when I was studying Spring docs. Let’s have this inheritance tree of Spring beans:            In following snippet is this tree of beans loaded into list with constructor injection: @Component public class Nature { List<Animal> animals;@Autowired public Nature(List<Animal> animals) { this.animals = animals; }public void showAnimals() { animals.forEach(animal -> System.out.println(animal)); } } Method showAnimals is using Java 8 lambda expression to output loaded beans into console. You would find a lot of reading about this new Java 8 feature these days. Spring context is loaded by this main class: public class Main { public static void main(String[] args) { AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(SpringContext.class);Nature nature = context.getBean(Nature.class); nature.showAnimals(); } } Console output: PolarBear [] Wolf [] Animal [] Grizzly [] Bear []This feature can be handy sometimes. Source code of this short example is on Github.Reference: Load inheritance tree into List by Spring from our JCG partner Lubos Krnac at the Lubos Krnac Java blog blog....

Tests as documentation

Documentation needs to be comprehensive, always up-to-date and accessible. By comprehensive I mean that it must cover all important areas of the code as well as all functions of the application. While importance of documentation is obvious to most, many struggle without success to have it accurate and up-to-date. Response to “poor” documentation is often assignment of more resources and more time. More often than not, documentation is created for wrong reasons. Reasons for requesting documentation Documentation can be requested for various reasons. Teams are often asked to work on documentation for political reasons or because of sheer ignorance. Some of the wrong reasons to create documentation can be:Someone thinks that some document is related to the project success Documentation justifies someone’s existence Requester does not know any better Requester wants reassurance that everything is OK Process says that document should be createdDocumentation is not up-to-date Main problem with software documentation is that it is not up-to-date most of the time. As soon as some part of the code changes, documentation stops reflecting the actual situation. This statement applies to almost any type of documentation with requirements and test cases being the most affected. No matter how hard we try, documentation inevitably gets outdated. Who uses the documentation? Depending on the audience, type of needed documentation and its format varies considerably. Developers, testers, customers, managers and end users are probably major profiles of potential documentation consumers. Developers Developers shouldn’t rely on system documentation because it is almost never up-to-date. Besides, no documentation can provide as detailed and up-to-date description of the code as the code itself. If you want to see what some method does, take a look at the method. Not sure what some class does? Take a look at the class. Necessity to document code is often a sign that the code itself is not well written. Using code as documentation does not exclude other types of documents. The key is to avoid duplication. If details of the system can be obtained by reading the code, other types of documentation can provide quick guidelines and high level overview. Non-code documentation should answer questions like what the general purpose of the system is and what technologies are used by the system. In many cases, simple README.md is enough to provide a quick start that developers need. Sections like project description, environment setup, installation, build and packaging instructions are very helpful for newcomers. From there on, code is the bible. Production code provides all needed details while test code acts as the description of the intent behind the production code. Tests are executable documentation with TDD being the most common way to create and maintain it. Assuming that some form of continuous integration is in use, if some part of test-documentation is incorrect, it will fail and be fixed soon afterwards. CI fixes the problem of incorrect test-documentation but it does not ensure that all functionality is documented. For that reason (among many others) test-documentation should be created in the TDD fashion. If all functionality is defined as tests before the implementation code and execution of all tests is successful, than tests act as complete and up-to-date documentation that can be used by developers. What should we do with the rest of the team? Testers, customers, managers and other non-coders might not be able to obtain needed information from the production and test code. Testers Two most common types of testing are black and white box testing. This division is important since it also divides testers into those who do know how to write or at least read code (white-box testing) and those who don’t (black-box testing). In some cases testers can do both types. However, more often than not, testers do not know how to code so the documentation that is usable for developers is not usable for them. If documentation needs to be decoupled from the code, unit tests are not a good match. That is one of the reasons why BDD came to being. It can provide documentation necessary for non-coders while still maintaining advantages of TDD and automation. Customers Customers need to be able to define new functionality of the system as well as to be able to get information about all the important aspects of the current system. That documentation should not be too technical (code is not an option) but still be always up to date. BDD narratives and scenarios are one of the best ways to provide this type of documentation. Ability to act as acceptance criteria (written before the code), be executed frequently (preferably on every commit) and be written in natural language makes BDD stories not only always up-to-date but usable by those who do not want to inspect the code. Executable documentation Documentation is integral part of the software. As with any other part of the software, it needs to be tested often so that we’re sure that it is accurate and up-to-date. The only cost effective way to accomplish this is to have executable documentation that can be integrated into your continuous integration system. TDD as methodology is the best way to move towards this direction. On a low level, unit tests are a best fit. On the other hand, BDD provides a good way to work on a functional level while maintaining understanding accomplished using natural language.  Reference: Tests as documentation from our JCG partner Viktor Farcic at the Technology conversations blog....
Java Code Geeks and all content copyright © 2010-2014, Exelixis Media Ltd | Terms of Use | Privacy Policy
All trademarks and registered trademarks appearing on Java Code Geeks are the property of their respective owners.
Java is a trademark or registered trademark of Oracle Corporation in the United States and other countries.
Java Code Geeks is not connected to Oracle Corporation and is not sponsored by Oracle Corporation.
Do you want to know how to develop your skillset and become a ...
Java Rockstar?

Subscribe to our newsletter to start Rocking right now!

To get you started we give you two of our best selling eBooks for FREE!

Get ready to Rock!
You can download the complementary eBooks using the links below: