What's New Here?

jooq-2-logo

Using jOOQ with Spring: CRUD

jOOQ is a library which helps us to get back in control of our SQL. It can generate code from our database and lets us build typesafe database queries by using its fluent API. The earlier parts of this tutorial have taught us how we can configure the application context of our example application and generate code from our database. We are now ready to take one step forward and learn how we can create typesafe queries with jOOQ. This blog post describes how we can add CRUD operations to a simple application which manages todo entries.   Let’s get started. Additional Reading:Using jOOQ with Spring: Configuration is the first part of this tutorial, and it describes you can configure the application context of a Spring application which uses jOOQ. You can understand this blog post without reading the first part of this tutorial, but if you want to really use jOOQ in a Spring powered application, I recommend that you read the first part of this tutorial as well. Using jOOQ with Spring: Code Generation is the second part of this tutorial, and it describes how we can reverse-engineer our database and create the jOOQ query classes which represents different database tables, records, and so on. Because these classes are the building blocks of typesafe SQL queries, I recommend that you read the second part of this tutorial before reading this blog post.Creating the Todo Class Let’s start by creating a class which contains the information of a single todo entry. This class has the following fields:The id field contains the id of the todo entry. The creationTime field contains a timestamp which describes when the todo entry was persisted for the first time. The description field contains the description of the todo entry. The modificationTime field contains a timestamp which describes when the todo entry was updated. The title field contains the title of the todo entry.The name of this relatively simple class is Todo, and it follows three principles which are described in the following:We can create new Todo objects by using the builder pattern described in Effective Java by Joshua Bloch. If you are not familiar with this pattern, you should read an article titled Item 2: Consider a builder when faced with many constructor parameters. The title field is mandatory, and we cannot create a new Todo object which has either null or empty title. If we try to create a Todo object with an invalid title, an IllegalStateException is thrown. This class is immutable. In other words, all its field are declared final.The source code of the Todo class looks as follows: import org.apache.commons.lang3.builder.ToStringBuilder; import org.joda.time.LocalDateTime;import java.sql.Timestamp;public class Todo {private final Long id;private final LocalDateTime creationTime;private final String description;private final LocalDateTime modificationTime;private final String title;private Todo(Builder builder) { this.id = builder.id;LocalDateTime creationTime = null; if (builder.creationTime != null) { creationTime = new LocalDateTime(builder.creationTime); } this.creationTime = creationTime;this.description = builder.description;LocalDateTime modificationTime = null; if (builder.modificationTime != null) { modificationTime = new LocalDateTime(builder.modificationTime); } this.modificationTime = modificationTime;this.title = builder.title; }public static Builder getBuilder(String title) { return new Builder(title); }//Getters are omitted for the sake of clarity.public static class Builder {private Long id;private Timestamp creationTime;private String description;private Timestamp modificationTime;private String title;public Builder(String title) { this.title = title; }public Builder description(String description) { this.description = description; return this; }public Builder creationTime(Timestamp creationTime) { this.creationTime = creationTime; return this; }public Builder id(Long id) { this.id = id; return this; }public Builder modificationTime(Timestamp modificationTime) { this.modificationTime = modificationTime; return this; }public Todo build() { Todo created = new Todo(this);String title = created.getTitle();if (title == null || title.length() == 0) { throw new IllegalStateException("title cannot be null or empty"); }return created; } } } Let’s find out why we need to get the current date and time, and more importantly, what is the best way to do it. Getting the Current Date and Time Because the creation time and modification time of each todo entry are stored to the database, we need a way to obtain the current date and time. Of course could we simply create this information in our repository. The problem is that if we would do this, we wouldn’t be able to write automated tests which ensure that the creation time and the modification time are set correctly (we cannot write assertions for these fields because their values depends from the current time). That is why we need to create a separate component which is responsible for returning the current date and time. The DateTimeService interface declares two methods which are described in the following:The getCurrentDateTime() method returns the current date and time as a LocalDateTime object. The getCurrentTimestamp() method returns the current date and time as a Timestamp object.The source code of the DateTimeService interface looks as follows: import org.joda.time.LocalDateTime; import java.sql.Timestamp;public interface DateTimeService {public LocalDateTime getCurrentDateTime();public Timestamp getCurrentTimestamp(); } Because our application is interested in the “real” time, we have to implement this interface and create a component which returns the real date and time. We can do this by following these steps:Create a CurrentTimeDateTimeService class which implements the DateTimeService interface. Annotate the class with the @Profile annotation and set the name of the profile to ‘application’. This means that the component can be registered to the Spring container when the active Spring profile is ‘application’. Annotate the class with the @Component annotation. This ensures that the class is found during classpath scanning. Implement the methods declared in the DateTimeService interface. Each method must return the current date and time.The source code of the CurrentTimeDateTimeService looks as follows:import org.joda.time.LocalDateTime; import org.springframework.context.annotation.Profile; import org.springframework.stereotype.Component;import java.sql.Timestamp;@Profile("application") @Component public class CurrentTimeDateTimeService implements DateTimeService {@Override public LocalDateTime getCurrentDateTime() { return LocalDateTime.now(); }@Override public Timestamp getCurrentTimestamp() { return new Timestamp(System.currentTimeMillis()); } } Let’s move on and start implementing the repository layer of our example application. Implementing the Repository Layer First we have create a repository interface which provides CRUD operations for todo entries. This interface declares five methods which are described in the following:The Todo add(Todo todoEntry) method saves a new todo entry to the database and returns the information of the saved todo entry. The Todo delete(Long id) method deletes a todo entry and returns the deleted todo entry. The ListfindAll()method returns all todo entries which are found from the database. The Todo findById(Long id) returns the information of a single todo entry. The Todo update(Todo todoEntry) updates the information of a todo entry and returns the updated todo entry.The source code of the TodoRepository interface looks as follows: import java.util.List;public interface TodoRepository {public Todo add(Todo todoEntry);public Todo delete(Long id);public List<Todo> findAll();public Todo findById(Long id);public Todo update(Todo todoEntry); } Next we have to implement the TodoRepository interface. When we do that, we must follow the following rule: All database queries created by jOOQ must be executed inside a transaction. The reason for this is that our application uses the TransactionAwareDataSourceProxy class, and if we execute database queries without a transaction, jOOQ will use a different connection for each operation. This can lead into race condition bugs. Typically the service layer acts as a transaction boundary, and each call to a jOOQ repository should be made inside a transaction. However, because programmers make mistakes too, we cannot trust that this is the case. That is why we must annotate the repository class or its methods with the @Transactional annotation. Now that we have got that covered, we are ready to create our repository class. Creating the Repository Class We can create the “skeleton” of our repository class by following these steps:Create a JOOQTodoRepository class and implement the TodoRepository interface. Annotate the class with the @Repository annotation. This ensures that the class is found during the classpath scan. Add a DateTimeService field to the created class. As we remember, the DateTimeService interface declares the methods which are used to get the current date and time. Add a DSLContext field to the created class. This interface acts as an entry point to the jOOQ API and we can build our SQL queries by using it. Add a public constructor to the created class and annotate the constructor with the @Autowired annotation. This ensures that the dependencies of our repository are injected by using constructor injection. Add a private Todo convertQueryResultToModelObject(TodosRecord queryResult) method to the repository class. This utility method is used by the public methods of our repository class. Implement this method by following these steps:Create a new Todo object by using the information of the TodosRecord object given as a method parameter. Return the created object.The relevant part of the JOOQTodoRepository class looks as follows: import net.petrikainulainen.spring.jooq.todo.db.tables.records.TodosRecord; import org.jooq.DSLContext; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Repository;@Repository public class JOOQTodoRepository implements TodoRepository {private final DateTimeService dateTimeService;private final DSLContext jooq;@Autowired public JOOQTodoRepository(DateTimeService dateTimeService, DSLContext jooq) { this.dateTimeService = dateTimeService; this.jooq = jooq; }private Todo convertQueryResultToModelObject(TodosRecord queryResult) { return Todo.getBuilder(queryResult.getTitle()) .creationTime(queryResult.getCreationTime()) .description(queryResult.getDescription()) .id(queryResult.getId()) .modificationTime(queryResult.getModificationTime()) .build(); } } Let’s move on and implement the methods which provide CRUD operations for todo entries. Adding a New Todo Entry The public Todo add(Todo todoEntry) method of the TodoRepository interface is used to add a new todo entries to the database. We can implement this method by following these steps:Add a private TodosRecord createRecord(Todo todoEntry) method to the repository class and implement this method following these steps:Get the current date and time by calling the getCurrentTimestamp() method of the DateTimeService interface. Create a new TodosRecord object and set its field values by using the information of the Todo object given as a method parameter. Return the created TodosRecord object.Add the add() method to the JOOQTodoRepository class and annotate the method with the @Transactional annotation. This ensures that the INSERT statement is executed inside a read-write transaction. Implement the add() method by following these steps:Add a new todo entry to the database by following these steps:Create a new INSERT statement by calling the insertInto(Table table) method of the DSLContext interface and specify that you want to insert information to the todos table. Create a new TodosRecord object by calling the createRecord() method. Pass the Todo object as a method parameter. Set the inserted information by calling the set(Record record) method of the InsertSetStep interface. Pass the created TodosRecord object as a method parameter. Ensure that the INSERT query returns all inserted fields by calling the returning() method of the InsertReturningStep interface. Get the TodosRecord object which contains the values of all inserted fields by calling the fetchOne() method of the InsertResultStep interface.Convert the TodosRecord object returned by the INSERT statement into a Todo object by calling the convertQueryResultToModelObject() method. Return the created the Todo object.The relevant part of the JOOQTodoRepository class looks as follows: import net.petrikainulainen.spring.jooq.todo.db.tables.records.TodosRecord; import org.jooq.DSLContext; import org.springframework.stereotype.Repository; import org.springframework.transaction.annotation.Transactional;import java.sql.Timestamp;import static net.petrikainulainen.spring.jooq.todo.db.tables.Todos.TODOS;@Repository public class JOOQTodoRepository implements TodoRepository {private final DateTimeService dateTimeService;private final DSLContext jooq;//The constructor is omitted for the sake of clarity@Transactional @Override public Todo add(Todo todoEntry) { TodosRecord persisted = jooq.insertInto(TODOS) .set(createRecord(todoEntry)) .returning() .fetchOne();return convertQueryResultToModelObject(persisted); }private TodosRecord createRecord(Todo todoEntry) { Timestamp currentTime = dateTimeService.getCurrentTimestamp();TodosRecord record = new TodosRecord(); record.setCreationTime(currentTime); record.setDescription(todoEntry.getDescription()); record.setModificationTime(currentTime); record.setTitle(todoEntry.getTitle());return record; }private Todo convertQueryResultToModelObject(TodosRecord queryResult) { return Todo.getBuilder(queryResult.getTitle()) .creationTime(queryResult.getCreationTime()) .description(queryResult.getDescription()) .id(queryResult.getId()) .modificationTime(queryResult.getModificationTime()) .build(); } } The section 4.3.3. The INSERT statement of the jOOQ reference manual provides additional information about inserting data to the database. Let’s move on and find out how we can find all entries which are stored to the database. Finding All Todo Entries The public List findAll() method of the TodoRepository interface returns all todo entries which are stored to the database. We can implement this method by following these steps:Add the findAll() method to the repository class and annotate the method with the @Transactional annotation. Set the value of its readOnly attribute to true. This ensures that the SELECT statement is executed inside a read-only transaction. Get all todo entries from the database by following these steps:Create a new SELECT statement by calling the selectFrom(Table table) method of the DSLContext interface and specify that you want to select information from the todos table. Get a list of TodosRecord objects by calling the fetchInto(Class type) method of the ResultQuery interface.Iterate the returned list of TodosRecord objects and convert each TodosRecord object into a Todo object by calling the convertQueryResultToModelObject() method. Add each Todo object to the list of Todo objects. Return the List which contains the found Todo objects.The relevant part of the JOOQTodoRepository class looks as follows: import net.petrikainulainen.spring.jooq.todo.db.tables.records.TodosRecord; import org.jooq.DSLContext; import org.springframework.stereotype.Repository; import org.springframework.transaction.annotation.Transactional;import java.util.ArrayList; import java.util.List;import static net.petrikainulainen.spring.jooq.todo.db.tables.Todos.TODOS;@Repository public class JOOQTodoRepository implements TodoRepository {private final DSLContext jooq;//The constructor is omitted for the sake of clarity@Transactional(readOnly = true) @Override public List<Todo> findAll() { List<Todo> todoEntries = new ArrayList<>();List<TodosRecord> queryResults = jooq.selectFrom(TODOS).fetchInto(TodosRecord.class);for (TodosRecord queryResult: queryResults) { Todo todoEntry = convertQueryResultToModelObject(queryResult); todoEntries.add(todoEntry); }return todoEntries; }private Todo convertQueryResultToModelObject(TodosRecord queryResult) { return Todo.getBuilder(queryResult.getTitle()) .creationTime(queryResult.getCreationTime()) .description(queryResult.getDescription()) .id(queryResult.getId()) .modificationTime(queryResult.getModificationTime()) .build(); } } The section 4.3.2. The SELECT Statement of the jOOQ reference manual provides more information about selecting information from the database. Next we will find out how we can get a single todo entry from the database. Finding a Single Todo Entry The public Todo findById(Long id) method of the TodoRepository interface returns the information of a single todo entry. We can implement this method by following these steps:Add the findById() method the repository class and annotate the method with the @Transactional annotation. Set the value of its readOnly attribute to true. This ensures that the SELECT statement is executed inside a read-only transaction. Get the information of a single todo entry from the database by following these steps:Create a new SELECT statement by calling the selectFrom(Table table) method of the DSLContext interface and specify that you want to select information from the todos table. Specify the WHERE clause of the SELECT statement by calling the where(Collection conditions) method of the SelectWhereStep interface. Ensure that the SELECT statement returns only the todo entry which id was given as a method parameter. Get the TodosRecord object by calling the fetchOne() method of the ResultQuery interface.If the returned TodosRecord object is null, it means that no todo entry was found with the given id. If this is the case, throw a new TodoNotFoundException. Convert TodosRecord object returned by the SELECT statement into a Todo object by calling the convertQueryResultToModelObject() method. Return the created Todo object.The relevant part of the JOOQTodoRepository looks as follows:import net.petrikainulainen.spring.jooq.todo.db.tables.records.TodosRecord; import org.jooq.DSLContext; import org.springframework.stereotype.Repository; import org.springframework.transaction.annotation.Transactional;import static net.petrikainulainen.spring.jooq.todo.db.tables.Todos.TODOS;@Repository public class JOOQTodoRepository implements TodoRepository {private final DSLContext jooq;//The constructor is omitted for the sake of clarity.@Transactional(readOnly = true) @Override public Todo findById(Long id) { TodosRecord queryResult = jooq.selectFrom(TODOS) .where(TODOS.ID.equal(id)) .fetchOne();if (queryResult == null) { throw new TodoNotFoundException("No todo entry found with id: " + id); }return convertQueryResultToModelObject(queryResult); }private Todo convertQueryResultToModelObject(TodosRecord queryResult) { return Todo.getBuilder(queryResult.getTitle()) .creationTime(queryResult.getCreationTime()) .description(queryResult.getDescription()) .id(queryResult.getId()) .modificationTime(queryResult.getModificationTime()) .build(); } } The section 4.3.2. The SELECT Statement of the jOOQ reference manual provides more information about selecting information from the database. Let’s find out how we can delete a todo entry from the database. Deleting a Todo Entry The public Todo delete(Long id) method of the TodoRepository interface is used to delete a todo entry from the database. We can implement this method by following these steps:Add the delete() method to the repository class and annotate the method with the @Transactional annotation. This ensures that the DELETE statement is executed inside a read-write transaction. Implement this method by following these steps:Find the deleted Todo object by calling the findById(Long id) method. Pass the id of the deleted todo entry as a method parameter. Delete the todo entry from the database by following these steps:Create a new DELETE statement by calling the delete(Table table) method of the DSLContext interface and specify that you want to delete information from the todos table. Specify the WHERE clause of the DELETE statement by calling the where(Collection conditions) method of the DeleteWhereStep interface. Ensure that the DELETE statement deletes the todo entry which id was given as a method parameter. Execute the the DELETE statement by calling the execute() method of the Query interface.Return the information of the deleted todo entry.The relevant part of the JOOQTodoRepository class looks as follows: import net.petrikainulainen.spring.jooq.todo.db.tables.records.TodosRecord; import org.jooq.DSLContext; import org.springframework.stereotype.Repository; import org.springframework.transaction.annotation.Transactional;import static net.petrikainulainen.spring.jooq.todo.db.tables.Todos.TODOS;@Repository public class JOOQTodoRepository implements TodoRepository {private final DSLContext jooq;//The constructor is omitted for the sake of clarity@Transactional @Override public Todo delete(Long id) { Todo deleted = findById(id);int deletedRecordCount = jooq.delete(TODOS) .where(TODOS.ID.equal(id)) .execute();return deleted; } } The section 4.3.5. The DELETE Statement of the jOOQ reference manual provides additional information about deleting data from the database. Let’s move on and find out how we can update the information of an existing todo entry. Updating an Existing Todo Entry The public Todo update(Todo todoEntry) method of the TodoRepository interface updates the information of an existing todo entry. We can implement this method by following these steps:Add the update() method to the repository class and annotate the method with the @Transactional annotation. This ensures that the UPDATE statement is executed inside a read-write transaction. Get the current date and time by calling the getCurrentTimestamp() method of the DateTimeService interface. Update the information of the todo entry by following these steps:Create a new UPDATE statement by calling the update(Table table) method of the DSLContext interface and specify that you want to update information found from the todos table. Set the new description, modification time, and title by calling the set(Field field, T value) method of the UpdateSetStep interface. Specify the WHERE clause of the UPDATE statement by calling the where(Collection conditions) method of the UpdateWhereStep interface. Ensure that the UPDATE statement updates the todo entry which id is found from the Todo object given as a method parameter. Execute the UPDATE statement by calling the execute() method of the Query interface.Get the information of the updated todo entry by calling the findById() method. Pass the id of the updated todo entry as a method parameter. Return the information of the updated todo entry.The relevant part of the JOOQTodoRepository class looks as follows: import org.jooq.DSLContext; import org.springframework.stereotype.Repository; import org.springframework.transaction.annotation.Transactional;import java.sql.Timestamp;import static net.petrikainulainen.spring.jooq.todo.db.tables.Todos.TODOS;@Repository public class JOOQTodoRepository implements TodoRepository {private final DateTimeService dateTimeService;private final DSLContext jooq;//The constructor is omitted for the sake of clarity.@Transactional @Override public Todo update(Todo todoEntry) { Timestamp currentTime = dateTimeService.getCurrentTimestamp(); int updatedRecordCount = jooq.update(TODOS) .set(TODOS.DESCRIPTION, todoEntry.getDescription()) .set(TODOS.MODIFICATION_TIME, currentTime) .set(TODOS.TITLE, todoEntry.getTitle()) .where(TODOS.ID.equal(todoEntry.getId())) .execute();return findById(todoEntry.getId()); } }The section 4.3.4. The UPDATE Statement of the jOOQ reference manual provides additional information about updating the information which is stored to the database. If you are using Firebird or PostgreSQL databases, you can use the RETURNING clause in the update statement (and avoid the extra select clause).That is all folks. Let’s summarize what we learned from this blog post. Summary We have now implemented CRUD operations for todo entries. This tutorial has taught us three things:We learned how we can get the current date and time in a way which doesn’t prevent us from writing automated tests for our example application. We learned how we can ensure that all database queries executed by jOOQ are executed inside a transaction. We learned how we can create INSERT, SELECT, DELETE, and UPDATE statements by using the jOOQ API.The next part of this tutorial describes how we can add a search function, which supports sorting and pagination, to our example application.The example application of this blog post is available at Github (The frontend is not implemented yet).Reference: Using jOOQ with Spring: CRUD from our JCG partner Petri Kainulainen at the Petri Kainulainen blog....
apache-activemq-logo

ActiveMQ – Network of Brokers Explained

Objective This 7 part blog series is to share about how to create network of ActiveMQ brokers in order to achieve high availability and scalability. Why network of brokers? ActiveMQ message broker is a core component of messaging infrastructure in an enterprise. It needs to be highly available and dynamically scalable to facilitate communication between dynamic heterogeneous distributed applications which have varying capacity needs. Scaling enterprise applications on commodity hardware is a rage nowadays. ActiveMQ caters to that very well by being able to create a network of brokers to share the load. Many times applications running across geographically distributed data centers need to coordinate messages. Running message producers and consumers across geographic regions/data centers can be architected better using network of brokers. ActiveMQ uses transport connectors over which it communicates with message producers and consumers. However, in order to facilitate broker to broker communication, ActiveMQ uses network connectors. A network connector is a bridge between two brokers which allows on-demand message forwarding. In other words, if Broker B1 initiates a network connector to Broker B2 then the messages on a channel (queue/topic) on B1 get forwarded to B2 if there is at least one consumer on B2 for the same channel. If the network connector was configured to be duplex, the messages get forwarded from B2 to B1 on demand. This is very interesting because it is now possible for brokers to communicate with each other dynamically. In this 7 part blog series, we will look into the following topics to gain understanding of this very powerful ActiveMQ feature:Network Connector Basics – Part 1 Duplex network connectors – Part 2 Load balancing consumers on local/remote brokers – Part 3 Load-balance consumers/subscribers on remote brokersQueue: Load balance remote concurrent consumers - Part 4 Topic: Load Balance Durable Subscriptions on Remote Brokers – Part 5Store/Forward messages and consumer failover  - Part 6How to prevent stuck  messagesVirtual Destinations – Part 7To give credit where it is due, the following URLs have helped me in creating this blog post series.Advanced Messaging with ActiveMQ by Dejan Bosanac [Slides 32-36] Understanding ActiveMQ Broker Networks by Jakub KorabPrerequisitesActiveMQ 5.8.0 – To create broker instances Apache Ant – To run ActiveMQ sample producer and consumers for demo.We will use multiple ActiveMQ broker instances on the same machine for the ease of demonstration. Network Connector Basics – Part 1 The following diagram shows how a network connector functions. It bridges two brokers and is used to forward messages from Broker-1 to Broker-2 on demand if established by Broker-1 to Broker-2.A network connector can be duplex so messages could be forwarded in the opposite direction; from Broker-2 to Broker-1, once there is a consumer on Broker-1 for a channel which exists in Broker-2. More on this in Part 2 Setup network connector between broker-1 and broker-2Create two broker instances, say broker-1 and broker-2Ashwinis-MacBook-Pro:bin akuntamukkala$ pwd /Users/akuntamukkala/apache-activemq-5.8.0/bin Ashwinis-MacBook-Pro:bin akuntamukkala$ ./activemq-admin create ../bridge-demo/broker-1 Ashwinis-MacBook-Pro:bin akuntamukkala$ ./activemq-admin create ../bridge-demo/broker-2 Since we will be running both brokers on the same machine, let’s configure broker-2 such that there are no port conflicts.Edit /Users/akuntamukkala/apache-activemq-5.8.0/bridge-demo/broker-2/conf/activemq.xmlChange transport connector to 61626 from 61616 Change AMQP port from 5672 to 6672 (won’t be using it for this blog)Edit /Users/akuntamukkala/apache-activemq-5.8.0/bridge-demo/broker-2/conf/jetty.xmlChange web console port to 9161 from 8161Configure Network Connector from broker-1 to broker-2 Add the following XML snippet to  /Users/akuntamukkala/apache-activemq-5.8.0/bridge-demo/broker-1/conf/activemq.xmlnetworkConnectors> <networkConnector name="T:broker1->broker2" uri="static:(tcp://localhost:61626)" duplex="false" decreaseNetworkConsumerPriority="true" networkTTL="2" dynamicOnly="true"> <excludedDestinations> <queue physicalName=">" /> </excludedDestinations> </networkConnector> <networkConnector name="Q:broker1->broker2" uri="static:(tcp://localhost:61626)" duplex="false" decreaseNetworkConsumerPriority="true" networkTTL="2" dynamicOnly="true"> <excludedDestinations> <topic physicalName=">" /> </excludedDestinations> </networkConnector> </networkConnectors> The above XML snippet configures two network connectors “T:broker1->broker2″ (only topics as queues are excluded) and “Q:broker1->broker2″  (only queues as topics are excluded). This allows for nice separation between network connectors used for topics and queues. The name can be arbitrary although I prefer to specify the [type]:->[destination broker]. The URI attribute specifies how to connect to broker-2Start broker-2Ashwinis-MacBook-Pro:bin akuntamukkala$ pwd /Users/akuntamukkala/apache-activemq-5.8.0/bridge-demo/broker-2/bin Ashwinis-MacBook-Pro:bin akuntamukkala$ ./broker-2 consoleStart broker-1Ashwinis-MacBook-Pro:bin akuntamukkala$ pwd /Users/akuntamukkala/apache-activemq-5.8.0/bridge-demo/broker-1/bin Ashwinis-MacBook-Pro:bin akuntamukkala$ ./broker-1 console Logs on broker-1 show 2 network connectors being established with broker-2 INFO | Establishing network connection from vm://broker-1?async=false&network=true to tcp://localhost:61626 INFO | Connector vm://broker-1 Started INFO | Establishing network connection from vm://broker-1?async=false&network=true to tcp://localhost:61626 INFO | Network connection between vm://broker-1#24 and tcp://localhost/127.0.0.1:61626@52132(broker-2) has been established. INFO | Network connection between vm://broker-1#26 and tcp://localhost/127.0.0.1:61626@52133(broker-2) has been established. Web Console on broker-1 @ http://localhost:8161/admin/connections.jsp shows the two network connectors established to broker-2The same on broker-2 does not show any network connectors since no network connectors were initiated by broker-2 Let’s see this in action Let’s produce 100 persistent messages on a queue called “foo.bar” on broker-1. Ashwinis-MacBook-Pro:example akuntamukkala$ pwd /Users/akuntamukkala/apache-activemq-5.8.0/example Ashwinis-MacBook-Pro:example akuntamukkala$ ant producer -Durl=tcp://localhost:61616 -Dtopic=false -Ddurable=true -Dsubject=foo.bar -Dmax=100broker-1 web console shows that 100 messages have been enqueued in queue “foo.bar” http://localhost:8161/admin/queues.jspLet’s start a consumer on a queue called “foo.bar” on broker-2. The important thing to note here is that the destination name “foo.bar” should match exactly. Ashwinis-MacBook-Pro:example akuntamukkala$ ant consumer -Durl=tcp://localhost:61626 -Dtopic=false -Dsubject=foo.bar We find that all the 100 messages from broker-1′s foo.bar queue get forwarded to broker-2′s foo.bar queue consumer. broker-1 admin console at http://localhost:8161/admin/queues.jspbroker-2 admin console @ http://localhost:9161/admin/queues.jsp shows that the consumer we had started has consumed all 100 messages which were forwarded on-demand from broker-1broker-2 consumer details on foo.bar queuebroker-1 admin console shows that all 100 messages have been dequeued [forwarded to broker-2 via the network connector].broker-1 consumer details on “foo.bar” queue shows that the consumer is created on demand: [name of connector]_[destination broker]_inbound_Thus we have seen the basics of network connector in ActiveMQ. Stay tuned for Part 2…Reference: ActiveMQ – Network of Brokers Explained from our JCG partner Ashwini Kuntamukkala at the Ashwini Kuntamukkala – Technology Enthusiast blog....
java-logo

How to do Continuous Integration with Java 8, NetBeans Platform 8, Jenkins, Jacoco and Sonar

Intro Java 8 is there, the promised revolution is finally released, and I am sure that a lot of you are having in mind the same question “Should I use it in my project?”. Well, I had the same question for few months and today that I have an answer I would like to share it with you. A lot of aspects have been influencing this decision but in this post I want to focus on one in particular that is: Can I continue to do Continuous Integration with Java 8 and NetBeans Platform?   The main question was around the maturity of the tools necessary to do CI, and how easy was to integrate that with the ant build scripts of the NetBeans Platform. Fortunately, we found that it is possible and easy to do! I would also thanks Alberto Requena Sanchez for his contribution on this article. The Technical Environment Working in a project where Safety & Quality are the main drivers, CI is vital. For this reason I started with my team a “proof of concept” to show that the following technologies were ready to work together:Java 8, NetBeans 8.0 & Ant JUnit 4 & Jacoco 0.7.1 Jenkins & Sonar 4.2Scope of this post is to explain all the steps done to install and setup the necessary tools to have a completely working CI server for Java 8. Note that the proof has been done on a developer machine on Windows 7, but is easy to do the same in a Linux server. The next diagram shows at high level the architecture that will be described in the post.  Java 8, NetBeans 8.0 & Ant Java 8 is released, get it here, install it, study it (preferable) and start to use it! We are using the NetBeans Platform 8.0 to create a modular application. This application has a Multi Layered Architecture where each layer is a Suite of Modules, and where the final executable is just an integrated set of Suites. We are using Ant to build our projects, but if you are using Maven the procedure can even be simplified since the Sonar integration in Jenkins can be done via a plugin that uses Maven. JUnit 4 & Jacoco 0.7.1 Naturally, we are doing unit tests, and for this reason we use JUnit 4. It is well integrated everywhere, specially in NetBeans. Jacoco is a great tool for the generation of code coverage and since version 0.7.1 it fully supports Java 8. Jenkins & Sonar 4.2 Jenkins is the engine of our CI server, it will integrate with all the above described technologies without any issue. The tested version is 1.554. Sonar is doing all the quality analysis of the code. The release 4.2 has a full compatibility with Java 8. Using Sonar with Ant needs a small library that contains the target to be integrated in Jenkins. If you are using Maven instead you can just install the plugin for Maven. Starting the puzzle Step 1 – NetBeansInstall Java 8 & NetBeans 8.0 Create a module suite with several modules, several classes and several jUnit tests Commit the code into your source code version management server Inside the harness of NetBeans Create a folder in the harness named “jacoco-0.7.1″ containing the downloaded jacoco jars Create a folder in the harness named “sonar-ant-task” and put inside the downloaded sonar ant jars Create a file in the harness named sonar-jacoco-module.xml and paste the following code inside: <?xml version="1.0" encoding="UTF-8"?> <!----> <project name="sonar-jacoco-module" basedir="." xmlns:jacoco="antlib:org.jacoco.ant" xmlns:sonar="antlib:org.sonar.ant"> <description>Builds the module suite otherSuite.</description><property name="jacoco.dir" location="${nbplatform.default.harness.dir}/jacoco-0.7.1"/> <property name="result.exec.file" location="${jacoco.dir}/jacoco.exec"/> <property name="build.test.results.dir" location="build/test/unit/results"/><property file="nbproject/project.properties"/><!-- Step 1: Import JaCoCo Ant tasks --> <taskdef uri="antlib:org.jacoco.ant" resource="org/jacoco/ant/antlib.xml"> <classpath path="${jacoco.dir}/jacocoant.jar"/> </taskdef><!-- Target at the level of modules --> <target name="-do-junit" depends="test-init"> <echo message="Doing testing for jacoco" /> <macrodef name="junit-impl"> <attribute name="test.type"/> <attribute name="disable.apple.ui" default="false"/> <sequential> <jacoco:coverage destfile="${build.test.results.dir}/${code.name.base}_jacoco.exec"> <junit showoutput="true" fork="true" failureproperty="tests.failed" errorproperty="tests.failed" filtertrace="${test.filter.trace}" tempdir="${build.test.@{test.type}.results.dir}" timeout="${test.timeout}"> <batchtest todir="${build.test.@{test.type}.results.dir}"> <fileset dir="${build.test.@{test.type}.classes.dir}" includes="${test.includes}" excludes="${test.excludes}"/> </batchtest> <classpath refid="test.@{test.type}.run.cp"/> <syspropertyset refid="test.@{test.type}.properties"/> <jvmarg value="${test.bootclasspath.prepend.args}"/> <jvmarg line="${test.run.args}"/> <!--needed to have tests NOT to steal focus when running, works in latest apple jdk update only.--> <sysproperty key="apple.awt.UIElement" value="@{disable.apple.ui}"/> <formatter type="brief" usefile="false"/> <formatter type="xml"/> </junit> </jacoco:coverage> <copy file="${build.test.results.dir}/${code.name.base}_jacoco.exec" todir="${suite.dir}/build/coverage"/> <!-- Copy the result of all the unit tests of all the modules into one common folder at the level of the suite, so that sonar could find those files to generate associated reports --> <copy todir="${suite.dir}/build/test-results"> <fileset dir="${build.test.results.dir}"> <include name="**/TEST*.xml"/> </fileset> </copy> <fail if="tests.failed" unless="continue.after.failing.tests">Some tests failed; see details above.</fail> </sequential> </macrodef> <junit-impl test.type="${run.test.type}" disable.apple.ui="${disable.apple.ui}"/> </target></project> Scope of this file is to override the do-junit task adding the jacoco coverage, and to copy the result of the unit test of each module in the build of the suite, so that sonar will find all of them together to perform its analysis. Create a file in the harness named sonar-jacoco-suite.xml and paste the following code inside <?xml version="1.0" encoding="UTF-8"?> <project name="sonar-jacoco-suite" basedir="." xmlns:jacoco="antlib:org.jacoco.ant" xmlns:sonar="antlib:org.sonar.ant"> <description>Builds the module suite otherSuite.</description><property name="jacoco.dir" location="${nbplatform.default.harness.dir}/jacoco-0.7.1"/> <property name="result.exec.file" location="build/coverage"/>    <!-- Define the SonarQube global properties (the most usual way is to pass these properties via the command line) --> <property name="sonar.jdbc.url" value="jdbc:mysql://localhost:3306/sonar?useUnicode=true&characterEncoding=utf8" /> <property name="sonar.jdbc.username" value="sonar" /> <property name="sonar.jdbc.password" value="sonar" /> <!-- Define the SonarQube project properties --> <property name="sonar.projectKey" value="org.codehaus.sonar:example-java-ant" /> <property name="sonar.projectName" value="Simple Java Project analyzed with the SonarQube Ant Task" /> <property name="sonar.projectVersion" value="1.0" /> <property name="sonar.language" value="java" /> <!-- Load the project properties file for retrieving the modules of the suite --> <property file="nbproject/project.properties"/><!-- Using Javascript functions to build the paths of the data source for sonar configuration --> <script language="javascript">  <![CDATA[// getting the value modulesName = project.getProperty("modules"); modulesName = modulesName.replace(":",","); res = modulesName.split(","); srcModules = ""; binariesModules = ""; testModules = ""; //Build the paths   for (var i=0; i<res.length; i++) { srcModules += res[i]+"/src,"; binariesModules += res[i]+"/build/classes,"; testModules += res[i]+"/test,"; } //Remove the last comma srcModules = srcModules.substring(0, srcModules.length - 1); binariesModules = binariesModules.substring(0, binariesModules.length - 1); testModules = testModules.substring(0, testModules.length - 1); // store the result in a new properties project.setProperty("srcModulesPath",srcModules); project.setProperty("binariesModulesPath",binariesModules); project.setProperty("testModulesPath",testModules); ]]> </script>   <!-- Display the values -->        <property name="sonar.sources" value="${srcModulesPath}"/> <property name="sonar.binaries" value="${binariesModulesPath}" /> <property name="sonar.tests" value="${testModulesPath}" /> <!-- Define where the coverage reports are located --> <!-- Tells SonarQube to reuse existing reports for unit tests execution and coverage reports --> <property name="sonar.dynamicAnalysis" value="reuseReports" /> <!-- Tells SonarQube where the unit tests execution reports are --> <property name="sonar.junit.reportsPath" value="build/test-results" /> <!-- Tells SonarQube that the code coverage tool by unit tests is JaCoCo --> <property name="sonar.java.coveragePlugin" value="jacoco" /> <!-- Tells SonarQube where the unit tests code coverage report is --> <property name="sonar.jacoco.reportPath" value="${result.exec.file}/merged.exec" /> <!--  Step 1: Import JaCoCo Ant tasks  --> <taskdef uri="antlib:org.jacoco.ant" resource="org/jacoco/ant/antlib.xml"> <classpath path="${jacoco.dir}/jacocoant.jar"/> </taskdef>     <target name="merge-coverage">         <jacoco:merge destfile="${result.exec.file}/merged.exec"> <fileset dir="${result.exec.file}" includes="*.exec"/> </jacoco:merge> </target><target name="sonar"> <taskdef uri="antlib:org.sonar.ant" resource="org/sonar/ant/antlib.xml"> <!-- Update the following line, or put the "sonar-ant-task-*.jar" file in your "$HOME/.ant/lib" folder --> <classpath path="${harness.dir}/sonar-ant-task-2.1/sonar-ant-task-2.1.jar" /> </taskdef><!-- Execute the SonarQube analysis --> <sonar:sonar /> </target></project> Scope of this file is to define at the level of the suite the sonar configuration and the sonar ant task. If you are using for sonar some special database or special users is here that you must change the configuration. Another task that is defined is the jacoco merge that will actually take all the generated exec for each module and merge them into one single exec in the build of the suite, to permit sonar to make its analysis. Replace the content of the build.xml of each module with this one: <description>Builds, tests, and runs the project com.infrabel.jacoco.</description> <property file="nbproject/suite.properties"/> <property file="${suite.dir}/nbproject/private/platform-private.properties"/> <property file="${user.properties.file}"/> <import file="${nbplatform.default.harness.dir}/sonar-jacoco-module.xml"/> <import file="nbproject/build-impl.xml"/> Replace the content of the build.xml of each suite with this one: <description>Builds the module suite otherSuite.</description> <property file="nbproject/private/platform-private.properties"/> <property file="${user.properties.file}"/> <import file="${nbplatform.default.harness.dir}/sonar-jacoco-suite.xml"/> <import file="nbproject/build-impl.xml"/> Step 2 – Jenkins In “Manage Jenkins -> Manage Plugins” go in the available list and install (if not already present) the following plugins:JaCoCo Mercurial or Subversion SonarIf you are behind a firewall or proxy and getting issue to configure the network settings you can always download and install them manually from here. In this case remember to download also the dependencies of each plugin first. In “Manage Jenkins -> Configure System” check that all plugins are correctly setup, see the following screenshots to have an example (replace the folders with the good ones for you):Create a new free style project, configure the version control of your preference and in the “Build” panel add the following three “Invoce Ant” tasks:Finally in the “Post-build Actions” panel add a new “Record Jacoco Coverage Report” configured like this one:Step 3 – Sonar Create a database following this script, and optionally run this query to make the connection work: GRANT ALL PRIVILEGES ON 'sonar'.* TO 'sonar'@'localhost'; Go in the configuration file of sonar (sonar.properties) and enable the use of MySQL, the file is located in the conf folder of the installation # Permissions to create tables, indices and triggers # must be granted to JDBC user. # The schema must be created first. sonar.jdbc.username=sonar sonar.jdbc.password=sonar#----- MySQL 5.x # Comment the embedded database and uncomment the following # line to use MySQL sonar.jdbc.url=jdbc:mysql://localhost:3306/sonar?useUnicode=true&characterEncoding=utf8&rewriteBatchedStatements=true In the configuration of sonar update the java plugin if necessary to be compatible with Java 8 If necessary go and configure your proxy always in the sonar.properties fileDone! Now everything is set-up, you can go in NetBeans, do a build, commit your code, then in Jenkins launch the build and after the build is ok check the project in Sonar. That’s all! I hope I did not forget anything, but in case if you find some errors during the process do not hesitate to leave a comment, I will try to find the solution.Reference: How to do Continuous Integration with Java 8, NetBeans Platform 8, Jenkins, Jacoco and Sonar from our JCG partner Marco Di Stefano at the Refactoring Ideas blog....
software-development-2-logo

Seven Databases in Seven Days – Riak

In this post I am summarizing the three days of Riak, which is the second database in the Seven Databases in Seven Days book. This post is actually in order for me to remember some tweaks I had to do while reading this chapter as sometimes the book wasn’t entirely correct. A good blog, which I used a little, can be found at: http://blog.wakatta.jp/blog/2011/12/09/seven-databases-in-seven-weeks-riak-day-3/ (this link directs to the 3rd Riak’s day) I have everything pushed to GitHub as raw material: https://github.com/eyalgo/seven-dbs-in-seven-weeks Installing The book recommends to install using the source code itself. I needed to install Erlang as well. Besides the information in the book, the following link was mostly helpful: http://docs.basho.com/riak/latest/ops/building/installing/from-source/ I installed everything under /usr/local/riak/. Start / Stop / Restart A nice command line to start/stop/restart all the servers: # under /usr/local/riak/riak-1.4.8/dev for node in `ls`; do $node/bin/riak start; done # change start to restart or stop Port The port which was installed in my machine was: 10018 for dev1, 10028 for dev2 etc. The port is located in app.config file, under the etc folder. Day 3 Issues Pre-commit I kept getting PUT aborted by pre-commit hook message instead of the one described in the book. I had to add the language (javascript) to the operation: curl -i -X PUT http://localhost:10018/riak/animals -H "content-type: application/json" -d '{"props":{"precommit":[{"name":"good_score","language":"javascript"}]}}' (see: http://blog.sacaluta.com/2012/07/riak-precommit-hook-example.html) Running a solr query Running the suggested query from the book ( curl http://localhost:10018/solr/animals/select?wt=json&q=nickname:rin%20breed:shepherd&q.op=and) kept returning 400 – Bad Request. All I needed to do was to surround the URL with: ‘ (apostrophe). Inverted Index Running the link as mentioned in the book gives bad response: Invalid link walk query submitted. Valid link walk query format is: ... The correct way, as described in http://docs.basho.com/riak/latest/dev/using/2i/ curl http://localhost:10018/buckets/animals/index/mascot_bin/butler Conclusion Riak chapter gives a taste of this database. It explains more about the “tooling” of it rather than the application of it. I feel that it didn’t explain too much on why someone would use it instead of something else (let’s wait for Redis). The book had errors in how to run commands. I had to find by myself how to fix these problems. Perhaps it’s because I’m reading eBook (PDF on my computer and mobi on my Kindle), and the hard-copy has less issues. The good part of this problem, is that I had to drill down and read more online and learn more from those mistakes.Reference: Seven Databases in Seven Days – Riak from our JCG partner Eyal Golan at the Learning and Improving as a Craftsman Developer blog....
java-logo

Hi there . . ! How would you rate your Java/Java EE skills?

To know, is to know that you know nothing. That is the meaning of true knowledge. Socrates This post is to provide the reader with a quick overview of the Java ecosystem and it’s technology stack. To be honest, there have been many revolutionary changes and additions to the Java Platform – from Java EE 7, Java SE 8 to Java Embedded 8 …. wow! Exciting times! In the midst of all this, why did I decide to write a blog post about a rudimentary topic such as the Java platform and its related technologies? How many times have you conducted an interview and asked a candidate to provide a rough estimate/rating of their Java skill set (on a specific scale)? What kind of answers have you received ? 8/10, 4/5, 6.5/10 ?? I am pretty surprised as to how the candidate actually managed to muster these figures in a matter of few seconds (I really don’t think that experience matters here!) So the premise of this post is toDrive home the point that “How would you rate your Java/J2EE skills?” is an unreasonable question – even though I have made the mistake of asking this on a number of occasions! Help you answer it!Read on . . . . . . . Java Technology can be broadly categorized intoJava SE Java EE Java Embedded Java FXLet’s begin . . . . . Java Standard Edition (Java SE)The Platform itself! The mother of all other Java related technologies ranging from Java EE on enterprise servers to Java Embedded on resource constrained devices. Latest version – Java SE 8 (click here for more on the new stuff in Java SE 8) Java is not just a programming language as many people mistakenly assume. It’s a complete Platform (sorry about the fact that I had to plug in the tabular content in the form of images. For some reason I can’t seem to find support for inserting tables into my WordPress blogs. Hence I decided to write the content in Word and use their snapshots) Primary ComponentsJava Enterprise Edition (Java EE)For developing enterprise grade applications which are distributed, multi-tiered, scalable, robust, fault tolerant. Latest version – Java EE 7 (click here for more on the latest Java EE 7 features) Standards driven modelJava EE 7 defines a unified model for developing rich and powerful server side solutions It is composed of individual specifications which are standards in themselves. Each of these specifications are a set of interfaces/APIs which are implemented by vendors of Application Servers (more details here)There are 32 specifications which Java EE definesAlright then! I am guessing you have had enough of Java EE …. ! Let’s move on Java EmbeddedThe Java Embedded technologies are focussed on mobile and embedded devices (RFIDs, sensors, micro controllers. blu-ray discs etc) and are powered mainly by different flavours of Java ME and SE for specific device capabilities Java Micro Edition (Java ME) flavours Java ME Embedded ClientBased on Connected Device Configuration (CDC) – subset of Java SE platform for small device like mobile phones Sufficient for devices having 8 MB RAM or moreJava ME EmbeddedNew launch Based on Connected Limited Device Configuration (CLDC) – JVM which is optimized for really small embedded systems which have 130 KB or more memory Suitable for memory/resource constrained embedded devices such as sensors, wireless modules etc Hailed as the platform of choice for developing applications in the Internet Of Things (IoT) era The latest version is Java ME Embedded 8 (Early Access) – Lends support for language features from Java SE 8Java SE flavours Java SE EmbeddedIt’s JVM implementation is suitable for mid to high range embedded devices 32 MB or more memory is required Allows developers to configure their own custom JRE as per application requirements Latest version – Java SE Embedded 8Java Embedded SuiteNew platform – An enriched version of Java SE Embedded Adds enterprise functionalities like support for Glass Fish server (yes – an application server in an embedded device!), Java DB, REST support through JAX-RS implementation Oracle Event Processing – Optional module in the Java SE Embedded Suite. It aims at extending real time, event driven processing support to embedded devicesJava FXJava FX is leveraged to build rich client applications. It sort of completes the puzzle so to say, complements the Java server side development stack and provides a comprehensive UI platform including graphics, and media API support. It’s tailor made to deliver high performance with hardware accelerated graphics. Ok, so.. what was the whole point of this post? To help you answer the inevitable “How would you rate your Java/J2EE skills?” Basically, this is what you can doSummarize this post – it’s not going to be tough.. trust me! Ask the interviewer to be more specific as far as Java is concerned, given the fact that you explained the length and breadth of the Java platform!Although this post only touched upon the various Java tech flavors, it’s quite evident as to how vast it is.  That’s precisely why, we as mortals cannot expect to attach numbers and random figures to our Java knowledge. Instead of fooling around with Java ratings, let’s just have fun with the platform and language and leverage it to build stuff which the world has not yet imagined!Reference: Hi there . . ! How would you rate your Java/Java EE skills? from our JCG partner Abhishek Gupta at the Object Oriented.. blog....
java-logo

We’re Hacking JDBC, so You Don’t Have To

We love working with JDBC Said no one. Ever. On a more serious note, JDBC is actually a very awesome API, if you think about it. It is probably also one of the very reasons Java has become the popular platform it is today. Before the JDK 1.1, and before ODBC (and that’s a very long time ago) it was hard to imagine any platform that would standardise database access at all. Heck, SQL itself was hardly even standardised at the time and along came Java with JDBC, a simple API with only few items that you have to know of in every day work:Connection: the object that models all your DB interactions PreparedStatement: the object that lets you execute a statement ResultSet: the object that lets you fetch data from the databaseThat’s it! Back to reality That was the theory. In practice, enterprise software operating on top of JDBC quickly evolved towards this:JDBC is one of the last resorts for Java developers, where they can feel like real hackers, hacking this very stateful, very verbose, very arcane API in many ways. Pretty much everyone operating on JDBC will implement wrappers around the API to prevent at least:Common syntax errors Bind variable index mismatches Dynamic SQL construction Edge cases around the usage LOBs Resource handling and closing Array and UDT management Stored procedure abstraction… and so much more. So while everyone is doing the above infrastructure work, they’re not working on their business logic. And pretty much everyone does these things, when working with JDBC. Hibernate and JPA do not have most these problems, but they’re not SQL APIs any longer, either. Here are a couple of examples that we have been solving inside of jOOQ, so you don’t have to: How to fetch generated keys in some databases case DERBY: case H2: case MARIADB: case MYSQL: { try { listener.executeStart(ctx); result = ctx.statement().executeUpdate(); ctx.rows(result); listener.executeEnd(ctx); }// Yes. Not all warnings may have been consumed yet finally { consumeWarnings(ctx, listener); }// Yep. Should be as simple as this. But it isn't. rs = ctx.statement().getGeneratedKeys();try { List<Object> list = new ArrayList<Object>();// Some JDBC drivers seem to illegally return null // from getGeneratedKeys() sometimes if (rs != null) { while (rs.next()) { list.add(rs.getObject(1)); } }// Because most JDBC drivers cannot fetch all // columns, only identity columns selectReturning(ctx.configuration(), list.toArray()); return result; } finally { JDBCUtils.safeClose(rs); } } How to handle BigInteger and BigDecimal else if (type == BigInteger.class) { // The SQLite JDBC driver doesn't support BigDecimals if (ctx.configuration().dialect() == SQLDialect.SQLITE) { return Convert.convert(rs.getString(index), (Class) BigInteger.class); } else { BigDecimal result = rs.getBigDecimal(index); return (T) (result == null ? null : result.toBigInteger()); } } else if (type == BigDecimal.class) { // The SQLite JDBC driver doesn't support BigDecimals if (ctx.configuration().dialect() == SQLDialect.SQLITE) { return Convert.convert(rs.getString(index), (Class) BigDecimal.class); } else { return (T) rs.getBigDecimal(index); } } How to fetch all exceptions from SQL Server switch (configuration.dialect().family()) { case SQLSERVER: consumeLoop: for (;;) try { if (!stmt.getMoreResults() && stmt.getUpdateCount() == -1) break consumeLoop; } catch (SQLException e) { previous.setNextException(e); previous = e; } } Convinced? This is nasty code. And we have more examples of nasty code here, or in our source code. All of these examples show that when working with JDBC, you’ll write code that you don’t want to / shouldn’t have to write in your application. This is why… we have been hacking JDBC, so you don’t have toReference: We’re Hacking JDBC, so You Don’t Have To from our JCG partner Lukas Eder at the JAVA, SQL, AND JOOQ blog....
software-development-2-logo

Conway’s Game of Life and the Flyweight Pattern

Conway’s Game of Life is fascinating, both from a functional and from a technical perspective. This may explain why it’s often used for code retreats. Code retreats are a fun way to learn. It’s amazing how working with new pairs gives you new insights virtually every time.   At the last code retreat that I attended, one of my pairs suggested we use the Flyweight pattern for cells: A flyweight is a shared object that can be used in multiple contexts simultaneously. The flyweight acts as an independent object in each context — it’s indistinguishable from an instance of the objects that is not shared. When the Design Patterns book (which contains the above quote) came out, I remember having many aha moments. It was so cool to see all these patterns that I had used before and finally have a name for them so I could discuss them with my peers more efficiently! I did not have an aha moment when reading about flyweight, however. The example in the book, sharing character objects in a text editor, seemed a bit far fetched at the time. This example is not unlike the cells in a Game of Life grid, however, so I happily went along with my pair’s idea to explore the pattern’s applicability in this context. After the code retreat was over, I gave this pattern some more thought. (This is usually where the code retreat really starts paying off.) We actually use a potential flyweight all the time: booleans. A boolean is a class with only two instances, and those instances could easily be shared. In Java they are not: new Boolean(true) != new Boolean(true). However, the Boolean class does provide two constants, TRUE and FALSE, for the instances that you could use for sharing. That got me thinking about using Enums for flyweights. Most of the time, I use enums for grouping related but mutually exclusive constants, like days of the week. However, Enums in Java can define methods: public enum Cell {ALIVE(true), DEAD(false);private final boolean alive;private Cell(boolean alive) { this.alive = alive; }public boolean isAlive() { return alive; }public Cell evolve(int numLiveNeighbors) { boolean aliveInNextGeneration = alive ? 2 <= numLiveNeighbors && numLiveNeighbors <= 3 : numLiveNeighbors == 3; return aliveInNextGeneration ? ALIVE : DEAD; }} One of the fun parts of code retreats is that in some sessions, you will have constraints on the way you work. Such constraints force you to be more creative and think beyond the techniques you would normally use. One constraint that is interesting in this context is to not use any conditionals, like if or switch statements or ternary operators. The idea behind this constraint is to force you to replace conditionals with polymorphism, making your program more object oriented. The only way that I see to keep the current Cell enum and not use conditionals, is to introduce a map: public enum Cell {ALIVE(true), DEAD(false);private final boolean alive; private static final Map<Boolean, Map<Integer, Cell>> NEXT = new HashMap<>();static { Map<Integer, Cell> dead = new HashMap<>(); dead.put(0, DEAD); dead.put(1, DEAD); dead.put(2, DEAD); dead.put(3, ALIVE); dead.put(4, DEAD); dead.put(5, DEAD); dead.put(6, DEAD); dead.put(7, DEAD); dead.put(8, DEAD); dead.put(9, DEAD); NEXT.put(false, dead); Map<Integer, Cell> alive = new HashMap<>(); alive.put(0, DEAD); alive.put(1, DEAD); alive.put(2, ALIVE); alive.put(3, ALIVE); alive.put(4, DEAD); alive.put(5, DEAD); alive.put(6, DEAD); alive.put(7, DEAD); alive.put(8, DEAD); alive.put(9, DEAD); NEXT.put(true, alive); }private Cell(boolean alive) { this.alive = alive; }public boolean isAlive() { return alive; }public Cell evolve(int numLiveNeighbors) { return NEXT.get(alive).get(numLiveNeighbors); }} This approach works, but is not very elegant and it breaks down when the number of possibilities grows. Clearly, we need a better alternative. The only way we can get rid of the conditional, is by getting rid of the boolean state of the cell. That means we need to have different classes for the two instances, so that the type implicitly embodies the state. That in turn means we need a factory to hide those classes from the client: public interface Cell {boolean isAlive(); Cell evolve(int numLiveNeighbors);}public class CellFactory {private static final Map<Boolean, Cell> CELLS = new HashMap<>();static { CELLS.put(false, new DeadCell()); CELLS.put(true, new AliveCell()); }public static Cell dead() { return cell(false); }public static Cell alive() { return cell(true); }static Cell cell(boolean alive) { return CELLS.get(alive); }}class DeadCell implements Cell {@Override public boolean isAlive() { return false; }@Override public Cell evolve(int numLiveNeighbors) { return CellFactory.cell(numLiveNeighbors == 3); }}class AliveCell implements Cell {@Override public boolean isAlive() { return true; }@Override public Cell evolve(int numLiveNeighbors) { return CellFactory.cell(numLiveNeighbors == 2 || numLiveNeighbors == 3); }} Indeed, when you look up the Flyweight pattern, you’ll see that the proposed structure contains a flyweight factory that creates instances of concrete flyweight classes that implement a common flyweight interface. Thanks to the code retreat and my partner, I now know why.Reference: Conway’s Game of Life and the Flyweight Pattern from our JCG partner Remon Sinnema at the Secure Software Development blog....
jetbrains-intellijidea-logo

Coloring Different Data Sources in IntelliJ IDEA

The database plugin in IntelliJ IDEA is a useful tool to work with data in databases. As long as we got a JDBC driver to connect to the database we can configure a data source. And then we can run queries, inspect the contents of tables and change data with the database tool window. It is not uncommon to have multiple data sources, for example development and test environment databases, which will have the same tables. When we open the tables or run queries we don’t have a visual feedback to see to which data source such a table belongs. To have a visual feedback we can colorize our data source. This means we assign a color to a data source and when we open a table from that data source the tab color in the editor window will have a different color than other tabs or the background color of the data source objects have a color. To add a color to a data source we must open the database tool window and right click on a data source. We select the option Color Settings… from the popup window:Next a new dialog opens where we can select a color:We can make a selection for one of the predefined colors or create a custom color we want to use. Also we can select in the Appearance Settings where in IntelliJ IDEA the colored data source must appear. We click on the OK button to save our settings. We can repeat these steps for other data sources and given them different colors. Once we have added color to our data source we can see for example in the tabs of our editor window the different colors:Or when we open the data sources in the database tool window to get a list of all objects in the data source:Even we open a dialog to see recently changed files we can see the colorized data source objects:Sample with IntelliJ IDEA 13.1.1Reference: Coloring Different Data Sources in IntelliJ IDEA from our JCG partner Hubert Ikkink at the JDriven blog....
apache-maven-logo

Maven and Java multi-version modules

Introduction Usually, a project has a minimum Java version requirement and that applies to all of its modules. But every rule has its exceptions, as recently I stumbled on the following issue. One open source project of mine mandates Java 1.6 for most of its modules, except one requiring the 1.7 version. This happens when integrating external libraries having different Java requirements than your own project. Because that one module integrates the DBCP2 framework (supporting at least Java 1.7), I need to instruct Maven to use two different Java compilers. Environment variables We need to define the following environment variablesEnvironment Variable Name Environment Variable ValueJAVA_HOME_6 C:\Program Files\Java\jdk1.6.0_38JAVA_HOME_7 C:\Program Files\Java\jdk1.7.0_25JAVA_HOME %JAVA_HOME_6%The parent pom.xml The parent pom.xml defines the global java version settings <properties> <jdk.version>6</jdk.version> <jdk>${env.JAVA_HOME_6}</jdk> </properties> We need to instruct both the compiler and the test plugins to use the configured java version. <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <configuration> <source>${jdk.version}</source> <target>${jdk.version}</target> <showDeprecation>true</showDeprecation> <showWarnings>true</showWarnings> <executable>${jdk}/bin/javac</executable> <fork>true</fork> </configuration> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> <configuration> <jvm>${jdk}/bin/java</jvm> <forkMode>once</forkMode> </configuration> </plugin> </plugins> </build> The specific module pom.xml Those modules requiring a different java version, just need to override the default settings: <properties> <jdk.version>7</jdk.version> <jdk>${env.JAVA_HOME_7}</jdk> </properties> And that’s it, we can now build each modules using its own specific minimum java version requirement.Reference: Maven and Java multi-version modules from our JCG partner Vlad Mihalcea at the Vlad Mihalcea’s Blog blog....
java-logo

New BigInteger Methods in Java 8

Attention to new features in JDK 8 has rightfully been largely focused on new language features and syntax. However, there are some nice additions to the libraries and APIs and in this post I cover four new methods added to the BigInteger class: longValueExact(), intValueExact(), shortValueExact(), and byteValueExact(). All four of the newly introduced “xxxxxExact()” methods throw an ArithmeticException if the number contained in the BigInteger instance cannot be provided in the specified form (specified in the method’s name) without loss of information. BigInteger already had methods intValue() and longValue() as well as inherited (from Number) methods shortValue() and byteValue(). These methods do not throw exceptions if the BigInteger value loses information in the presentation as one of these types. Although at first glance this may seem like an advantage, it means that code that uses the results of these methods uses values that are not accurate without any ability to know that information was lost. The new “xxxxxExact” methods throw an ArithmenticException rather than pretending to provide a result that has lost significant information. The following simple code listing demonstrates the “legacy” methods that present wrong data in types byte, short, int, and long rather than throwing an exception. The same code also demonstrates use of the new “xxxxxExact” methods that throw an exception when information is lost rather than presenting a bad representation. The output of running this code follows the code and demonstrates how the methods behave differently when the BigInteger contains a value with more information than the returned byte, short, int, or long can represent. BigIntegerDem.java package dustin.examples.jdk8;import static java.lang.System.out; import java.math.BigInteger;/** * Demonstrate the four new methods of BigInteger introduced with JDK 8. * * @author Dustin */ public class BigIntegerDemo { /** * Demonstrate BigInteger.byteValueExact(). */ private static void demonstrateBigIntegerByteValueExact() { final BigInteger byteMax = new BigInteger(String.valueOf(Byte.MAX_VALUE)); out.println("Byte Max: " + byteMax.byteValue()); out.println("Byte Max: " + byteMax.byteValueExact()); final BigInteger bytePlus = byteMax.add(BigInteger.ONE); out.println("Byte Max + 1: " + bytePlus.byteValue()); out.println("Byte Max + 1: " + bytePlus.byteValueExact()); }/** * Demonstrate BigInteger.shortValueExact(). */ private static void demonstrateBigIntegerShortValueExact() { final BigInteger shortMax = new BigInteger(String.valueOf(Short.MAX_VALUE)); out.println("Short Max: " + shortMax.shortValue()); out.println("Short Max: " + shortMax.shortValueExact()); final BigInteger shortPlus = shortMax.add(BigInteger.ONE); out.println("Short Max + 1: " + shortPlus.shortValue()); out.println("Short Max + 1: " + shortPlus.shortValueExact()); }/** * Demonstrate BigInteger.intValueExact(). */ private static void demonstrateBigIntegerIntValueExact() { final BigInteger intMax = new BigInteger(String.valueOf(Integer.MAX_VALUE)); out.println("Int Max: " + intMax.intValue()); out.println("Int Max: " + intMax.intValueExact()); final BigInteger intPlus = intMax.add(BigInteger.ONE); out.println("Int Max + 1: " + intPlus.intValue()); out.println("Int Max + 1: " + intPlus.intValueExact()); }/** * Demonstrate BigInteger.longValueExact(). */ private static void demonstrateBigIntegerLongValueExact() { final BigInteger longMax = new BigInteger(String.valueOf(Long.MAX_VALUE)); out.println("Long Max: " + longMax.longValue()); out.println("Long Max: " + longMax.longValueExact()); final BigInteger longPlus = longMax.add(BigInteger.ONE); out.println("Long Max + 1: " + longPlus.longValue()); out.println("Long Max + 1: " + longPlus.longValueExact()); }/** * Demonstrate BigInteger's four new methods added with JDK 8. * * @param arguments Command line arguments. */ public static void main(final String[] arguments) { System.setErr(out); // exception stack traces to go to standard output try { demonstrateBigIntegerByteValueExact(); } catch (Exception exception) { exception.printStackTrace(); }try { demonstrateBigIntegerShortValueExact(); } catch (Exception exception) { exception.printStackTrace(); }try { demonstrateBigIntegerIntValueExact(); } catch (Exception exception) { exception.printStackTrace(); }try { demonstrateBigIntegerLongValueExact(); } catch (Exception exception) { exception.printStackTrace(); } } } The Output Byte Max: 127 Byte Max: 127 Byte Max + 1: -128 java.lang.ArithmeticException: BigInteger out of byte range at java.math.BigInteger.byteValueExact(BigInteger.java:4428) at dustin.examples.jdk8.BigIntegerDemo.demonstrateBigIntegerByteValueExact(BigIntegerDemo.java:23) at dustin.examples.jdk8.BigIntegerDemo.main(BigIntegerDemo.java:75) Short Max: 32767 Short Max: 32767 Short Max + 1: -32768 java.lang.ArithmeticException: BigInteger out of short range at java.math.BigInteger.shortValueExact(BigInteger.java:4407) at dustin.examples.jdk8.BigIntegerDemo.demonstrateBigIntegerShortValueExact(BigIntegerDemo.java:36) at dustin.examples.jdk8.BigIntegerDemo.main(BigIntegerDemo.java:84) Int Max: 2147483647 Int Max: 2147483647 Int Max + 1: -2147483648 java.lang.ArithmeticException: BigInteger out of int range at java.math.BigInteger.intValueExact(BigInteger.java:4386) at dustin.examples.jdk8.BigIntegerDemo.demonstrateBigIntegerIntValueExact(BigIntegerDemo.java:49) at dustin.examples.jdk8.BigIntegerDemo.main(BigIntegerDemo.java:93) Long Max: 9223372036854775807 Long Max: 9223372036854775807 Long Max + 1: -9223372036854775808 java.lang.ArithmeticException: BigInteger out of long range at java.math.BigInteger.longValueExact(BigInteger.java:4367) at dustin.examples.jdk8.BigIntegerDemo.demonstrateBigIntegerLongValueExact(BigIntegerDemo.java:62) at dustin.examples.jdk8.BigIntegerDemo.main(BigIntegerDemo.java:102) As the above output demonstrates, the new BigInteger methods with “xxxxxExact” in their name will not present inaccurate representations when the returned type cannot hold the information in BigInteger instance. Although exceptions are generally not one of our favorite things, they are almost always going to be better than getting and using wrong data and not even realizing it is wrong.Reference: New BigInteger Methods in Java 8 from our JCG partner Dustin Marx at the Inspired by Actual Events blog....
Java Code Geeks and all content copyright © 2010-2014, Exelixis Media Ltd | Terms of Use
All trademarks and registered trademarks appearing on Java Code Geeks are the property of their respective owners.
Java is a trademark or registered trademark of Oracle Corporation in the United States and other countries.
Java Code Geeks is not connected to Oracle Corporation and is not sponsored by Oracle Corporation.

Sign up for our Newsletter

15,153 insiders are already enjoying weekly updates and complimentary whitepapers! Join them now to gain exclusive access to the latest news in the Java world, as well as insights about Android, Scala, Groovy and other related technologies.

As an extra bonus, by joining you will get our brand new e-books, published by Java Code Geeks and their JCG partners for your reading pleasure! Enter your info and stay on top of things,

  • Fresh trends
  • Cases and examples
  • Research and insights
  • Two complimentary e-books