Featured FREE Whitepapers

What's New Here?


Use reCaptcha in a Spring MVC web application

A CAPTCHA is a program that can generate and grade tests that humans can pass but computer programs ‘cannot‘. One of strategies followed are showing an image to user with distorted text, and user should write text in input area. If showed text is the same as input by user, then we can ‘assure‘ that a human is on computer.A captcha example:Captchas have several applications for practical security, for example:Preventing Spam in comment fields. Protecting from Massive User Registration. Preventing Dictionary Attacks. …These distorted texts are acquired as follows:Digitizing physical books/newspaper. Pages are photographically scanned, and then transformed into text using ‘Optical Character Recognition‘ (OCR). OCR is not perfect, each word that cannot be read correctly by OCR is placed on an image and used as a CAPTCHA. Word that cannot be read correctly by OCR is given to a user with another word for which the answer is already known. Then is asked to read both words, if user solves the one for which the answer is known, the system assumes their answer is correct for the new one. The system then gives the new image to a number of other people to determine, with higher confidence, whether the original answer was correct.Now you know how captcha works, the problem is that if you want to use captchas in your website, you should implement yourself process described above, and of course this is not easy and tedious work is required digitalizing works. For this reason there are some ‘captcha providers‘ that have done this work for us. One of these providers is reCaptcha http://www.google.com/recaptcha. reCaptcha is a free captcha service that provides us these captchas ready to be used in our site. As developers we only have to embedded a piece of code in client side for showing captcha image and text area, and in server side, calling a function for resolving input data. reCaptcha provides plugins for dealing with lot of programming languages like Java, PHP, Perl, … This post will guide you on how to use reCaptcha in Spring MVC web application. The application consists in a form to register a new user. This form contains a captcha for avoiding a bot starts a massive registration attack. First step is open an account to reCaptcha site (you can use yourgoogle account or create a new one). Once you have entered go to My Account – Add New Site. Then at domain box you should write the domain which will contain captcha validation. For this example I have entered localhost and I have checked Enable this key on all domains (global key). Of course information provided here is for testing porpoise and in production environment should be different. After you have registered your site, two keys are provided, private key (XXXX) and a public key (YYYY).Before coding, let me show basic life-cycle of a reCAPTCHA challenge. Diagram is from reCaptcha web:Second step is create a Spring MVC application, no secret here, I am going to explain only parts that are implied in reCaptcha integration. Apart from SpringMVC dependencies, recaptcha4j API should be added: <dependency> <groupId>net.tanesha.recaptcha4j<groupId> <artifactId>recaptcha4j<artifactId> <version>0.0.7<version> <dependency> recaptcha4j.jar is an API that provides a simple way to place a captcha on your Java-based website. The library wraps the reCAPTCHA API. Integrating reCaptcha into a form, requires two modifications:One in client side, for connecting to reCaptcha server and get the challenge. Second one in server-side for connecting to reCaptcha server to send the user’s answer, and give back a response.  Client side: For client side a tagfile has been created to encapsulate all logic of reCaptcha API in a single point, so can be reused in all JSP forms. <%@ tag import='net.tanesha.recaptcha.ReCaptcha' %> <%@ tag import='net.tanesha.recaptcha.ReCaptchaFactory' %> <%@ attribute name='privateKey' required='true' rtexprvalue='false' %> <%@ attribute name='publicKey' required='true' rtexprvalue='false' %><% ReCaptcha c = ReCaptchaFactory.newReCaptcha(publicKey, privateKey, false); out.print(c.createRecaptchaHtml(null, null)); %> reCaptcha class requires the private key (XXXX) and the public key (YYYY) provided by reCaptcha in step one. The method createRecaptchaHtml(…) creates a piece of html code to show the challenge. In fact it generates something like:And finally a JSP page with a form and captcha information: <%@ taglib uri='http:java.sun.comjspjstlcore' prefix='c' %> <%@ taglib prefix='form' uri='http:www.springframework.orgtagsform' %> <%@ taglib prefix='tags' tagdir='WEB-INFtags' %><%@ page session='false' %> <html> <head> <title>Register User<title> <head> <body> <h1> <form:form id='register' modelAttribute='userInfo'> <table> <tr> <td>Username: <td> <td><form:input path='username'><td> <tr> <tr> <td>Password: <td> <td><form:password path='password'><td> <tr> <tr> <td>Age: <td> <td><form:input path='age'><td> <tr> <tr> <td colspan='2'> <tags:captcha privateKey='XXXX' publicKey='YYYY'><tags:captcha> <td> <tr> <tr> <td colspan='2'> <input id='submit' type='submit' value='Submit' > <td> <tr> <table> <form:form> <h1> <body> <html> See that form is generated as usual using Spring MVC taglib, but also we are using created tagfile (<tags:captcha>) for embedding captcha into form. Server Side: Server side is even simpler than client side. When a captcha is created using createRecaptchaHtml, two form element fields are created, recaptcha_challenge_field that contains information about the challenge presented to user, and recaptcha_response_field that contains the user answer to the challenge. Apart from these two parameters, recaptcha4j requires remote address too. ServletRequest interface has a method (getRemoteAddr()) for this porpoise. @RequestMapping(value='', method=RequestMethod.POST) public String submitForm(@ModelAttribute('userInfo') UserInfo userInfo, @RequestParam('recaptcha_challenge_field') String challangeField, @RequestParam('recaptcha_response_field') String responseField, ServletRequest servletRequest) { String remoteAddress = servletRequest.getRemoteAddr();ReCaptchaResponse reCaptchaResponse = this.reCaptcha.checkAnswer(remoteAddress, challangeField, responseField);if(reCaptchaResponse.isValid()) { return 'success'; } else { return 'home'; } } reCaptcha object is injected using Spring. It is important to note that UserInfo (data entered by user in form) does not contain any information about captcha, it only contains ‘business’ data. Using @RequestParam reCaptcha information is retrieved by Spring and can be used directly into reCaptcha object. The other important part is isValid() method. This method simply checks if response of reCaptcha site is that user has been passed the challenge or not. Depending on result you should act consequently, if challenge is not passed returning to previous page is a good practice. <bean id='recaptcha' class='net.tanesha.recaptcha.ReCaptchaImpl'> <property name='privateKey' value='XXXX'><property> <bean> This bean definition is simply for instantiating reCaptcha class with your private key. Using @Autowire bean is injected into controller. Step Three: Last step is watch that created form shows the captcha image and controller redirects you to page depending on what you have entered into captcha text area. Extra Step: Now you have a basic notion of how to work with reCaptcha, next step (out of scope of this post) is instead of showing again form without any error message, you could use BindingResult in Controller for notifying to user an error message: if (!reCaptchaResponse.isValid()) { FieldError fieldError = new FieldError( 'userInfo', 'captcha', 'Please try again.'); result.addError(fieldError); } result variable is an attribute passed to submitForm of type BindingResult. Of course JSP should be changed with <form:errors path=’captcha’/> for showing the error message. Another improvement is creating a HandlerInterceptor for validating forms with captchas. For example ReCaptchaHandlerInterceptorAdapter would contain reCaptcha management. preHandle method would return true if captcha challenge is resolved correctly by user (allowing defined controller do its work), or false and redirecting to an error page. <mvc:interceptors> <mvc:interceptor> <mapping path='*.form'> <bean class='org.springsource.mvc.ReCaptchaHandlerInterceptorAdapter' > <mvc:interceptor> <mvc:interceptors> With previous handler configuration all forms would have captcha validation. Hope you find useful this post, and now you can start protecting your web forms from spam or bots. Download Eclipse Project.   Reference: Mornië Utúlië, Believe And You Will Find Your Way (May It Be – Enya) from our JCG partner Alex Soto at the One Jar To Rule Them All blog. ...

Tomcat Clustering Series Part 2 : Session Affinity Load Balancer

This is the second part of the Tomcat Clustering Series. In the first part we discussed how to setup a simple load balancer. And we saw how the load balancer distributes the requests to tomcat instances in a round robin fashion. In this post we talk about what is the problem that occurs in the simple load balancer when we introduce sessions in our web application. And we will see how to resolve this issue. How Session works in Servlet/Tomcat? Before going into the problem, let’s see the session management in Tomcat. If any of the pages/servlets creates a session, then Tomcat creates the Session Object and attaches is into the group of sessions (HashMap-like structure) and that session can be identified by using the session-id, which is just a random number generated through any one of the hash algorithms. Then responds to client with a cookie header field. That cookie header field is a key-value pair. So tomcat creates jsessioid which is the key and the random session-id is the value. When the response reaches the client (Web Browser), it updates the cookie value. If it already exists, then it overrides the cookie value. From now on, the browser sends the cookie attached with the request to that server. HTTP is a stateless protocol. So the server can’t find the client session trivially. So the server reads the header of the request and extracts the cookie value and the Random session-id. Then it search through the group of session maintained by Tomcat. Then tomcat gets the Session of that particular client (Web Browser). If the client cookie value doesn’t have a match in the group of sessions, then Tomcat creates a completely new session and sends the new cookie to the browser. Then the browser updates it. This index.jsp code is to deploy all tomcat instances. <%@page import="java.util.ArrayList"%> <%@page import="java.util.Date"%><%@page import="java.util.List"%> <%@page contentType="text/html" pageEncoding="UTF-8"%><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> <title>JSP Page</title><font size="5" color="#0000FF"> Instance 1</font><hr><font size="5" color="#CC0000">Session Id : <%=request.getSession().getId()%>Is it New Session : <%=request.getSession().isNew()%>Session Creation Date : <%=new Date(request.getSession().getCreationTime())%>Session Access Date : <%=new Date(request.getSession().getLastAccessedTime())%></font> <b>Cart List </b><hr><ul> <%String bookName = request.getParameter("bookName"); List<string> listOfBooks = (List<string>) request.getSession().getAttribute("Books");if (listOfBooks == null) { listOfBooks = new ArrayList<string>(); request.getSession().setAttribute("Books", listOfBooks); } if (bookName != null) { listOfBooks.add(bookName); request.getSession().setAttribute("Books", listOfBooks); }for (String book : listOfBooks) { out.println("<li>"+book + "</li> "); }%> </string></string></string></ul> <hr> <form action="index.jsp" method="post"> Book Name <input type="text" name="bookName"><input type="submit" value="Add to Cart"> </form> <hr>   What is the Problem in the Simple Load Balancer? If we deploy the web application, in which its uses the session, then actually the problem occurs.  I describe with sequenceUser request one web page, in that web page its used sessions (like shopping cart). Load balancer intercept the request and use the round robin fashion its send to one of the tomcat. suppose this time its send to tomcat1. tomcat1 create the session and respond with cookie header to client. load balancer just act as relay. its send back to client. next time user request again the shopping cart to server. this time user send the cookie header also Load balancer intercept the request and use the round robin fashion its send to one of the tomcat. this time its send to tomcat2.  Tomcat 2 receive the request and extract the session-id. and this session id is doesn’t match with their managed session. because this session is available only in tomcat1. so tomcat 2 is create the new session and send new cookie to client Client receive the response and update the cookie(Its overwrite the old cookie). Client send one more time to request that page and send the cookie to server. Load balancer intercept the request and use the round robin fashion its send to one of the tomcat. this time its send to tomcat3. Tomcat 3 receive the request and extract the session-id. and this session id is doesn’t match with their managed session. because this session is available only in tomcat2. so tomcat3 is create the new session and send new cookie to client Client receive the response and update the cookie.(Its overwrite the old cookie). Client send one more time to request that page and send the cookie to server. Load balancer intercept the request and use the round robin fashion its send to one of the tomcat. this time its send to tomcat1. Tomcat 1 receive the request and extract the session-id. and this session id is doesn’t match with their managed session. because client session id is updated by tomcat 3 last time. so even though tomcat 1 have one session object created by this client. but client session id is wrong.so tomcat3 is create the new session and send new cookie to client (for more info watch the video below) Client receive the response and update the cookie.This sequence continues … As a result, on every request one new session is created. Instead of continuing with the old one. Here the root cause is the Load balancer. If  the load balancer redirects the request correctly then this problem is fixed. But how the load balancer knows in advance about this client before it is processed by a particular Tomcat instance. HTTP is a stateless protocol. So HTTP doesn’t help this situation. And the other information is the jsessionid cookie. It’s good but it’s just a random value. So we can’t take the decision based on this random value. ex: Cookie: JSESSIONID=40025608F7B50E42DFA2785329079227 Session affinity/Sticky Session Session affinity overrides the load-balancing algorithm by directing all requests in a session to a specific Tomcat server. So when we setup the session affinity our problem is solved. But how to setup this, the problem is that the session values are random value. So we need to generate the session value, to somehow identify which Tomcat instance generates the response. jvmRoute The Tomcat configuration file (server.xml) contains an <Engine> tag thas has jvmRoute attribute for this purpose. So, edit the config file and update the <Engine > tag like this <Engine name='Catalina' defaultHost='localhost“ jvmRoute=“tomcat1” > here we mention jvmRoute=’tomcat1′ Here tomcat1 is a worker name of this Tomcat instance. Check the workers.properties file in last post. Add this line to all Tomcat instances conf/server.xml file and restart the Tomcat instances. Now all Tomcat instances generate the session-id pattern like this <Random Value like before>.<jvmRoute value> ex: Cookie:JSESSIONID=40025608F7B50E42DFA2785329079227.tomcat1 Here, in the  end of the value, we can see which Tomcat instance has generated this particular session. So the load balancer can easily find out where we need to delegate the request. In this case it’s tomcat1. So, update all tomcat instances conf/server.xml file to add the jvmRoute property to appropriate worker name values. and restart the instances. All problems are fixed and the entire load balance works fine even in session based application. But there is still one drawback Let’s say tha 5 users are accessing the website. The session affinity is setup. Here tomcat 1 serves 2 user, tomcat 2 serves 2 user, tomcat 2 serves 1 user, then suddenly one of the instances fails. Then what happen? Suppose instance 1 (tomcat1) failed, then those 2 users have lost their session. But their requests are redirected to one of the remaining tomcat instances (tomcat2,tomcat3). So they still access the web page. But they lost the  previous sessions. This is one of the drawbacks. But compared to the last post load balancer it works in session based web application also. In the next post we will see how to set up the session replication in load balancer. Video http://www.youtube.com/watch?feature=player_embedded&v=-9C2ZtdAAFY   Reference: Tomcat Clustering Series Part 2 : Session Affinity Load Balancer from our JCG partner Rama Krishnan at the Ramki Java Blog blog. ...

Polling an http end point using Spring Integration

It is a little non-intuitive if you want to write a flow with Spring Integration which polls an http end point and gathers some content from the http end point for further processing. Spring Integration provides a couple of ways for integrating with a HTTP endpoint –Http Outbound adapter – to send the messages to an http endpoint Http Outbound gateway – to send messages to an http endpoint and to collect the response as a messageMy first instinct to poll the http endpoint was to use a Http Inbound channel adapter, the wrong assumption that I made was that the adapter will be responsible for getting the information from an endpoint – what Http Inbound Gateway actually does is to expose an Http endpoint and wait for requests to come in! , this is why I started by saying that it was a little non-intuitive to me that to poll a URL and collect content from it I will actually have to use a Http Outbound gateway With this clarified, consider an example where I want to poll the USGS Earth Quake information feed available at this url – http://earthquake.usgs.gov/earthquakes/feed/geojson/all/hour This is how my sample http Outbound component looks like: <int:channel id='quakeinfo.channel'> <int:queue capacity='10'/> </int:channel><int:channel id='quakeinfotrigger.channel'></int:channel><int-http:outbound-gateway id='quakerHttpGateway' request-channel='quakeinfotrigger.channel' url='http://earthquake.usgs.gov/earthquakes/feed/geojson/all/hour' http-method='GET' expected-response-type='java.lang.String' charset='UTF-8' reply-timeout='5000' reply-channel='quakeinfo.channel'> </int-http:outbound-gateway> Here the http outbound gateway waits for messages to come into the quakeinfotrigger channel, sends out a GET request to the ‘http://earthquake.usgs.gov/earthquakes/feed/geojson/all/hour’ url, and places the response json string into the ‘quakeinfo.channel’ channel Testing this is easy: @RunWith(SpringJUnit4ClassRunner.class) @ContextConfiguration('httpgateway.xml') public class TestHttpOutboundGateway {@Autowired @Qualifier('quakeinfo.channel') PollableChannel quakeinfoChannel; @Autowired @Qualifier('quakeinfotrigger.channel') MessageChannel quakeinfoTriggerChannel;@Test public void testHttpOutbound() { quakeinfoTriggerChannel.send(MessageBuilder.withPayload('').build()); Message<?> message = quakeinfoChannel.receive(); assertThat(message.getPayload(), is(notNullValue())); }} What I am doing here is getting a reference to the channel which triggers the outbound gateway to send a message to the http endpoint and reference to another channel where the response from the http endpoint is placed. I am triggering the test flow by placing a dummy empty message in the trigger channel and then waiting on message to be available on the response channel and asserting on the contents. This works cleanly, however my original intent was to write a poller which would trigger polling of this endpoint once every minute or so, to do this what I have to do is essentially place a dummy message into the ‘quakeinfotrigger.channel’ channel every minute and this is easily accomplished using a Spring Integration ‘poller’ and a bit of Spring Expression language: <int:inbound-channel-adapter channel='quakeinfotrigger.channel' expression=''''> <int:poller fixed-delay='60000'></int:poller> </int:inbound-channel-adapter> Here I have a Spring inbound-channel-adapter triggered attached to a poller , with the poller triggering an empty message every minute. All this looks a little convoluted but works nicely – here is a gist with a working code Related LinksBased on a question I had posed at the Spring forum http://forum.springsource.org/showthread.php?130711-Need-help-with-polling-to-a-json-based-HTTP-service  Reference: Polling an http end point using Spring Integration from our JCG partner Biju Kunjummen at the all and sundry blog. ...

Chain of responsibility using Spring @Autowired List

There is a way in Spring 3.1 to auto populate a typed List which is very handy when you want to push a bit the decoupling and the cleaning in your code. To show you how it works, I will implement a simple chain of responsibility that will take care of printing some greetings for a passed User. Let start from the (only) domain class we have, the User:           package com.marco.springchain; public class User {private final String name; private final char gender;public User(String name, char gender) { super(); this.name = name; this.gender = gender; }public String getName() { return name; }public char getGender() { return gender; } } Then we create an interface that defines the type for our command objects to be used in our chain: package com.marco.springchain; public interface Printer {void print(User user); } This is the generic class (the template) for a Printer implementation. The org.springframework.core.Ordered is used to tell the AnnotationAwareOrderComparator how we want our List to be ordered. You don’t need to implement the Ordered interface and to override the getOrder method if you don’t need your chain to have an execution order. Also notice that this abstract class return Ordered.LOWEST_PRECEDENCE, this because I want some Printer commands to just run at the end of the chain and I don’t care about their execution order (everything will be clearer after, I promise!). package com.marco.springchain; import org.springframework.core.Ordered; public abstract class GenericPrinter implements Printer, Ordered {public void print(User user) { String prefix = 'Mr'; if (user.getGender() == 'F') { prefix = 'Mrs'; } System.out.println(getGreeting() + ' ' + prefix + ' ' + user.getName()); }protected abstract String getGreeting();public int getOrder() { return Ordered.LOWEST_PRECEDENCE; } } This is our first real Printer command. I want this to have absolute precedence in the chain, hence the order is HIGHEST_PRECEDENCE. package com.marco.springchain; import org.springframework.core.Ordered; import org.springframework.stereotype.Component; @Component public class HelloPrinter extends GenericPrinter {private static final String GREETING = 'Hello';@Override protected String getGreeting() { return GREETING; }@Override public int getOrder() { return Ordered.HIGHEST_PRECEDENCE; } } WelcomePrinter to be executed as first command (After High precedence ones ). package com.marco.springchain; import org.springframework.stereotype.Component; @Component public class WelcomePrinter extends GenericPrinter {private static final String GREETING = 'Welcome to the autowired chain';@Override protected String getGreeting() { return GREETING; }@Override public int getOrder() { return 1; } } GoodbyePrinter to be executed as second command package com.marco.springchain; import org.springframework.stereotype.Component; @Component public class GoodbyePrinter extends GenericPrinter {private static final String GREETING = 'Goodbye';@Override protected String getGreeting() { return GREETING; }@Override public int getOrder() { return 2; } } These 2 commands need to be executed after the others, but I don’t care about their specific order, so I will not override the getOrder method, leaving the GenericPrinter to return Ordered.LOWEST_PRECEDENCE for both. package com.marco.springchain; import org.springframework.stereotype.Component; @Component public class CleaningMemoryPrinter extends GenericPrinter {private static final String GREETING = 'Cleaning memory after';@Override protected String getGreeting() { return GREETING; } } package com.marco.springchain; import org.springframework.stereotype.Component; @Component public class CleaningSpacePrinter extends GenericPrinter {private static final String GREETING = 'Cleaning space after';@Override protected String getGreeting() { return GREETING; } } This is the chain context. Spring will scan (see the spring-config.xml) the package specified in the config file, it will see the typed (List<Printer>) list, and it will populate the list with an instance of any @Component that implements the type Printer. To order the List we use AnnotationAwareOrderComparator.INSTANCE that use the getOrder method to re-order the List ( the object with the lowest value has highest priority (somewhat analogous to Servlet ‘load-on-startup’ values)). package com.marco.springchain; import java.util.Collections; import java.util.List; import javax.annotation.PostConstruct; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.core.annotation.AnnotationAwareOrderComparator; import org.springframework.stereotype.Component; @Component public class PrinterChain {@Autowired private List<Printer> printers;@PostConstruct public void init() { Collections.sort(printers, AnnotationAwareOrderComparator.INSTANCE); }public void introduceUser(User user) { for (Printer printer : printers) { printer.print(user); } } } The spring-config.xml in the src/main/resources. <?xml version='1.0' encoding='UTF-8'?> <beans xmlns='http://www.springframework.org/schema/beans' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xmlns:aop='http://www.springframework.org/schema/aop' xmlns:tx='http://www.springframework.org/schema/tx' xmlns:context='http://www.springframework.org/schema/context' xmlns:util='http://www.springframework.org/schema/util' xsi:schemaLocation=' http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.5.xsd http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-2.5.xsd http://www.springframework.org/schema/aop http://www.springframework.org/schema/aop/spring-aop-2.5.xsd http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-2.5.xsd http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util-2.5.xsd' default-lazy-init='true'><context:component-scan base-package='com.marco.springchain'/> </beans> Finally, a main class to test our chain. package com.marco.springchain; import org.springframework.context.support.ClassPathXmlApplicationContext; public class MainTest {public static void main(String[] args) { ClassPathXmlApplicationContext context = new ClassPathXmlApplicationContext('spring-config.xml'); PrinterChain printerChain = (PrinterChain) context.getBean('printerChain'); printerChain.introduceUser(new User('Marco Castigliego', 'M')); printerChain.introduceUser(new User('Julie Marot', 'F')); } } OUTPUT: Hello Mr Marco Castigliego Welcome to the autowired chain Mr Marco Castigliego Goodbye Mr Marco Castigliego Cleaning space after Mr Marco Castigliego Cleaning memory after Mr Marco Castigliego Hello Mrs Julie Marot Welcome to the autowired chain Mrs Julie Marot Goodbye Mrs Julie Marot Cleaning space after Mrs Julie Marot Cleaning memory after Mrs Julie Marot Hope you enjoyed the example.   Reference: Chain of responsibility using Spring @Autowired List from our JCG partner Marco Castigliego at the Remove duplication and fix bad names blog. ...

IBM AIX: Java process size monitoring

This article will provide you with a quick reference guide on how to calculate the Java process size memory footprint for Java VM processes running on IBM AIX 5.3+ OS. This is a complementary post to my original article on this subject: how to monitor the Java native memory on AIX. I highly recommend this read to any individual involved in production support or development of Java applications deployed on AIX. Why is this knowledge important?From my perspective, basic knowledge on how the OS is managing the memory allocation of your JVM processes is very important. We often overlook this monitoring aspect and only focus on the Java heap itself. From my experience, most Java memory related problems are observed from the Java heap itself such as garbage collection problems, leaks etc. However, I’m confident that you will face situations in the future involving native memory problems or OS memory challenges. Proper knowledge of your OS and virtual memory management is crucial for proper root causes analysis, recommendations and solutions. AIX memory vs. pages As you may have seen from my earlier post, the AIX Virtual Memory Manager (VMM) is responsible to manage memory requests from the system and its applications. The actual physical memory is converted and partitioned in units called pages; allocated either in physical RAM or stored on disk until it is needed. Each page can have a size of 4 KB (small page), 64 KB (medium page) or 16 MB (large page). Typically for a 64-bit Java process you will see a mix of all of the above. What about the topas command?The typical reflex when supporting applications on AIX is to run the topas command, similar to Solaris top. Find below an example of output from AIX 5.3:As you can see, the topas command is not very helpful to get a clear view on the memory utilization since it is not providing the breakdown view that we need for our analysis. It is still useful to get a rough idea of the paging space utilization which can give you a quick idea of your top ‘paging space’ consumer processes. Same can be achieved via the ps aux command. AIX OS command to the rescue: svmonThe AIX svmon command is by far my preferred command to deep dive into the Java process memory utilization. This is a very powerful command, similar to Solaris pmap . It allows you to monitor the current memory “pages” allocation along with each segment e.g. Java Heap vs. native heap segments. Analyzing the svmon output will allow you to calculate the memory footprint for each page type (4 KB, 64 KB, and 16 MB). Now find below a real example which will allow you to understand how the calculation is done: # 64-bit JVM with -Xms2048m & -Xmx2048m (2 GB Java Heap) # Command:svmon –P <Java PID>As you can see, the total footprint of our Java process size was found at 2.2 GB which is aligned with current Java heap settings. You should be able to easily perform the same memory footprint analysis from your AIX environment I hope this article has helped you to understand how to calculate the Java process size on AIX OS. Please feel free to post any comment or question.   Reference: IBM AIX: Java process size monitoring from our JCG partner Pierre-Hugues Charbonneau at the Java EE Support Patterns & Java Tutorial blog. ...

Java Annotations Tutorial with Custom Annotation

Java Annotations provide information about the code and they have no direct effect on the code they annotate. In this tutorial, we will learn about Java annotations, how to write custom annotation, annotations usage and how to parse annotations using reflection. Annotations are introduced in Java 1.5 and now it’s heavily used in Java frameworks like Hibernate, Jersey, Spring. Annotation is metadata about the program embedded in the program itself. It can be parsed by the annotation parsing tool or by compiler. We can also specify annotation availability to either compile time only or till runtime also. Before annotations, program metadata was available through java comments or by javadoc but annotation offers more than that. It not only contains the metadata but it can made it available to runtime and annotation parsers can use it to determine the process flow. For example, in Jersey webservice we add PATH annotation with URI string to a method and at runtime jersey parses it to determine the method to invoke for given URI pattern. Creating Custom Annotations in Java Creating custom annotation is similar to writing an interface, except that it interface keyword is prefixed with @ symbol. We can declare methods in annotation. Let’s see annotation example and then we will discuss it’s features. package com.journaldev.annotations;import java.lang.annotation.Documented; import java.lang.annotation.ElementType; import java.lang.annotation.Inherited; import java.lang.annotation.Retention; import java.lang.annotation.RetentionPolicy; import java.lang.annotation.Target;@Documented @Target(ElementType.METHOD) @Inherited @Retention(RetentionPolicy.RUNTIME) public @interface MethodInfo{ String author() default 'Pankaj'; String date(); int revision() default 1; String comments(); }Annotation methods can’t have parameters. Annotation methods return types are limited to primitives, String, Enums, Annotation or array of these. Annotation methods can have default values. Annotations can have meta annotations attached to them. Meta annotations are used to provide information about the annotation. There are four types of meta annotations:@Documented – indicates that elements using this annotation should be documented by javadoc and similar tools. This type should be used to annotate the declarations of types whose annotations affect the use of annotated elements by their clients. If a type declaration is annotated with Documented, its annotations become part of the public API of the annotated elements. @Target – indicates the kinds of program element to which an annotation type is applicable. Some possible values are TYPE, METHOD, CONSTRUCTOR, FIELD etc. If Target meta-annotation is not present, then annotation can be used on any program element. @Inherited – indicates that an annotation type is automatically inherited. If user queries the annotation type on a class declaration, and the class declaration has no annotation for this type, then the class’s superclass will automatically be queried for the annotation type. This process will be repeated until an annotation for this type is found, or the top of the class hierarchy (Object) is reached. @Retention – indicates how long annotations with the annotated type are to be retained. It takes RetentionPolicy argument whose Possible values are SOURCE, CLASS and RUNTIME  Java Built-in Annotations Java Provides three built-in annotations.@Override – When we want to override a method of Superclass, we should use this annotation to inform compiler that we are overriding a method. So when superclass method is removed or changed, compiler will show error message. @Deprecated – when we want the compiler to know that a method is deprecated, we should use this annotation. Java recommends that in javadoc, we should provide information for why this method is deprecated and what is the alternative to use. @SuppressWarnings – This is just to tell compiler to ignore specific warnings they produce, for example using raw types in generics. It’s retention policy is SOURCE and it gets discarded by compiler.Let’s see a java example showing use of built-in annotations as well as use of custom annotation created by us in above example. package com.journaldev.annotations;import java.io.FileNotFoundException; import java.util.ArrayList; import java.util.List;public class AnnotationExample {public static void main(String[] args) { }@Override @MethodInfo(author = 'Pankaj', comments = 'Main method', date = 'Nov 17 2012', revision = 1) public String toString() { return 'Overriden toString method'; }@Deprecated @MethodInfo(comments = 'deprecated method', date = 'Nov 17 2012') public static void oldMethod() { System.out.println('old method, don't use it.'); }@SuppressWarnings({ 'unchecked', 'deprecation' }) @MethodInfo(author = 'Pankaj', comments = 'Main method', date = 'Nov 17 2012', revision = 10) public static void genericsTest() throws FileNotFoundException { List l = new ArrayList(); l.add('abc'); oldMethod(); }} I believe example is self explanatory and showing use of annotations in different cases. Java Annotations Parsing We will use Reflection to parse java annotations from a class. Please note that Annotation Retention Policy should be RUNTIME otherwise it’s information will not be available at runtime and we wont be able to fetch any data from it. package com.journaldev.annotations;import java.lang.annotation.Annotation; import java.lang.reflect.Method;public class AnnotationParsing {public static void main(String[] args) { try { for (Method method : AnnotationParsing.class .getClassLoader() .loadClass(('com.journaldev.annotations.AnnotationExample')) .getMethods()) { // checks if MethodInfo annotation is present for the method if (method .isAnnotationPresent(com.journaldev.annotations.MethodInfo.class)) { try { // iterates all the annotations available in the method for (Annotation anno : method.getDeclaredAnnotations()) { System.out.println('Annotation in Method '' + method + '' : ' + anno); } MethodInfo methodAnno = method .getAnnotation(MethodInfo.class); if (methodAnno.revision() == 1) { System.out.println('Method with revision no 1 = ' + method); }} catch (Throwable ex) { ex.printStackTrace(); } } } } catch (SecurityException | ClassNotFoundException e) { e.printStackTrace(); } }} Output of the above program is: Annotation in Method 'public java.lang.String com.journaldev.annotations.AnnotationExample.toString()' : @com.journaldev.annotations.MethodInfo(author=Pankaj, revision=1, comments=Main method, date=Nov 17 2012) Method with revision no 1 = public java.lang.String com.journaldev.annotations.AnnotationExample.toString() Annotation in Method 'public static void com.journaldev.annotations.AnnotationExample.oldMethod()' : @java.lang.Deprecated() Annotation in Method 'public static void com.journaldev.annotations.AnnotationExample.oldMethod()' : @com.journaldev.annotations.MethodInfo(author=Pankaj, revision=1, comments=deprecated method, date=Nov 17 2012) Method with revision no 1 = public static void com.journaldev.annotations.AnnotationExample.oldMethod() Annotation in Method 'public static void com.journaldev.annotations.AnnotationExample.genericsTest() throws java.io.FileNotFoundException' : @com.journaldev.annotations.MethodInfo(author=Pankaj, revision=10, comments=Main method, date=Nov 17 2012) That’s all for the java annotation tutorial, I hope you learned something from it.   Reference: Java Annotations Tutorial with Custom Annotation Example and Parsing using Reflection from our JCG partner Pankaj Kumar at the Developer Recipes blog. ...

Google Guava Concurrency – ListenableFuture

In my last post I covered using the Monitor class from the com.google.common.util.concurrent package in the Guava Library. In this post I am going to continue my coverage of Guava concurrency utilities and discuss the ListenableFuture interface. A ListenableFuture extends the Future interface from the java.util.concurrent package, by adding a method that accepts a completion listener. ListenableFuture a ListenableFuture behaves in exactly the same manner as a java.util.concurrent.Future but has the method addCallback(Runnable, ExecutorService) that executes the callback in the given executor. Here is an example: ListenableFuture futureTask = executorService.submit(callableTask) futureTask.addListener(new Runnable() { @Override public void run() { ..work after futureTask completed } }, executorService); If the submitted task has completed when you add the callback, it will run immediately. Using the addCallback method has a drawback in that the Runnable does not have access to the result produced by the future. For access to the result of the Future you would need to use a FutureCallback. FutureCallback A FutureCallback accepts the results produced from the Future and specifies onSuccess and onFailure methods. Here is an example: class FutureCallbackImpl implements FutureCallback<String> {@Override public void onSuccess(String result){ .. work with result }@Override public void onFailure(Throwable t) { ... handle exception } } A FutureCallback is attached by using the addCallback method in the Futures class: Futures.addCallback(futureTask, futureCallbackImpl); At this point you may be asking how do you get an instance of ListenableFuture, when an ExecutorService only returns Futures? The answer is to use the ListenableExecutionService. ListenableExecutionService To use a ListenableExecutionService simply decorate an ExecutorService instance with a call to MoreExecutors.listeningDecorator(ExecutorService) for example: ExecutorsService executorService = MoreExecutors.listeningDecorator(Executors.newCachedThreadPool()); Conclusion With the ability to add a callback, whether a Runnable or the FutureCallback that handles success and failure conditions, the ListenableFuture could be a valuable addition to your arsenal. I have created a unit-test demonstrating using the ListenableFuture available as a gist. In my next post I am going to cover the Futures class, which contains static methods for working with futures. ResourcesGuava Project Home ListenableFuture API Sample Code  Reference: Google Guava Concurrency – ListenableFuture from our JCG partner Bill Bejeck at the Random Thoughts On Coding blog. ...

Permissions in OSGi

In a previous post, we looked at implementing a sandbox for Java applications in which we can securely run mobile code. This post looks at how to do the same in an OSGi environment.OSGi The OSGi specification defines a dynamic module system for Java. As such, it’s a perfect candidate for implementing the kind of plugin system that would enable your application to dynamically add mobile code. Security in OSGi builds on the Java 2 security architecture that we discussed earlier, so you can re-use your knowledge about code signing, etc. OSGi goes a couple of steps further, however. Revoking Permissions One of the weaknesses in the Java permissions model is that you can only explicitly grant permissions, not revoke them. There are many cases where you want to allow everything except a particular special case. There is no way to do that with standard Java permissions, but, luckily, OSGi introduces a solution.The downside is that OSGi introduces its own syntax for specifying policies. The following example shows how to deny PackagePermission for subpackages of com.acme.secret: DENY { ( ..PackagePermission "com.acme.secret.*" "import,exportonly" ) } "denyExample" (In this and following examples, I give the simple name of permission classes instead of the fully qualified name. I hint at that by prefixing the simple name with ..) PackagePermission is a permission defined by OSGi for authorization of package imports and exports. Your application could use a policy like this to make sure that mobile code can’t call the classes in a given package, for instance to limit direct access to the database. Extensible Conditions on Permissions The second improvement that OSGi brings is that the conditions under which a permission are granted can be dynamically evaluated at runtime. The following example shows how to conditionally grant ServicePermission: ALLOW { [ ..BundleSignerCondition "* ; o=ACME" ] ( ..ServicePermission "..ManagedService" "register" ) } "conditionalExample" ServicePermission is an OSGi defined permission that restricts access to OSGi services. The condition is the part between square brackets. OSGi defines two conditions, which correspond to the signedBy and codeBase constructs in regular Java policies. You can also define your own conditions. The specification gives detailed instructions on implementing conditions, especially with regard to performance. Different Types of Permissions The final innovation that OSGi brings to the Java permissions model, is that there are different types of permissions. Bundles can specify their own permissions. This doesn’t mean that bundles can grant themselves permissions, but rather that they can specify the maximum privileges that they need to function. These permissions are called local permissions. The OSGi framework ensures that the bundle will never have more permissions than the local permissions, thus implementing the principle of least privilege. Actually, that statement is not entirely accurate. Every bundle will have certain permissions that they need to function in an OSGi environment, like being able to read the org.osgi.framework.* system properties. These permissions are called implicit permissions, since every bundle will have them, whether the permissions are explicitly granted to the bundle or not. The final type of permissions are the system permissions. These are the permissions that are granted to the bundle.The effective permissions are the set of permissions that are checked at runtime: effective = (local ∩ system) ∪ implicit Local permissions enable auditing. Before installing a bundle into your OSGi environment, you can inspect the Bundle Permission Resource in OSGI-INF/permissions.perm to see what permissions the bundle requires. If you are not comfortable with granting the bundle these permissions, you can decide to not install the bundle. The point is that you can know all of this without running the bundle and without having access to its source code. Integration into the Java Permissions Model The OSGi framework integrates their extended permissions model into the standard Java permissions model by subclassing ProtectionDomain. Each bundle gets a BundleProtectionDomainImpl for this purpose.This approach allows OSGi to tap into the standard Java permissions model that you have come to know, so you can re-use most of your skills in this area. The only thing you’ll have to re-learn, is how to write policies. Comparison of Permission Models To put the OSGi permission model into perspective, consider the following comparison table, which uses terminology from the XACML specification:Permission Models Standard Java OSGiEffects permit permit, denyTarget, Condition codeBase, signedBy codeBase, signedBy, custom conditionsCombining Algorithms first-applicable first-applicable, local/system/implicitFrom this table you can see that the OSGi model is quite a bit more expressive than the standard Java permission model, although not as expressive as XACML.   Reference: Permissions in OSGi from our JCG partner Remon Sinnema at the Secure Software Development blog. ...

Using Jasper Reports to create reports in Java

Last week I was trying to create a report using Jasper. In this post I will document some of the resources and links so that it will be useful for any one looking for similar information. I will cover life cycle of Jasper reports, examples and Dynamic Jasper. The Jasper Reports is the world’s most popular open source reporting engine. It is entirely written in Java and it is able to use data coming from any kind of data source and produce pixel-perfect documents that can be viewed, printed or exported in a variety of document formats including HTML, PDF, Excel, OpenOffice and Word. JasperReport Life CycleAs in the image the life cycle has 3 distinct phases, 1. Designing the Report In this step involves creation of the JRXML file, which is an XML document that contains the definition of the report layout. We can use the either iReport Designer or a text editor to manually create it. Using iReport Designer, the layout is completely designed in a visual way, so you can ignore the real structure of the JRXML file. Here is the detailed tutorial on designing a report using iReport. We can also use Dynamic Jasper described later in the article to design a report. 2. Executing the report. Before executing a report, the JRXML must be compiled in a binary object called a Jasper file(*.jasper). This compilation is done for performance reasons. Jasper files are what you need to ship with your application in order to run the reports. Once the report is compiled it is filled with data from the application. The class net.sf.jasperreports.engine.JasperFillManager provides necessary functions to fill the data in the reports. The report execution is performed by passing a Jasper file and a data source to JasperReports. There are plenty of types of data sources, it’s possible to fill a Jasper file from an SQL query, an XML file, a csv file, an HQL (Hibernate Query Language) query, a collection of Java Beans, etc… If you don’t find a suitable data source, JasperReports is very flexible and allows you to write your own custom data source. JasperFillManager.fillReportToFile( ‘MasterReport.jasper’ , parameters, getDataSource()); This operation creates a Jasper print file (*.jrprint), which used to either print or export the report. 3. Exporting to the desired format Using the Jasper print file created in the previous step we shall be able to export it into any format using JasperExportManager. Jasper provides various forms of exports. This means with the same input we can create multiple representation of the data. Jasper inernally uses different APIs to create documents. But these complexity are hidden by the simpler JasperExportManager. JasperExportManager. exportReportToPdfFile( ‘MasterReport.jrprint’ ); In a nutshell the life cycle can be summarized in the below imageReferences and other good articles on Jasper Reports Life Cycle Jasper Library Wiki  Jasper Reports Wiki  Jasper Reports in Ramki Java Blog  JasperReport – Open Source Java Reporting Framework  Examples I have found it really hard to find a working example of Jasper report. But it is right there inside the package shipment!. Once you have downloaded the Jasper Library go to demo\samples, you will find a lot of sample programs. Many of these needs a working HSQL DB connection, to activate it go to demo\hsqldb and start the server. Every folder has a readme.txt file which will help you in understanding how to run it. All the examples can be executed using ant tasks. Here is a list of few other sources.Samples from the Jasper Library Java Reporting With Jasper Reports – Part 2 Jasper Reports – Example Spring MVC 3.1 and JasperReports  Simplify report creation using Dynamic Jasper DynamicJasper (DJ) is an open source free library that hides the complexity of Jasper Reports, it helps developers to save time when designing simple/medium complexity reports generating the layout of the report elements automatically. The project homepage provides lots of examples and code snippets on how to use the library. I have been using it for some time and it is a pretty stable replacement for the JRXML file.While using dynamic jasper the report design is coded in Java. Which means every time the report is compiled, filled and exported. By using dynamic jasper we are replacing the first step in the above mentioned jasper life cycle. Even with dynamic jasper you need the jasper library and other dependent files. Here is some more examples of Dynamic Jasper usage.HOW TO page on Dynamic Jasper Spring 3 – DynamicJasper – Hibernate Tutorial: Concatenating a DynamicReport Spring 3 – DynamicJasper – Hibernate Tutorial: Using Plain List  Reference: Using Jasper Reports to create reports in Java from our JCG partner Manu PK at the The Object Oriented Life blog. ...

Parallelization of a simple use case explained

Some time ago a friend of mine asked me about the possibilities of speeding up the following process: they are generating some data in two stages, reading from a database and processing the results. Reading takes approximately 70% of time and processing takes the remaining 30%. Unfortunately they cannot simply load the whole data into memory, thus they split reading into much smaller chunks (pages) and process these pages once they are retrieved, interleaving the these two stages in a loop. Here is a pseudo-code of what they have so far:         public Data loadData(int page) { //70% of time... }public void process(Data data) { //30% of time... }for (int i = 0; i < MAX; ++i) { Data data = loadData(i); process(data); } His idea of improving the algorithm was to somehow start fetching next page of data when current page is still being processed, thus reducing the overall run time of the algorithm. He was correct, but didn’t know how to put this into Java code, not being very experienced with magnificent java.util.concurrent package. This article is targeted for such people, introducing briefly the very basic concepts of concurrent programming in Java such as thread pools and Future<T> type. First let’s visualize the initial and desired implementation using Gantt chart:The second chart represents the solution we are aiming to achieve. The first observation you should make is that the second process finishes earlier, which is good. The second one is: when we are processing first page (yellow 1), the second page is already being downloaded (green 2). When we begin processing page 2, page 3 began downloading. And so on. We will go back to this chart later, once we have a working implementation. Let’s put this into code. Threads are the way to achieve background loading of data (green blocks). However simply starting a thread for each green block is both slow and inconvenient. Thread pool with just a single thread is much more flexible and easier to use. First let’s wrap our call to loadData() into Callable<Data>: private class LoadDataTask implements Callable<Data> {private final int page;private LoadDataTask(int page) { this.page = page; }@Override public Data call() throws Exception { return loadData(page); } } Once we have such class it’s easy to feed thread pool (represented by ExecutorService) and wait for a reply. Here is a full implementation: ExecutorService executorService = Executors.newSingleThreadExecutor(); Future<Data> next = executorService.submit(new LoadDataTask(0)); for (int i = 0; i < MAX; ++i) { Future<Data> current = next; if (i + 1 < MAX) { next = executorService.submit(new LoadDataTask(i + 1)); } Data data = current.get(); //this can block process(data); } executorService.shutdownNow(); Executors.newSingleThreadExecutor() basically creates a background thread waiting for tasks to run. We cannot use a bigger pool (with more threads) because then we would risk keeping too much data in memory, before it gets processed. For the purpose of example assume loading a page (green blocks) takes 700ms while processing it (yellow blocks) – 300ms. At the beginning we submit an initial task to load page 0 (first blue arrow pointing down). Thus we have to wait full 700ms for the first block. However once the data is available, before we start processing it, we immediately ask for the next page. When we run the second iteration, we don’t have to wait full 700 ms again, because loading data already progressed by 300 ms, thus Future.get() only blocks for 400 ms. We repeat this process until we are processing the last page. Of course we don’t have load next page of data because we already processed all of them, thus this ugly condition inside loop. It’s easy to avoid it by returning null object from loadData() when page is out of bounds, but let’s leave it for the clarity of example. This approach is so common in the enterprise that dedicated support was added to both Spring and EJB. Let’s use Spring as an example. The only thing we have to change is to adjust return value of loadData() from Data to Future<Data>. Wrapping result value with AsyncResult is required to compile: @Async public Future<Data> loadData(int page) { //... return new AsyncResult<Data>(new Data(...)); } Of course this class is a part of some Spring bean (say dao). API is now much cleaner: Future<Data> next = dao.loadData(0); for (int i = 0; i < MAX; ++i) { Future<Data> current = next; if (i + 1 < MAX) { next = dao.loadData(i + 1); } Data data = current.get(); processor.process(data); } we no longer have to use Callable and interact with some thread pools. Also bootstraping Spring was never that simple (so don’t tell me that Spring is heavyweight!): @Configuration @ComponentScan("com.blogspot.nurkiewicz.async") @EnableAsync public class Config implements AsyncConfigurer {@Override public Executor getAsyncExecutor() { return Executors.newSingleThreadExecutor(); }} Technically getAsyncExecutor() is not required, but by default Spring will create a thread pool with 10 threads for @Async methods (and we want only one). Now simply run this somewhere in your code. ApplicationContext context = new AnnotationConfigApplicationContext(Config.class); Lesson learnt from this article: don’t be afraid of concurrency, it’s much simpler than you think, providing that you are using built-in abstractions and understand them.   Reference: Parallelization of a simple use case explained from our JCG partner Tomasz Nurkiewicz at the Java and neighbourhood blog. ...
Java Code Geeks and all content copyright © 2010-2014, Exelixis Media Ltd | Terms of Use | Privacy Policy | Contact
All trademarks and registered trademarks appearing on Java Code Geeks are the property of their respective owners.
Java is a trademark or registered trademark of Oracle Corporation in the United States and other countries.
Java Code Geeks is not connected to Oracle Corporation and is not sponsored by Oracle Corporation.
Do you want to know how to develop your skillset and become a ...
Java Rockstar?

Subscribe to our newsletter to start Rocking right now!

To get you started we give you two of our best selling eBooks for FREE!

Get ready to Rock!
You can download the complementary eBooks using the links below: