Featured FREE Whitepapers

What's New Here?

jenkins-logo

Hooking into the Jenkins (Hudson) API, Part 1

Which one – Hudson or Jenkins? Both. I started working on this little project a couple of months back using Hudson v1.395 and returned to it after the great divide happened. I took it as an opportunity to see whether there would be any significant problems should I choose to move permanently to Jenkins in the future. There were a couple of hiccups- most notably that the new CLI jar didn’t work right out of the box- but overall v1.401 of Jenkins worked as expected after the switch. The good news is the old version of the CLI jar still works, so this example is actually using a mix of code to get things done. Anyway, the software is great and there’s more than enough credit to go around. The API Jenkins/Hudson has a handy remote API packed with information about your builds and supports a rich set of functionality to control them, and the server in general, remotely. It is possible to trigger builds, copy jobs, stop the server and even install plugins remotely. You have your choice of XML, JSON or Python when interacting with the APIs of the server. And, as the build in documentation says, you can find the functionality you need on a relative path from the build server url at: “/…/api/ where ‘…’ portion is the object for which you’d like to access”. This will show a brief documentation page if you navigate to it in a browser, and will return a result if you add the desired format as the last part of the path. For instance, to load information about the computer running a locally hosted Jenkins server, a get request on this url would return the result in JSON format: http://localhost:8080/computer/api/json. { 'busyExecutors': 0, 'displayName': 'nodes', 'computer': [ { 'idle': true, 'executors': [ { }, { } ], 'actions': [], 'temporarilyOffline': false, 'loadStatistics': { }, 'displayName': 'master', 'oneOffExecutors': [], 'manualLaunchAllowed': true, 'offline': false, 'launchSupported': true, 'icon': 'computer.png', 'monitorData': { 'hudson.node_monitors.ResponseTimeMonitor': { 'average': 111 }, 'hudson.node_monitors.ClockMonitor': { 'diff': 0 }, 'hudson.node_monitors.TemporarySpaceMonitor': { 'size': 58392846336 }, 'hudson.node_monitors.SwapSpaceMonitor': null, 'hudson.node_monitors.DiskSpaceMonitor': { 'size': 58392846336 }, 'hudson.node_monitors.ArchitectureMonitor': 'Mac OS X (x86_64)' }, 'offlineCause': null, 'numExecutors': 2, 'jnlpAgent': false } ], 'totalExecutors': 2 } Here’s the same tree rendered using GraphViz.This functionality extends out in a tree from the root of the server, and you can gate how much of the tree you load from any particular branch by supplying a ‘depth’ parameter on your urls. Be careful how high you specify this variable. Testing with a load depth of four against a populous, long-running build server (dozens of builds with thousands of job executions) managed to regularly timeout for me. To give you an idea, here’s a very rough visualization of the domain at depth three from the root of the api.Getting data out of the server is very simple, but the ability to remotely trigger activity on the server is more interesting. In order to trigger a build of a job named ‘test’, a POST on http://localhost:8080/job/test/build does the job. Using the available facilities, it’s pretty easy to do things like:load a job’s configuration file, modify it and create a new job by POSTing the new config.xml file move a job from one build machine to another build up an overview of scheduled buildsThe CLI Jar There’s another way to remotely drive build servers in the CLI jar distributed along with the server. This jar provides simple facilities for executing certain commands remotely on the build server. Of note, this enables installing plugins remotely and executing a remote Groovy shell. I incorporated this functionality with a very thin wrapper around the main class exposed by the CLI jar as shown in the next code sample. /** * Drive the CLI with multiple arguments to execute. * Optionally accepts streams for input, output and err, all of which * are set by default to System unless otherwise specified. * @param rootUrl * @param args * @param input * @param output * @param err * @return */ def runCliCommand(String rootUrl, List<String> args, InputStream input = System.in, OutputStream output = System.out, OutputStream err = System.err) { def CLI cli = new CLI(rootUrl.toURI().toURL()) cli.execute(args, input, output, err) cli.close() } And here’s a simple test showing how you can execute a Groovy script to load information about jobs, similar to what you can do from the built-in Groovy script console on the server, which can be found for a locally installed deployment at http://localhost:8080/script. def 'should be able to query hudson object through a groovy script'() { final ByteArrayOutputStream output = new ByteArrayOutputStream() when: api.runCliCommand(rootUrl, ['groovysh', 'for(item in hudson.model.Hudson.instance.items) { println('job $item.name')}'], System.in, output, System.err)then: println output.toString() output.toString().split('\n')[0].startsWith('job') }Here are some links to articles about the CLI, if you want to learn more :Hudson CLI wikidoc Jenkins CLI wikidoc A template for PHP jobs on Jenkins An article from Kohsuke Kawaguchi A nice tutorialHTTPBuilder HTTPBuilder is my tool of choice when programming against an HTTP API nowadays. The usage is very straightforward and I was able to get away with only two methods to support reaching the entire API: one for GET and one for POST. Here’s the GET method, sufficient for executing the request, parsing the JSON response, and complete with (albeit naive) error handling. /** * Load info from a particular rootUrl+path, optionally specifying a 'depth' query * parameter(default depth = 0) * * @param rootUrl the base url to access * @param path the api path to append to the rootUrl * @param depth the depth query parameter to send to the api, defaults to 0 * @return parsed json(as a map) or xml(as GPathResult) */ def get(String rootUrl, String path, int depth = 0) { def status HTTPBuilder http = new HTTPBuilder(rootUrl) http.handler.failure = { resp -> println 'Unexpected failure on $rootUrl$path: ${resp.statusLine} ${resp.status}' status = resp.status }def info http.get(path: path, query: [depth: depth]) { resp, json -> info = json status = resp.status } info ?: status } Calling this to fetch data is a one liner, as the only real difference is the ‘path’ variable used when calling the API. private final GetRequestSupport requestSupport = new GetRequestSupport() ... /** * Display the job api for a particular Hudson job. * @param rootUrl the url for a particular build * @return job info in json format */ def inspectJob(String rootUrl, int depth = 0) { requestSupport.get(rootUrl, API_JSON, depth) } Technically, there’s nothing here that limits this to JSON only. One of the great things about HTTPBuilder is that it will happily just try to do the right thing with the response. If the data returned is in JSON format, as these examples are, it gets parsed into a JSONObject. If on the other hand, the data is XML, it gets parsed into a Groovy GPathResult. Both of these are very easily navigable, although the syntax for navigating their object graphs is different. What can you do with it? My primary motivation for exploring the API of Hudson/Jenkins was to see how I could make managing multiple servers easier. At present I work daily with four build servers and another handful of slave machines, and support a variety of different version branches. This includes a mix of unit and functional test suites, as well as a continuous deployment job that regularly pushes changes to test machines matching our supported platform matrix, so unfortunately things are not quite as simple as copying a single job when branching. Creating the build infrastructure for new feature branches in an automatic, or at least semi-automatic, fashion is attractive indeed, especially since plans are in the works to expand build automation. For a recent 555 day project, I utilized the API layer to build a Grails app functioning as both a cross-server build radiator and a central facility for server management. This proof of concept is capable of connecting to multiple build servers and visualizing job data as well as specific system configuration, triggering builds, and direct linking to each of the connected servers to allow for drilling down further. Here’s a couple of mock-ups that pretty much show the picture.Just a pretty cool app for installing Jenkins This is only very indirectly related, but I came across this very nice and simple Griffon app, called the Jenkins-Assembler which simplifies preparing your build server. It presents you with a list of plugins, letting you pick and choose, and then downloads and composes them into a single deployable war.Enough talking – where’s the code??? Source code related to this article is available on github. The tests are more of an exploration of the live API than an actual test of the code in this project. They run against a local server launched using the Gradle Jetty plugin. Finally, here’s some pretty pictures for you. [Show as slideshow] [View with PicLens]Continue to Part 2. Reference: Hooking into the Jenkins(Hudson) API from our JCG partner Kelly Robinson at the The Kaptain on … stuff blog....
jenkins-logo

Hooking into the Jenkins (Hudson) API, Part 2

This post continues from Part 1 of the tutorial. It’s been almost a year, but I finally had some time to revisit some code I wrote for interacting with the Jenkins api. I’ve used parts of this work to help manage a number of Jenkins build servers, mostly in terms of keeping plugins in sync and moving jobs from one machine to another. For this article I’m going to be primarily focusing on the CLI jar functionality and some of the things you can do with it. This has mostly been developed against Jenkins but I did some light testing with Hudson and it worked there for everything I tried, so the code remains mostly agnostic as to your choice of build server. The project structure The code is hosted on Github, and provides a Gradle build which downloads and launches a Jenkins(or Hudson) server locally to execute tests. The server is set to use the Gradle build directory as its working directory, so it can be deleted simply by executing gradle clean. I tried it using both the Jenkins and the Hudson versions of the required libraries and, aside from some quirks between the two CLI implementations, they continue to function very much the same. If you want to try it with Hudson instead of Jenkins, pass in the command flag -Pswitch and the appropriate war and libraries will be used. The project is meant to be run with Gradle 1.0-milestone-8, and comes with a Gradle wrapper for that version. Most of the code remains the same since the original article, but there are some enhancements and changes to deal with the newer versions of Jenkins and Hudson. The library produced by this project is published as a Maven artifact, and later on I’ll describe exactly how to get at it. There are also some samples included that demonstrate using that library in Gradle or Maven projects, and in Groovy scripts with Grapes. We’re using Groovy 1.8.6, Gradle 1.0-milestone-8 and Maven 3.0.3 to build everything. Getting more out of the CLI As an alternative to the api, the CLI jar is a very capable way of interacting with the build server. In addition to a variety of built-in commands, Groovy scripts can be executed remotely, and with a little effort we can easily serialize responses in order to work with data extracted on the server. As an execution environment, the server provides a Groovysh shell and stocks it with imports for the hudson.model package. Also passed into the Binding is the instance of the Jenkins/Hudson singleton object in that package. In these examples I’m using the backwards-compatible Hudson version, since the code is intended to be runnable on either flavor of the server. The available commands There’s a rich variety of built-in commands, all of which are implemented in the hudson.cli package. Here are the ones that are listed on the CLI page of the running application:build: Builds a job, and optionally waits until its completion. cancel-quiet-down: Cancel the effect of the “quiet-down” command. clear-queue: Clears the build queue connect-node: Reconnect to a node copy-job: Copies a job. create-job: Creates a new job by reading stdin as a configuration XML file. delete-builds: Deletes build record(s). delete-job: Deletes a job delete-node: Deletes a node disable-job: Disables a job disconnect-node: Disconnects from a node enable-job: Enables a job get-job: Dumps the job definition XML to stdout groovy: Executes the specified Groovy script. groovysh: Runs an interactive groovy shell. help: Lists all the available commands. install-plugin: Installs a plugin either from a file, an URL, or from update center. install-tool: Performs automatic tool installation, and print its location to stdout. Can be only called from inside a build. keep-build: Mark the build to keep the build forever. list-changes: Dumps the changelog for the specified build(s). login: Saves the current credential to allow future commands to run without explicit credential information. logout: Deletes the credential stored with the login command. mail: Reads stdin and sends that out as an e-mail. offline-node: Stop using a node for performing builds temporarily, until the next “online-node” command. online-node: Resume using a node for performing builds, to cancel out the earlier “offline-node” command. quiet-down: Quiet down Jenkins, in preparation for a restart. Don’t start any builds. reload-configuration: Discard all the loaded data in memory and reload everything from file system. Useful when you modified config files directly on disk. restart: Restart Jenkins safe-restart: Safely restart Jenkins safe-shutdown: Puts Jenkins into the quiet mode, wait for existing builds to be completed, and then shut down Jenkins. set-build-description: Sets the description of a build. set-build-display-name: Sets the displayName of a build set-build-result: Sets the result of the current build. Works only if invoked from within a build. shutdown: Immediately shuts down Jenkins server update-job: Updates the job definition XML from stdin. The opposite of the get-job command version: Outputs the current version. wait-node-offline: Wait for a node to become offline wait-node-online: Wait for a node to become online who-am-i: Reports your credential and permissionsIt’s not immediately apparent what arguments are required for each, but they almost universally follow a CLI pattern of printing usage details when called with no arguments. For instance, when you call the build command with no arguments, here’s what you get back in the error stream: Argument “JOB” is required java -jar jenkins-cli.jar build args… Starts a build, and optionally waits for a completion. Aside from general scripting use, this command can be used to invoke another job from within a build of one job. With the -s option, this command changes the exit code based on the outcome of the build (exit code 0 indicates a success.) With the -c option, a build will only run if there has been an SCM change JOB : Name of the job to build -c : Check for SCM changes before starting the build, and if there’s no change, exit without doing a build -p : Specify the build parameters in the key=value format. -s : Wait until the completion/abortion of the command Getting data out of the system All of the interaction with the remote system is handled by streams and it’s pretty easy to craft scripts that will return data in an easily parseable String format using built-in Groovy facilities. In theory, you should be able to marshal more complex objects as well, but let’s keep it simple for now. Here’s a Groovy script that just extracts all of the job names into a List, calling the Groovy inspect method to quote all values. @GrabResolver(name = 'glassfish', root = 'http://maven.glassfish.org/content/groups/public/') @GrabResolver(name = "github", root = "http://kellyrob99.github.com/Jenkins-api-tour/repository") @Grab('org.kar:hudson-api:0.2-SNAPSHOT') @GrabExclude('org.codehaus.groovy:groovy') import org.kar.hudson.api.cli.HudsonCliApiString rootUrl = 'http://localhost:8080' HudsonCliApi cliApi = new HudsonCliApi() OutputStream out = new ByteArrayOutputStream() cliApi.runCliCommand(rootUrl, ['groovysh', 'hudson.jobNames.inspect()'], System.in, out, System.err) List allJobs = Eval.me(cliApi.parseResponse(out.toString())) println allJobsOnce we get the response back, we do a little housekeeping to remove some extraneous characters at the beginning of the String, and use Eval.me to transform the String into a List. Groovy provides a variety of ways of turning text into code, so if your usage scenario gets more complicated than this simple case you can use a GroovyShell with a Binding or other alternative to parse the results into something useful. This easy technique extends to Maps and other types as well, making it simple to work with data sent back from the server. Some useful examples Finding plugins with updates and and updating all of them Here’s an example of using a Groovy script to find all of the plugins that have updates available, returning that result to the caller, and then calling the CLI ‘install-plugin’ command on all of them. Conveniently, this command will either install a plugin if it’s not already there or update it to the latest version if already installed. def findPluginsWithUpdates = ''' Hudson.instance.pluginManager.plugins.inject([]) { List toUpdate, plugin -> if(plugin.hasUpdate()) { toUpdate << plugin.shortName } toUpdate }.inspect() ''' OutputStream updateablePlugins = new ByteArrayOutputStream() cliApi.runCliCommand(rootUrl, ['groovysh', findPluginsWithUpdates], System.in, updateablePlugins, System.err)def listOfPlugins = Eval.me(parseOutput(updateablePlugins.toString())) listOfPlugins.each{ plugin -> cliApi.runCliCommand(rootUrl, ['install-plugin', plugin]) }Install or upgrade a suite of Plugins all at once This definitely beats using the ‘Manage Plugins’ UI and is idempotent so running it more than once can only result in possibly upgrading already installed Plugins. This set of plugins might be overkill, but these are some plugins I recently surveyed for possible use. @GrabResolver(name='glassfish', root='http://maven.glassfish.org/content/groups/public/') @GrabResolver(name="github", root="http://kellyrob99.github.com/Jenkins-api-tour/repository") @Grab('org.kar:hudson-api:0.2-SNAPSHOT') @GrabExclude('org.codehaus.groovy:groovy') import static java.net.HttpURLConnection.* import org.kar.hudson.api.* import org.kar.hudson.api.cli.HudsonCliApiString rootUrl = 'http://localhost:8080' HudsonCliApi cliApi = new HudsonCliApi()['groovy', 'gradle', 'chucknorris', 'greenballs', 'github', 'analysis-core', 'analysis-collector', 'cobertura', 'project-stats-plugin','audit-trail', 'view-job-filters', 'disk-usage', 'global-build-stats', 'radiatorviewplugin', 'violations', 'build-pipeline-plugin', 'monitoring', 'dashboard-view', 'iphoneview', 'jenkinswalldisplay'].each{ plugin -> cliApi.runCliCommand(rootUrl, ['install-plugin', plugin]) }// Restart a node, required for newly installed plugins to be made available. cliApi.runCliCommand(rootUrl, 'safe-restart')Finding all failed builds and triggering them It’s not all that uncommon that a network problem or infrastructure event can cause a host of builds to fail all at once. Once the problem is solved this script can be useful for verifying that the builds are all in working order. @GrabResolver(name = 'glassfish', root = 'http://maven.glassfish.org/content/groups/public/') @GrabResolver(name = "github", root = "http://kellyrob99.github.com/Jenkins-api-tour/repository") @Grab('org.kar:hudson-api:0.2-SNAPSHOT') @GrabExclude('org.codehaus.groovy:groovy') import org.kar.hudson.api.cli.HudsonCliApiString rootUrl = 'http://localhost:8080' HudsonCliApi cliApi = new HudsonCliApi() OutputStream out = new ByteArrayOutputStream() def script = '''hudson.items.findAll{ job -> job.isBuildable() && job.lastBuild && job.lastBuild.result == Result.FAILURE }.collect{it.name}.inspect() ''' cliApi.runCliCommand(rootUrl, ['groovysh', script], System.in, out, System.err) List failedJobs = Eval.me(cliApi.parseResponse(out.toString())) failedJobs.each{ job -> cliApi.runCliCommand(rootUrl, ['build', job]) }Open an interactive Groovy shell If you really want to poke at the server you can launch an interactive shell to inspect state and execute commands. The System.in stream is bound and responses from the server are immediately echoed back. @GrabResolver(name = 'glassfish', root = 'http://maven.glassfish.org/content/groups/public/') @GrabResolver(name = "github", root = "http://kellyrob99.github.com/Jenkins-api-tour/repository") @Grab('org.kar:hudson-api:0.2-SNAPSHOT') @GrabExclude('org.codehaus.groovy:groovy') import org.kar.hudson.api.cli.HudsonCliApi /** * Open an interactive Groovy shell that imports the hudson.model.* classes and exposes * a 'hudson' and/or 'jenkins' object in the Binding which is an instance of hudson.model.Hudson */ HudsonCliApi cliApi = new HudsonCliApi() String rootUrl = args ? args[0] :'http://localhost:8080' cliApi.runCliCommand(rootUrl, 'groovysh')Updates to the project A lot has happened in the last year and all of the project dependencies needed an update. In particular, there have been some very nice improvements to Groovy, Gradle and Spock. Most notably, Gradle has come a VERY long way since version 0.9.2. The JSON support added in Groovy 1.8 comes in handy as well. Spock required a small tweak for rendering dynamic content in test reports when using @Unroll, but that’s a small price to pay for features like the ‘old’ method and Chained Stubbing. Essentially, in response to changes in Groovy 1.8+, a Spock @Unroll annotation needs to change from: @Unroll('querying of #rootUrl should match #xmlResponse') to a Closure encapsulated GString expression: @Unroll({'querying of $rootUrl should match $xmlResponse'}) It sounds like the syntax is still in flux and I’m glad I found this discussion of the problem online. Hosting a Maven repository on Github Perhaps you noticed from the previous script examples, we’re referencing a published library to get at the HudsonCliApi class. I read an interesting article last week which describes how to use the built-in Github Pages for publishing a Maven repository. While this isn’t nearly as capable as a repository like Nexus or Artifactory, it’s totally sufficient for making some binaries available to most common build tools in a standard fashion. Simply publish the binaries along with associated poms in the standard Maven repo layout and you’re off to the races! Each dependency management system has its quirks(I’m looking at you Ivy!) but they’re pretty easy to work around, so here’s examples for Gradle, Maven and Groovy Grapes to use the library produced by this project code. Note that some of the required dependencies for Jenkins/Hudson aren’t available in the Maven central repository, so we’re getting them from the Glassfish repo. Gradle Pretty straight forward, this works with the latest version of Gradle and assumes that you are using the Groovy plugin. repositories { mavenCentral() maven { url 'http://maven.glassfish.org/content/groups/public/' } maven { url 'http://kellyrob99.github.com/Jenkins-api-tour/repository' } } dependencies { groovy 'org.codehaus.groovy:groovy-all:${versions.groovy}' compile 'org.kar:hudson-api:0.2-SNAPSHOT' }Maven Essentially the same content in xml and in this case it’s assumed that you’re using the GMaven plugin <repositories> <repository> <id>glassfish</id> <name>glassfish</name> <url>http://maven.glassfish.org/content/groups/public/</url> </repository> <repository> <id>github</id> <name>Jenkins-api-tour maven repo on github</name> <url>http://kellyrob99.github.com/Jenkins-api-tour/repository</url> </repository> </repositories><dependencies> <dependency> <groupId>org.codehaus.groovy</groupId> <artifactId>groovy-all</artifactId> <version>${groovy.version}</version> </dependency> <dependency> <groupId>org.kar</groupId> <artifactId>hudson-api</artifactId> <version>0.2-SNAPSHOT</version> </dependency> </dependencies>Grapes In this case there seems to be a problem resolving some transitive dependency for an older version of Groovy which is why there’s an explicit exclude for it. @GrabResolver(name='glassfish', root='http://maven.glassfish.org/content/groups/public/') @GrabResolver(name='github', root='http://kellyrob99.github.com/Jenkins-api-tour/repository') @Grab('org.kar:hudson-api:0.2-SNAPSHOT') @GrabExclude('org.codehaus.groovy:groovy')LinksThe Github Jenkins-api-tour project page Maven repositories on Github Scriptler example Groovy scripts Jenkins CLI documentationRelated posts:Hooking into the Jenkins(Hudson) API Five Cool Things You Can Do With Groovy Scripts A Grails App Demoing the StackExchange APIReference: Hooking into the Jenkins(Hudson) API, Part 2 from our JCG partner Kelly Robinson at the The Kaptain on … stuff blog....
career-logo

Learn A Different Language – Advice From A JUG Leader

The cry of “Java is Dead“? has been heard for many years now, yet Java still continues to be among the most used languages/ecosystems. I am not here to declare that Java is dead (it isn’t and won’t be anytime soon). My opinion, if you haven’t already heard:Java developers, it’s time to learn something elseFirst, a little background as basis for my opinions:I founded the Philadelphia Area Java Users’ Group in March 2000, and for the past 12 years I have served as ‘?JUGmaster’. Professionally, I have been a technology recruiter focused on (you guessed it) helping Philadelphia area software companies to hire Java talent since early 1999. I started a new recruiting firm in January that is not focused on Java, and I’m taking searches for mostly Java, Python, Ruby, Scala, Clojure, and mobile talent. This was a natural progression for me, as a portion of my candidate network had already transitioned to other technologies.I launched Philly JUG based on a recommendation from a candidate, who learned that the old group was dormant. Philly JUG grew from 30 to over 1300 members and we have been recognized twice by Sun as a Top JUG worldwide. This JUG is non-commercial (no product demos, no sales or recruiting activity directed to the group), entirely sponsor-funded, and I have had great success in attracting top Java minds to present for us.The early signsAfter several years of 100% Java-specific presentations at our meetings, I started to notice that an element of the membership requested topics that were not specifically Java EE or SE. I served as the sole judge of what content was appropriate (with requested input from some members), and I allowed the group to stray a bit from our standard fare. First was Practical JRuby back in ’06, but since that was ‘still Java’ there was no controversy. Groovy and Grails in ’08 wasn’t going to raise any eyebrows either. Then in ’09, we had consecutive non-Java meetings – Scala for Jarheads followed by Clojure and the Robot Apocalypse (exact dates for said apocalypse have been redacted). Obviously there is commonality with the JVM, but it was becoming readily apparent that some members of the group were less interested in simply hearing about JSP, EJB, Java ME or whatever the Java vendor universe might be promoting at the time.I noticed that the members that sought these other topics and attended these alternative meetings were my unofficial advisory committee over the years – the members I called first to ask opinions about topics. These people were the thought leadership of the group. Many of them were early adopters of Java as well.It was apparent that many of the better Java engineers I knew were choosing to broaden their horizons with new languages, which prompted me to write “Become a Better Java Programmer – Learn Something Else”. That ’09 article served to demonstrate that by learning another language, you should become a better overall engineer and your Java skills should improve just based on some new approaches. Today I go a step farther in my advice for the Java community, and simply say ‘Learn Something Else‘?.To be clear, the reason I make this suggestion is not because I feel Java as a language is going to die off, or that all companies will stop using Java in the near future. Java will obviously be around for many years to come, and the JVM itself will certainly continue to be a valued resource for developers. The reason I advise you to learn something else is that I strongly believe that the marketability of developers that only code in Java will diminish noticeably in the next few years, and the relevance and adoption of Java in new projects will decline. Known Java experts who are at the top few percent probably won’t see decreased demand, but the vast majority of the Java talent pool undoubtedly will.The writing on the wallI think at this point the writing on the wall is getting a bit too obvious to ignore, and you have two forces acting concurrently. First, there is a tangible groundswell of support for other languages. A month doesn’t seem to go by that we don’t hear about a new language being released, or read that a company transitioned from Java to another option. Much of this innovation is by former Java enthusiasts, who are often taking the best elements of Java and adding features that were often desired by the Java community but couldn’t get through the process for inclusion. Java has been lauded for its stability, and the price Java pays for that stability is slowed innovation.The second contributing factor is that Java has simply lost much of its luster and magic over the past few years. The Sun acquisition was a major factor, as Oracle is viewed as entirely profit-driven, ‘?big corporate’, and less focused on community-building than Sun was with Java. The Java community, in turn, is naturally less interested in helping to improve Java under Oracle. Giving away code or time to Oracle is like ‘?working for the man‘? to the Java community. Oracle deciding to run JavaOne alongside Oracle OpenWorld may have been an omen. Failures such as JavaFX and the inability to keep up with feature demand have not helped either.My suggestion to learn something else is also rooted in simple economic principles. I have seen the demand for engineers with certain skills (Ruby, and dare I say JavaScript are good examples) increasing quickly and dramatically, and the low supply of talent in these markets makes it an opportune time to make a move. It reminds me of the late 90′s when you could earn six-figures if you could spell J-A-V-A. Some companies are now even willing to teach good Java pros a new language on the job – what is better than getting paid to learn? The gap in supply and demand for Java was severe years ago, but it seems the supply has caught up recently. Java development also seems to be a skill that, in my experience, is shipped offshore a bit more than some other languages.Still don’t see it? Remember those early Java adopters, the thought leaders I mentioned? Many of them are still around Java, but they aren’t writing Java code anymore. They have come to appreciate the features of some of these other offerings, and are either bored or frustrated with Java. As this set of converts continue to use and evangelize alternative languages in production, they will influence more junior developers who I expect will follow their lead. The flow of Java developers to other languages will continue to grow, and there is still time to take advantage of the supply shortage in alternative language markets.Java will never die. However, the relevance and influence of Java tomorrow is certainly questionable, the marketability of ‘pure’ Java developers will decline, and the market for talent in alternative languages is too strong for proactive career-minded talent to ignore.Reference: Advice From A JUG Leader – Learn A Different Language from our JCG partner Dave Fecak at the Job Tips For Geeks blog....
software-development-2-logo

10 Object Oriented Design principles for the Java programmer

Object Oriented Design Principles are core of OOPS programming but I have seen most of Java programmer chasing design patterns like Singleton pattern , Decorator pattern or Observer pattern but not putting enough attention on Object oriented analysis and design or following these design principles. I have regularly seen Java programmers and developers of various experience level who either doesn’t heard about these OOPS and SOLID design principle or simply doesn’t know what benefits a particular design principle offers or how to use these design principle in coding.Bottom line is always strive for highly cohesive and loosely couple solution, code or design and looking open source code from Apache and Sun are good examples of Java design principles or how design principles should be used in Java coding. Java Development Kit follows several design principle like Factory Pattern in BorderFactory class, Singleton pattern in Runtime class and if you interested more on Java code read Effective Java by Joshua Bloch , a gem by the guy who wrote Java API. My another personal favorite on object oriented design pattern is Head First Design Pattern by Kathy Sierra and others and Head First Object Oriented Analysis and Design .Though best way of learning design principles or pattern is real world example and understanding the consequences of violating that design principle, subject of this article is Introducing Object oriented design principles for Java Programmers who are either not exposed to it or in learning phase. I personally think each of these design principle need an article to explain it clearly and I will definitely try to do it here but for now just get yourself ready for quick bike ride on design principle town :)Object oriented design principle 1 – DRY (Don’t repeat yourself)As name suggest DRY (don’t repeat yourself) means don’t write duplicate code, instead use abstraction to abstract common things in one place. if you use a hardcoded value more than one time consider making it public final constant, if you have block of code in more than two place consider making it a separate method. Benefit of this SOLID design principle is in maintenance. Its worth to note is don?€™t abuse it, duplicate is not for code but for functionality means if you used common code to validate OrderID and SSN it doesn?€™t mean they are same or they will remain same in future. By using common code for two different functionality or thing you closely couple them forever and when your OrderID changes its format , your SSN validation code will break. So beaware of such coupling and just don?€™t combine anything which uses similar code but are not related.Object oriented design principle 2 – Encapsulate what varies Only one thing is constant in software field and that is “Change”, So encapsulate the code you expect or suspect to be changed in future. Benefit of this OOPS Design principle is that Its easy to test and maintain proper encapsulated code. If you are coding in Java then follow principle of making variable and methods private by default and increasing access step by step e.g. from private to protected and not public. Several of design pattern in Java uses Encapsulation, Factory design pattern is one example of Encapsulation which encapsulate object creation code and provides flexibility to introduce new product later with no impact on existing code.Object oriented design principle 3 – Open Closed principle Classes, methods or functions should be Open for extension (new functionality) and Closed for modification. This is another beautiful object oriented design principle which prevents some-one from changing already tried and tested code. Ideally if you are adding new functionality only than your code should be tested and that’s the goal of Open Closed Design principle.Object oriented design principle 4 – Single Responsibility Principle (SRP) There should not be more than one reason for a class to change or a class should always handle single functionality. If you put more than one functionality in one Class in Java it introduce coupling between two functionality and even if you change one functionality there is chance you broke coupled functionality which require another round of testing to avoid any surprise on production environment.Object oriented design principle 5 – Dependency Injection or Inversion principleDon’t ask for dependency it will be provided to you by framework. This has been very well implemented in Spring framework, beauty of this design principle is that any class which is injected by DI framework is easy to test with mock object and easier to maintain because object creation code is centralized in framework and client code is not littered with that.There are multiple ways to implemented Dependency injection like using byte code instrumentation which some AOP (Aspect Oriented programming) framework like AspectJ does or by using proxies just like used in Spring.Object oriented design principle 6 – Favour Composition over InheritanceAlways favour composition over inheritance if possible. Some of you may argue this but I found that Composition is lot more flexible than Inheritance. Composition allows to change behaviour of a class at runtime by setting property during runtime and by using Interfaces to compose a class we use polymorphism which provides flexibility of to replace with better implementation any time. Even Effective Java advise to favor composition over inheritance.Object oriented design principle 7 – Liskov Substitution Principle (LSP) According to Liskov Substitution Principle Subtypes must be substitutable for super type i.e. methods or functions which uses super class type must be able to work with object of sub class without any issue”. LSP is closely related to Single responsibility principle and Interface Segregation Principle. If a class has more functionality than subclass might not support some of the functionality and does violated LSP. In order to follow LSP design principle, derived class or sub class must enhance functionality not reducing it.Object oriented design principle 8 – Interface Segregation principle (ISP) Interface Segregation Principle stats that a client should not implement an interface if it doesn’t use that. this happens mostly when one interface contains more than one functionality and client only need one functionality and not other.Interface design is tricky job because once you release your interface you can not change it without breaking all implementation. Another benefit of this desing principle in Java is, interface has disadvantage to implement all method before any class can use it so having single functionality means less method to implement.Object oriented design principle 9 – Programming for Interface not implementation Always program for interface and not for implementation this will lead to flexible code which can work with any new implementation of interface. So use interface type on variables, return types of method or argument type of methods in Java. This has been advised by many Java programmer including in Effective Java and head first design pattern book.Object oriented design principle 10 – Delegation principle Don’t do all stuff by yourself, delegate it to respective class. Classical example of delegation design principle is equals() and hashCode() method in Java. In order to compare two object for equality we ask class itself to do comparison instead of Client class doing that check. Benefit of this design principle is no duplication of code and pretty easy to modify behaviour.All these object oriented design principle helps you write flexible and better code by striving high cohesion and low coupling. Theory is first step but what is most important is to develope ability to find out when and to apply these design principle and find our whether we are violating any design principle and compromising flexibility of code. but again as nothing is perfect in this world, don’t always try to solve problem with design patterns and design principle they are mostly for large enterprise project which has longer maintenance cycle.Reference: 10 Object Oriented Design principles Java programmer should know from our JCG partner Javin Paul at the Javarevisited blog....
google-app-engine-logo

Vaadin App on Google App Engine in 5 Minutes

In this tutorial you’ll learn how to create your very first Vaadin web application, how to run it on a local AppEngine development server and how to deploy it to the Google App Engine infrastructure. And all of that in about 5 to 10 minutes. Yes, if you have the necessary prerequisites installed, you’ll be up and running straight away. Thanks to the power of Maven.This tutorial is in the form of a nicely formatted, 4-page, quick reference card. You can download it straight away, no sign-ups required.The card will guide you through the process of:Setting up your environment. How to run a Vaadin application on the Google App Engine development server. How to deploy that application. How to start customizing the Powered by Reindeer templates.Get started with Vaadin and Google AppEngine right now.Be Sociable, Share!Reference: Tutorial: a Vaadin Application on Google App Engine in 5 Minutes from our JCG partner Peter Backx at the Streamhead blog....
software-development-2-logo

Modelling Is Everything

I’m often asked, “What is the best way to learn about building high-performance systems”? There are many perfectly valid answers to this question but there is one thing that stands out for me above everything else, and that is modelling. Modelling what you need to implement is the most important and effective step in the process. I’d go further and say this principle applies to any development and the rest is just typing Domain Driven Design (DDD) advocates modelling the domain and expressing this model in code as fundamental to the successful delivery and ongoing maintenance of software. I wholeheartedly agree with this. How often do we see code that is an approximation of the problem domain? Code that exhibits behaviour which approximates to what is required via inappropriate abstractions and mappings which just about cope. Those mappings between what is in the code and the real domain are only contained in the developers’ heads and this is just not good enough. When requiring high-performance, code for parts of the system often have to model what is happening with the CPU, memory, storage sub-systems, or network sub-systems. When we have imperfect abstractions on top of these domains, performance can be very adversely affected. The goal of my “ Mechanical Sympathy” blog is to peek at what is under the hood so we can improve our abstractions. What is a Model? A model does not need to be the result of a 3-year exercise producing UML. It can be, and often is best as, people communicating via various means including speech, drawings, illustrations, metaphors, analogies, etc, to build a mental model for shared understanding. If an accurate and distilled understanding can be reached then this model can be turned into code with great results. Infrastructure Domain Models If developers writing a concurrent framework do not have a good model of how a typical cache sub-system works, i.e. it uses message passing to exchange cache lines, then the framework is unlikely to perform well or be correct. If their code drives the cache sub-system with mechanical sympathy and understanding, it is less likely to have bugs and more likely to perform well. It is much easier to predict performance from a sound model when coming from an understanding of the infrastructure for the underlying platform and its published abilities. For example, if you know how many packets per second a network sub-system can handle, and the size of its transfer unit, then it is easy to extrapolate expected bandwidth. With this model based understanding we can test our code for expectations with confidence. I’ve fixed many performance issues whereby a framework treated a storage sub-system as stream-based when it is really a block-based model. If you update part of a file on disk, the block to be updated must be read, the changes applied, and the results written back. Now if you know the system is block based and the boundaries of the blocks, you can write whole blocks back without incurring the read, modify, write back cycle replacing these actions with a single write. This applies even when appending to a file as the last block is likely to have been partially written previously. Business Domain Models The same thinking should be applied to the models we construct for the business domain. If a business process is modelled accurately, then the software will not surprise its end users. When we draw up a model it is important to describe the relationships for cardinality and the characteristics by which they will be traversed. This understanding will guide the selection of data structures to those best suited for implementing the relationships. I often see people use a list for a relationship which is mostly searched by key, for this case a map could be more appropriate. Are the entities at the other end of a relationship ordered? A tree or skiplist implementation may then be a better option. Identity Identity of entities in a model is so important. All models have to be entered in some way, and this normally starts with an entity from which to walk. That entity could be “Customer” by customer ID but could equally be “DiskBlock” by filename and offset in an infrastructure domain. The identity of each entity in the system needs to be clear so the model can be accessed efficiently. If for each interaction with a model we waste precious cycles trying to find our entity as a starting point, then other optimisations can become almost irrelevant. Make identity explicit in your model and, if necessary, index entities by their identity so you can efficiently enter the model for each interaction. Refine as we learn It is also important to keep refining a model as we learn. If the model grows as a series of extensions without refining and distilling, then we end up with a spaghetti mess that is very difficult to manage when trying to achieve predictable performance. Never mind how difficult it is to maintain and support. Everyday we learn new things. Reflect this in the model and keep it up to date. Implement no more, but also no less, than what is needed! The fastest code is code that just does what is needed and no more. Perform the instructions to complete the task and no more. Really fast code is normally not a weird mess of bit-shifting and compiler tricks. It is best to start with something clean and elegant. Then measure to see if you are within performance targets. So often this will be sufficient. Sometimes performance will be a surprise. You then need to apply science to test and measure before jumping to conclusions. A profiler will often tell you where the time is being taken. Once the basic modelling mistakes and assumptions have been corrected, it usually takes just a little mechanical sympathy to reach the performance goal. Unused code is waste. Try not to create it. If you happen to create some, then remove it from your codebase as soon as you notice it. Conclusion When non-functional requirements, such as performance and availability, are critical to success, I’ve found the most important thing is to get the model correct for the domain at all levels. That is, take the principles of DDD and make sure your code is an appropriate reflection of each domain. Be that the domain of business applications, or the domain of interactions with infrastructure, I’ve found modelling is everything. Reference: Modelling Is Everything from our JCG partner Martin Thompson at the Mechanical Sympathy blog....
spring-logo

Introducing Spring Integration

In this article we introduce Spring Integration. If you have not worked with Spring Integration before, it might help to brush up on Enterprise Integration Patterns by Gregor Hohpe. Also I will recommend this excellent introductory article by Josh Long. Context setting In a nutshell, Enterprise Integration Patterns is all about how to get two application (possibly on different technology stacks, different machines, different networks) talk to each other, in order to provide a single business functionality. The challenge is how to ensure that this communication remains transparent to business user, yet reliable and easy for applications. Messaging is one of the patterns. Using this pattern applications can talk to each other frequently, immediately, reliably, and asynchronously, using customizable formats. Applications talk to each other by sending data (called Messages) over virtual pipes (called Channels). This is overly simplistic introduction to the concept, but hopefully enough to make sense of the rest of the article. Spring Integration is not an implementation of any of the patterns, but it supports these patterns, primarily Messaging. The rest of this article is pretty hands on and is an extension of the series on Spring 3. The earlier articles of this series were:Hello World with Spring 3 MVC Handling Forms with Spring 3 MVC Unit testing and Logging with Spring 3 Handling Form Validation with Spring 3 MVCWithout further ado, let’s get started. Bare bones Spring Integration exampleAt the time of writing this article the latest version of Spring is 3.1.2.RELEASE. However, the latest version of Spring Integration is 2.1.3.RELEASE, as found in Maven Central. I was slightly – and in retrospect, illogically – taken aback that the Spring and Spring Integration should have different latest versions, but, hey that’s how it is. This means our pom.xml should have an addition now (if you are wondering where did that come from you need to follow through, at least on a very high level, the Spring 3 series that I have mentioned earlier in the article). File: /pom.xml <!-- Spring integration --> <dependency> <groupId>org.springframework.integration</groupId> <artifactId>spring-integration-core</artifactId> <version>2.1.3.RELEASE</version> </dependency> This one dependency in the pom, now allows my application to send message over channels. Notice that now we are referring to message and channels in the realm of Spring Integration, which is not necessarily exactly same as the same concepts referred earlier in this article in the realm of Enterprise Integration Patterns. It is probably worth having a quick look at the Spring Integration Reference Manual at this point. However, if you are just getting started with Spring Integration, you are perhaps better off following this article for the moment. I would recommend you get your hands dirty before returning to reference manual, which is very good but also very exhaustive and hence could be overwhelming for a beginner. To keep things simple, and since I generally try to do test first approach (wherever possible), let us try and write some unit tests to create message, send it over a channel and then receive it. I have blogged here about how to use JUnit and Logback in Spring 3 applications. Continuing with the same principle, assuming that we are going to write a HelloWorldTest.java, let’s set up the Spring configuration for the test. File: \src\test\resources\org\academy\integration\HelloWorldTest-context.xml <?xml version='1.0' encoding='UTF-8'?> <beans xmlns='http://www.springframework.org/schema/beans' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xmlns:p='http://www.springframework.org/schema/p' xmlns:int='http://www.springframework.org/schema/integration' xsi:schemaLocation='http://www.springframework.org/schema/beanshttp://www.springframework.org/schema/beans/spring-beans-3.0.xsdhttp://www.springframework.org/schema/integrationhttp://www.springframework.org/schema/integration/spring-integration-2.1.xsd'><int:channel id='inputChannel'></int:channel><int:channel id='outputChannel'> <int:queue capacity='10' /> </int:channel><int:service-activator input-channel='inputChannel' output-channel='outputChannel' ref='helloService' method='greet' /><bean id='helloService' class='org.academy.integration.HelloWorld' /></beans> So, what did we just do? We have asked Spring Integration to create a ‘inputChannel’ to send messages to. A ‘outputChannel’ to read messages from. We have also configured for all messages on ‘inputChannel’ to be handed over to a ‘helloService’. This ‘helloService’ is an instance of org.academy.integration.HelloWorld class, which should be equipped to do something to the message. After that we have also configured that the output of the ‘helloService’ i.e. the modified message in this case to be handed over to the ‘outputChannel’. Simple, isn’t it? Frankly, when I had a worked with Spring Integration a few years ago for the first time, I found this all a bit confusing. It does not make much sense to me till I see this working. So, let’s keep going. Let’s add our business critical HelloWorld class. File: /src/main/java/org/academy/integration/HelloWorld.java package org.academy.integration;import org.slf4j.Logger; import org.slf4j.LoggerFactory;public class HelloWorld { private final static Logger logger = LoggerFactory .getLogger(HelloWorld.class); public String greet(String name){ logger.debug('Greeting {}', name); return 'Hello ' + name; } }As, you can see, given a ‘name’ it return ‘Hello {name}’. Now, let’s add the unit test to actually put this in action. File: /src/test/java/org/academy/integration/HelloWorldTest.java package org.academy.integration;import static org.junit.Assert.*;import org.junit.Test; import org.junit.runner.RunWith; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Qualifier; import org.springframework.integration.MessageChannel; import org.springframework.integration.core.PollableChannel; import org.springframework.integration.message.GenericMessage; import org.springframework.test.context.ContextConfiguration; import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;@RunWith(SpringJUnit4ClassRunner.class) @ContextConfiguration public class HelloWorldTest { private final static Logger logger = LoggerFactory .getLogger(HelloWorldTest.class);@Autowired @Qualifier('inputChannel') MessageChannel inputChannel;@Autowired @Qualifier('outputChannel') PollableChannel outputChannel;@Test public void test() { inputChannel.send(new GenericMessage<String>('World')); assertEquals(outputChannel.receive().getPayload(), 'Hello World'); logger.debug('Checked basic Hello World with Spring Integration'); }}Although not mandatory, I find it easier to use the following logback setting. Feel free to use it, if you fancy. File: /src/main/resources/logback.xml <?xml version='1.0' encoding='UTF-8'?> <configuration> <appender name='CONSOLE' class='ch.qos.logback.core.ConsoleAppender'> <encoder> <pattern>%d %5p | %t | %-55logger{55} | %m %n</pattern> </encoder> </appender><logger name='org.springframework'> <level value='ERROR' /> <!-- level value='INFO' /> --> <!-- level value='DEBUG' /> --> </logger><root> <level value='DEBUG' /> <appender-ref ref='CONSOLE' /> </root> </configuration> Now, simply type ‘mvn -e clean install’ (or use m2e plugin) and you should be able to run the unit test and confirm that given string ‘World’ the HelloWorld service indeed returns ‘Hello World’ over the entire arrangement of channels and messages. Again, something optional but I highly recommend, is to run ‘mvn -e clean install site’. This – assuming you have correctly configured some code coverage tool (cobertura in my case) will give you a nice HTML report showing the code coverage. In this case it would be 100%. I have blogged a series on code quality which deals this subject in more detail, but to cut long story short, it is very important for me to ensure that whatever coding practice / framework I use and recommend use, complies to some basic code quality standards. Being able to unit test and measure that is one such fundamental check that I do. Needless to say, Spring in general (including Spring integration) passes that check with flying colours. ConclusionThat’s it for this article. In the next article, we will see how to insulate the application code from the Spring Integration specific code that we have in our current JUnit test i.e. inputChannel.send(…) etc. Till then, happy coding. Suggested further reading … Here are the links to earlier articles in this series:Hello World with Spring 3 MVC Handling Forms with Spring 3 MVC Unit testing and Logging with Spring 3 Handling Form Validation with Spring 3 MVCThese are excellent material that I can recommend:Getting started with Spring Integration Sample codes with Spring Integration Spring Integration – Session 1 – Hello World Spring Integration – Session 2 – More Hello WorldsContinue to Spring Integration with Gateways Reference: Introducing Spring Integration from our JCG partner Partho at the Tech for Enterprise blog....
spring-logo

Spring Integration with Gateways

This is the second article of the series on Spring Integration. This article builds on top of the first article where we introduced Spring Integration. Context setting In the first article, we created a simple java application whereA message was sent over a channel, It was intercepted by a service i.e. POJO and modified. It was then sent over a different channel The modified message was read from the channel and displayed.However, in doing this – keeping in mind that we were merely introducing the concepts there – we wrote some Spring specific code in our application i.e. the test classes. In this article we will take care of that and make our application code as insulated from Spring Integration api as possible. This is done by, what Spring Integration calls gateways. Gateways exist for the sole purpose of abstracting messaging related ‘plumbing’ code away from ‘business’ code. The business logic might really not care whether a functionality is being achieved be sending a message over a channel or by making a SOAP call. This abstraction – though logical and desirable – have not been very practical, till now. It is probably worth having a quick look at the Spring Integration Reference Manual at this point. However, if you are just getting started with Spring Integration, you are perhaps better off following this article for the moment. I would recommend you get your hands dirty before returning to reference manual, which is very good but also very exhaustive and hence could be overwhelming for a beginner. The gateway could be a POJO with annotations (which is convenient but in my mind beats the whole purpose) or with XML configurations (can very quickly turn into a nightmare in any decent sized application if unchecked). At the end of the day it is really your choice but I like to go the XML route. The configuration options for both styles are detailed out in this section of the reference implementation. Spring Integration with Gateways So, let’s create another test with gateway throw in for our HelloWorld service (refer to the first article of this series for more context). Let’s start with the Spring configuration for the test. File: src/test/resources/org/academy/integration/HelloWorld1Test-context.xml <?xml version='1.0' encoding='UTF-8'?> <beans xmlns='http://www.springframework.org/schema/beans' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xmlns:p='http://www.springframework.org/schema/p' xmlns:int='http://www.springframework.org/schema/integration' xsi:schemaLocation='http://www.springframework.org/schema/beanshttp://www.springframework.org/schema/beans/spring-beans-3.0.xsdhttp://www.springframework.org/schema/integrationhttp://www.springframework.org/schema/integration/spring-integration-2.1.xsd'><int:channel id='inputChannel'></int:channel><int:channel id='outputChannel'> <int:queue capacity='10' /> </int:channel><int:service-activator input-channel='inputChannel' output-channel='outputChannel' ref='helloService' method='greet' /><bean id='helloService' class='org.academy.integration.HelloWorld' /><int:gateway service-interface='org.academy.integration.Greetings' default-request-channel='inputChannel' default-reply-channel='outputChannel'></int:gateway></beans> In this case, all that is different is that we have added a gateway. This is an interface called org.academy.integration.Greetings. It interacts with both ‘inputChannel’ and ‘outputChannel’, to send and read messages respectively. Let’s write the interface. File: /src/main/java/org/academy/integration/Greetings.java package org.academy.integration;public interface Greetings { public void send(String message);public String receive();} And then we add the implementation of this interface. Wait. There is no implementation. And we do not need any implementation. Spring uses something called GatewayProxyFactoryBean to inject some basic code to this gateway which allows it to read the simple string based message, without us needing to do anything at all. That’s right. Nothing at all. Note – You will need to add more code for most of your production scenarios – assuming you are not using Spring Integration framework to just push around strings. So, don’t get used to free lunches. But, while it is here, let’s dig in. Now, lets write a new test class using the gateway (and not interact with the channels and messages at all). File: /src/test/java/org/academy/integration/HelloWorld1Test.java package org.academy.integration;import static org.junit.Assert.*;import org.junit.Test; import org.junit.runner.RunWith; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.test.context.ContextConfiguration; import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;@RunWith(SpringJUnit4ClassRunner.class) @ContextConfiguration public class HelloWorld1Test {private final static Logger logger = LoggerFactory .getLogger(HelloWorld1Test.class);@Autowired Greetings greetings;@Test public void test() { greetings.send('World'); assertEquals(greetings.receive(), 'Hello World'); logger.debug('Spring Integration with gateways.'); }}Our test class is much cleaner now. It does not know about channels, or messages or anything related to Spring Integration at all. It only knows about a greetings instance – to which it gave some data by .send() method – and got modified data back by .receive() method. Hence, the business logic is oblivious of the plumbing logic, making for a much cleaner code. Now, simply type ‘mvn -e clean install’ (or use m2e plugin) and you should be able to run the unit test and confirm that given string ‘World’ the HelloWorld service indeed returns ‘Hello World’ over the entire arrangement of channels and messages. Again, something optional but I highly recommend, is to run ‘mvn -e clean install site’. This – assuming you have correctly configured some code coverage tool (cobertura in my case) will give you a nice HTML report showing the code coverage. In this case it would be 100%. I have blogged a series on code quality which deals this subject in more detail, but to cut long story short, it is very important for me to ensure that whatever coding practice / framework I use and recommend use, complies to some basic code quality standards. Being able to unit test and measure that is one such fundamental check that I do. Needless to say, Spring in general (including Spring integration) passes that check with flying colours. ConclusionThat’s it for this article. Happy coding. Suggested further reading … Here are the links to earlier articles in this series:Hello World with Spring 3 MVC Handling Forms with Spring 3 MVC Unit testing and Logging with Spring 3 Handling Form Validation with Spring 3 MVC Introducing Spring IntegrationThese are excellent material that I can recommend:Getting started with Spring Integration Sample codes with Spring Integration Spring Integration – Session 1 – Hello World Spring Integration – Session 2 – More Hello WorldsReference: Spring Integration with Gatways from our JCG partner Partho at the Tech for Enterprise blog....
spring-logo

Spring security 3 Ajax login – accessing protected resources

I have seen some blogs about Spring Security 3 Ajax login, however I could not find any that tackles how to invoke Ajax based login, where a protected resource is being accessed in Ajax by an anonymous user. The problem – The web application enables anonymous access to certain parts and certain parts are protected resources which require the user to login. When an anonymous user accesses protected resources (via Http Get / Post), Spring Security automatically invokes the login page and after a successful authentication, redirects to the required resource/page. However, if the protected resource is being accessed in Ajax, the login page will not appear correctly (will be set on part of the page). The 302 code (redirect to login page) will not function as expected in Ajax. Please note that this is NOT the same as initiating an Ajax login screen (e.g. when user press on the login button and a popup with user/password fields is being invoked). So – how can we have Spring Security 3 handle access to protected resources both with “regular” HTTP Post(FORM based authentication) AND Ajax calls, including a redirect to the required resource after successful authentication? So, this blog post has two protection layers/parts: 1. Spring Security 3 standard FORM based authentication 2. Configure/extends Spring Security 3. and the app to support also Ajax access to protected resources. Regarding part 1 – there are many references about the issue. No need to elaborate. Regarding part 2 – Requires the following: 1. Configure Spring Security 3 to enable Ajax based login. 2. Configure client Ajax calls to protected resources to handle request for authentication. 3. Re-execution of functions to simulate the automatic user original method invocation after successful login (as it happens in the FORM based login) The below diagram describes a detailed flow and should help follow the client/sever communication.Handling protected resource access via AjaxLets discuss the diagram: The flow starts with an anonymous user Ajax request to a protected resource (1). In this case the user wants to add an item to the shopping cart. The addItem method is a protected resource, which is protected via Spring Security (@pre_authorize(“SOME_ROLE”)) (2). This causes the Spring Secutiry filter (3) to send the login FORM with HTTP code 302 (i.e. redirect to that page). Now, since this is an Ajax call, it will not handle the request well, so here comes the part that takes the login FORM, put it aside, and invoke Ajax based login instead (4): The client Ajax method (which invoked the Ajax addItem method) checks whether it is a form based login or any other reply. If it is a FORM based login, it will call a dialog modal (5) that will try to login in Ajax. Spring will handle the Ajax login authentication (6) and return an appropriate message to the client. The client, if the message was successful, will re-execute the original function, which tried to access the protected resource (e.g. addItem in our example). Let us see how it all fits in our code: Steps #1, #4 – Client side which accesses protected resources and checks if a login is required //JavaScript method - Ajax call to protected resource (#1 in flow diagram) function addItem(itemId) { $.ajax({ url: '/my_url/order/addItem', type: 'POST', data: ({orderItemId : itemId,...}), success: function(data) {//construct a callback string if user is not logged in. var cllbck = 'addItem('+itemId +')';//Client check if login required //(#4 in flow diagram) if (verifyAuthentication(data,cllbck)){ // in here => access to protected resource was ok // show message to user, "item has been added..." } }); }Steps #2, #3 – is a regular Spring Security configuration. Plenty of resources out there. Step #4 – Client checks if login is required: function verifyAuthentication(data, cllBackString){ //naive check - I put a string in the login form, so I check for existance if (isNaN(data) && (data.indexOf("login_hidden_for_ajax")!= -1)){ //if got here then data is a loginform => login required //set callback in ajax login form hidden input $("#my_callback").val(cllBackString); //show ajax login //Get the window height and width var winH = $(window).height(); var winW = $(window).width(); //Set the popup window to center $("#ajaxLogin").css('top', winH/2-$("#ajaxLogin").height()/2); $("#ajaxLogin").css('left', winW/2-$("#ajaxLogin").width()/2); $("#ajaxLogin").fadeIn(2000); return false; } // data is not a login form => return true to continue with function processing return true; } Step #5, #7 – the Ajax login FORM utilizes the following Ajax login: function ajaxLogin(form, suffix){ var my_callback = form.my_callback.value; // The original function which accessed the protected resource var user_pass = form.j_ajax_password.value; var user_name = form.j_ajax_username.value;//Ajax login - we send credentials to j_spring_security_check (as in form based login $.ajax({ url: "/myContextURL/j_spring_security_check", data: { j_username: user_name , j_password: user_pass }, type: "POST", beforeSend: function (xhr) { xhr.setRequestHeader("X-Ajax-call", "true"); }, success: function(result) { //if login is success, hide the login modal and //re-execute the function which called the protected resource //(#7 in the diagram flow) if (result == "ok") { $("#ajax_login_error_"+ suffix).html(""); $('#ajaxLogin').hide(); if (my_callback!=null && my_callback!='undefined' && my_callback!=''){ eval(my_callback.replace(/_/g,'"')); } return true; }else { $("#ajax_login_error_"+ suffix).html('<span class="alert display_b clear_b centeralign">Bad user/password</span>') ; return false; } }, error: function(XMLHttpRequest, textStatus, errorThrown){ $("#ajax_login_error_"+ suffix).html("Bad user/password") ; return false; } }); } We need to set Spring to support Ajax login (#6): Set Spring Security xml configuration: <beans:beans xmlns:beans="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.springframework.org/schema/security" xsi:schemalocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd http://www.springframework.org/schema/security http://www.springframework.org/schema/security/spring-security-3.0.3.xsd"> <http auto-config="false" use-expressions="true"> <intercept-url access="hasRole('ROLE_ADMIN')" pattern="/admin**"> <intercept-url filters="none" pattern="/**"> <intercept-url access="permitAll" pattern="/signin/**"> <form-login authentication-failure-handler-ref="ajaxAuthenticationFailureHandler" authentication-success-handler-ref="ajaxAuthenticationSuccessHandler" login-page="/common/authentication/login"> <logout invalidate-session="true" logout-success-url="/common/authentication/logout"> <custom-filter before="LOGOUT_FILTER" ref="logoutFilter"> </custom-filter></logout></form-login></intercept-url></intercept-url></intercept-url></http> ... </beans:beans>Define a handler for login success: @Component("ajaxAuthenticationSuccessHandler") public class AjaxAuthenticationSuccessHandler extends SimpleUrlAuthenticationSuccessHandler { public AjaxAuthenticationSuccessHandler() { }@Override public void onAuthenticationSuccess(HttpServletRequest request, HttpServletResponse response, Authentication authentication) throws IOException, ServletException { HttpSession session = request.getSession(); DefaultSavedRequest defaultSavedRequest = (DefaultSavedRequest) session.getAttribute(WebAttributes.SAVED_REQUEST); //check if login is originated from ajax call if ("true".equals(request.getHeader("X-Ajax-call"))) { try { response.getWriter().print("ok");//return "ok" string response.getWriter().flush(); } catch (IOException e) { //handle exception... } } else { setAlwaysUseDefaultTargetUrl(false); ... } } }Define a handler for login failure – the same as success, but the string is “not-ok”. I know some of the code here is not the best practice so I would like to hear what you think. Please post me if you can see a way to improve the process or make it more generic. Acknowledgment : Diagram was done via gliffy - online diagram tool Reference: Spring security 3 Ajax login – accessing protected resources from our JCG partner Gal Levinsky at the Gal Levinsky’s blog blog....
java-logo

Five useful ways to sorting in java

A rapid overview of java sorting :                     normal sort of list : private static List VEGETABLES = Arrays.asList("apple", "cocumbers", "blackberry"); Collections.sort(VEGETABLES);output: apple, blackberry, cocumbers Reverse sorting: private static List VEGETABLES = Arrays.asList("apple", "cocumbers", "blackberry"); Collections.sort(VEGETABLES, Collections.reverseOrder()); output: cocumbers, blackberry, apple with custom comparator: private class StringComparator implements Comparator { public int compare(Object o1, Object o2) { String so1 = (String) o1; String so2 = (String) o2; return so1.compareTo(so2); } } private static List VEGETABLES = Arrays.asList("apple", "cocumbers", "blackberry"); Collections.sort(VEGETABLES, new StringComparator()); output: apple, blackberry, cocumbers Elements sorting: private class Element implements Comparable { private String name; private Double atomicMass; @Override public String toString() { final StringBuilder sb = new StringBuilder(); sb.append("Element"); sb.append("{name='").append(name).append('\''); sb.append(", atomicMass=").append(atomicMass); sb.append('}'); return sb.toString(); } public String getName() { return name; } public void setName(String name) { this.name = name; } public Double getAtomicMass() { return atomicMass; } public void setAtomicMass(Double atomicMass) { this.atomicMass = atomicMass; } public Element(String name, String mass, double atomicMass) { this.name = name; this.atomicMass = atomicMass; } public int compareTo(Element o) { return this.getAtomicMass().compareTo(o.getAtomicMass()); } } ArrayList<Element> elements = new ArrayList<Element>(); elements.add(new Element("Hydrogen", "H", 1.00794)); // Hydrogen 1.00794 amu Atomic Mass elements.add(new Element("Iron", "Fe", 55.845)); elements.add(new Element("Lithium", "Li", 6.941)); elements.add(new Element("Lead", "Pb", 207.2)); elements.add(new Element("Magnesium", "Mg", 24.305)); Collections.sort(elements); // Sort by Element output: Element{name='Hydrogen', atomicMass=1.00794} Element{name='Lithium', atomicMass=6.941} Element{name='Magnesium', atomicMass=24.305} Element{name='Iron', atomicMass=55.845} Element{name='Lead', atomicMass=207.2} Chronological sorting: SimpleDateFormat formatter = new SimpleDateFormat("MMMM dd, yyyy", Locale.US); try { ArrayList<Date> holidays = new ArrayList<Date>(); holidays.add(formatter.parse("May 31, 2010")); // Memorial Day holidays.add(formatter.parse("July 4, 2010")); // Independence Day holidays.add(formatter.parse("February 15, 2010")); // Presidents Day holidays.add(formatter.parse("September 6, 2010")); // Labor Day holidays.add(formatter.parse("December 24, 2010")); // Thanksgiving Day holidays.add(formatter.parse("July 5, 2010")); // federal employees extra day off for July 4th holidays.add(formatter.parse("January 18, 2010")); // Martin Luther King Day holidays.add(formatter.parse("November 25, 2010")); // federal employees extra day off for Christmas holidays.add(formatter.parse("October 11, 2010")); // Columbus Day holidays.add(formatter.parse("December 25, 2010")); // Christmas Day holidays.add(formatter.parse("January 1, 2010")); // New Year's Day Collections.sort(holidays); // Native sort for Date is chronological } catch (ParseException e) { e.printStackTrace(); } output: sorted:[Fri Jan 01 00:00:00 CET 2010, Mon Jan 18 00:00:00 CET 2010, Mon Feb 15 00:00:00 CET 2010, Mon May 31 00:00:00 CEST 2010, Sun Jul 04 00:00:00 CEST 2010, Mon Jul 05 00:00:00 CEST 2010, Mon Sep 06 00:00:00 CEST 2010, Mon Oct 11 00:00:00 CEST 2010, Thu Nov 25 00:00:00 CET 2010, Fri Dec 24 00:00:00 CET 2010, Sat Dec 25 00:00:00 CET 2010] You can view the complete simple class below: package com.tommyalf.personal.sorting;import java.text.ParseException; import java.text.SimpleDateFormat; import java.util.*;/** * Created by IntelliJ IDEA. * User: tommyalf * Date: 1-dic-2010 * Time: 22.40.49 */ public class SortDemo { private static List VEGETABLES = Arrays.asList("apple", "cocumbers", "blackberry");;public static void main(String args[]) { SortDemo sd = new SortDemo(); sd.normalSort(); sd.reverseSort(); sd.stringComparator(); sd.elementsSort(); sd.chronologicalSort(); }private void chronologicalSort() { SimpleDateFormat formatter = new SimpleDateFormat("MMMM dd, yyyy", Locale.US); try { ArrayList<Date> holidays = new ArrayList<Date>(); holidays.add(formatter.parse("May 31, 2010")); // Memorial Day holidays.add(formatter.parse("July 4, 2010")); // Independence Day holidays.add(formatter.parse("February 15, 2010")); // Presidents Day holidays.add(formatter.parse("September 6, 2010")); // Labor Day holidays.add(formatter.parse("December 24, 2010")); // Thanksgiving Day holidays.add(formatter.parse("July 5, 2010")); // federal employees extra day off for July 4th holidays.add(formatter.parse("January 18, 2010")); // Martin Luther King Day holidays.add(formatter.parse("November 25, 2010")); // federal employees extra day off for Christmas holidays.add(formatter.parse("October 11, 2010")); // Columbus Day holidays.add(formatter.parse("December 25, 2010")); // Christmas Day holidays.add(formatter.parse("January 1, 2010")); // New Year's Day System.out.println("before sort:" + holidays); Collections.sort(holidays); // Native sort for Date is chronological System.out.println("sorted:" + holidays); } catch (ParseException e) { e.printStackTrace(); } }private void elementsSort() { ArrayList<Element> elements = new ArrayList<Element>(); elements.add(new Element("Hydrogen", "H", 1.00794)); // Hydrogen 1.00794 amu Atomic Mass elements.add(new Element("Iron", "Fe", 55.845)); elements.add(new Element("Lithium", "Li", 6.941)); elements.add(new Element("Lead", "Pb", 207.2)); elements.add(new Element("Magnesium", "Mg", 24.305)); Collections.sort(elements); // Sort by Element System.out.print("Elements sort by atomicMass value:"); for ( Element e : elements ) { System.out.println(e); } }private void stringComparator() { Collections.sort(VEGETABLES, new StringComparator()); System.out.print("StringComparator:"); printList(VEGETABLES); }private void reverseSort() { Collections.sort(VEGETABLES, Collections.reverseOrder()); System.out.print("ReverseSort:"); printList(VEGETABLES); }private void normalSort() { Collections.sort(VEGETABLES); System.out.print("NormalSort:"); printList(VEGETABLES); }private void printList(List vegetables) { for (int i = 0, n = vegetables.size(); i < n; i++) { if (i != 0) { System.out.print(", "); } System.out.print(VEGETABLES.get(i)); } System.out.println(); }private class StringComparator implements Comparator { public int compare(Object o1, Object o2) { String so1 = (String) o1; String so2 = (String) o2; return so1.compareTo(so2); } }private class Element implements Comparable<Element> { private String name; private Double atomicMass;@Override public String toString() { final StringBuilder sb = new StringBuilder(); sb.append("Element"); sb.append("{name='").append(name).append('\''); sb.append(", atomicMass=").append(atomicMass); sb.append('}'); return sb.toString(); }public String getName() { return name; }public void setName(String name) { this.name = name; }public Double getAtomicMass() { return atomicMass; }public void setAtomicMass(Double atomicMass) { this.atomicMass = atomicMass; }public Element(String name, String mass, double atomicMass) { this.name = name; this.atomicMass = atomicMass; }public int compareTo(Element o) { return this.getAtomicMass().compareTo(o.getAtomicMass()); } } } Reference: Five useful ways to sorting in java from our JCG partner Tommy Alf at the Tommy Alf – blog blog. ...
Java Code Geeks and all content copyright © 2010-2014, Exelixis Media Ltd | Terms of Use | Privacy Policy
All trademarks and registered trademarks appearing on Java Code Geeks are the property of their respective owners.
Java is a trademark or registered trademark of Oracle Corporation in the United States and other countries.
Java Code Geeks is not connected to Oracle Corporation and is not sponsored by Oracle Corporation.
Do you want to know how to develop your skillset and become a ...
Java Rockstar?

Subscribe to our newsletter to start Rocking right now!

To get you started we give you two of our best selling eBooks for FREE!

Get ready to Rock!
You can download the complementary eBooks using the links below:
Close