Featured FREE Whitepapers

What's New Here?

google-app-engine-logo

Google AppEngine: Task Queues API

Task Queues com.google.appengine.api.taskqueue With Task Queues a user can initiate a request to have applications perform work outside of this request; they are a powerful tool for background work. Furthermore, you can organize work into small, discrete units (tasks). The application then inserts these tasks into one or more queues based on the queue’s configuration and processes them in FIFO order. Here’s a diagram I took from a Google IO presentation which illustrates at a high level task insertion into the queue:Queue Configuration 1. Push Queues (default): Push queue will process tasks based on the processing rate configured in the queue definition (see below). App Engine automatically manages the lifetime of these queues (creation, deletion, etc) and adjusts the processing capacity to match your configuration and processing volume. These can only be used within App Engine (internal to your app).  2. Pull Queues: Allow a task consumer to lease tasks at a specific time within a specific timeframe. They are accessible internally as well as externally through the Task Queue REST API. In this scenarion, however, GAE does not manage the lifecycle and processing rate of queues automatically, it is up to the developer to do it. A backend also has access to these queues. Tasks   They represent a unit of work performed by application. T asks are are idempotent, i.e they are unique in a queue and according to Google documentation cannot be invoked more than once simultaneously (unless some weird internal error condition happens). Instances of TaskOptions class, tasks consist of URL and a payload which can be a simple string, a binary object (byte[ ]), or an instance of a DeferredTask. A DeferredTask is basically a Runnable. This allows you to chain tasks together. Our team had to do this in order to simulate long runnings tasks when GAE’s max execution limit was 30 seconds. Presently, a task must finish executing and send an HTTP response value between 200–299 within 10 minutes of the original request. This deadline is separate from user requests, which have a 60-second deadline. Furthermore, t asks use token buckets to control the rate of task execution. Each time task is i nvoked, a token is used. This leasing model (acquire a token) is typically of brokering systems or message-passing systems and it allows users to control the rate of execution of these tasks (see below on configuring queues). Lastly, a very important feature of the Task Queue API is that it has automatic retries of tasks. You can configure this with the RetriesOptions parameter when creating the TaskOptions object. Task within a Transaction Tasks can be enqueued as part of a datastore transaction. Insertion (not execution) will be guaranteed if the transaction was committed successfully. The only caveat is that Transactional tasks cannot have user-defined names and there is a maximum of 5 insertions into task queues in a single transaction. Configuration   Queues are configured via queue.xml. If omitted, default queue with default configuration is used. Since Pull Queues are for more advanced needs, they must be specifically configured (there is no default pull queue). An application’s, queue configuration applies to all versions of the app. You can override this behavior for push queues using the target parameter in queue.xml. This is used in case you want different versions of your app (different sites) with different queue processing configuration. Here are some of things you are allowed to configure (the documentation is more extensive): • bucket-size: how fast the queue is processed when many tasks are in the queue and the rate is high (push only). (Warning: Development server ignores this value) • max-concurrent-requests: maximum number of tasks that can be executed at any \ given time in the specified queue (push only). • mode: whether it’s push or pull. • name: queue name • rate: How often tasks are processed on this queue (s=seconds, m=minutes, h=hours, d=days). If 0, queue is considered paused. (Warning: Development server ignores this value) • target: target a task to a specfic backend or application version. <queue-entries> <!--Set the number of max concurrent requests to 10--> <queue> <name>optimize-queue</name> <rate>20/s</rate> <bucket-size>40</bucket-size> <max-concurrent-requests>10</max-concurrent-requests> </queue> </queue-entries>Sample Code   This is a very straightforward example. As I said before, task queues are basically a URL handler. In this servlet, the GET will handle enqueueing a task. The task will POST to this same servlet and execute the doPost( ) method carrying out the task. In this case, it’s just a simple counter. Notice the counter is a volatile property. If you access this servlet as GET request, it will enqueue another task. So, you will see the counter being incremented by both tasks. public class TaskQInfo extends HttpServlet { private static volatile int TASK_COUNTER = 0;// Executed by user menu click public void doGet(HttpServletRequest req, HttpServletResponse resp) throws IOException { // Build a task using the TaskOptions Builder pattern from ** above Queue queue = QueueFactory.getDefaultQueue(); queue.add(withUrl("/taskq_demo").method(TaskOptions.Method.POST)); resp.getWriter().println("Task have been added to default queue..."); resp.getWriter().println("Refresh this page to add another count task"); } // Executed by TaskQueue @Override protected void doPost(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException { // This is the body of the task for(int i = 0; i < 1000; i++) { log.info("Processing: " + req.getHeader("X-AppEngine-TaskName") + "-" + TASK_COUNTER++); try { // Sleep for a second (if the rate is set to 1/s this will allow at // most 1 more task to be processed) Thread.sleep(1000); } catch (InterruptedException e) { // ignore} } } }Task queues allow you to achieve some level of concurrency in your application by invoking background processes on demand. For very lengthy tasks, you might want to take a look at App Engine backends, which are basically special App Engine instances with no request time limit. Reference: Google AppEngine: Task Queues API from our JCG partner Luis Atencio at the Reflective Thought blog....
software-development-2-logo

Why Developers Never Use State Machines

A few months ago I saw a great little blog post about state machines on the Shopify blog. The message was that state machines are great and developers should use them more – given my recent experiences with state machines at CrowdHired, I could certainly agree with that. But it got me thinking, how many times in my developer career have I actually used a state machine (either separate library or even hand-rolled abstraction)? The answer is zero times – which surprised the hell out of me since state machines really are very useful. So I decided to engage in a bit of introspection and figure out why we tend to manage our “state” and “status” fields in an ad-hoc fashion rather than doing what is clearly called for. We Don’t Need One Until We Do The problem is that you almost never create an object fully formed with all the behaviour it is ever going to need, rather you build it up over time. The same is true for the “states” that a state machine candidate object can be in. So, early on you don’t feel like your objects’ state machine behaviour is complex enough to warrant a “full-blown” state machine (YAGNI and all that jazz), but later on – when it IS complex enough – you feel like you’ve invested too much time/effort to replace it with something that has equivalent functionality. It’s a bit of a catch-22. It’s overkill and by the time it’s not, it’s too late. A State Machine Is A Fluffy Bunny (Not Particularly Threatening)Those of us who went through computer science degrees remember state machines from our computing theory subjects and the memories are often not fond ones. There are complex diagrams and math notation, determinism and non-determinism, Moore and Mealy, as well as acronyms galore (DFA, NFA, GNFA etc.). We come to believe that state machines are more complex than they actually are and it is therefore nothing but pragmatism that makes us consider a “full-blown” state machine overkill. But most state machines you’re likely to need in your day-to-day development have nothing in common with their computing theory counterparts (except the … errr … theory). You have states which are strings, and events which are methods that cause transitions from one state to another – that’s pretty much it (at least for the state_machine gem in Ruby). The point is, even if you have two states, a state machine is not overkill, it might be easier that rolling an ad-hoc solution, as long as you have a good library to lean on. Even A Good Tool Is Not A Good Tool I would hazard a guess that there are decent state machine libraries for most languages that you can use (the aforementioned state_machine for Ruby is just one example). But even a fluffy bunny has a learning curve (I am stretching the metaphor well past breaking point here). That wouldn’t be such an issue if you were solving a problem, but all you’re likely doing is replacing an existing solution. Since we tend to turn to a state machine library after the fact (our ad-hoc solution is working right now). Just like with everything that has “potential future benefits” the immediate value is very hard to justify even to yourself (unless you’ve had experience with it before). The slight learning curve only tips the scale further towards the “we can live without it” side. It doesn’t matter how good a tool is if you never give it a chance. It is really difficult to appreciate (until you’ve gone through it) – how much better life can be if you do give a good state machine library a chance. When we finally “bit the bullet” at CrowdHired and rejigged some of our core objects to use the state_machine gem, the difference was immediately apparent.Firstly the learning curve was minor, I did spend a few hours of going through the source and documentation, but after that I had a good idea what could and couldn’t be done (I might do an in-depth look at the state_machine gem at some point). The integration itself was almost painless, but moving all the code around to be inline with the new state machine was a big pain. In hindsight had we done this when our objects only had a couple of states it would have been a breeze. We’re now able to easily introduce more states to give our users extra information as well as allow us to track things to a finer grain. Before it was YAGNI cause it was a pain, now we find that we “ai gonna need” after all, cause it’s so easy. Our return values from state transitions are now 100% consistent (true/false). Before we were returning objects, arrays of objects, nil, true/false depending on who was writing it and when. We’re now able to keep an audit trail of our state transitions simply by dropping in state_machine-audit_trail (see that Shopify post), before it was too hard to hook it in everywhere so we had nothing. We removed a bunch of code and improved our codebase – always worthy goals as far as I am concerned.My gut-feel is that most people who read that Shopify post agreed with it in spirit, but did nothing about it (that’s kinda how it was with me). We seem to shy away from state machines due to misunderstanding of their complexity and/or an inability to quantify the benefits. But, there is less complexity than you would think and more benefits than you would expect as long you don’t try to retrofit a state machine after the fact. So next time you have an object that even hints at having a “status” field, just chuck a state machine in there, you’ll be glad you did. I guarantee it or your money back :). Reference: Why Developers Never Use State Machines from our JCG partner Alan Skorkin at the Skorks blog blog....
javafx-logo

JBox2D and JavaFX: Events and forces

In yesterdays samples you saw how you can create a simple world, and display it with WorldView, and how to provide custom Renderers. Now we’re going to add some user input. We’ll create a control that behaves like a flipper in a pinball machine. To do that we’ll create a Joint. In JBox2D Joints are used to constrain bodies to the world or to each other. We’ll create a static circular Body that will serve as the axis for our flipper, and bind a Box to it via a RevoluteJoint. To simplify the code, we’ll first define a JointBuilder base class and a RevoluteJointBuilder: public abstract class JointBuilder, T extends JointDef> {protected World world; protected T jointDef;protected JointBuilder(World world, T jointDef) { this.world = world; this.jointDef = jointDef; }public K bodyA(Body a) { jointDef.bodyA = a; return (K) this; }public K bodyB(Body b) { jointDef.bodyB = b; return (K) this; }public K userData(Object userData) { jointDef.userData = userData; return (K) this; }public K type(JointType type) { jointDef.type = type; return (K) this; }public K collideConnected(boolean coco) { jointDef.collideConnected = coco; return (K) this; }public Joint build() { return world.createJoint(jointDef); } }And here’s the RevoluteJointBuilder: public class RevoluteJointBuilder extends JointBuilder {public RevoluteJointBuilder(World world, Body a, Body b, Vec2 anchor) { super(world, new RevoluteJointDef()); jointDef.initialize(a, b, anchor); }public RevoluteJointBuilder enableLimit(boolean enable) { jointDef.enableLimit = enable; return this; }public RevoluteJointBuilder enableMotor(boolean motor) { jointDef.enableMotor = motor; return this; }public RevoluteJointBuilder localAnchorA(Vec2 localAnchorA) { jointDef.localAnchorA = localAnchorA; return this; }public RevoluteJointBuilder localAnchorB(Vec2 localAnchorB) { jointDef.localAnchorB = localAnchorB; return this; }public RevoluteJointBuilder lowerAngle(float lowerAngle) { jointDef.lowerAngle = lowerAngle; return this; }public RevoluteJointBuilder maxMotorTorque(float maxMotorTorque) { jointDef.maxMotorTorque = maxMotorTorque; return this; }public RevoluteJointBuilder motorSpeed(float motorSpeed) { jointDef.motorSpeed = motorSpeed; return this; }public RevoluteJointBuilder referenceAngle(float referenceAngle) { jointDef.referenceAngle = referenceAngle; return this; }public RevoluteJointBuilder upperAngle(float upperAngle) { jointDef.upperAngle = upperAngle; return this; }}Now we can modify our HelloWorld-Example like this: public class HelloWorld extends Application {public static void main(String[] args) { Application.launch(args); }@Override public void start(Stage primaryStage) { World world = new World(new Vec2(0, -2f), true); primaryStage.setTitle("Hello World!"); NodeManager.addCircleProvider(new MyNodeProvider());new CircleBuilder(world).userData("ball").position(0.1f, 4).type(BodyType.DYNAMIC).restitution(1).density(2).radius(.15f).friction(.3f).build(); final Body flipperBody = new BoxBuilder(world).position(0, 2).type(BodyType.DYNAMIC).halfHeight(.02f).halfWidth(.2f).density(2).friction(0).userData("flipper").build(); Vec2 axis = flipperBody.getWorldCenter().add(new Vec2(.21f, 0)); Body axisBody = new CircleBuilder(world).position(axis).type(BodyType.STATIC).build(); new RevoluteJointBuilder(world, flipperBody, axisBody, axis).upperAngle(.6f).lowerAngle(-.6f) .enableMotor(true).enableLimit(true).maxMotorTorque(10f).motorSpeed(0f).build();Scene scene = new Scene(new WorldView(world, 200, 400, 50), 500, 600);// ground new BoxBuilder(world).position(0, -1f).halfHeight(1).halfWidth(5).build(); primaryStage.setScene(scene ); primaryStage.show(); } }This will display our scene and you’ll see how the Joint prevents the dynamic Box from falling to the ground and how it constrains it’s movement. The next step is to allow the user to control it. For this we’ll apply a force when the user presses a key. Add this after the instantiation of the scene: scene.setOnKeyPressed(new EventHandler() {@Override public void handle(KeyEvent ke) {if (ke.getCode() == KeyCode.LEFT) { flipperBody.applyTorque(-15f); }} });scene.setOnKeyReleased(new EventHandler() {@Override public void handle(KeyEvent ke) {if (ke.getCode() == KeyCode.LEFT) { flipperBody.applyTorque(15f); }} });That’s it for now. In the next parts of this tutorial we’ll do a bit more custom rendering and create some nice custom Nodes. Reference: Events and forces with JBox2D and JavaFX from our JCG partner Toni Epple at the Eppleton blog....
groovy-logo

Using Groovy scriptlets inside a *.docx document

Introduction One of my recent projects required automated generation of contracts for customers. Contract is a legal document of about 10 pages length. One contract form can be applied for many customers so the document is a template with customer info put in certain places. In this article I am going to show you how I solved this problem. Requirements This is an initial version of formalized requirements : Specified data must be placed in marked places of a complex DOC/DOCX file The requirements were subsequently refined and expanded :Specified data must be placed in marked places of a complex DOCX file. Output markup must be scriptlet-like: ${}, <%%>, <%=%>. Output data may be not only strings but also hashes and objects. Field access must be an option. Output language must be brief and script-friendly: Groovy, JavaScript. A possibility to display list of objects in a table, each cell displaying a field.Background It turned out that the existing products in the field (I’m talking about Java world) do not fit into initial requirements. A brief overview of the products: Jasper reports Jasper Reportsuses *.jrxml files as templates. Template file in combination with input data (SQL result set or a Map of params) are given to a processor which forms any of these formats: PDF, XML, HTML, CSV, XLS, RTF, TXT. Did not fit in:It’s not a WYSIWYG, even with help of iReport — a visual tool to create jrxml-templates. JasperReports API must be learned well to create and style a complex template. JR does not output in a suitable format. PDF might be okay, but ability of hand-editing is preferable.Docx4java Docx4jis a Java library for creating and manipulating Microsoft Open XML (Word docx, Powerpoint pptx, and Excel xlsx) files.  Did not fit in:There is no case meeting my requirements in docx4java documentation. A brief note about XMLUtils.unmarshallFromTemplate functionality is present but it only does simpliest substitutions. Repeats of output is done with prepared XML-sources and XPath, linkApache POI Apache POI is a Java tool for creating and manipulating parts of *. doc, *.ppt, *.xls documents. A major use of the Apache POI api is for Text Extraction applications such as web spiders, index builders, and content management systems. Did not fit in:Does not have any options that meet my requirements.Word Content Control Toolkit Word Content Control Toolkit is a stand-alone, light-weight tool that opens any Word Open XML document and lists all of the content controls inside of it. After I developed my own solution with scriptlets I heard of a solution based on combination of this tool and XSDT-transformations. It may work for somebody but I did not bother digging because it simply takes less steps to use my solution straightforward. Solution of the problem It was fun! 1. Document text content is stored as Open XML file inside a zip-archive. Traditional JDK 6 zipper does not support an explicit encoding parameter. That is, a broken docx-file may be produced using this zipper. I had to use a Groovy-wrapper AntBuilder for zipping, which does have an encoding parameter. 2. Any text inside you enter in MS Word may be “arbitrary” broken into parts wrapped with XML. So, I had to solve the problem of cleaning pads generated from the template xml. I used regular expressions for this task. I did not try to use XSLT or anything because I thought RegEx would be faster. 3. I decided to use Groovy as a scripting language because of its simplicity, Java-nature, and a built-in template processor. I found an interesting issue related to the processor. It turned out that even in a small 10-sheet document one can easily run into a restriction on the length of a string between two scriptlets. I had to substitute the text going between a pair of scriptlets with a UUID-string, run the Groovy template processor using the modified text, and finally swich back those UUID-placeholders with the initial text fragments. After overcoming these difficulties, I tried out the project in real life. It turned out well! I created a project website and published it. Project address: snowindy.github.com/scriptlet4docx/ Code example HashMap<String, Object> params = new HashMap<String, Object>(); params.put("name","John"); params.put("sirname","Smith");DocxTemplater docxTemplater = new DocxTemplater(new File("path_to_docx_template/template.docx")); docxTemplater.process(new File("path_to_result_docx/result.docx"), params);Scriptlet types explanation ${ data } Equivalent to out.print(data) <%= data %> Equivalent to out.print(data) <% any_code %> Evaluates containing code. No output applied. May be used for divided conditions: <% if (cond) { %> This text block will be printed in case of "cond == true" <% } else { %> This text block will be printed otherwise. <% } %>$[ @listVar.field ] This is a custom Scriptlet4docx scriptlet type designed to output collection of objects to docx tables. It must be used inside a table cell. Say, we have a list of person objects. Each has two fields: ‘name’ and ‘address’. We want to output them to a two-column table.Create a binding with key ‘personList’ referencing that collection. Create a two-column table inside a template docx-document: two columns, one row. $[@person.name] goes to the first column cell; $[@person.address] goes to the second. Voila, the whole collection will be printed to the table.Live template example You can check all mentioned scriptlets usage in a demonstration template Project future If I actually developed a new approach to processing docx-templates, it would be nice to popularize it. Projects TODOs:Preprocessed templates caching, Scriptlets support in lists Streaming APIReference: Using Groovy scriptlets inside a *.docx document from our W4G partner Eugene Polyhaev ...
software-development-2-logo

Stupid Design Decisions (Part I)

Maybe you know the joke where a young software engineer goes into a bar, puts a green frog on top of the bar counter and the frog says: “Kiss me, I’m an enchanted princess.” The bar keeper is fascinated and recommends the software engineer to kiss the frog. But he just replies “I have no time for a girlfriend and a talking frog is cool!”. We love cool things, difficult technical puzzles and sometimes challenging bugs. But we should be careful with our motivation if we do technical decisions. I have seen and will see a lot of stupid decisions form engineers and/or managers. The following list of tree decision anti-pattern is neither complete nor scientifically verifiable, but I think you may recognize some of them in your organization. 1) The Swarm-Foolishness-Design-Decision Often, a group of humans is not more intelligent than each single team member. I know that this is a provocative statement, but it’s a matter of fact that team dynamic can be a killer of intelligent decisions. If a group of people discuss a problem and every body’s input is told, it is very likely that the person will get acceptance of the team which is most sure that he/she knows the right solution. People which are convinced that their opinion is the correct solution are so dominant to others, that even people which know it better getting uncertain and just recede from their opinion. The only way to avoid this pitfall is to believe nothing. Try to find a prove, a sample, some measurement, program spike solutions and/or find reliable case studies. Software Engineering is a discipline that should be founded on knowledge and not on believe. 2) The Manager-Design-Decision Unfortunately relatively often these people which are perfectly sure that they have the right solution are managers. A lot of organizations promote people which are self confident and good in solving problems. Over the time these managers lose their technical competence, they just stop to be up-to-date. Some are still convinced that they know everything better than their team members. This can be a great problem, in cases where the manager is fooled into bel ieving that a manager is automatically a good designer. In the case that your manager permanently over rules the team and makes stupid design decisions, maybe it’s a good idea to find a better manager. Or you could print this article, use a yellow text marker to highlight right paragraph and drop it at the desk of your boss. At least you will have some fun and there is a little chance that he/she will improve. 3) The Big-Mac-Menu-Design-Decision A Big-Mac is a product with consistent standard of quality in space and time over the past 40 years. Just small variations in nutritional values between countries [1] and almost no variation within a country. But this doesn’t says anything about taste and/or healthiness. Large organizations tend to have a (big) enterprise architecture department which produces a kind of (big) standard architecture. Independent of the selected technology a standard architecture has usually a lot of overhead (because it should fit for every project in the company) and/or it is outdated before published for the first time. It can be dangerous to use a standard architecture in inappropriate circumstances and it is difficult to do decide for alternative solutions. Please, try to courageously raise your voice if the standard architecture doesn’t fit for your task. The enterprise architecture department should work for us and not vice versa. Reference: Stupid Design Decisions – ‘I have no time for a girlfriend and a talking frog is cool!’ from our JCG partner Markus Sprunck at the Software Engineering Candies blog....
jsf-logo

Custom JSF validator for required fields

JSF components implementing EditableValueHolder interface have two attributes ‘ required’ and ‘ requiredMessage’ – a flag indicating that the user is required to input / select not empty value and a text for validation message. We can use that, but it’s not flexible enough, we can’t parameterize the message directly in view (facelets or jsp) and we have to do something for a proper message customization. What is about a custom validator attached to any required field? We will write one. At first we need to register such validator in a tag library. <?xml version='1.0'?> <facelet-taglib version='2.0' ... > <namespace>http://ip.client/ip-jsftoolkit/validator</namespace> <tag> <tag-name>requiredFieldValidator</tag-name> <validator> <validator-id>ip.client.jsftoolkit.RequiredFieldValidator</validator-id> </validator> <attribute> <description>Resource bundle name for the required message</description> <name>bundle</name> <required>false</required> <type>java.lang.String</type> </attribute> <attribute> <description>Key of the required message in the resource bundle</description> <name>key</name> <required>false</required> <type>java.lang.String</type> </attribute> <attribute> <description>Label string for the required message</description> <name>label</name> <required>false</required> <type>java.lang.String</type> </attribute> </tag> </facelet-taglib>We defined three attributes in order to achieve a high flexibility. A simple using would be <h:outputLabel for='myInput' value='#{text['myinput']}'/> <h:inputText id='myInput' value='...'> <jtv:requiredFieldValidator label='#{text['myinput']}'/> </h:inputText>The validator class itself is not difficult. Dependent on the ‘ key’ parameter (key of the required message) and the ‘ label’ parameter (text of the corresponding label) there are four cases how the message gets acquired. /** * Validator for required fields. */ @FacesValidator(value = RequiredFieldValidator.VALIDATOR_ID) public class RequiredFieldValidator implements Validator { /** validator id */ public static final String VALIDATOR_ID = 'ip.client.jsftoolkit.RequiredFieldValidator';/** default bundle name */ public static final String DEFAULT_BUNDLE_NAME = 'ip.client.jsftoolkit.validator.message';private String bundle; private String key; private String label;@Override public void validate(FacesContext facesContext, UIComponent component, Object value) throws ValidatorException { if (!UIInput.isEmpty(value)) { return; }String message; String bundleName;if (bundle == null) { bundleName = DEFAULT_BUNDLE_NAME; } else { bundleName = bundle; }if (key == null && label == null) { message = MessageUtils.getMessageText( MessageUtils.getResourceBundle(facesContext, bundleName), 'jsftoolkit.validator.emptyMandatoryField.1'); } else if (key == null && label != null) { message = MessageUtils.getMessageText( MessageUtils.getResourceBundle(facesContext, bundleName), 'jsftoolkit.validator.emptyMandatoryField.2', label); } else if (key != null && label == null) { message = MessageUtils.getMessageText( MessageUtils.getResourceBundle(facesContext, bundleName), key); } else { message = MessageUtils.getMessageText( MessageUtils.getResourceBundle(facesContext, bundleName), key, label); }throw new ValidatorException(new FacesMessage(FacesMessage.SEVERITY_WARN, message, StringUtils.EMPTY)); // getter / setter ... } }MessageUtils is an utility class to get ResourceBundle and message text. We also need two text in the resource bundle (property file) jsftoolkit.validator.emptyMandatoryField.1=Some required field is not filled in. jsftoolkit.validator.emptyMandatoryField.2=The required field '{0}' is not filled in.and the following context parameter in web.xml <context-param> <param-name>javax.faces.VALIDATE_EMPTY_FIELDS</param-name> <param-value>true</param-value> </context-param>This solution is not ideal because we need to define the label text (like #{text['myinput']}) twice and to attach the validator to each field to be validated. A better and generic validator for multiple fields will be presented in the next post. Stay tuned! Reference: Custom JSF validator for required fields from our JCG partner Oleg Varaksin at the Thoughts on software development blog....
akka-logo

Akka STM – Playing PingPong with STM Refs and Agents

PingPong is a classic example where 2 players (or threads) access a shared resource – PingPong Table and pass the Ball (state variable) between each other. With any shared resource, unless we synchronize the access, the threads can run into potential deadlock situation.The PingPong algorithm is very simple if my turn { update whose turn is next ping/pong -log the hit notify other threads } else { wait for notification }  Let’s take an example and see how this works! Here is our Player class, which implements Runnable and takes in the access to the shared resource and a message public class Player implements Runnable { PingPong myTable; Table where they play String myOpponent; public Player(String opponent, PingPong table) { myTable = table; myOpponent = opponent; } public void run() { while (myTable.hit(myOpponent)) ; }}Second, we see the PingPong table class, which has a synchronized method hit() where a check is made, if my turn or not. If my turn, log the ping and update the shared variable for opponent name. public class PingPong { state variable identifying whose turn it is. private String whoseTurn = null; public synchronized boolean hit(String opponent) { String x = Thread.currentThread().getName(); if (x.compareTo(whoseTurn) == 0) { System.out.println('PING! (' + x + ')'); whoseTurn = opponent; notifyAll(); } else { try { wait(2500); } catch (InterruptedException e) { } } } }Next, we start the game and get the players started! public class Game { public static void main(String args[]) { PingPong table = new PingPong(); Thread alice = new Thread(new Player('bob', table)); Thread bob = new Thread(new Player('alice', table)); alice.setName('alice'); bob.setName('bob'); alice.start(); alice starts playing bob.start(); bob starts playing try { Wait 5 seconds Thread.sleep(5000); } catch (InterruptedException e) { } table.hit('DONE'); cause the players to quit their threads. try { Thread.sleep(100); } catch (InterruptedException e) { } } }That’s all, we have our PingPong game running. In this case, we saw how the synchronized method hit() allows only one thread to access the shared resource – whoseTurn. Akka STM provides two constructs Refs and Agents. Refs (Transactional References) provide coordinated synchronous access to multiple identities. Agents provide uncoordinated asynchronous access to single identity. Refs In our case, since share state variable is a single identity, usage of Refs is overkill but still we will go ahead and see their usage. public class PingPong { updates to Ref.View are synchronous Ref.View<String> whoseTurn; public PingPong(Ref.View<String> player) { whoseTurn = player; } public boolean hit(final String opponent) { final String x = Thread.currentThread().getName(); if (x.compareTo(whoseTurn.get()) == 0) { System.out.println('PING! (' + x + ')'); whoseTurn.set(opponent); } else { try { wait(2500); } catch (Exception e) { } } } }The key here are the followingThe synchronized keyword is missing Definition of the state variable as Ref //updates to Ref.View are synchronous Ref.View<string> whoseTurn; Calls to update Ref are coordinated and synchronous whoseTurn.set(opponent);So, when we use the Ref to hold the state, access to the Refs is automatically synchronized in a transaction. Agents Since agents provide uncoordinated asynchronous access, using agents for state manipulation would mean that we need to wait till all the updates have been applied to the agent. Agents provide a non blocking access for gets. public class PingPong { Agent<String> whoseTurn; public PingPong(Agent<String> player) { whoseTurn = player; } public boolean hit(final String opponent) { final String x = Thread.currentThread().getName(); wait till all the messages are processed to make you get the correct value, as updated to Agents are async String result = whoseTurn.await(new Timeout(5, SECONDS)); if (x.compareTo(result) == 0) { System.out.println('PING! (' + x + ')'); whoseTurn.send(opponent); } else { try { wait(2500); } catch (Exception e) { } } return true; keep playing. } }The key here are the followingThe synchronized keyword is missing Definition of the state variable as Agent //updates to Ref.View are synchronous Agent<string> whoseTurn; Wait for updates to the agent, as updates to agent are async String result = whoseTurn.await(new Timeout(5, SECONDS)); Calls to update Ref are coordinated and synchronous whoseTurn.send(opponent);All the code referred in these example is available at – https://github.com/write2munish/Akka-Essentials/tree/master/AkkaSTMExample/src/main/java/org/akka/essentials/stm/pingpong with Example 1 – for normal thread based synchronization Example 2 – Usage of Refs for synchronization Example 3 – Usage of Agents for synchronization Reference: Playing PingPong with STM – Refs and Agents from our JCG partner Munish K Gupta at the Akka Essentials blog....
spring-logo

REST CXF for Spring JPA2 backend

In this demo, we will generate a REST/CXF application with spring/jpa2 backend. This demo presents the track REST-CXF of minuteproject. The model from demo 2 remains the same.The enrichment stays the same. But the tracks changes What is added are 2 layers:a DAO layer with spring integration on top of JPA2 a REST-CXF layer with JAX-RS annotationThe JPA2 entities are annotated by JAXB annotations. All this is done to provide a CRUD interface on top of the model entities. Configuration Here is the configuration TRANXY-JPA2-Spring-REST-CXF.xml <!DOCTYPE root> <generator-config xmlns="http://minuteproject.sf.net/xsd/mp-config" xmlns:xs="http://www.w3.org/2001/XMLSchema-instance" xs:noNamespaceSchemaLocation="../config/mp-config.xsd"> <configuration> <conventions> <target-convention type="enable-updatable-code-feature" /> </conventions> <model name="tranxy" version="1.0" package-root="net.sf.mp.demo"> <data-model> <driver name="mysql" version="5.1.16" groupId="mysql" artifactId="mysql-connector-java"></driver> <dataSource> <driverClassName>org.gjt.mm.mysql.Driver</driverClassName> <url>jdbc:mysql://127.0.0.1:3306/tranxy</url> <username>root</username> <password>mysql</password> </dataSource> <primaryKeyPolicy oneGlobal="false" > <primaryKeyPolicyPattern name="autoincrementPattern"></primaryKeyPolicyPattern> </primaryKeyPolicy> </data-model> <business-model> <generation-condition> <condition type="exclude" startsWith="QUARTZ"></condition> </generation-condition> <business-package default="tranxy"> <condition type="package" startsWith="trans" result="translation"></condition> </business-package> <enrichment> <conventions> <!-- manipulate the structure and entities BEFORE manipulating the entities --> <column-naming-convention type="apply-strip-column-name-suffix" pattern-to-strip="ID" /> <reference-naming-convention type="apply-referenced-alias-when-no-ambiguity" is-to-plurialize="true" /> </conventions> <entity name="language_x_translator"> <field name="language_id" linkReferenceAlias="translating_language" /> <field name="user_id" linkReferenceAlias="translator" /> </entity> <entity name="LANGUAGE_X_SPEAKER"> <field name="LANGUAGE_ID" linkToTargetEntity="LANGUAGE" linkToTargetField="IDLANGUAGE" linkReferenceAlias="spoken_language" /> <field name="user_id" linkReferenceAlias="speaker" /> </entity> <entity name="APPLICATION"> <field name="TYPE"> <property tag="checkconstraint" alias="application_type"> <property name="OPENSOURCE"/> <property name="COPYRIGHT" /> </property> </field> </entity> </enrichment> </business-model> </model> <targets> <target refname="REST-CXF-BSLA" name="default" fileName="mp-template-config-REST-CXF-Spring.xml" outputdir-root="../../DEV/latvianjug/tranxy/rest" templatedir-root="../../template/framework/cxf"> </target><target refname="BackendOnBsla" name="default" fileName="mp-template-config-JPA2-bsla.xml" outputdir-root="../../DEV/latvianjug/tranxy/bsla" templatedir-root="../../template/framework/bsla"> <property name="add-cache-implementation" value="ehcache"></property> </target> <target refname="JPA2" fileName="mp-template-config-JPA2.xml" outputdir-root="../../DEV/latvianjug/tranxy/jpa" templatedir-root="../../template/framework/jpa"> <property name="add-querydsl" value="2.1.2"></property> <property name="add-jpa2-implementation" value="hibernate"></property> <property name="add-cache-implementation" value="ehcache"></property> <property name="add-domain-specific-method" value="true"></property> <property name="add-xmlbinding" value="true"></property> <property name="add-xml-format" value="lowercase-hyphen"></property> </target> <target refname="MavenMaster" name="maven" fileName="mp-template-config-maven.xml" outputdir-root="../../DEV/latvianjug/tranxy" templatedir-root="../../template/framework/maven"> </target><target refname="CACHE-LIB" fileName="mp-template-config-CACHE-LIB.xml" templatedir-root="../../template/framework/cache"> </target> <target refname="LIB" fileName="mp-template-config-bsla-LIB-features.xml" templatedir-root="../../template/framework/bsla"> </target><target refname="REST-LIB" fileName="mp-template-config-REST-LIB.xml" templatedir-root="../../template/framework/rest"> </target> <target refname="SPRING-LIB" fileName="mp-template-config-SPRING-LIB.xml" templatedir-root="../../template/framework/spring"> </target></targets> </configuration> </generator-config>Todo explanations Generation Set TRANXY-JPA2-Spring-REST-CXF.xml in /mywork/config Run >model-generation.cmd TRANXY-JPA2-Spring-REST-CXF.xml The output goes in /dev/latvianjug/tranxy Resulting artefacts A maven project structure with 3 modulesJPA2 layer Spring DAO layer CXF layerJPA2 layer has been visited in Demo 1 and Demo 2. Spring DAO layer It consists of transactional services one for each entity CRUD DAO layer on top of JPA2: This layer is called BSLA (Basic Spring Layer Architecture). Two interfaces and implementation are generated for each entity Example for Translation entity DAO Interfaces /** * Copyright (c) minuteproject, minuteproject@gmail.com * All rights reserved. * * Licensed under the Apache License, Version 2.0 (the "License") * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. * * More information on minuteproject: * twitter @minuteproject * wiki http://minuteproject.wikispaces.com * blog http://minuteproject.blogspot.net * */ /** * template reference : * - name : BslaDaoInterfaceUML * - file name : BslaDaoInterfaceUML.vm */ package net.sf.mp.demo.tranxy.dao.face.translation;import net.sf.mp.demo.tranxy.domain.translation.Translation; import java.util.List; import net.sf.minuteProject.architecture.bsla.bean.criteria.PaginationCriteria; import net.sf.minuteProject.architecture.bsla.dao.face.DataAccessObject;/** * * <p>Title: TranslationDao</p> * * <p>Description: Interface of a Data access object dealing with TranslationDao * persistence. It offers a set of methods which allow for saving, * deleting and searching translation objects</p> * */ public interface TranslationDao extends DataAccessObject {/** * Inserts a Translation entity * @param Translation translation */ public void insertTranslation(Translation translation) ; /** * Inserts a list of Translation entity * @param List<Translation> translations */ public void insertTranslations(List<Translation> translations) ; /** * Updates a Translation entity * @param Translation translation */ public Translation updateTranslation(Translation translation) ;/** * Updates a Translation entity with only the attributes set into Translation. * The primary keys are to be set for this method to operate. * This is a performance friendly feature, which remove the udibiquous full load and full update when an * update is to be done * Remark: The primary keys cannot be update by this methods, nor are the attributes that must be set to null. * @param Translation translation */ public int updateNotNullOnlyTranslation(Translation translation) ; public int updateNotNullOnlyPrototypeTranslation(Translation translation, Translation prototypeCriteria); /** * Saves a Translation entity * @param Translation translation */ public void saveTranslation(Translation translation); /** * Deletes a Translation entity * @param Translation translation */ public void deleteTranslation(Translation translation) ; /** * Loads the Translation entity which is related to an instance of * Translation * @param Long id * @return Translation The Translation entity public Translation loadTranslation(Long id); */ /** * Loads the Translation entity which is related to an instance of * Translation * @param java.lang.Long Id * @return Translation The Translation entity */ public Translation loadTranslation(java.lang.Long id);/** * Loads a list of Translation entity * @param List<java.lang.Long> ids * @return List<Translation> The Translation entity */ public List<Translation> loadTranslationListByTranslation (List<Translation> translations); /** * Loads a list of Translation entity * @param List<java.lang.Long> ids * @return List<Translation> The Translation entity */ public List<Translation> loadTranslationListById(List<java.lang.Long> ids); /** * Loads the Translation entity which is related to an instance of * Translation and its dependent one to many objects * @param Long id * @return Translation The Translation entity */ public Translation loadFullFirstLevelTranslation(java.lang.Long id); /** * Loads the Translation entity which is related to an instance of * Translation * @param Translation translation * @return Translation The Translation entity */ public Translation loadFullFirstLevelTranslation(Translation translation); /** * Loads the Translation entity which is related to an instance of * Translation and its dependent objects one to many * @param Long id * @return Translation The Translation entity */ public Translation loadFullTranslation(Long id) ;/** * Searches a list of Translation entity based on a Translation containing Translation matching criteria * @param Translation translation * @return List<Translation> */ public List<Translation> searchPrototypeTranslation(Translation translation) ; /** * Searches a list of Translation entity based on a list of Translation containing Translation matching criteria * @param List<Translation> translations * @return List<Translation> */ public List<Translation> searchPrototypeTranslation(List<Translation> translations) ; /** * Searches a list of Translation entity * @param Translation translation * @return List */ public List<Translation> searchPrototypeTranslation(Translation translationPositive, Translation translationNegative) ; /** * Load a paginated list of Translation entity dependent of pagination criteria * @param PaginationCriteria paginationCriteria * @return List */ public List<Translation> loadPaginatedTranslation (Translation translation, PaginationCriteria paginationCriteria) ; }/** * Copyright (c) minuteproject, minuteproject@gmail.com * All rights reserved. * * Licensed under the Apache License, Version 2.0 (the "License") * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. * * More information on minuteproject: * twitter @minuteproject * wiki http://minuteproject.wikispaces.com * blog http://minuteproject.blogspot.net * */ /** * template reference : * - name : BslaDaoInterfaceExtendedUML * - file name : BslaDaoInterfaceKFUML.vm */ package net.sf.mp.demo.tranxy.dao.face.translation;import net.sf.mp.demo.tranxy.domain.translation.Translation; import java.util.List; import net.sf.minuteProject.architecture.filter.data.Criteria; import net.sf.minuteProject.architecture.bsla.dao.face.DataAccessObject;/** * * <p>Title: TranslationExtDao</p> * * <p>Description: Interface of a Data access object dealing with TranslationExtDao * persistence. It offers extended DAO functionalities</p> * */ public interface TranslationExtDao extends DataAccessObject { /** * Inserts a Translation entity with cascade of its children * @param Translation translation */ public void insertTranslationWithCascade(Translation translation) ; /** * Inserts a list of Translation entity with cascade of its children * @param List<Translation> translations */ public void insertTranslationsWithCascade(List<Translation> translations) ; /** * lookup Translation entity Translation, criteria and max result number */ public List<Translation> lookupTranslation(Translation translation, Criteria criteria, Integer numberOfResult); public Integer updateNotNullOnlyTranslation (Translation translation, Criteria criteria);/** * Affect the first translation retrieved corresponding to the translation criteria. * Blank criteria are mapped to null. * If no criteria is found, null is returned. */ public Translation affectTranslation (Translation translation); public Translation affectTranslationUseCache (Translation translation); /** * Assign the first translation retrieved corresponding to the translation criteria. * Blank criteria are mapped to null. * If no criteria is found, null is returned. * If there is no translation corresponding in the database. Then translation is inserted and returned with its primary key(s). */ public Translation assignTranslation (Translation translation);/** * Assign the first translation retrieved corresponding to the mask criteria. * Blank criteria are mapped to null. * If no criteria is found, null is returned. * If there is no translation corresponding in the database. * Then translation is inserted and returned with its primary key(s). * Mask servers usually to set unique keys or the semantic reference */ public Translation assignTranslation (Translation translation, Translation mask); public Translation assignTranslationUseCache (Translation translation); /** * return the first Translation entity found */ public Translation getFirstTranslation (Translation translation); /** * checks if the Translation entity exists */ public boolean existsTranslation (Translation translation); public boolean existsTranslationWhereConditionsAre (Translation translation);/** * partial load enables to specify the fields you want to load explicitly */ public List<Translation> partialLoadTranslation(Translation translation, Translation positiveTranslation, Translation negativeTranslation);/** * partial load with parent entities * variation (list, first, distinct decorator) * variation2 (with cache) */ public List<Translation> partialLoadWithParentTranslation(Translation translation, Translation positiveTranslation, Translation negativeTranslation);public List<Translation> partialLoadWithParentTranslationUseCache(Translation translation, Translation positiveTranslation, Translation negativeTranslation, Boolean useCache);public List<Translation> partialLoadWithParentTranslationUseCacheOnResult(Translation translation, Translation positiveTranslation, Translation negativeTranslation, Boolean useCache);/** * variation first */ public Translation partialLoadWithParentFirstTranslation(Translation translationWhat, Translation positiveTranslation, Translation negativeTranslation); public Translation partialLoadWithParentFirstTranslationUseCache(Translation translationWhat, Translation positiveTranslation, Translation negativeTranslation, Boolean useCache);public Translation partialLoadWithParentFirstTranslationUseCacheOnResult(Translation translationWhat, Translation positiveTranslation, Translation negativeTranslation, Boolean useCache);/** * variation distinct */ public List<Translation> getDistinctTranslation(Translation translationWhat, Translation positiveTranslation, Translation negativeTranslation);// public List partialLoadWithParentForBean(Object bean, Translation translation, Translation positiveTranslation, Translation negativeTranslation);/** * search on prototype with cache */ public List<Translation> searchPrototypeWithCacheTranslation (Translation translation); /** * Searches a list of distinct Translation entity based on a Translation mask and a list of Translation containing Translation matching criteria * @param Translation translation * @param List<Translation> translations * @return List<Translation> */ public List<Translation> searchDistinctPrototypeTranslation(Translation translationMask, List<Translation> translations) ;public List<Translation> countDistinct (Translation whatMask, Translation whereEqCriteria); public Long count (Translation whereEqCriteria); public List<Translation> loadGraph(Translation graphMaskWhat, List<Translation> whereMask); public List<Translation> loadGraphFromParentKey (Translation graphMaskWhat, List<Translation> parents); /** * generic to move after in superclass */ public List<Object[]> getSQLQueryResult(String query); }DAO implementations TranslationJPAImpl and TranslationJPAExtImpl (code not copied). In the future Generic DAO will be used for cross-entity redundant aspects. Adaptation to spring 3.x will be perform (i.e no more JPASupport extension by EntityManager injection) Meanwhile the code here above works fine with spring 2.5+ Spring configurations spring-config-Tranxy-BE-main.xml <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd"><beans><!-- Dao JPA --> <import resource="classpath:net/sf/mp/demo/tranxy/factory/spring/spring-config-JPA-Tranxy-dao.xml"/><!--MP-MANAGED-UPDATABLE-BEGINNING-DISABLE @JPAtranxyFactory-tranxy@--> <!-- hibernate config to put in an appart config file--> <bean id="JPAtranxyFactory" autowire="byName" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean"> <!-- all connection information are retrieve from the persistence file--> <!-- <property name="dataSource" ref="..."/> <property name="persistenceUnitName" value="..."/> --> <property name="persistenceXmlLocation" value="classpath:META-INF/persistence.xml" /> </bean> <!--MP-MANAGED-UPDATABLE-ENDING--> <!-- Database --> <import resource="classpath:net/sf/mp/demo/tranxy/factory/spring/spring-config-Tranxy-database.xml"/></beans>spring-config-Tranxy-database.xml <?xml version="1.0" encoding="UTF-8"?><beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:jndi="http://www.springframework.org/schema/jee" xmlns:tx="http://www.springframework.org/schema/tx" xmlns:aop="http://www.springframework.org/schema/aop" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd http://www.springframework.org/schema/jee http://www.springframework.org/schema/jee/spring-jee-3.0.xsd http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-3.0.xsd http://www.springframework.org/schema/aop http://www.springframework.org/schema/aop/spring-aop-3.0.xsd"><bean id="placeHolderConfig" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer"> <property name="location"><value>classpath:net/sf/mp/demo/tranxy/factory/spring/spring-config-Tranxy.properties</value></property> </bean> <bean id="tranxyTransactionManager" class="org.springframework.orm.jpa.JpaTransactionManager"> <property name="entityManagerFactory" ref="JPAtranxyFactory"/> </bean><!-- to get the entity manager --> <tx:annotation-driven transaction-manager="tranxyTransactionManager"/> </beans>spring-config-Tranxy-BE-main jdbc.tranxy.driverClassName=org.gjt.mm.mysql.Driver jdbc.tranxy.url=jdbc:mysql://127.0.0.1:3306/tranxy jdbc.tranxy.username=root jdbc.tranxy.password=mysql jdbc.tranxy.jndi=jdbc/tranxy hibernate.dialect=org.hibernate.dialect.MySQLDialectspring-config-JPA-Tranxy-dao.xml <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd"><beans><!-- Import Dao definitions for business components --><!-- tranxy --> <import resource="classpath:net/sf/mp/demo/tranxy/factory/spring/tranxy/dao-JPA-Tranxy.xml"/> <!-- translation --> <import resource="classpath:net/sf/mp/demo/tranxy/factory/spring/translation/dao-JPA-Translation.xml"/><!-- Import Ext Dao definitions for business components --> <!-- tranxy extended dao --> <import resource="classpath:net/sf/mp/demo/tranxy/factory/spring/tranxy/dao-ext-JPA-Tranxy.xml"/> <!-- translation extended dao --> <import resource="classpath:net/sf/mp/demo/tranxy/factory/spring/translation/dao-ext-JPA-Translation.xml"/></beans>dao-JPA-Translation.xml <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd"> <beans><bean id="translationDao" class="net.sf.mp.demo.tranxy.dao.impl.jpa.translation.TranslationJPAImpl" singleton="false" > <property name="entityManagerFactory"><ref bean="JPAtranxyFactory"/></property> </bean> <bean id="translationKeyDao" class="net.sf.mp.demo.tranxy.dao.impl.jpa.translation.TranslationKeyJPAImpl" singleton="false" > <property name="entityManagerFactory"><ref bean="JPAtranxyFactory"/></property> </bean> <bean id="translationRequestDao" class="net.sf.mp.demo.tranxy.dao.impl.jpa.translation.TranslationRequestJPAImpl" singleton="false" > <property name="entityManagerFactory"><ref bean="JPAtranxyFactory"/></property> </bean></beans>It is the same for the dao-ext-JPA-Translation.xml, dao-ext-JPA-Tranxy.xml, dao-JPA-Tranxy.xml files But wait a minute… How can I unit test? You need two other artifacts before writting your own test. One is persistence.xml… Again? Yes, with a embedded connection pool, because the shipped with the build of your JPA2 layer may refere a JNDI Datasource (in case the property environment is set to remote). Since it is under /src/test/resources/META-INF it will override the one in the JPA2 package. Two is an adapter that extends AbstractTransactionalJUnit4SpringContextTests: it is generated in /src/test/java package net.sf.mp.demo.tranxy.dao.face;import javax.sql.DataSource;import org.apache.commons.lang.StringUtils; import org.junit.runner.RunWith; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Qualifier; import org.springframework.jdbc.core.simple.SimpleJdbcTemplate; import org.springframework.test.context.ContextConfiguration; import org.springframework.test.context.junit4.AbstractTransactionalJUnit4SpringContextTests; import org.springframework.test.context.junit4.SpringJUnit4ClassRunner; import org.springframework.test.context.transaction.TransactionConfiguration; import org.springframework.transaction.annotation.Transactional;@RunWith(SpringJUnit4ClassRunner.class) @ContextConfiguration(locations={ "classpath:net/sf/mp/demo/tranxy/factory/spring/spring-config-Tranxy-BE-main.xml" }) @TransactionConfiguration(transactionManager = "tranxyTransactionManager") @Transactional public class AdapterTranxyTestDao extends AbstractTransactionalJUnit4SpringContextTests {@Override @Autowired public void setDataSource(@Qualifier(value = "tranxyDataSource") DataSource dataSource) { this.simpleJdbcTemplate = new SimpleJdbcTemplate(dataSource); } ...CXF layer Each entity have a Rest Resource artifact with JAX-RS annotations to enable CRUD access. Example with Translation /** * Copyright (c) minuteproject, minuteproject@gmail.com * All rights reserved. * * Licensed under the Apache License, Version 2.0 (the "License") * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. * * More information on minuteproject: * twitter @minuteproject * wiki http://minuteproject.wikispaces.com * blog http://minuteproject.blogspot.net * */ /** * template reference : * - name : CXFSpringEntityResource * - file name : CXFSpringEntityResource.vm */ package net.sf.mp.demo.tranxy.rest.translation;import java.util.Date; import java.util.List; import java.util.ArrayList; import java.io.*; import java.sql.*;import javax.servlet.http.*;import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Qualifier; import org.springframework.stereotype.Service; import org.springframework.transaction.annotation.Propagation; import org.springframework.transaction.annotation.Transactional;import javax.ws.rs.Path; import javax.ws.rs.PathParam; import javax.ws.rs.FormParam; import javax.ws.rs.Consumes; import javax.ws.rs.POST; import javax.ws.rs.DELETE; import javax.ws.rs.GET; import javax.ws.rs.PUT; import javax.ws.rs.Produces; import javax.ws.rs.core.Context; import javax.ws.rs.core.MediaType; import javax.ws.rs.core.Request; import javax.ws.rs.core.Response; import javax.ws.rs.core.UriInfo; import javax.xml.bind.JAXBElement;import net.sf.mp.demo.tranxy.dao.face.translation.TranslationDao; import net.sf.mp.demo.tranxy.dao.face.translation.TranslationExtDao; import net.sf.mp.demo.tranxy.domain.translation.Translation;/** * * <p>Title: TranslationResource</p> * * <p>Description: remote interface for TranslationResource service </p> * */ @Path ("/rest/xml/translations") @Produces ({MediaType.APPLICATION_XML, MediaType.APPLICATION_JSON}) @Consumes ({MediaType.APPLICATION_XML, MediaType.APPLICATION_JSON}) @Service @Transactional public class TranslationResource {@Autowired @Qualifier("translationDao") TranslationDao translationDao; @Autowired @Qualifier("translationExtDao") TranslationExtDao translationExtDao;//MP-MANAGED-UPDATABLE-BEGINNING-DISABLE @FIND_ALL-translation@ @GET @Produces ({MediaType.APPLICATION_XML, MediaType.APPLICATION_JSON}) public List<Translation> findAll () { List<Translation> r = new ArrayList<Translation>(); List<Translation> l = translationDao.searchPrototypeTranslation(new Translation()); for (Translation translation : l) { r.add(translation.flat()); } return r; } //MP-MANAGED-UPDATABLE-ENDING//MP-MANAGED-UPDATABLE-BEGINNING-DISABLE @FIND_BY_ID-translation@ @GET @Path("{id}") @Produces ({MediaType.APPLICATION_XML, MediaType.APPLICATION_JSON}) public Translation findById (@PathParam ("id") java.lang.Long id) { Translation _translation = new Translation (); _translation.setId(id); _translation = translationExtDao.getFirstTranslation(_translation); if (_translation!=null) return _translation.flat(); return new Translation (); } //MP-MANAGED-UPDATABLE-ENDING@DELETE @Path("{id}") public void delete (@PathParam ("id") Long id) { Translation translation = new Translation (); translation.setId(id); translationDao.deleteTranslation(translation); }@POST @Produces ({MediaType.APPLICATION_XML, MediaType.APPLICATION_JSON}) @Consumes(MediaType.APPLICATION_FORM_URLENCODED) public Translation create ( @FormParam("id") Long id, @FormParam("translation") String translation, @FormParam("language") Integer language, @FormParam("key") Long key, @FormParam("isFinal") Short isFinal, @FormParam("dateFinalization") Date dateFinalization, @FormParam("translator") Long translator, @Context HttpServletResponse servletResponse ) throws IOException { Translation _translation = new Translation ( id, translation, language, key, isFinal, dateFinalization, translator); return save(_translation); }@PUT @Consumes({MediaType.APPLICATION_XML, MediaType.APPLICATION_JSON}) public Translation save(JAXBElement<Translation> jaxbTranslation) { Translation translation = jaxbTranslation.getValue(); if (translation.getId()!=null) return translationDao.updateTranslation(translation); return save(translation); }public Translation save (Translation translation) { translationDao.saveTranslation(translation); return translation; }}And two files for the web application and spring in /src/main/resources/webapp/WEB-INF Web.xml <?xml version="1.0" encoding="UTF-8"?> <web-app version="2.5" xmlns="http://java.sun.com/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd"><display-name>tranxy CXF REST</display-name> <description>tranxy CXF REST access</description><context-param> <param-name>contextConfigLocation</param-name> <param-value>/WEB-INF/application-context.xml</param-value> </context-param><listener> <listener-class>org.springframework.web.context.ContextLoaderListener</listener-class> </listener><servlet> <servlet-name>CXFServlet</servlet-name> <servlet-class>org.apache.cxf.transport.servlet.CXFServlet</servlet-class> <load-on-startup>1</load-on-startup> </servlet><servlet-mapping> <servlet-name>CXFServlet</servlet-name> <url-pattern>/*</url-pattern> </servlet-mapping></web-app> application-context.xml <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:context="http://www.springframework.org/schema/context" xmlns:tx="http://www.springframework.org/schema/tx" xmlns:jaxrs="http://cxf.apache.org/jaxrs" xsi:schemaLocation="http://www.springframework.org/schema/beanshttp://www.springframework.org/schema/beans/spring-beans-3.0.xsdhttp://www.springframework.org/schema/contexthttp://www.springframework.org/schema/context/spring-context-3.0.xsdhttp://www.springframework.org/schema/txhttp://www.springframework.org/schema/tx/spring-tx-3.0.xsdhttp://cxf.apache.org/jaxrshttp://cxf.apache.org/schemas/jaxrs.xsd"><import resource="classpath:META-INF/cxf/cxf.xml" /> <import resource="classpath:META-INF/cxf/cxf-extension-jaxrs-binding.xml" /> <import resource="classpath:META-INF/cxf/cxf-servlet.xml" /> <context:component-scan base-package="net.sf.mp.demo.tranxy.rest"/><import resource="classpath:net/sf/mp/demo/tranxy/factory/spring/spring-config-Tranxy-BE-main.xml"/> <jaxrs:server id="restContainer" address="/"> <jaxrs:serviceBeans> <!-- tranxy --> <ref bean="applicationResource"/> <ref bean="languageResource"/> <ref bean="userResource"/> <!-- translation --> <ref bean="translationResource"/> <ref bean="translationKeyResource"/> <ref bean="translationRequestResource"/> <!-- statements --> </jaxrs:serviceBeans> </jaxrs:server></beans> Package, deployment and test Package Before building the package there is a dependency shipped with minuteproject mp-bsla.x.y.jar to install. In /target/mp-bsla/ Run script: maven-install.cmd/sh  Build: >mvn clean package The result is tranxyRestCxfApp.war in /rest/target Deployment Start tomcat Drop tranxyRestCxfApp.war in /webapps There is an embedded connection pool, so no configuration is needed on tomcat. Test USE `tranxy` ; DELETE FROM application_x_key; DELETE FROM translation; DELETE FROM language_x_translator; DELETE FROM language_x_speaker; DELETE FROM request_key;DELETE FROM application; DELETE FROM translation_key; DELETE FROM user; DELETE FROM language;INSERT INTO application (idapplication, name, description, type) VALUES (-1,'Porphyry', 'OS application holding environment app', 'OPENSOURCE'); INSERT INTO application (idapplication, name, description, type) VALUES (-2,'Minuteproject', 'Minuteproject app', 'OPENSOURCE');INSERT INTO user (iduser, first_name, last_name, email) VALUES (-1,'test', 'lastName', 'test@test.me'); INSERT INTO user (iduser, first_name, last_name, email) VALUES (-2,'test2', 'lastName2', 'test2@test.me');INSERT INTO language (idlanguage, code, description, locale) VALUES (-1, 'FR', 'France', 'fr'); INSERT INTO language (idlanguage, code, description, locale) VALUES (-2, 'ES', 'Spanish', 'es'); INSERT INTO language (idlanguage, code, description, locale) VALUES (-3, 'EN', 'English', 'en');INSERT INTO language_x_translator (language_id, user_id) VALUES (-1, -1); INSERT INTO language_x_translator (language_id, user_id) VALUES (-2, -1); INSERT INTO language_x_speaker (language_id, user_id) VALUES (-1, -1); INSERT INTO language_x_speaker (language_id, user_id) VALUES (-2, -1); INSERT INTO language_x_translator (language_id, user_id) VALUES (-1, -2); INSERT INTO language_x_translator (language_id, user_id) VALUES (-2, -2); INSERT INTO language_x_translator (language_id, user_id) VALUES (-3, -2);INSERT INTO translation_key (id, key_name, description) VALUES (-1, 'msg.user.name', 'user name'); INSERT INTO translation (id, translation, language_id, key_id, is_final, date_finalization, translator_id) VALUES (-1, 'nom', -1, -1, 1, '2012-04-04', -1); INSERT INTO translation (id, translation, language_id, key_id, is_final, date_finalization, translator_id) VALUES (-2, 'apellido', -1, -2, 1, CURDATE(), -1);Now enter http://localhost:8080/tranxyRestCxfApp/rest/xml/languages to get all the languages This is the resultNow enter http://localhost:8080/tranxyRestCxfApp/rest/xml/users/-1 to get the first user This is the resultConclusion This article presented you how to get quickly a CRUD REST interface on top of your DB model. Of course, you may not need CRUD for all entities and may you need more coarse grain functions to manipulate your model. Next article will present you how with Statement Driven Development we can get closer to Use Case. Reference: RigaJUG – demo – REST CXF from our JCG partner Florian Adler at the minuteproject blog blog....
spring-logo

Dynamic Property Management in Spring

Static and Dynamic Properties are very important for both operational management and changing the behavior of the system at the production level. Specially, dynamic parameters reduces interruption of the service. This article shows how to manage dynamic properties in Spring Applications by using Quartz. Multi-Job Scheduling Service by using Spring and Quartz article is offered for Spring and Quartz Integration. Let us look at Dynamic Property Management in Spring. Used Technologies : JDK 1.6.0_31 Spring 3.1.1 Quartz 1.8.5 Maven 3.0.2 STEP 1 : CREATE MAVEN PROJECT A maven project is created as follows. (It can be created by using Maven or IDE Plug-in).STEP 2 : LIBRARIES Spring dependencies are added to Maven’ s pom.xml. <properties> <spring.version>3.1.1.RELEASE</spring.version> </properties><dependencies> <!-- Spring 3 dependencies --> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-core</artifactId> <version>${spring.version}</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-context</artifactId> <version>${spring.version}</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-context-support</artifactId> <version>${spring.version}</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-tx</artifactId> <version>${spring.version}</version> </dependency><!-- Quartz dependency --> <dependency> <groupId>org.quartz-scheduler</groupId> <artifactId>quartz</artifactId> <version>1.8.5</version> </dependency><!-- Log4j dependency --> <dependency> <groupId>log4j</groupId> <artifactId>log4j</artifactId> <version>1.2.16</version> </dependency></dependencies>STEP 3 : CREATE DynamicPropertiesFile.properties DynamicPropertiesFile covers dynamic properties of the application. # This property defines message content # Possible values = Text. Default value : Welcome Message_Content = Welcome Visitor# This property defines minimum visitor count # Possible values = positive integer. Default value : 1 Minimum_Visitor_Count = 1# This property defines maximum visitor count # Possible values = positive integer. Default value : 10 Maximum_Visitor_Count = 10STEP 4 : CREATE applicationContext.xml Application Context is created as follows : <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beanshttp://www.springframework.org/schema/beans/spring-beans-3.0.xsd"><!-- Beans Declaration --> <!-- Core Dynamic Properties Bean Declaration --> <bean id="CoreDynamicPropertiesBean" class="org.springframework.beans.factory.config.PropertiesFactoryBean" scope="prototype"> <property name="location" value="classpath:DynamicPropertiesFile.properties" /> </bean> <!-- Dynamic Properties Map Declaration --> <bean id="DynamicPropertiesMap" class="java.util.HashMap"/><!-- Dynamic Properties File Reader Task Declaration --> <bean id="DynamicPropertiesFileReaderTask" class="com.otv.dynamic.properties.task.DynamicPropertiesFileReaderTask"> <property name="dynamicPropertiesMap" ref="DynamicPropertiesMap"/> </bean> <!-- End of Beans Declaration --><!-- Scheduler Configuration --> <!-- Job Detail--> <bean id="DynamicPropertiesFileReaderTaskJobDetail" class="org.springframework.scheduling.quartz.MethodInvokingJobDetailFactoryBean"> <property name="targetObject" ref="DynamicPropertiesFileReaderTask" /> <property name="targetMethod" value="start" /> </bean><!-- Simple Trigger --> <bean id="DynamicPropertiesFileReaderTaskTrigger" class="org.springframework.scheduling.quartz.SimpleTriggerBean"> <property name="jobDetail" ref="DynamicPropertiesFileReaderTaskJobDetail" /> <property name="repeatInterval" value="60000" /> <property name="startDelay" value="0" /> </bean><bean class="org.springframework.scheduling.quartz.SchedulerFactoryBean"> <property name="jobDetails"> <list> <ref bean="DynamicPropertiesFileReaderTaskJobDetail" /> </list> </property> <property name="triggers"> <list> <ref bean="DynamicPropertiesFileReaderTaskTrigger" /> </list> </property> </bean> <!-- End of Scheduler Configuration --> </beans>STEP 5 : CREATE SystemConstants CLASS A new SystemConstants Class is created. This class covers all system constants. package com.otv.common;/** * System Constants * * @author onlinetechvision.com * @since 26 May 2012 * @version 1.0.0 * */ public class SystemConstants {//Names of Dynamic Properties... public static final String DYNAMIC_PROPERTY_MESSAGE_CONTENT = "Message_Content"; public static final String DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT = "Minimum_Visitor_Count"; public static final String DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT = "Maximum_Visitor_Count";//Default Values of Dynamic Properties... public static final String DYNAMIC_PROPERTY_MESSAGE_CONTENT_DEFAULT_VALUE = "Welcome"; public static final String DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT_DEFAULT_VALUE = "1"; public static final String DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT_DEFAULT_VALUE = "10";public static final String BEAN_NAME_CORE_DYNAMIC_PROPERTIES_BEAN = "CoreDynamicPropertiesBean";public static final String APPLICATION_CONTEXT_FILE_NAME = "applicationContext.xml"; }STEP 6 : CREATE DynamicPropertiesFileReaderTask CLASS DynamicPropertiesFileReaderTask Class is created. This class is managed by Quartz. It reads all dynamic properties via DynamicPropertiesFile by invoking “start” method in every minute. Reading period can be changed via applicationContext.xml. Please note that coreDynamicPropertiesBean‘ s scope is Singleton by default. It must return new values of dynamic properties at the runtime so its scope should be set Prototype. Otherwise, new values can not be received.package com.otv.dynamic.properties.task;import java.util.HashMap; import java.util.Properties;import org.apache.log4j.Logger; import org.springframework.beans.BeansException; import org.springframework.beans.factory.BeanFactory; import org.springframework.beans.factory.BeanFactoryAware;import com.otv.common.SystemConstants;/** * Dynamic Properties File Reader Task * * @author onlinetechvision.com * @since 26 May 2012 * @version 1.0.0 * */ public class DynamicPropertiesFileReaderTask implements BeanFactoryAware {private static Logger logger = Logger.getLogger(DynamicPropertiesFileReaderTask.class); private Properties coreDynamicPropertiesBean; private HashMap<String, String> dynamicPropertiesMap; private BeanFactory beanFactory;/** * Starts reading the dynamic properties * */ public void start() {setCoreDynamicPropertiesBean(createCoreDynamicPropertiesBeanInstance());logger.info("**** Dynamic Properties File Reader Task is being started... ****"); readConfiguration(); logger.info("**** Dynamic Properties File Reader Task is stopped... ****"); }/** * Reads all the dynamic properties * */ private void readConfiguration() { readMessageContent(); readMinimumVisitorCount(); readMaximumVisitorCount(); }/** * Reads Message_Content dynamic property * */ private void readMessageContent() { String messageContent = getCoreDynamicPropertiesBean() .getProperty(SystemConstants.DYNAMIC_PROPERTY_MESSAGE_CONTENT, SystemConstants.DYNAMIC_PROPERTY_MESSAGE_CONTENT_DEFAULT_VALUE);if (messageContent.equals("")){ getDynamicPropertiesMap().put(SystemConstants.DYNAMIC_PROPERTY_MESSAGE_CONTENT, SystemConstants.DYNAMIC_PROPERTY_MESSAGE_CONTENT_DEFAULT_VALUE);logger.error(SystemConstants.DYNAMIC_PROPERTY_MESSAGE_CONTENT + " value is not found so its default value is set. Default value : " + SystemConstants.DYNAMIC_PROPERTY_MESSAGE_CONTENT_DEFAULT_VALUE);} else { messageContent = messageContent.trim(); getDynamicPropertiesMap().put(SystemConstants.DYNAMIC_PROPERTY_MESSAGE_CONTENT, messageContent); logger.info(SystemConstants.DYNAMIC_PROPERTY_MESSAGE_CONTENT + " : " + getDynamicPropertiesMap().get(SystemConstants.DYNAMIC_PROPERTY_MESSAGE_CONTENT)); } }/** * Reads Minimum_Visitor_Count dynamic property * */ private void readMinimumVisitorCount() { String minimumVisitorCount = getCoreDynamicPropertiesBean() .getProperty(SystemConstants.DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT, SystemConstants.DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT_DEFAULT_VALUE).trim();try { if (Integer.parseInt(minimumVisitorCount) > 0){ getDynamicPropertiesMap().put(SystemConstants.DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT, minimumVisitorCount);logger.info(SystemConstants.DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT + " : " + getDynamicPropertiesMap().get(SystemConstants.DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT)); } else { getDynamicPropertiesMap().put(SystemConstants.DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT, SystemConstants.DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT_DEFAULT_VALUE);logger.error("Invalid "+SystemConstants.DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT + " value encountered. Must be greater than 0. Its default value is set. Default value : "+ SystemConstants.DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT_DEFAULT_VALUE); } } catch (NumberFormatException nfe) { logger.error("Invalid "+SystemConstants.DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT + " value encountered. Must be numeric!", nfe);logger.warn(SystemConstants.DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT + " default value is set. Default value : " + SystemConstants.DYNAMIC_PROPERTY_MINIMUM_VISITOR_COUNT_DEFAULT_VALUE); } }/** * Reads Maximum_Visitor_Count dynamic property * */ private void readMaximumVisitorCount() { String maximumVisitorCount = getCoreDynamicPropertiesBean() .getProperty(SystemConstants.DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT, SystemConstants.DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT_DEFAULT_VALUE).trim(); try { if (Integer.parseInt(maximumVisitorCount) > 0){ getDynamicPropertiesMap().put(SystemConstants.DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT, maximumVisitorCount);logger.info(SystemConstants.DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT + " : " + getDynamicPropertiesMap() .get(SystemConstants.DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT)); } else { getDynamicPropertiesMap().put(SystemConstants.DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT, SystemConstants.DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT_DEFAULT_VALUE);logger.error("Invalid "+SystemConstants.DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT + " value encountered. Must be greater than 0. Its default value is set. Default value : " + SystemConstants.DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT_DEFAULT_VALUE);} } catch (NumberFormatException nfe) { logger.error("Invalid "+SystemConstants.DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT + " value encountered. Must be numeric!", nfe);logger.warn(SystemConstants.DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT + " default value is set. Default value : " + SystemConstants.DYNAMIC_PROPERTY_MAXIMUM_VISITOR_COUNT_DEFAULT_VALUE); } }/** * Gets CoreDynamicPropertiesBean * * @return Properties coreDynamicPropertiesBean */ public Properties getCoreDynamicPropertiesBean() { return coreDynamicPropertiesBean; }/** * Sets CoreDynamicPropertiesBean * * @param Properties coreDynamicPropertiesBean */ public void setCoreDynamicPropertiesBean(Properties coreDynamicPropertiesBean) { this.coreDynamicPropertiesBean = coreDynamicPropertiesBean; }/** * Gets DynamicPropertiesMap * * @return HashMap dynamicPropertiesMap */ public HashMap<String, String> getDynamicPropertiesMap() { return dynamicPropertiesMap; }/** * Sets DynamicPropertiesMap * * @param HashMap dynamicPropertiesMap */ public void setDynamicPropertiesMap(HashMap<String, String> dynamicPropertiesMap) { this.dynamicPropertiesMap = dynamicPropertiesMap; }/** * Gets a new instance of CoreDynamicPropertiesBean * * @return Properties CoreDynamicPropertiesBean */ public Properties createCoreDynamicPropertiesBeanInstance() { return (Properties) this.beanFactory.getBean(SystemConstants.BEAN_NAME_CORE_DYNAMIC_PROPERTIES_BEAN); }/** * Sets BeanFactory * * @param BeanFactory beanFactory */ public void setBeanFactory(BeanFactory beanFactory) throws BeansException { this.beanFactory = beanFactory; }}STEP 7 : CREATE Application CLASS Application Class starts the project. package com.otv.starter;import org.springframework.context.support.ClassPathXmlApplicationContext;import com.otv.common.SystemConstants;/** * Application Starter Class * * @author onlinetechvision.com * @since 26 May 2012 * @version 1.0.0 * */ public class Application {/** * Main method of the Application * */ public static void main(String[] args) { new ClassPathXmlApplicationContext(SystemConstants.APPLICATION_CONTEXT_FILE_NAME); } }STEP 8 : RUN PROJECT If Application Class is run, following console logs are shown : 26.05.2012 17:25:09 INFO (DefaultLifecycleProcessor.java:334) - Starting beans in phase 2147483647 26.05.2012 17:25:09 INFO (SchedulerFactoryBean.java:648) - Starting Quartz Scheduler now 26.05.2012 17:25:09 INFO (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [DynamicPropertiesFile.properties] 26.05.2012 17:25:09 INFO (DynamicPropertiesFileReaderTask.java:36) - **** Dynamic Properties File Reader Task is being started... **** 26.05.2012 17:25:09 INFO (DynamicPropertiesFileReaderTask.java:63) - Message_Content : Welcome Visitor 26.05.2012 17:25:09 INFO (DynamicPropertiesFileReaderTask.java:76) - Minimum_Visitor_Count : 1 26.05.2012 17:25:09 INFO (DynamicPropertiesFileReaderTask.java:96) - Maximum_Visitor_Count : 10 26.05.2012 17:25:09 INFO (DynamicPropertiesFileReaderTask.java:38) - **** Dynamic Properties File Reader Task is stopped... ****26.05.2012 17:26:09 INFO (PropertiesLoaderSupport.java:177) - Loading properties file from class path resource [DynamicPropertiesFile.properties] 26.05.2012 17:26:09 INFO (DynamicPropertiesFileReaderTask.java:36) - **** Dynamic Properties File Reader Task is being started... **** 26.05.2012 17:26:09 INFO (DynamicPropertiesFileReaderTask.java:63) - Message_Content : Welcome Visitor, Bruce! 26.05.2012 17:26:09 INFO (DynamicPropertiesFileReaderTask.java:76) - Minimum_Visitor_Count : 2 26.05.2012 17:26:09 INFO (DynamicPropertiesFileReaderTask.java:96) - Maximum_Visitor_Count : 20 26.05.2012 17:26:09 INFO (DynamicPropertiesFileReaderTask.java:38) - **** Dynamic Properties File Reader Task is stopped... ****STEP 9 : DOWNLOAD OTV_SpringDynamicPropertyManagement REFERENCES : Spring Framework Reference 3.x Reference: Dynamic Property Management in Spring from our JCG partner Eren Avsarogullari at the Online Technology Vision blog....
devops-logo

Embedding Ops members in Dev teams

For about 2 months I was sitting with a dev team while we worked through how to build a new service which will be continuously deployed. I wanted to share my experiences here because I’ve read both positive and negative opinions about doing this and I’m not sure there’s a single right answer. It was certainly an interesting experiment and one I may repeat in the future, with some modifications. This started when I began attending this teams daily stand up. My goal was to get more involved with a single team in dev to get a better idea of how the process worked in general. This team was one of our two Infrastructure teams which focus on scalability, stability & performance enhancing changes. Initially this team was creating a new Web Services API service which I wrote a little bit about here. Eventually that service was set aside and the team moved on to a new Authorization and Authentication service. For this new service the decision was made to use Continuous Deployment. We were already doing fully automated deploys at least once per week but there was a bit of a jump to giving the developers the tools they needed to deploy every commit including monitoring & deployment automation changes. I had also noticed, leading up to this, that the few times I had sat over with the team I was immediately more involved in discussions – they asked me questions (because I was there) and I had the option of attending planning sessions. There was literally a 20 foot difference between my own desk & the desk at sat at “with them” but it made a world of difference. As such, I talked to my management about sitting with that team all the time and they agreed to try it. Now, this team is a bit unique. The team is constructed of a handful of developers working on the code but it also is the home of the Build & Release guy as well as our Sysadmin who manages the testing infrastructure. Sitting with this team gave me an opportunity to not only be involved in the development of this new service but to also become more involved in the Build & Release process, getting familiar with the day to day problems that are dealt with as well as pairing with folks to work on our puppet configurations which are shared between dev & prod. This team structure, along with me, also made them uniquely suited to tackle the Continuous Deployment problem (at least for this service) completely within a single team. As part of the Continuous Deployment implementation we wanted to make it as easy as possible for developers to get access to the metrics they needed. We already had splunk for log access but our monitoring system required Ops involvement to manage new metrics. So as part of this new service we also had to perform a spike on a new metric collection/trending systems – we looked at Ganglia & Graphite. We weren’t trying to tackle alerting – we just made it a requirement that any system we select be able to expose metrics to Nagios. I worked with the developers to test out a variety of ways for our application to push metrics into each of these systems while also evaluating each system for good Operational fit (ease of management, performance, scalability, etc). Throughout this process there were also a lot of questions about how to perform deployments. How many previous builds do we keep? When and how do we rollback? What is our criteria for calling a deployment successful? How do we make sure it fails in test before it fails in production? What do we have to build into the service to allow rolling deploys to not interrupt service? The list goes on – these are all things that you should think about with any service but when the Developers are building the deployment tools they become very aware of all of this – it was awesome. After about 45 days we had the monitoring system selected & running in production and test, we had deployments going to our testing systems and we were just starting to deploy into production. We now had to start our dark launch, sending traffic from our production system to the new service without impacting production traffic so we can see how this backend service performs, whether it is responding correctly to production traffic & generally get a better understanding of behavior with prod traffic. Today this service is still operating dark as we tweak and tune a variety of things to make sure it’s ready for production – again, it’s awesome. 60 days in things started winding down. We had been dark launched for a few weeks and largely the developers had access to everything they needed – they could look at graphs, logs, if they needed new metrics they just added them to the code and they showed up in monitoring as soon as they deployed. We got deploy lines added onto the graphs so we could correlate deployments with trends on the graph – more awesome. However my work was winding down, there were fewer and fewer Operational questions coming up and I was starting to move back toward working on other Ops projects. As I looked back on the last 60 days working with this team I realized the same 20 feet that kept me from being involved with the development team had now kept me from being involved with the Ops team. I was really conflicted but it felt like the healthy thing to do would be to move back over into Ops now that the work was winding down. I immediately realized the impact it had as people made comments “wow, you’re back!”… seriously folks, I was 20 feet away! You shot me with nerf darts! So now I’ve been back over in Ops for a few weeks and there has actually been a change – I’m still much more involved with that Dev team than I was from the start. They still include me in planning & they come to me when there are Operational questions or issues that come up around the service. However, that 20 feet is there again and I can’t hear all the conversations and I know there are questions that would get asked if someone didn’t have to stand up and walk over. Our Dev teams tend to do a lot of pairing and as a result aren’t often on IM and email responses are usually delayed – pairing certainly cuts down on the email checking. Was I happy I did it? Absolutely. Would I do it again? I think I would – but I would constrain it and set expectations. I think the physical proximity to the team helped a lot to move quickly and toss ideas around while the service was being developed and decisions were being made but it did have an impact on my relationship with the Ops team that I wish I could have avoided. I think continuing to move back and forth – spending time with the Ops team would be helpful. I actually did spend my on-call weeks (every 4th week) in Ops instead of sitting with the Dev team, but I would try to find some time during the 3 weeks in-between to be over there too, it was just too much absence. All that said, I think overall the company and the service is better for the way this turned out and for me personally it was a super insightful experience that I wish every Ops person could try sometimes. Reference: Embedding Ops members in Dev teams – my recent experience from our JCG partner Aaron Nichols at the Operation Bootstrap blog. ...
Java Code Geeks and all content copyright © 2010-2014, Exelixis Media Ltd | Terms of Use | Privacy Policy
All trademarks and registered trademarks appearing on Java Code Geeks are the property of their respective owners.
Java is a trademark or registered trademark of Oracle Corporation in the United States and other countries.
Java Code Geeks is not connected to Oracle Corporation and is not sponsored by Oracle Corporation.
Do you want to know how to develop your skillset and become a ...
Java Rockstar?

Subscribe to our newsletter to start Rocking right now!

To get you started we give you two of our best selling eBooks for FREE!

Get ready to Rock!
You can download the complementary eBooks using the links below:
Close