Featured FREE Whitepapers

What's New Here?

javafx-logo

Integrating JavaFX 2.0 with Swing and SWT

One of the improvements in JavaFX with JavaFX 2.0 has been greater ease of interoperability with Swing and SWT. Several online resources document how this is done. These include Integrating JavaFX into Swing Applications and SWT Interop. However, in a nice example of effective class-level Javadoc documentation, the respective JavaFX classes javafx.embed.swing.JFXPanel and javafx.embed.swt.FXCanvas each provide a simple code sample of how to use the class to embed JavaFX into Swing or SWT code. In this post, I build upon the code samples provided in these classes’ Javadoc documentation to demonstrate JavaFX integration with Swing and SWT. Both JFXPanel and FXCanvas allow a JavaFX Scene to be set on their instance. The instance of Scene (based on my Simple JavaFX 2.0 Text Example post) to be be used in my examples in this post are provided by the method shown in the next JavaFX-specific code example. Method Providing a JavaFX Scene for Integration package dustin.examples;import javafx.scene.Group; import javafx.scene.Scene; import javafx.scene.effect.*; import javafx.scene.paint.Color; import javafx.scene.text.Font; import javafx.scene.text.FontWeight; import javafx.scene.text.Text;/** * Simple class intended to be used by two examples of integrating JavaFX with * Swing and with SWT. Provides single method {@code createScene()} to be used * by the classes that are examples of integrating Swing with JavaFX and SWT * with JavaFX. * * @author Dustin */ public class TextIntegrationSceneCreator { /** * Provides an instance of Scene with JavaFX text examples. * * @return Instance of Scene with text examples. */ public static Scene createTextScene() { final Group rootGroup = new Group(); final Scene scene = new Scene(rootGroup, 800, 400, Color.BEIGE);final Text text1 = new Text(25, 25, "(2007) JavaFX based on F3"); text1.setFill(Color.CHOCOLATE); text1.setFont(Font.font(java.awt.Font.SERIF, 25)); rootGroup.getChildren().add(text1);final Text text2 = new Text(25, 50, "(2010) JavaFX Script Deprecated"); text2.setFill(Color.DARKBLUE); text2.setFont(Font.font(java.awt.Font.SANS_SERIF, 30)); rootGroup.getChildren().add(text2);final Text text3 = new Text(25, 75, "(2011) JavaFX to be Open Sourced!"); text3.setFill(Color.TEAL); text3.setFont(Font.font(java.awt.Font.MONOSPACED, 35)); rootGroup.getChildren().add(text3);final Text text4 = new Text(25, 125, "(2011) JavaFX to be Standardized"); text4.setFill(Color.CRIMSON); text4.setFont(Font.font(java.awt.Font.DIALOG, 40)); final Effect glow = new Glow(1.0); text4.setEffect(glow); rootGroup.getChildren().add(text4);final Text text5 = new Text(25, 175, "(Now) Time for JavaFX 2.0!"); text5.setFill(Color.DARKVIOLET); text5.setFont(Font.font(java.awt.Font.SERIF, FontWeight.EXTRA_BOLD, 45)); final Light.Distant light = new Light.Distant(); light.setAzimuth(-135.0); final Lighting lighting = new Lighting(); lighting.setLight(light); lighting.setSurfaceScale(9.0); text5.setEffect(lighting); rootGroup.getChildren().add(text5);final Text text6 = new Text(25, 225, "JavaFX News at JavaOne!"); text6.setFill(Color.DARKGREEN); text6.setBlendMode(BlendMode.COLOR_BURN); text6.setFont(Font.font(java.awt.Font.DIALOG_INPUT, FontWeight.THIN, 45)); final Reflection reflection = new Reflection(); reflection.setFraction(1.0); text6.setEffect(reflection); rootGroup.getChildren().add(text6);return scene; } }A JavaFX Scene can be integrated into Swing code via the JavaFX class JFXPanel and its setScene(Scene) method. This is demonstrated in the next code listing, which gets the particular Scene instance from the method in the previous code listing. JavaFX/Swing Integration with JFXPanel package dustin.examples;import javafx.application.Platform; import javafx.embed.swing.JFXPanel; import javafx.scene.Scene; import javax.swing.JFrame; import javax.swing.SwingUtilities;/** * Simple class demonstrating interoperability between Swing and JavaFX. This * class is adapted from the example provided in the Javadoc documentation for * {@code javafx.embed.swing.JFXPanel}. */ public class SwingJavaFxInteroperabilityDemo { private static void initAndShowGUI() { // This method is invoked on Swing thread final JFrame frame = new JFrame("JavaFX / Swing Integrated"); final JFXPanel fxPanel = new JFXPanel(); frame.add(fxPanel); frame.setVisible(true);Platform.runLater(new Runnable() { @Override public void run() { initFX(fxPanel); } }); }private static void initFX(JFXPanel fxPanel) { // This method is invoked on JavaFX thread final Scene scene = TextIntegrationSceneCreator.createTextScene(); fxPanel.setScene(scene); }public static void main(String[] arguments) { SwingUtilities.invokeLater(new Runnable() { @Override public void run() { initAndShowGUI(); } }); } }The output of running this simple Java Swing application with embedded JavaFX Scene is shown next.Integrating SWT with JavaFX is arguably even easier and is demonstrated in the next code listing. As with the Swing integration example, the main approach is to call FXCanvas‘s setScene(Scene) method. JavaFX/SWT Integration with FXCanvas package dustin.examples;import javafx.embed.swt.FXCanvas; import javafx.scene.Scene; import org.eclipse.swt.SWT; import org.eclipse.swt.layout.FillLayout; import org.eclipse.swt.widgets.Display; import org.eclipse.swt.widgets.Shell;/** * Simple class demonstrating interoperability between SWT and JavaFX. This * class is based on the example provided in the Javadoc documentation for * {@code javafx.embed.swt.FXCanvas}. * * @author Dustin */ public class SwtJavaFxInteroperabilityDemo { public static void main(String[] arguments) { final Display display = new Display(); final Shell shell = new Shell(display); shell.setText("JavaFX / SWT Integration"); shell.setLayout(new FillLayout()); final FXCanvas canvas = new FXCanvas(shell, SWT.NONE); final Scene scene = TextIntegrationSceneCreator.createTextScene(); canvas.setScene(scene); shell.open(); while (!shell.isDisposed()) { if (!display.readAndDispatch()) display.sleep(); } display.dispose(); } }The next screen snapshot shows what this simple SWT application with embedded JavaFX looks like.The code listings for Swing integration with JavaFX and for SWT integration with JavaFX shown above are only slightly adapted from the Javadoc documentation for the JavaFX classes JFXPanel (Swing) and FXCanvas (SWT). It is nice that these classes provide these examples in their documentation and it is really nice that integration has become so much easier. For more thorough coverage of JavaFX/Swing integration, see Integrating JavaFX into Swing Applications. Reference: Integrating JavaFX 2.0 with Swing and SWT   from our JCG partner Dustin Marx  at the Inspired by Actual Events blog Related Articles :Migrating from JavaFX 1.3 to JavaFX 2.0 JavaFX 2.0 beta sample application and after thoughts JavaOne is Rebuilding Momentum Sometimes in Java, One Layout Manager Is Not Enough...
java-logo

Using a memory mapped file for a huge matrix

Overview Matrices can be really large, sometimes larger than you can hold in one array. You can extend the maximum size by having multiple arrays however this can make your heap size really large and inefficient. An alternative is to use a wrapper over a memory mapped file. The advantage of memory mapped files is that they have very little impact on the heap and can be swapped in and out by the OS fairly transparently. A huge matrix This code supports large matrices of double. It partitions the file into 1 GB mappings. (As Java doesn’t support mappings of 2 GB or more at a time, a pet hate of mine ;) import sun.misc.Cleaner; import sun.nio.ch.DirectBuffer;import java.io.Closeable; import java.io.IOException; import java.io.RandomAccessFile; import java.nio.MappedByteBuffer; import java.nio.channels.FileChannel; import java.util.ArrayList; import java.util.List;public class LargeDoubleMatrix implements Closeable { private static final int MAPPING_SIZE = 1 << 30; private final RandomAccessFile raf; private final int width; private final int height; private final List mappings = new ArrayList();public LargeDoubleMatrix(String filename, int width, int height) throws IOException { this.raf = new RandomAccessFile(filename, "rw"); try { this.width = width; this.height = height; long size = 8L * width * height; for (long offset = 0; offset < size; offset += MAPPING_SIZE) { long size2 = Math.min(size - offset, MAPPING_SIZE); mappings.add(raf.getChannel().map(FileChannel.MapMode.READ_WRITE, offset, size2)); } } catch (IOException e) { raf.close(); throw e; } }protected long position(int x, int y) { return (long) y * width + x; }public int width() { return width; }public int height() { return height; }public double get(int x, int y) { assert x >= 0 && x < width; assert y >= 0 && y < height; long p = position(x, y) * 8; int mapN = (int) (p / MAPPING_SIZE); int offN = (int) (p % MAPPING_SIZE); return mappings.get(mapN).getDouble(offN); }public void set(int x, int y, double d) { assert x >= 0 && x < width; assert y >= 0 && y < height; long p = position(x, y) * 8; int mapN = (int) (p / MAPPING_SIZE); int offN = (int) (p % MAPPING_SIZE); mappings.get(mapN).putDouble(offN, d); }public void close() throws IOException { for (MappedByteBuffer mapping : mappings) clean(mapping); raf.close(); }private void clean(MappedByteBuffer mapping) { if (mapping == null) return; Cleaner cleaner = ((DirectBuffer) mapping).cleaner(); if (cleaner != null) cleaner.clean(); } }public class LargeDoubleMatrixTest { @Test public void getSetMatrix() throws IOException { long start = System.nanoTime(); final long used0 = usedMemory(); LargeDoubleMatrix matrix = new LargeDoubleMatrix("ldm.test", 1000 * 1000, 1000 * 1000); for (int i = 0; i < matrix.width(); i++) matrix.set(i, i, i); for (int i = 0; i < matrix.width(); i++) assertEquals(i, matrix.get(i, i), 0.0); long time = System.nanoTime() - start; final long used = usedMemory() - used0; if (used == 0) System.err.println("You need to use -XX:-UseTLAB to see small changes in memory usage."); System.out.printf("Setting the diagonal took %,d ms, Heap used is %,d KB%n", time / 1000 / 1000, used / 1024); matrix.close(); }private long usedMemory() { return Runtime.getRuntime().totalMemory() - Runtime.getRuntime().freeMemory(); } }With the following test, which writes to each of the diagonal values of a one million * one million matrix. This is too large to hope to create on the heap. Setting the diagonal took 314,819 ms, Heap used is 2,025 KB$ ls -l ldm.test -rw-rw-r-- 1 peter peter 8000000000000 2011-12-30 12:42 ldm.test $ du -s ldm.test 4010600 ldm.testThat’s 8,000,000,000,000 bytes or ~7.3 TB in virtual memory, in a Java process! This works because it only allocates or pages in the pages which you use. So while the file size is almost 8 TB, the actual disk space and memory used is 4 GB. With a more modest file size of 100K * 100K matrix you see something like the following. Its still an 80 GB matrix, which uses trivial heap space. ;) Setting the diagonal took 110 ms, Heap used is 71 KB$ ls -l ldm.test -rw-rw-r-- 1 peter peter 80000000000 2011-12-30 12:49 ldm.test $ du -s ldm.test 400000 ldm.testReference: Using a memory mapped file for a huge matrix   from our JCG partner Peter Lawrey  at the Vanilla Java blog Related Articles :How to get C like performance in Java Low GC in Java: Use primitives instead of wrappers Recycling objects to improve performance Quick tips for improving Java apps performance Java Secret: Loading and unloading static fields High performance JPA with GlassFish and Coherence – Part 1...
scala-logo

First steps into Scala

For over a full year now, I’ve been looking into Scala. I have heard many people talk about it passionately and it just got me interested. Also a lot of big companies are investing in this new language. I just figured I had to check it out. In my 4 years of Java programming, I’ve learned the Java EE stack, (Spring and Java EE) and I must say, not much has changed since then. We got the long awaited release of Java 7 (with lots of features missing) and in 2009 Sun released Java EE 6. Java EE 6 was a cool release and I have blogged about it for a few times but it is not more than just more abstraction of the same concepts. Spring hasn’t really been moving forward at all. When you look at the release which was used when I started programming professionally and the one we use in production, it is largely still the same version. There is a 3.0 release but to be honest, it is not very compelling.So after hearing people like Dick Wall talk about Scala for a long time, I decided to pick it up. The first thing I have tried was the Scala Koans. Scala cones is a project on github that helps you to learn a language by correcting failing tests. At the time the Scala Koans project was still in the very early stage and I did not get them to work and gave up rather quickly. My second attempt at Scala was when I had some spare time and I wrote a simple application which parsed some XML. This worked but in the end this only took me 5 minutes, looking back at it now, I did not use any language feature that was an added value. The only thing I did, was write Java in Scala. In the mean time I joined the Belgian Scala Enthousiasts and I’m following the mailing list, but still, I couldn’t really write a true Scala application. I didn’t even get the feeling I was hitting it off. At Devoxx 2011 I was determined to go and see the Scala talks and things that have to do with Scala. I ended up seeing 2 talks about the Play! framework, 3 talks about Scala and a talk about Akka. I did also talk to the guy’s at MongoDB and Typesafe. Devoxx 2011 was a real eye opener for me to get started with Scala. There were many reasons, first of all there were the talks. The talk about play! 2.0 showed me how to build a web application with Scala. It also demonstrated what a cool framework Play! is. The Akka talk showed me how to create super scalable and decoupled applications. It is written in Scala and integrated with the play! framework (version2.0) which was a plus for me. The talk from Matt Raible didn’t really show me something technical. His talk was about some technologies he wanted to learn (Scala, Play, Coffeescript, {Less}, scalate and jade). He wanted to talk on Devoxx so badly so he just submitted a talk. At the time when he submitted the talk he didn’t know any of the technologies he was going to talk about. He even waited to start learning them until his talk got accepted. Only after the approval, he started learning and blogging about these technologies and then showed what he built at Devoxx. His talk was about the same thing which I was trying to do for over a year, but never pushed through. It might sound corny, but this talk was the real boost for what I’m doing now.On the other hand I had 2 interesting chats in the downstairs hall.One was with a mongoDB guy. He showed me the API he had built for connecting with mongoDB from Scala. Unfortunately, I did not get his name… The other was with Henrik Engström, a developer who works at Typesafe (A company founded by Martin Odersky, the creator of the Scala language and next to Scala they also house the play! framework and akka).We just talked about how you can use scala in web applications. When I got back from Devoxx, I literally got home and downloaded the Scala runtime, the Typesafe stack (at the time this was only Scala, Akka and the Scala IDE) and the beta of the Play! framework. I have a small project in my head that I’ve been thinking about for some time now and I started implementing it. Based upon the 3 Scala examples with which the Play! 2.0 beta ships, I’m learning the language bit by bit. But there were a lot of language features I didn’t really grasp. I tried to look into the scala doc but got even more confused. I knew that Typesafe is offering a free book ‘Scala for the Impatient‘, so I decided to check it out. I downloaded that book and started reading it. Things cleared up immensely. I now understand why people say Scala isn’t complex, it just looks that way. Well it’s true. It also explains the weird things you see in the Scala API documentation. Now I’m working with Play! 2.0 and Scala. When I’m getting the hang of it, I’ll also try integrating Akka and probably deploy it on Heroku and see what it can do. I’m going to try to keep documenting my steps into Scala, Play! and Akka. I’ll see where It takes me. Reference: First steps into Scala from our JCG partner Jelle Victoor at the Styled Ideas blog Related Articles :Yes, Virginia, Scala is hard Scala use is less good than Java use for at least half of all Java projects Scala’s version fragility make the Enterprise argument near impossible How Scala changed the way I think about my Java Code Fun with function composition in Scala Scala Tutorial – SBT, scalabha, packages, build systems...
software-development-2-logo

Technology Related Classic Mistakes

In my last blog I looked at Product Related Classic Mistakes from Rapid Development: Taming Wild Software Schedules by Steve McConnell, which although it’s now been around for at least 10 years, and times have changed, is still as relevant today as when it was written. As Steve’s book states, classic mistakes are classic mistakes because they’re mistakes that are made so often and by so many people. They have predictably bad results and, when you know them, they stick out like a sore thumb and the idea behind listing them here is that, once you know them, you can spot them and hopefully do something to remedy their effect. Classic mistakes can be divided in to four types:People Related Mistakes Process Related Mistakes Product Related Mistakes Technology Related MistakesToday’s blog takes a quick look at the fourth of Steve’s categories of mistakes: Technology Related Mistakes, which include:Silver Bullet Syndrome Overestimated Savings From New Tools or Methods Switching Tools in the Middle of a Project Lack of Automated Source ControlSilver Bullet Syndrome Don’t expect the use of new technology or development tool to solve all your scheduling problems. Overestimated Savings From New Tools or Methods Use of new technology and practices can increase development time as their learning curves are climbed. New practices have new risks that you only discover by using them. Teams (or organisations) seldom improve their productivity in leaps and bounds. Aim for steady slow progress. Also overestimated is the saving that arise from code re-use. Code re-use is a very effective approach however wishful thinking comes into play and the savings usually not as high as expected. Switching Tools in the Middle of a Project Try not to upgrade your compiler, operating system etc. in the middle of a project. Apart from the fact that the installation takes time, the associated learning curve, rework and inevitable mistakes often cancel out the benefits of the new tool. Lack of Automated Source Control This was written 10 years ago, but it still amazingly happens, even though there are companies who will provide this service for you via the Internet. Failure to use automated source control exposes your project to needless risk. Use source control to its full. Reference: Technology Related Classic Mistakes from our JCG partner Roger Hughes at the Captain Debug’s Blog. Related Articles :People Related Classic Mistakes Process Related Classic Mistakes Product Related Classic Mistakes 2011: The State of Software Security and Quality Technical debt & the Boiling Frog...
software-development-2-logo

Personal gains from contributing to Open Source

Many may find it difficult to understand why certain people spend a lot of their spare time producing stuff without being paid and then give it away for free. Is this altruism on the edge of stupidity or are there personal benefits gained from participating in such activities? The act of charity and joy of programming arise but may not be the ultimate goal. The motives for participation is subjective but it seems that many does it to boost professional work in one way or another. Contribution Schools benefit greatly from reduced costs and many students would not have had the opportunity to get a computer science degree without the wealth of information and experience found in open source. Many corporates certainly also benefit from open source. Yes, some people actually develop feelings of wanting to give something back. Maybe not trying to make a difference but simply showing a token of gratitude to a community providing such a strong foundation for learning and education to anyone in society. Appreciation Programmers want others to use their stuff. We are social beings and it feels good to hear someone express their appreciation for your work. Appreciation motivates the will to understand different point of views, reduce insecurity and allow you to put others before yourself. Collaboration and social interaction create a feeling of belonging and coding for a community can make this activity even more energizing and enjoyable. Corporate companies sometimes have a tendency to give managers most of the props, which can be disappointing and demoralizing indeed. Reading emails of gratitude and receiving help from others can feel refreshing, especially for those who have been working under less gratifying conditions. Self-education This is your chance to work on projects and problems that excite and inspire you the most. A strong motivator for doing your best and reach creative heights. It may seem scary to know that your work will be reviewed and criticized publicly. But this is a tool for improving your skills, strengthening your attitude and habits towards quality. You will not code sloppy knowing that your work will be accessible anyone. The larger projects that have survived for years and continue to evolve often have great leadership, organization and development guidelines. Technical skill is just one of the many things to observe and absorb. There is also a chance that you will join a team and learn from people that are many levels better than yourself. Reputation Open sourcing will build a public resume that is accessible to anyone. It looks good to have worked on a open source projects, especially famous ones. Meritocracy has a tendency to arise so offering bug corrections, improvements and ideas will earn your peers/users recognition and enhance your reputation. But keep in mind that quality is key. People do not want to spend time on contributions not following guiding principles just because the contributor was too lazy to read them. Such a relationship can be quite stimulating as compared with the typical interaction trying to impress your manager, which interest usually lies with delivering on-time. Transparency also feeds honest and humble communication since nobody can hide bad or selfish decisions. Strong disagreement that otherwise may end in rudeness and cruelty behind closed doors are likely be discussed more calmly knowing that others observe. Control Most people wish for freedom to control their lives. It can be incredibly frustrating to work on a project with budget constraints where software is rushed into a unmanageable mess. Reorganization and outsourcing can also seed feelings of disappointment and helplessness. With open source you are no longer are a victim of such circumstances. You are free to implement and improve the features you think matters, while users help with finding relevance and set priorities. Reuse Most programmers develop an urge to not repeat themselves throughout their careers. Producing open source software is the freedom to truly reuse efforts when changing jobs (or starting your own company) and share them with anyone. These intentions stimulate thinking using broader perspectives and designs that are cooperative, flexible and adaptable to different environments in order to maximize opportunities for reuse. Keeping users loyal often means maintaining version compatibility and upgradability. Having to deal with all this complexity will make you a better programmer. And this is the right thing to do. Newton would have been proud to see this tradition of code-sharing and reuse. Reinventing wheels is a terrible waste of time and human skill. Many view patents as the direct opposite. A threat that prevents reuse and slow programmers down. Patents also encourage a culture where people build barriers instead of helping each other. It is understandable that patents make the open source community frown. Conclusion Open source is a lot about a community of freedom and sharing and it is not hard to see why open source developers often are highly respected. Participation will introduce you to a community of incredible talented, like-minded and caring people that may help improve your skills beyond imagination. Unexpected and exciting job opportunities may indeed arise, maybe at a company that will give you the fortune to produce open source software and get paid at the same time. Reference: Personal gains from contributing to Open Source  from our JCG partner Kristoffer Sjögren at the deephacks blog. Related Articles :Significant Software Development Developments of 2011 The Pragmatic Programmer – Review / Summary Notes. Writing Code that Doesn’t Suck How to Get Unstuck Are frameworks making developers dumb? The top 9+7 things every programmer or architect should know...
scala-logo

Scala for 2012? Deciding Whether to Invest In a Programming Language

I have found it both interesting and rewarding to learn a new programming language or major framework on a roughly yearly basis. If forced to self-identify with any single programming language, it would be Java. However, over the years, I’ve used C and C++ fairly extensively and have used and learned enough to be dangerous about several other languages including shell scripting languages, Perl, JavaScript, Pascal, C#, Ruby, JRuby, Groovy, PHP, and Python. Of the latter group of languages (Ruby, JRuby, Groovy, PHP, and Python), Groovy has had the most practical benefit for me, but I have learned valuable idioms, best practices, and different ways of thinking from using the other languages. In some years, I’ve not been as quick to learn a language in a year in which I’ve been learning either a major framework for that language or in a year in which a language I am familiar with undergoes significant changes. For example, Struts and the Spring Framework dominated my time as I learned each of them. JavaFX has similarly dominated my interest in recent weeks. When not working with a new language, I tend to focus on libraries and frameworks of the languages I am comfortable with. I have spent more time on Guava this year, for example. Learning a new language does provide many benefits. However, these benefits don’t come for free. There is always an opportunity cost associated with learning anything new. If a programming language is particularly different than what one is used to, this opportunity cost can be great. The opportunity cost can be manifest as many different things. It might be lower productivity than could be had using a known language. It might be missing out on learning a new framework, library, or approach in the more familiar language. The opportunity cost might be having to settle on fewer or choosing different features that better fit the new language. The opportunity cost may be as simple as not being able to do other things one would want to do and might have time to do if using a familiar language. Because there are so many potential opportunity costs associated with learning a new programming language, I try to be careful about which I invest my time in. I typically have a compelling reason for learning a new language. Compelling reasons might include specific advantages of a language (such as PHP for many Web 2.0-centric projects) or widespread use and “employability” of that language. Other reasons might be to learn new techniques that can be adapted to more familiar languages. Perhaps the most compelling reason I’ve learned a new language has been to read and maintain code or scripts that I have handed to me and am assigned responsibility for. For several years now, I’ve been somewhat curious about Scala, but have not yet committed myself to using it and learning it because other languages, frameworks, and tools have grabbed my attention. Typically, I’ve had some motivation that has made these tools, languages, or frameworks seem most worth my investment of time and energy. For example, the need to have a nice scripting language that meshes well in my Java development environment led me to Groovy. My attendance at JavaOne 2010 and JavaOne 2011, coupled with my interest in a modern GUI technology, has led to my interest in JavaFX. I spent time with Python after being in a position where I needed to read and modify Python scripts. I recently welcomed the opportunity to pose some questions to Scala creator Martin Odersky related to what is in Scala that might motivate me (and others) to invest time and energy into learning Scala. As I articulated the questions I had for Martin, I realized that these are really the things that I informally look at when investigating a new language. I typically spend an hour to two finding a language’s highest-level motivations first and only invest more time in that language if it seems to be a good potential fit for me. Martin has agreed to me posting my questions and his answers and they are shown next (I have added hyperlinks). Question: What is the most compelling/motivating reason or reasons that one might want to invest time in learning Scala as opposed to continuing use of Java (mostly for applications in my case) + Groovy (mostly for development environment scripting in my case)? For example, Groovy appealed to me at a high level as a way to script with libraries, idioms, and syntax that I was comfortable with from Java application development experience. There are actually quite a few different facets of Scala that individuals attach to for different reasons. Some are attracted by the succinct syntax and resulting productivity. Others gravitate to the stability of the JVM and runtime performance of a compiled language (versus an interpreted language, like Groovy), or the ability of a sophisticated type system to help programmers avoid errors that would otherwise crop up at runtime. Others find the functional style of programming to be a more natural way to reason about their application logic. One of the strongest attractors, from a practical point of view, is that Scala (and the rest of the Typesafe Stack — including Akka and Play) are designed to provide better tools to address the dual challenges of parallel and concurrent programming. With the advent of mainstream multicore/manycore hardware, and the increasing scale of applications that developers are charged to build, many industry developers are looking for higher level abstractions than threads and locks for building at this next scale. Many find that the functional style, immutable state, actor concurrency model, and other concepts at the heart of Scala make it simpler to build parallel and concurrent applications. Question: What are Scala’s biggest strengths, advantages, and innovative features? At a high level, Scala seeks to be a pragmatic language that scales from the smallest scripts to the largest distributed systems. One major thread of innovation in Scala is its unique blend of object-oriented Java with functional programming concepts. Scala’s libraries build on this foundation to provide outstanding support for concurrency and parallelism, for example through the actor programming model and the built-in parallel collections introduced in Scala 2.9. Scala’s expressive type system and syntax helps developers build more reliable code and greatly increase extensibility, especially for library developers and those building domain-specific languages (DSLs). Finally, it’s important not to overlook the fact that Scala is deeply integrated with Java, supporting blended Scala/Java projects and allowing developers to apply their skills and investments in Java immediately when they start working with Scala. Question: What are Scala’s biggest weaknesses, disadvantages, and plans for improvement? One of the challenges for a relatively young language like Scala is the maturity of tools. In particular, the Scala IDE for Eclipse has had its rough edges in the past — one of the reasons that Typesafe, as the leading commercial contributor to Scala, has invested substantial resources in overhauling the IDE with version 2.0 (just released in December 2011). Another challenge for adoption is that Scala does introduce with functional programming a new mode of thinking about programs, which takes some time to learn. It makes the transition gentle, because one can start writing Scala code like more concise Java code. But as Scala’s native library ecosystem grows chances are that newcomers to the language will come across to some of its more foreign features before they have developed a good understanding. To avoid culture shock, we need to develop a set of best practices and good tutorials that help the transition. “Programming in Scala“, which I have co-authored, is a comprehensive tutorial of the object/functional style. Cay Horstmann‘s “Scala for the Impatient“, available as a free preview on the Typesafe site, is a pragmatic, fast-paced introduction. Question: What situations/scenarios/use cases is Scala best and worst suited for? As described above, Scala and its frameworks like Akka and Play really shine for building systems that need to scale on multiple fronts — across cores, across machines in a cloud environment, and across large software teams. Traditionally, one area where Scala or other JVM-hosted languages would not be considered well suited would be lower-level systems programming. But interestingly, we see evidence of forward-looking systems developers increasingly embracing managed runtime languages like Scala because they face fundamental challenges in building reliable systems for the era of multicore hardware and distributed deployments. Martin’s responses validate some of my own conclusions about Scala from reading posts by Scala enthusiasts and even some of the detractors. In terms of motivation, I have a difficult time believing that learning Scala primarily as a scripting language will be very motivating because I’m pretty happy with Groovy for scripting. However, for development of applications, I tend to use Java and not Groovy and I wonder if it’s in that area where I’d be most likely to benefit from learning and using Scala. Once I determine that a language is worth investing in, the next step is deciding how to best learn it. Reading about it is a necessity, but using it is what really helps me learn it and also what helps me identify the things I don’t like about it. The trick is to come up with a somewhat realistic example that is easy enough to implement, but interesting enough to prove out some concepts. A “Hello World” is okay to get one’s feet wet, but doesn’t really test how the language fares for a developer’s specific needs. My favorite initial examples are ones that actually provide benefit in addition to being a mechanism for learning. For example, when I was learning Groovy, I developed several scripts early on that were helpful to me as scripts in their own right regardless of the language they were written in. In those cases, I gained familiarity with Groovy while also receiving other utilitarian benefits. The JavaWorld article Learn Scala with Specs2 Spring describes how a Java developer who uses the Spring Framework can use the author’s company‘s Specs2 Spring for integration testing and also benefit from “an efficient and safe way to learn the patterns of object-functional programming with Scala.” The entire premise of this article is exactly the kind of thing I like to do when learning a new language: combine legitimate benefit with learning of the new language. One other thing to think about when trying out a new language is to ensure that one is trying it out for the correct situations. This was easy for me with Groovy: I tried out Groovy first in situations in which I wanted the power of the JVM or the scope of the JDK, but wanted a scripting-friendly language. A developer can quickly decide a language is “not good” simply because the situation in which the language used is not a great fit for that language. Related to this, another issue I try to keep in mind when learning a new language is that it’s not fair to compare a language I know well and have spent thousands of hours with to a language that I’ve spent a few hours with. Unless I encounter some real deal-breakers early in the process, I try to not let “little problems” or things I don’t like about the new language prevent me from giving it a real chance. An excellent recent post on this is Rob Pike‘s Esmerelda’s Imagination. All that being stated, there are times I run into a true deal-breaker that makes me realize I should not invest any more time in a particular language because it doesn’t fit my needs. That doesn’t necessarily mean there’s anything wrong with the language, but simply that it doesn’t fit my needs well. An example of this would be using Java for a real-time system in Java’s early days. I think I’m almost ready to commit to spending more time with Scala. I’m not the type to make new-year resolutions, but it just so happens that it seems like the right time to give Scala a closer look. If I really do start to invest more time in Scala, my plan is to first re-read Bruce Eckel‘s Scala: The Static Language that Feels Dynamic and then read the A1 chapters of Scala for the Impatient, trying out and adapting examples. If I’m still interested in Scala after that, I can invest more at that time. Do I still have reservations about spending time on Scala? Of course. One of my biggest concerns is best articulated by someone who actually seems to have tried out Scala. Cédric Beust states, “In my experience with Scala, it’s hard not to like the language in the first week and it’s hard to still be in love with it after reading the 700+ pages of a book about it.” On the other hand, Casper Bang articulates well why I think I maybe I should spend time with Scala despite any other obvious motivations: “So I guess my point is, even if I do find Scala hyperbolish and biting over a bit too much; the majority of identifiable alpha-geeks that I track, are moving this way and as a practicing professional, I can not afford to ignore this.” The post Offbeat: Scala by the end of 2011 – No Drama but Frustration is Growing and the feedback comments related to that post are insightful and seem to reiterate some of the issues that Martin pointed out that Scala must deal with. In particular, when I look at the issue most likely to deter me from spending time on Scala, it is the risk that Scala may never take hold in mainstream development. If that turns out to be the case, then the primary advantage of learning Scala would be to change my way of thinking about things and that’s not always necessarily worth the opportunity cost and other costs. This post and the feedback comments contain multiple sides of the same issue and are another reminder that I probably need to do more with Scala to decide for myself how I feel about it. My plan as of right now is to invest significant time and effort into learning basics of Scala and applying it to some “realistic” examples. I even have plans to blog on what I learn. But, I have had these types of plans before and been distracted by some other shiny thing that has come my way. I think this time will be different, but I should know for certain by the end of 2012.   Reference: Scala for 2012? Deciding Whether to Invest In a Programming Language  from our JCG partner Dustin Marx at the Inspired by Actual Events  blog. Related Articles :Selecting a new programming language to learn Yes, Virginia, Scala is hard Scala’s version fragility make the Enterprise argument near impossible Scala use is less good than Java use for at least half of all Java projects Things Every Programmer Should Know...
git-logo

Git DVCS – Getting started

Git is a distributed revision control system, where every working directory is a full-fledged repository with complete history and full revision tracking capabilities.Git is categorized as DVCS (Distributed Version Control System), because is not dependant on a central server. So the academic way for working with Git is pushing/pulling data from/to each developer repository. This works in small teams or in a highly distributed development (open source projects that people are working around the world), but in mid-size teams or business companies, that require a central repository because of infrastructure/workflow process like Continuous Integration System, QA Checks before delivering, Environment Backups, External Manual Audits… seem that a traditional SCM should be desired. But this claim is far from reality, Git is still your VCS; how about creating a theoretical central repository? I say theoretical because in Git there is no central repository at a technical level. This repository will act as central because of convention. I call, and in many other posts also call this repository origin.A Git remote repository is a repository without working directory. Only composed by  .git project directory and nothing else.Nvie has created a nice schema of this topology:See that each developer pulls and pushes to origin, but also may exchange data with other peers. For example, if two or more developers are working on a new feature, they can push changes between them before pushing stable version to origin repository.Git is not tied to any particular transmission protocol, it supports transmitting changes via USB stick, email, …, or traditional way like HTTP, FTP, SSH, …So although Git has broken the typical SCM hub architecture to peer-to-peer structure, we can still create (by convention) a central repository for uploading stable code. And let me write again, “This central repo is just another node in the peer not THE REPOSITORY”.What I am going to explain is how to install and configure this “central repo” in an Ubuntu Server.We can say that Git only takes care of repository management and leaves transport operations to lower layers. A typical transport configuration for these central repos is using SSH protocol. So let’s install and configure a SSH server. (if you have already installed skip to next step).Install SSH Server:div>$ sudo apt-get install openssh-serverafter installed try:$ ssh <username>@<servername>Configure SSH Server:In /etc/ssh/sshd_config configure to only use SSH Protocol 2: Protocol 2Next step is to install Git: (You can skip this step if you have already installed).Install Git (not git-core package):$ sudo apt-get install gitThen execute Git command to check that has been installed correctly.Next step is creating a bare repository for the project. By convention, bare repository directories end with .git. So first thing to do is create a .git directory of project. Creating a bare repository from existing repository:$ git clone –bare my_project my_project.gitThis command transforms the /my_project/.git to my_project.git.Creating a new bare repository:If you are starting a new project you can initialize it directly as bare repository using:$ mkdir my_project.git $ cd my_project.git $ git –bare init Now all structure is created and ready to be transferred. Case that initial project was started on developer computer you should copy this directory (using scp for example) to origin.Then execute next command:$ git init –bare –sharedThis command will add propertly group read/write permissions.And now it is time to clone created repository to developer computer, I assume that developer has already an account in server (for connecting using ssh). So go to developer computer (or open another terminal) and type next command: $ git clone <username>@<servername>:/<directories>/my_project.git If user has read permissions to my_project.git directory, repository will be downloaded to local computer. Write permissions are required for checking in changes. And now I suppose you are thinking that it was so easy creating a remote repository, but now another problem arises. If your company is small you can manually create a new user into your server for each developer, it should be easy to manage, but if your company is bigger, then management of all users is hard. You must create an account for each one, and more important, they will have access to server shell using ssh (not only for uploading code) or ftp, …, and this fact implies a problem with security, you should take care of what a user can do and what cannot do in his shell. So arrived at this point, one can setup accounts for everyone, which is straightforward but can be cumbersome. Another way is using an LDAP or any other centralized system, but this is alien topic for this post. A second method is to create an account called “git” on the server, and ask every user who will have  access, to send its SSH public key, and add that key to the .ssh/authorized_keys file of “git” user. I am sure that this approach sounds you familiar (github way?). So let’s explain this way: First of all each user should send you its public key, (they can find in .ssh directory *.pub file), or simply create new, using ssh-keygen command. See this tutorial for learning how to generate both keys http://github.com/guides/providing-your-ssh-key. Setting up Git server with user public keys: First step is create a git user with .ssh directory. #from server $ sudo adduser git $ su git $ cd $ mkdir .ssh Next step is create authorized_keys file where all public keys will be stored: For example: #from server $ cat id_dsa.user1.pub >> ~/.ssh/authorized_keys $ cat id_dsa.user2.pub >> ~/.ssh/authorized_keys And now each developer, with public key published in authorized_keys and private key in his own .ssh directory, has access to repository. Let’s try, open another terminal (would be developer machine in real scenario) and try to clone existing repo from server: #from developer computer $ git clone git@<servername>:<directories>/my_project.git After repository is cloned to developer computer, modifications can be made and pushed them. And now you can say, “Ok, I don’t have to create one account for each developer but I am still having a problem with security“, each developer still has access to shell. Yes it is true, but you can easily restrict the “git” user to only doing Git activities with a limited shell called git-shell. Next step is specifying git-shell instead of bash for Git user, in /etc/passwd. $ sudo vim /etc/passwd and change git:x:1000:1000::/home/git:/bin/sh to git:x:1000:1000::/home/git:/usr/bin/git-shell Now your server is secured, only Git operations are allowed using “git” account with users that have sent their SSH public key. You have your central remote repository configured and ready to be used; at this point you may consider install Git tools like gitweb, gitosis or gitolite, but in this post are off topic. I hope you have found this post useful. Reference: Git DVCS – Getting started  from our JCG partner Alex Soto at the One Jar To Rule Them All blog Related Articles :Services, practices & tools that should exist in any software development house, part 1 Dealing with technical debt Diminishing Returns in software development and maintenance This comes BEFORE your business logic! When Inheriting a Codebase, there are more questions than answers… Java Tools: Source Code Optimization and Analysis...
jcg-logo

Best Of The Week – 2011 – W53

Hello guys, Time for the “Best Of The Week” links for the week that just passed. Here are some links that drew Java Code Geeks attention: * 11 Things every Software Developer should be doing in 2012: A great list of things that developers should be doing in the year to come. Perfect for new year’s resolutions :-). Also check out Things Every Programmer Should Know. * Practical Garbage Collection, part 1 – Introduction: Must read article for an introduction to Garbage Collection. Enough said. * Android SDK: Build a Mall Finder App – Mapview & Location: A very helpful tutorial that shows how to use both the integrated Google Maps functionality and the location based capabilities of Android. Also check out Android Google Maps Tutorial and Android Location Based Services Application. * Smelly Communication: How the Suits should assign tasks to Geeks: This article discusses the communication problems between the “Suits” (Marketing, Sales, Creative folks) and the “Geeks” (Devs, Ops and Infra folks) which stem from a fundamental gap in understanding between translating the language of business needs into the language of technical requirements. * Provisioning of Java web applications using Chef, VirtualBox and Vagrant: This DevOps tutorial describes how to set up Chef (configuration management), VirtualBox (virtualization) and Vagrant in order to deploy a very simple web application to a Tomcat instance in a dynamic topology of virtualized servers. * Java Servlet 3.0 Tutorial: WebServlet Annotations with NetBeans 7, Jetty and Maven: A quick tutorial on how to implement a 3.0 Servlet using WebServlet annotations and how to deploy it on Jetty using Maven and NetBeans. Also check out Servlet 3.0 Async Processing for Tenfold Increase in Server Throughput and JAX–WS with Spring and Maven Tutorial. * 10 Indispensable NOC Tools for Linux and BSD: As the title suggets, some Linux/BSD based tools that will help you with the NOC administration. Tools like Nagios, Zenoss, CloudPassage, Htop, Xen and others are suggested. * OSGi: An Introduction: A nice introductory article to OSGi, it demonstrates how to use the framework in order to build modular systems. Eclipse and the Equinox platform are used for this example. Also check out OSGi – Simple Hello World with services and OSGi Using Maven with Equinox. * Integrating Lucene with HBase: A very interesting article that describes how to integrate Lucene (search library) with HBase (NoSQL data storage) in order to build a highly scalable search implementation. For this, a memory-based backend is used as an in memory cache and a mechanism for synchronizing this cache with the HBase backend is implemented. * The Rise of Application Analytics: A New Game Demands New Rules: This article discusses the emerging application analytics market, which offers unprecedented insight into how software is being used in the real world and gives multiple stakeholders a reliable way to measure and manage development investments. Application analytics correlates behavior with business results and extends beyond browser clicks on a web page. That’s all for this week. Stay tuned for more, here at Java Code Geeks. Cheers, Ilias Related Articles:Best Of The Week – 2011 – W52 Best Of The Week – 2011 – W51 Best Of The Week – 2011 – W50 Best Of The Week – 2011 – W49 Best Of The Week – 2011 – W48 Best Of The Week – 2011 – W47 Best Of The Week – 2011 – W46 Best Of The Week – 2011 – W45 Best Of The Week – 2011 – W44 Best Of The Week – 2011 – W43...
javafx-logo

What I Learnt about JavaFX Today

In case you haven’t heard, JavaFX 2 is the new Desktop / web / client framework for Java. It’s had a considerable overhaul since JavaFX 1 (which was frankly not that impressive). Out has gone the custom scripting language, and instead you can write it using standard Java and an XML-based language for the actual UI presentation. So today, a friend and I got together at one of our places to teach ourselves a bit of JavaFX. Here’s what we learned, starting with some of the yak-shaving we had to do:First of all, install the JavaFX developer preview – get it here You have to unzip it, and place the resulting directory somewhere sensible, chown’d to root.I put it in /usr/local/javafx-sdk2.1.0-beta/Next, you’ll want an IDE to go with thatNetbeans is the IDE which is the most advanced and usable with JavaFX 2 You want Netbeans 7.1 RC2To get this to install on a Mac, you need JavaForMacOSX10.7.dmg – no lower version of official Apple Java will do, and an OpenJDK build won’t work either (even if it’s the correct version or higher) Once it’s installed, Netbeans will work fine with other JREs (I was mostly running it against the Java 7 Developer Preview) To start new JavaFX projects, you need to tell NetBeans where to find JavaFX. For this, you need to create a new JavaSE platform profile, and add the JavaFX dependencies in manually.Once it was installed, we started working with JavaFX properly. Our project for the day was to try to replicate some of Victor Grazi’s concurrency animations in JavaFX – both to teach ourselves the JavaFX technology, and also create some teaching tools as outputs.JavaFX uses Application as the main class to subclass The API docs are hereIf you’ve done any Flex development, JavaFX will seem very natural. E.g.The FXML file provides the UI and layout The top level FXML element has a fx:controller attriubte, which defines the Control for this View FXML elements are bound to members contained in the controller class which have been annotated with the @FXML annotation The fx:id property is used to define the name of the member that is being bound to the FXML element Binding also occurs to methods. E.g. buttons bind use an onAction handler, like this: onAction="#isFutureDone" The #methodName syntax is used to say which method should be called when the button is pressed.From this it’s very easy to get started with building up a basic application. Some things that we found:The UI thread can be quite easy to tie up. Don’t ever call a blocking method directly from the Control object, as triggering this code path on the UI thread will cause the display to hang. Be careful of exception swallowing. If you have a method in an object which is updating a UI element, but which is not annotated with @FXML, then you seem to need to call requestLayout() on the UI element after updating it. We’re not sure we got to the bottom of why – please enlighten us if you know why. The framework seems to use custom classloading to transform the FXML file into a “scene graph” of objects, seemingly a bit like how Spring does it.On the whole, we were quite impressed with our short hack session. The APIs seem clean, and the overall design of the framework seems sound. There were a few stability issues, but this is bleeding-edge tech on Mac – both the JDK and the JavaFX runtime are Developer Previews. We’ll definitely be back to do some more with JavaFX, and look forward to seeing it mature and become a fully-supported OSS framework for client development in Java. Reference: What I Learnt about JavaFX Today  from our JCG partner Martijn Verburg  at the Java 7 Developer Blog. Related Articles :Migrating from JavaFX 1.3 to JavaFX 2.0 JavaFX 2.0 beta sample application and after thoughts JavaOne is Rebuilding Momentum Sometimes in Java, One Layout Manager Is Not Enough...
software-development-2-logo

Simplifying RESTful Search

Overview REST architectural pattern is based around two basic principles:Resources as URLs: A resource is something like an entity or a noun in modelling lingo. Anything on a web is identified as a resource and each unique resource is identified by a unique URL. Operations as HTTP methods: REST leverages existing HTTP methods, particularly GET, PUT, POST, and DELETE which map to resource’s read, create, modify and removal operations respectively.Any action performed by a client over HTTP, contains an URL and a HTTP method. The URL represents the resource and the HTTP method represents the action which needs to be performed over the resource. Being a broad architectural style, REST always have different interpretations. The ambiguity is exacerbated by the fact that there aren’t nearly enough HTTP methods to support common operations. One of the most common examples is the lack of a ‘search’ method. Search being one of the most extensively used features across different applications, but there have been no standards for implementing this feature. Due to this different people tend to design search in different ways. Given that REST aims to unify service architecture, any ambiguity must be seen as weakening the argument for REST. Further in this document, we shall be discussing how search over REST can be simplified. We are not aiming at developing standards for RESTful search, but we shall be discussing how this problem can be approached. Search Requirements Search being mostly used feature across different web applications, supports almost similar features around different applications. Below is the list of some common constituents of search features: Search based on one or more criteria at a time Search red colored cars of type hatchback color=red && type=hatchbackRelational and conditional operator support Search red or black car with mileage greater than 10 Colour=red|black && mileage > 10Wild card searchSearch car manufactured from company name starting with M company=M*PaginationList all cars but fetch 100 results at a time upperLimit=200 && lowerLimit=101Range searchesGet me all the cars launched between 2000 and 2010 launch year between (2000, 2010)When we support search with such features, search interface design itself becomes complex. And when implemented in a REST framework, meeting all these requirements (while still conforming to REST!) is challenging. Coming back to the basic REST principles, we are now left with following two questions:Which HTTP method to use for “search”? How to create effective resource URL for search?Query parameters versus Embedded URLs Modelling filter criteriaHTTP Method Selection Query Criteria vs. Embedded Criteria: Effectively, REST categorizes the operations by its nature and associates well-defined semantics with these categories. The idempotent operations are GET, PUT and DELETE (GET for read-only, PUT for update, DELETE for remove).   While POST method is used for non-idempotent procedures like create. By the definition itself, search is a read only operation, which is used to request for a collection of resources, filtered based on some criteria. So, GET HTTP method for search feature is an obvious choice. However, with GET, we are constrained with respect to URL size if we add complex criteria in the URL. URL Representation Let’s discuss this using an example: a user wish to search four-doored sedan cars of blue color; how shall the resource URL for this request look like? Below two different URLs are syntactically different but semantically same:/cars/?color=blue&type=sedan&doors=4 /cars/color:blue/type:sedan/doors:4Both of the above URLs conform to RESTful way of representing a resource query, but are represented differently. First one uses URL query criteria to add filtering details while the later one goes by an embedded URL approach. The embedded URL approach is more readable and can take advantage of the native caching mechanisms that exist on the web server for HTTP traffic. But this approach limits user to provide parameter in a specific order. Wrong parameter positions will cause an error or unwanted behaviour. Below two looks same but may not give you correct results/cars/color:red/type:sedan /cars/type:sedan/color:redAlso, since there’s no standardization for embedding criteria, people may tend to device their own way of representation. So, we consider query criteria approach over the embedded URL approach, though the representation is a bit complex and lacks readabilityModeling Filter Criteria: A search-results page is fundamentally RESTful even though its URL identifies a query. The URL shall be able to incorporate SQL like elements. While SQL is meant to filter data fetched from relational data, the new modelling language shall be able to filter data from hierarchical set of resources. This language shall help in devising a mechanism to communicate complex search requirements over URLs. In this section further, two such styles are discussed in detail.Feed Item Query Language (FIQL): The Feed Item Query Language (FIQL, pronounced “fickle”) is a simple but flexible, URI-friendly syntax for expressing filters across the entries in a syndicated feed. These filter expressions can be mapped at any RESTful service and can help in modelling complex filters. Below are some samples of such web URLs against their respective SQLs.SQLREST Search URLsselect * from actors where firstname=’PENELOPE’ and lastname=’GUINESS’ /actors?_s=firstname==PENELOPE;lastname==GUINESSselect * from actors where lastname like ‘PEN%’ /actors?_s=lastname==PEN*select * from films where filmid=1 and rentalduration <> 0 /films?_s=filmid==1;rentalduration!=0select * from films where filmid >= 995 /films?_s=filmid=ge=995select * from films where release date < ‘27/05/2005’ /film?_s=releasedate=le=2005-05-27T00:00:00.000%2B00:00Resource Query Language (RQL) : Resource Query Languages (RQL) defines a syntactically simple query language for querying and retrieving resources. RQL is designed to be URI friendly, particularly as a query component of a URI, and highly extensible. RQL is a superset of HTML’s URL encoding of form values, and a superset of Feed Item Query Language (FIQL). RQL basically consists of a set of nestable named operators which each have a set of arguments and operate on a collection of resources.Casestudy: Apache CXF advance search features To support advance search capabilities Apache CXF introduced FIQL support with its JAX-RS implementation since 2.3.0 release. With this feature, users can now express complex search expressions using URI. Below is the detailed note on how to use this feature: To work with FIQL queries, a SearchContext needs be injected into an application code and used to retrieve a SearchCondition representing the current FIQL query. This SearchCondition can be used in a number of ways for finding the matching data. @Path("books") public class Books { private Map books; @Context private SearchContext context;@GET public List getBook() {SearchCondition sc = searchContext.getCondition(Book.class); //SearchCondition is method can also be used to build a list of// matching beans iterate over all the values in the books map and // return a collection of         matching beans List found = sc.findAll(books.values()); return found; } } SearchCondition can also be used to get to all the search requirements (originally expressed in FIQL) and do some manual comparison against the local data. For example, SearchCondition provides a utility toSQL(String tableName, String… columnNames) method which internally introspects all the search expressions constituting a current query and converts them into an SQL expression: // find all conditions with names starting from 'ami' // and levels greater than 10 : // ?_s="name==ami*;level=gt=10" SearchCondition sc = searchContext.getCondition(Book.class); assertEquals("SELECT * FROM table WHERE name LIKE 'ami%' AND level > '10'", sq.toSQL("table")); Conclusion Data querying is a critical component of most applications. With the advance of rich client-driven Ajax applications and document oriented databases, new querying techniques are needed; these techniques must be simple but extensible, designed to work within URIs and query for collections of resources. The NoSQL movement is opening the way for a more modular approach to databases, and separating out modelling, validation, and querying concerns from storage concerns, but we need new querying approaches to match more modern architectural design.   Reference: Guava’s Strings Class from our JCG partner Dustin Marx at the Inspired by Actual Events blog. ...
Java Code Geeks and all content copyright © 2010-2014, Exelixis Media Ltd | Terms of Use | Privacy Policy
All trademarks and registered trademarks appearing on Java Code Geeks are the property of their respective owners.
Java is a trademark or registered trademark of Oracle Corporation in the United States and other countries.
Java Code Geeks is not connected to Oracle Corporation and is not sponsored by Oracle Corporation.
Do you want to know how to develop your skillset and become a ...
Java Rockstar?

Subscribe to our newsletter to start Rocking right now!

To get you started we give you two of our best selling eBooks for FREE!

Get ready to Rock!
You can download the complementary eBooks using the links below:
Close