Fast, Predictable & Highly-Available @ 1 TB/Node

The world is pushing huge amounts of data to applications every second, from mobiles, the web, and various gadgets. More applications these days have to deal with this data. To preserve performance, these applications need fast access to the data tier.
RAM prices have crumbled over the past few years and we can now get hardware with a Terabyte of RAM much more cheaply. OK, got the hardware, now what? We generally use virtualization to create smaller virtual machines to meet applications scale-out requirements, as having a Java application with a terabyte of heap is impractical. JVM Garbage Collection will slaughter your application right away. Ever imagined how much time will it take to do a single full garbage collection for a terabyte of heap? It can pause an application for hours, making it unusable.
BigMemory is the key to access terabytes of data with milliseconds of latency, with no maintenance of disk/raid configurations/databases.
BigMemory = Big Data + In-memory  

BigMemory can utilize your hardware to the last byte of RAM. BigMemory can store up to a terabyte of data in single java process.
BigMemory provides “fast”, “predictable” and “highly-available” data at 1 terabytes per node.

The following test uses two boxes, each with a terabyte of RAM. Leaving enough room for the OS, we were able to allocate 2 x 960 GB of BigMemory, for a total of 1.8+ TB of data. Without facing the problems of high latencies, huge scale-out architectures … just using the hardware as it is.
Test results: 23K readonly transactions per second with 20 ms latency.
Graphs for test throughput and periodic latency over time.
Readonly Periodic Throughput Graph
Readonly Periodic Latency Graph

Reference: Fast, Predictable & Highly-Available @ 1 TB/Node from our JCG partner Himadri Singh at the Billions & Terabytes blog.

Related Whitepaper:

Big Data Basics

An Introduction to Big Data and How It Is Changing Business

Amazingly, 90% of the data in the world today has been created only in the last two years. With the increase of mobile devices, social media networks, and the sharing of digital photos and videos, we are continuing to grow the world's data at an astounding pace. However, big data is more than just the data itself. It is a combination of factors that require a new way of collecting, analyzing, visualizing, and sharing data. These factors are forcing software companies to re-think the ways that they manage and offer their data, from new insights to completely new revenue streams.

Get it Now!  

Leave a Reply


five × 5 =



Java Code Geeks and all content copyright © 2010-2014, Exelixis Media Ltd | Terms of Use | Privacy Policy
All trademarks and registered trademarks appearing on Java Code Geeks are the property of their respective owners.
Java is a trademark or registered trademark of Oracle Corporation in the United States and other countries.
Java Code Geeks is not connected to Oracle Corporation and is not sponsored by Oracle Corporation.
Do you want to know how to develop your skillset and become a ...
Java Rockstar?

Subscribe to our newsletter to start Rocking right now!

To get you started we give you two of our best selling eBooks for FREE!

Get ready to Rock!
You can download the complementary eBooks using the links below:
Close