Core Java

Java heap space, native heap and memory problems

Recently, I was discussing with a friend, why the Java process uses more memory than the maximum heap that we set when starting the java process.

All java objects that code creates are created inside Java heap space, which its size is defined by the -Xmx option. But a java process is consisted by many spaces, not only by the java heap space. A few of spaces that a java process is consisted are the following:
 
 
 
 

  • Loaded libraries (including jar and class files)
  • Control structures for the java heap
  • Thread Stacks
  • Generated (JITed) code
  • User native memory (malloced in JNI)
  • … more…

In a 32-bit architecture system, the total process size cannot exceed 4GB. So, a 32-bit java process is consisted by many spaces (java heap, native memory (C-Heap) and other spaces) and its allocated space cannot exceed 4GB.

Assume on a 32-bit production system you run a java application server with -Xmx 1.5 GB (java heap is set to 1.5 GB) for a long time, with many applications deployed. After some time, customer wants to deploy on the same application server more applications. System operator(s) understands that as server will have to process more requests will also need to create more objects and do more processing. So, as a future proof solution operator(s) decides to increase maximum heap of java process to 2 GB.

OK, it looks like a good approach, but what did it really happen on this production application server in reality??? (This is a real case). The application server crashed with OutOfMemoryError !!! Can you think about the possible causes?

My first thought was that 2 GB were not enough for all these applications with this load. Unfortunately, the problem was something else. What do you think now? I will help you a little.

java.lang.OutOfMemoryError: requested 55106896 bytes for Chunk::new.

The real cause was that already deployed (old) applications were needed too large size for the native (C-Heap) memory. Before operator(s) increase the size of the heap size (from 1.5GB to 2 GB) they had not monitored the required native memory space of the old applications. The side effect of this action was to automatically decrease the available maximum size of native memory of java process (from 2.5 GB to 2GB). As the old applications were already use so large size for native memory, this change crash the server!!!

The only accepted solution on this case was to avoid increase the maximum heap size, deploy the new applications and live with less throughput. It is not a perfect solution, but it is the only one viable for this case (as our java process has to be 32-bit).

Especially in 32-bit systems, be aware of the required size of native memory of java process, before you increase the java heap size. If you are in a situation where these two spaces conflict, then the solution may not be so easy. If you cannot change your code to overcome this situation, then the most common solution is to move to a 64-bit system, where the maximum process size limit is too much larger.

There are four major things to remember:

  • The maximum limit of size of a process
  • The size of a java process is not only consisted of java heap
  • The size of native (C-Heap) memory of a java process cannot be configured explicitly, as it is possible with the java heap space
  • The size of java heap space and native (C-Heap) memory space an application requires is only defined by the application and there is not any standard ratio between these two spaces

 

Reference: Java heap space, native heap and memory problems from our JCG partner Adrianos Dadis at the Java, Integration and the virtues of source blog.

Adrianos Dadis

Adrianos is working as senior software engineer in telcos business domain. Particularly interested in enterprise integration, multi-tier architecture and middleware services. He mainly works with Weblogic, JBoss, Java EE, Spring, Drools, Oracle SOA Suite and various ESBs.
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

5 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Radek
Radek
11 years ago

Hi, sorry for my comment if I didn’t understand your post correctly but from my point of view the reason of OutOfMemoryError is -Xmx parameter. -Xmx says maximum size of heap space but you can get it only if you have enough free countinuous physical memmory otherwise you get less. So in numbers, 32-bit Windows let you allocate no more than 850M for heap space and Linux 1,2G +-.

Adrianos Dadis
11 years ago
Reply to  Radek

Hi Radek,

I am not sure how you conclude that in Linux you can get only 1.2GB for heap. I can assure you that we run many production servers with 32-bit JVM and all of them are using 1536 MB for heap space. I believe you are confused with some concept related to process properties in Linux.
As for MS Windows I believe the same, as I have run java processes with more than 1GB for heap space.

Adrianos Dadis.

SoftMAS | Software Development

Hi, great article, is very simple to understand and so much useful. Thanks for publishing.

James
James
10 years ago

4gb is the size of addressable memory with 32 bits, but user processes don’t get all of it on most operating systems, the kernel uses some too. On Solaris you’ll get the whole lot, on Linux you process gets 3gb, and normally on Windows you only get 2gb.

agnidaju
agnidaju
9 years ago

some information sound vague,however. so many good contents you have included. I am wondering if you could one short video about from start to end of CPU/Memory/Heapdump profiling/analysis. You have not mention suspension time. What time do you take Heap Dump and how do you interpret memory leak(include stack,array,class,block,thread etc)
I know asking question is always easy.However; I am waiting to get some clearance over the topic that I mentioned above
Thank you
Agni Daju

Back to top button