The Artima Developer Community
Sponsored Link

Agile Buzz Forum
Only getting 340 Megs of usable heap out of 2 Gigs in Java - why the limit?

0 replies on 1 page.

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 0 replies on 1 page
Mark Levison

Posts: 877
Nickname: mlevison
Registered: Jan, 2003

Mark Levison an agile software developer who writes Notes from a tool user.
Only getting 340 Megs of usable heap out of 2 Gigs in Java - why the limit? Posted: Jul 6, 2007 3:17 PM
Reply to this message Reply

This post originated from an RSS feed registered with Agile Buzz by Mark Levison.
Original Post: Only getting 340 Megs of usable heap out of 2 Gigs in Java - why the limit?
Feed Title: Notes from a Tool User
Feed URL: http://feeds.feedburner.com/NotesFromAToolUser
Feed Description: Thoughts about photography, software development, reading, food, wine and the world around us.
Latest Agile Buzz Posts
Latest Agile Buzz Posts by Mark Levison
Latest Posts From Notes from a Tool User

Advertisement

The challenge - a colleague is trying to write a memory intensive application, he's trying to allocate large chunks of memory and nothing seems to work. Here are his notes:

I'm trying do some memory intensive work in Java and am running into limits far sooner than I would expect.
First of I would like to access as much of the 2GB process limit I can. It would appear that 1.4 GB is the largest amount that can be allocated with the -Xmx switch. So be it.


However we only appear to get use of a small chunk of that. Maxing out around 340MB. Who takes that space? How can get more of it?

 

Some examples of the problem: I allocate an array (1- or 3-dimensional) of 50 million ints and another array of 10 million doubles. The raw space is 280MB -- 200MB for the ints and 80 MB for the doubles. I see the process virtual memory grow to about 290MB, an acceptable overhead. So far so good. Then I do some simple operations involving instantiations of small numbers of small objects. Almost immediately, and fairly frequently, the garbage collector runs (I use -verbose:gc to see when) and the virtual memory of the process jumps to about 460MB. It seems to allocate a fixed fraction of the space consumed on the heap for use during garbage collection.


As another example, I have an application which accumulates values into large int and double arrays (similar to the above). The greatest fraction by far of memory I'm using is in those arrays; the program is creating, then dropping references to small objects whose primitive contents are used to fill the big arrays. The process crashes out-of-memory -- exceeds 1.4GB in VM size -- when I've built arrays that total about 340MB.


This project is running under Java 1.4 - due to limits imposed by our clients. However it would nice to know if things get better in future versions.

So the question for you my reader is there a way to work around this? Who uses all the extra memory?

If you enjoyed this post, subscribe now to get free updates.

Read: Only getting 340 Megs of usable heap out of 2 Gigs in Java - why the limit?

Topic: Channeling Smalltalkers Previous Topic   Next Topic Topic: Smalltalk Daily 7/2/07: Schedule Your Own Processes

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use