The challenge - a colleague is trying to write a memory intensive application, he's trying to allocate large chunks of memory and nothing seems to work. Here are his notes:
I'm trying do some memory intensive work in Java and am running into limits far sooner than I would expect.
First of I would like to access as much of the 2GB process limit I can. It would appear that 1.4 GB is the largest amount that can be allocated with the -Xmx switch. So be it.
However we only appear to get use of a small chunk of that. Maxing out around 340MB. Who takes that space? How can get more of it?
Some examples of the problem: I allocate an array (1- or 3-dimensional) of 50 million ints and another array of 10 million doubles. The raw space is 280MB -- 200MB for the ints and 80 MB for the doubles. I see the process virtual memory grow to about 290MB, an acceptable overhead. So far so good. Then I do some simple operations involving instantiations of small numbers of small objects. Almost immediately, and fairly frequently, the garbage collector runs (I use -verbose:gc to see when) and the virtual memory of the process jumps to about 460MB. It seems to allocate a fixed fraction of the space consumed on the heap for use during garbage collection.
As another example, I have an application which accumulates values into large int and double arrays (similar to the above). The greatest fraction by far of memory I'm using is in those arrays; the program is creating, then dropping references to small objects whose primitive contents are used to fill the big arrays. The process crashes out-of-memory -- exceeds 1.4GB in VM size -- when I've built arrays that total about 340MB.
This project is running under Java 1.4 - due to limits imposed by our clients. However it would nice to know if things get better in future versions.
So the question for you my reader is there a way to work around this? Who uses all the extra memory?
If you enjoyed this post, subscribe now to get free updates.