Page 1 of 1

java.lan.OutOfMemoryError on 3.5M xml file with 2G max VM

Posted: Mon Aug 16, 2004 10:16 pm
by esoniat
This isn't really unreasonable (but emacs can handle it ;-))
The file is 3.5M. I have 1G of ram and 2G of swap. I give
the VM 2G with -Xmx2000m -Xms512m.

Really seems like it should fit.

Posted: Wed Aug 18, 2004 11:11 am
by george

It may help if you can make available your file so we can reproduce the problem.
We test oXygen with the similar long files and it works without problems with a maximum memory set to 256Mb. Take for instance the ot.xml from ... which is ~3.5 Mb.

Best Regards,

Posted: Fri Oct 01, 2004 11:35 pm
by fastjack
hi, let me add something...

I also experience Out of Memory exceptions.

My situation is this: I have a master-xml file (docbook) which includes 10 other docbook files using XInclude.

When I have all files open the oxygen process already eats about 180MB of RAM, which seems quite a lot. The XML istelf is not huge, all files together take 89kB space on harddisk.

When I in this situation try to start to transform it to PDF, I happen to get these exceptions. Solution is of course to close all open files, restart oxygenxml and only open the mainfile (in which case oxygenxml takes 70MB only). Then the transformation runs.

I have 512MB of RAM and oxygenxml really pushes the limits here :). (Together with Browser - Firefox also eats 60MB nowadays - IM software - Mail client, Thunderbird ...)

So, in the end... having that experience with 89kB of XML...

...maybe you could rethink, why the application needs so much extra RAM for every open file. And why it needs so much RAM in the first place. It doesn't seem so overly complex, so maybe there is a little waste here and there ... :-) Using Oxygen inside eclipse makes the situation even worse, because Eclipse already eats much RAM and CPU power itself, so I get Out of Memory Exceptions far more early using Eclipse than with Oxygen running in standalone mode.

Another symptom of the RAM hunger is the fact that windows pages unused applications to harddisk, to have more RAM for applications in use. Meaning: if I minimize oxygenxml and forget about it for a while, then it will take ages to load it again, because windows has to reload so many pages from harddisk to memory at once.

Yes, "buy more RAM" will help. I know. Maybe later.

Granted: with just one or two opened files, all runs quite smooth.


Posted: Sat Oct 02, 2004 7:37 pm
by catterall
Some time ago I posted a message about this problem - got no reply. Briefly, on MacOS 10.2.8 or on 10.3.5, with 1.25GB memory. 1.0 GB allocated to Oxygen, G4 server hardware, I get the same "out of memory" during transformations. This problem ONLY occurs after I have been working on several files for some time. Closing down Oxygen and re-starting clears the memory problem and the transformation goes OK.

This looks to me like a classic memory leak - somewhere some process is using memory and not releasing it correctly. Killing the overall process finally releases memory, and we start again the next time.

My final file (after all the XIncludes) comes out about 1.1 MB. If I do a separate copy transformation to do the Xincludes as a first step, I can still get the problem on an FO transformation of the big file.

I don't believe it is a file size problem at all - it's a memory leak.

What hardware/operating systems do others find this problem?