[XSL-LIST Mailing List Archive Home] [By Thread] [By Date]

RE: [xsl] coping with huge xml-saxon


Subject: RE: [xsl] coping with huge xml-saxon
From: "Passin, Tom" <tpassin@xxxxxxxxxxxx>
Date: Mon, 16 Jun 2003 11:42:26 -0400

[XSL Chatr]
>
I have a problem which i cant figure out how to get round. I 
> have an XML
> that is as big as 600 Mg and everything fails!!!
> I am using saxon and running this under win2k Machine. It 
> reports of memory
> out of stack and i tried increasing the memory to the JVM with the JVM
> parameter and apparently we  can not increase the memory to 
> more than a
> certain limit.I would appreciate if someone can please let me 
> know how we
> can solve this

The solution will have to involve some other approach.  An xslt
processor may have to refer to any part of the xml document from any
other part.  So it needs to keep everything in the source document
available.  A tree in memory can easily be ten times larger than the
source, or even more.  So there is no way you are going to make this
work.

Theoretically, if the input file and transformation were such that only
a little adjacent bits of the file needed to refer to each other, it
would be possible to stream the document through without keeping it all
in storage, but that capability is not available in practice.

Often, documents of this size are log files and could easily be broken
up for processing piece-by-piece.  You should see if this would be
possible in your case.  Xslt may not even be the right solution, even if
you can break up the data.

Cheers,

Tom P

 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list



Current Thread
Keywords