[XSL-LIST Mailing List Archive Home]
[By Thread]
[By Date]
RE: [xsl] Transforming large XML docs in small amounts of memory
Subject: RE: [xsl] Transforming large XML docs in small amounts of memory From: "Michael Kay" <mike@xxxxxxxxxxxx> Date: Wed, 2 May 2007 09:53:39 +0100 |
> >> I think that transforming 150Mb of data in 400Mb of RAM would be a > >> sensible target (is this sensible?) > > > > That's ambitious. To achieve that, you're going to have to do > > something that condenses the input document before transformation. > > What would you say was a reasonable target? I expect it will > be dependent on many factors. I reckon a factor of 5x is usually achievable - so 750Mb. But it does depend. Biggest variable is the proportion of the 150Mb that's taken up with long element names - sometimes e.g. in FpML this is a huge proportion of the total, in which case you can do much better than 5x. Michael Kay http://www.saxonica.com/
Current Thread |
---|
|
<- Previous | Index | Next -> |
---|---|---|
Re: [xsl] Transforming large XML do, Ronan Klyne | Thread | [xsl] loop timing and result-docume, Duane Gran |
Re: [xsl] Transforming large XML do, Ronan Klyne | Date | Re: [xsl] How to split an RegEx int, Abel Braaksma |
Month |