[XSL-LIST Mailing List Archive Home]
[By Thread]
[By Date]
Re: [xsl] Splitting file into N-sized chunks
Subject: Re: [xsl] Splitting file into N-sized chunks From: Stefan Krause <stf@xxxxxxxx> Date: Mon, 10 Aug 2009 02:07:17 +0200 |
Michael Kay schrieb: > I suspect that level of accuracy isn't needed. A heuristic that says 500Kb > of serialized XHTML = 250K characters in text nodes is probably quite > adequate for the purpose. Indeed, I've produced .epubs with thousends of chunks of minimal size (less than 5 kBytes), and they do well. (Smaller chunks speed up the page turning, and the end of an chunk forces a page break.) A real challenge was to synchronize these chunks with epubs .opf and .ncx files, where the chunks must be registered, and to handle internal links (consider <a href="#something">...</a>) and other references like footnotes. I'm afraid there is no simple answer how to split a file for epub, because it depends in some way from the input and affects some other parts of the workflow. Stefan
Current Thread |
---|
|
<- Previous | Index | Next -> |
---|---|---|
RE: [xsl] Splitting file into N-siz, Michael Kay | Thread | RE: [xsl] Splitting file into N-siz, Michael Kay |
Re: [xsl] RE: What are the differen, Vyacheslav Sedov | Date | [xsl] XPath to get node with sons w, Andy Kohn |
Month |
Keywords