Will XML eat the web?

Michael.Kay at icl.com Michael.Kay at icl.com
Fri Jan 29 13:41:27 GMT 1999

> From: Matthew Sergeant (EML) [mailto:Matthew.Sergeant at eml.ericsson.se]
> Server based XML processing
> XML processing is a
> resource hog. There's not really much you can do about it. 
> Sure, you can use DCOM or Corba to distribute processing your XML across 
> several servers - that's throwing hardware at the problem - not always the
> solution. You can use a persistent parsed structure like a DOM maintained 
> in memory, but for some applications such as a rapidly changing XML
> this isn't always feasible (or is it?). Currently our web based XML 
> system processes about 5 files per second (very subjective figures)...

How big are your XML files? My experience is that provided you split the
data up into "page-sized chunks" and only parse the data the user wants to
see, you can get much higher performance than this. Also, I've found that an
XML-to-HTML conversion that works in a serial pass using a SAX parser (with
SAXON, of course) is faster than anything that involves using a DOM or
de-serialising Java objects, and is negligible compared with the cost of
reading the XML from a relational database or interpreting an ASP script.
But of course, what's true with 2Kb XML files may not be true with 200Kb. 

Mike Kay

xml-dev: A list for W3C XML Developers. To post, mailto:xml-dev at ic.ac.uk
Archived as: http://www.lists.ic.ac.uk/hypermail/xml-dev/
To (un)subscribe, mailto:majordomo at ic.ac.uk the following message;
(un)subscribe xml-dev
To subscribe to the digests, mailto:majordomo at ic.ac.uk the following message;
subscribe xml-dev-digest
List coordinator, Henry Rzepa (mailto:rzepa at ic.ac.uk)

More information about the Xml-dev mailing list