Will XML eat the web?
Tim McCune
timm at channelpoint.com
Thu Jan 28 17:58:51 GMT 1999
Whenever you run into processing bottlenecks, cache and distribute. The
most obvious thing to cache is the result of applying the XSL to the
XML, for as long as you can. Caching DOMs also helps. If your DOM is
very dynamic, you can also exercise more fine-grained control by caching
individual elements, or caching the objects that create these elements.
You can also cache stylesheets. We did all of this on the Java Lobby,
and it gave us more of a speed boost than anything else.
As far as bottlenecks on the browser go, I once worked on a project
where we had a tiny applet run when the user first enters the site that
measures its resources such as CPU speed. This gets stored in the
client's session so that the server can then shift processing to the
client (sending it XML/XSL) or keep it on the server (sending the client
HTML), whichever gives the client the fastest response. This is a
perfect fit with XML/XSL processing.
-----Original Message-----
From: owner-xml-dev at ic.ac.uk [mailto:owner-xml-dev at ic.ac.uk]On Behalf Of
Matthew Sergeant (EML)
Sent: Thursday, January 28, 1999 10:46 AM
To: 'xml-dev at ic.ac.uk'
Subject: Will XML eat the web?
XML is potentially a web "killer application" in more ways than one.
Let's
examine 2 scenarios - server based XML processing and client based XML
processing.
Server based XML processing
Here we process XML on the server to produce HTML. This will be where
the
majority of XML processing occurs for web applications, since reliance
on
the 5.0 web browsers is going to be low for a long time. XML processing
is a
resource hog. There's not really much you can do about it. Sure, you can
use
DCOM or Corba to distribute processing your XML across several servers -
that's throwing hardware at the problem - not always the best solution.
You
can use a persistent parsed structure like a DOM maintained in memory,
but
for some applications such as a rapidly changing XML database this isn't
always feasible (or is it?). Currently our web based XML system
processes
about 5 files per second (very subjective figures) - and it's at max CPU
(it's only a Pii266). This is using expat. Not a good situation since I
could probably build a much faster application using an RDBMS - but I'm
looking to the future when I can send the raw XML to the client.
Processing XML on the client
This is a much better option, but an option that doesn't always exist.
However if I'm sending XML to the client, for example a large database
of
products, wouldn't the client machine then get bogged down processing
the
XML? (I don't know - we haven't got that far yet).
Anyway, I'd like to hear people's comments on solving this potential
issue,
and whether they think choosing XML for the web was a good choice at
this
stage in browser development.
(please note: I'm a big fan of XML - it has huge potential, and I would
appreciate any help I can get in making this application faster)
Matt.
--
http://come.to/fastnet
Perl on Win32, PerlScript, ASP, Database, XML
GCS(GAT) d+ s:+ a-- C++ UL++>UL+++$ P++++$ E- W+++ N++ w--@$ O- M-- !V
!PS !PE Y+ PGP- t+ 5 R tv+ X++ b+ DI++ D G-- e++ h--->z+++ R+++
xml-dev: A list for W3C XML Developers. To post, mailto:xml-dev at ic.ac.uk
Archived as: http://www.lists.ic.ac.uk/hypermail/xml-dev/
To (un)subscribe, mailto:majordomo at ic.ac.uk the following message;
(un)subscribe xml-dev
To subscribe to the digests, mailto:majordomo at ic.ac.uk the following
message;
subscribe xml-dev-digest
List coordinator, Henry Rzepa (mailto:rzepa at ic.ac.uk)
xml-dev: A list for W3C XML Developers. To post, mailto:xml-dev at ic.ac.uk
Archived as: http://www.lists.ic.ac.uk/hypermail/xml-dev/
To (un)subscribe, mailto:majordomo at ic.ac.uk the following message;
(un)subscribe xml-dev
To subscribe to the digests, mailto:majordomo at ic.ac.uk the following message;
subscribe xml-dev-digest
List coordinator, Henry Rzepa (mailto:rzepa at ic.ac.uk)
More information about the Xml-dev
mailing list