EDITORIAL

Peter Murray-Rust peter at ursus.demon.co.uk
Tue Jan 25 14:32:15 GMT 2000


At 10:17 PM 1/21/00 -0500, Thomas B. Passin wrote:
>
>Michael Champion wrote:
>>
>>
>The thing about the writing is this:  the Recs must communicate clearly and,
>we hope :-), unambiguously to would-be users.  Think of all the arguments on
>this list that occur because neither the letter nor the intent of a Rec is
>clear.  There are two common ways to improve on written material.  One is
>peer review.  You should choose some reviewers that are expert in the field.
>The second is mass review, sort of like beta testing.  Discussion groups
>could do this.
>
>Trouble with peer review is, who could you get, especially without pay, who
>isn't already involved? 

[comments below]

>Trouble with discussion groups is, a lot of the
>people responding are either not knowledgeable enough or don't read the
>material closely enough. Still, where there is widespread misunderstanding,
>the material probably needs rewriting.  This mechanism is already being
>used, although I don't know how much it matters to the working groups.

I would like to think that this list acts somewhere between peer-review and
a discussion group, particularly on technical issues. In the early days it
was the main forum for 
public analysis of the specs, particularly since XML was not widely
appreciated.

>
>The IETF won't bless an RFC until there are two independent implementations.
>I think this is the right kind of approach.  Surely, Tim B's Lark
>implementation must have been a great help to the XML committee (I'm
>guessing, I don't really know anything about its role).

It was, along with Norbert Mikula's NXP (later DXP) and James Clark's XP -
I think in that

order. They were important in simplifying (sic) the specification of such
things as parameter
entities. I also endorse the need for reference implementations - this is
one reason why I would like a
"browser" platform on which to hang early implementations of technical
subjects.

>There's nothing
>like trying to build to a spec to uncover its lack of clarity.

Absolutely agreed - and this was really the main motivation in setting up
this list.

Peer Review
---------

I write (perhaps) from a rather academic perspective, but it could be
interesting to comment on the role of "publications", "peer review", etc in
the context of scholarly publishing. I write this partly because the time
has come when HenryR and I have to justify our existence to our university
employers - we get assessed on "research excellence" as measured by
scholarly publications in "journals" weighted by their impact factor.
"Nature" and "Science" score very highly in the STM
(science/technical/medical) area whereas I suspect the (?now defunct?)
WorldWideWeb Journal scores low. This impact factor is a measure of how
many citations the journal gets. Funding, and therefore jobs, depends on
the number of "high-impact" publications. We academics are now preparing
our justification to keep on living.

This is not an absolute measure of excellence, but the algorithm is easy to
apply and has replaced the practice of weighing paper publications. However
it is strongly biased in favour of the status quo, and publications in new
"journals" are likely to be regarded as 
equivalent to professional suicide. I am wondering, therefore, how we
measure the "value" of publication in the communal electronic market-place. 


Because of my involvement in XML I get asked to talk on scientific
publications and the e-revolution and I try to envisage the role of the new
technology in changing the publication marketplace. The fundamentals are
that high quality still involves a lot of money (conventional scientific
publishers reckon that 70% of their costs are not related to the cost of
paper, but concerned with editing, peer-review, etc.). On the other hand we
can set up a forum - like XML-DEV - where the production costs are marginal
and the "editorial policy" is not expensive in cash terms. In XML-DEV we
have something that has many of the qualities of a "scientific publication".

Apart from the vicious circle of publishing to gain funding, STM
publications have the following roles:
	- communication to the community
	- establishing priority of ideas or expressions
	- opening one's work to peer review
	- building a sense of community
	- formally depositing re-usable material (data, code, etc.) in the public
domain
	- acting as a historical record

In all those areas XML-DEV functions well (and often better) than
traditional methods. XML-DEV is not unique and there are many successful
models, RFC, other specialist mailgroups, W3C notes, and possibly even
newsgroups. [I distinguish XML-DEV from a newsgroup mainly because there is
a different membership and attitude, springing from its history.] Is there
formal recognition of work done by individuals or organisations? For
example, HenryR and I submitted an RFC some years ago (on using MIME for
chemistry). There was a huge amount of (mainly) valuable peer-review, but
in the end the community didn't take it further. In my own set of values I
would regard the authorship of an RFC as more valuable than many of my
traditional chemical publications.

The formal STM publication market is, in fact, arbitrary. Although it
originated from learned societies and individuals, commercial publishers
moved in because they thought they could make money - and did. [Their
required growth rates are now crippling the academic library system and
most librarians are trapped in a cycle of chopping subscriptions and
appeasement. Many academics are now not publishing in commercial journals.]
But the technology we are developing, here, gives us some opportunity to
change this from the bottom-up.

I therefore see XML-DEV as one instance of a wide range of "new publication
types". I would like to see these as rooted in trusted organisations, and
that is one of the primary reasons for transferring to OASIS. XML-DEV has -
effectively - achieved the status of a high-impact STM journal - it is
widely read (ca 2000, and rising, and probably re-distributed within
firewalls) and the recent MODERATION discussion has suggested it works well
for members. In other words it is worth their "paying" (in time, not money)
to read XML-DEV. It is cited in a variety of e- and paper sources. This is
one reason why Henry and I are keen to preserve the archives, because
otherwise we lose some of our e-history.

I think we have an opportunity here. Is there a role for (say) XML-DEV
whitepapers? SAX, XSchema, DDML, and SML could fall into this category.
They don't necessarily have to be "successful" - in that they get adopted -
 but they have to be seen as competent and innovative.

So to sum it up, apart from the formal academic process, I feel like the
editor of a journal who has been pro-active in sponsoring high-quality
publications which meet the criteria set out above. [Obviously if it led to
a formal impact factor as well I would be even more happy because it helps
keep my job!]. A there ways forward worth developing as part of the move to
OASIS?

	P.


xml-dev: A list for W3C XML Developers. To post, mailto:xml-dev at ic.ac.uk
Archived as: http://www.lists.ic.ac.uk/hypermail/xml-dev/ or CD-ROM/ISBN 981-02-3594-1
Unsubscribe by posting to majordom at ic.ac.uk the message
unsubscribe xml-dev  (or)
unsubscribe xml-dev your-subscribed-email at your-subscribed-address

Please note: New list subscriptions now closed in preparation for transfer to OASIS.





More information about the Xml-dev mailing list