Rhetoric, Epistemology and the 'Net: The Ethics Of Web Publishing

Marc Demarest
marc@noumenal.com

November 1996

   
          

We booksellers, if we are faithful to our task, are trying to destroy, and are helping to destroy, all kinds of confusion, and are aiding our great Taskmaster to reduce the world into order, and beauty, and harmony…

Daniel Macmillan, Memoir of Daniel Macmillan (1882)

Zeta interdimensional spacecraft will beam up people as an evacuation of the Earth. Everything from our Etheric body up will be collected (our physical body being left behind) and we will use the hybrid Zeta-human physical bodies as a 'shell' to adjust and transform to beings of Light until the new dimension is fully prepared.

Posting to an Internet list service (1996)

          

How Do We Know?

Information wants to be free. And information, being free, is also dangerous, destabilizing, and potentially deadly.

Pierre Salinger's recent attempt to use documents published on the Internet to bolster his claims that TWA Flight 800 was downed by one or more US missiles highlights one of the fundamental problems with the Web as a community of self-publishers: it is no longer possible for anyone, no matter what their level of sophistication or education, to believe what they read, to know, in fundamentally actionable ways, that we can act on the basis of information and not do substantial harm to ourselves or others.

I would argue that this loss of epistemological sureity is due, in large part, to the fact that the 'Net, and particularly the Web, truncates an older system of publication that - whatever its limitations - acted as a significantly valuable safeguard against the publication of "information" that was, in one form or another, toxic to the reader: productive of, as Daniel Macmillan suggested, "confusion" in one for or another. The people operating inside the conventional publishing system - whatever their politics or mercenary motives - also abide, in the main, by codes of conduct and moral standards that, in aggregate, protect readers from toxic information by suppressing it prior to publication.

The Web doesn't do that: toxic information is scattered all over the Web, in obvious and not-so-obvious forms. Our challenge, if we want to see the Web grow and fulfill its communal self-publication potential, is to

  • understand the real social value that the traditional publication value chain provides, and

  • reproduce its beneficial effects in the Web environment without carrying over, in that effort, the undesirable aspects of that publishing system.

We also have to understand, in rock-bottom fundamental ways, what new kinds of "reading skills" the Web requires of us and future generations, and begin to teach the hermeneutic and rhetorical skills required to produce proficient, critical Web readers.

A Thumbnail History Of The Publishing System

If we think about it for a while, it becomes apparent that the World Wide Web represents a radical truncation of the value chain that stands between an author -- a person with content to publish and the desire to publish it -- and that author's audience.

As a rough historical framework, we might think about the evolution of this writer-reader relationship as follows:

  • Up until approximately the end of the 1700s, the author had, between herself and her audience, only the printer - the person who prepared from manuscript the public text of the writer. The author and printer entered into various kinds of financial arrangements to deliver a book to the public, and the author was responsible, in many cases, for doing whatever marketing work was required (often, merely soliciting friends and colleagues to use or purchase the book) to defray the costs of printing and make the printer and author a profit. The technology employed by the printer changed and became more sophisticated - from the scriptorium to the printing press, as it were - but the fundamental value chain writer printer reader did not change much, as a structure, from Aristotle to William Blake.

  • During the 1830s, the publisher - as a conflation of printer and publicist - entered the value chain. Although some authors (for example, Dickens) were also publishers, most authors worked directly with their publishers (see, for example, Charlotte Bronte's relationship with George Smith of Smith, Elder) to prepare public works that were then sold directly to the public, and indirectly through large, opinion-making distributors like circulating libraries and later book clubs of various sorts, whose decision to ignore a newly-published work could virtually guarantee its immediate and silent death in the market. Within the publisher's organization could be found publisher's readers (often authors themselves) who vetted manuscripts for content, made publish/no-publish decisions and who, over time, became editors, exercising some measure of control over the matter and expression of texts handled by the publisher. The advent of the publisher was a response and a goad to mass market reading, which first appeared after 1830.

  • In the early-to-mid 1880s (it is hard to say precisely when), the agent (having formerly served, quite often, as the publisher's reader) appeared in the value chain, mediating between the author and publisher, and breaking the link between the two. Thomas Wolfe, and to a lesser extent Faulkner, enjoyed very 19th centyury relationships with their publishing houses, but by the end of the Second World War, agents were a common feature of the landscape, providing authors with advice on marketability and access to an increasingly distant, complex and merchantile publishing network, and providing that publishing network with some guaranteed level of "pre-processing": in short, a filtration mechanism that enabled publishers to focus more on production and outbound marketing and less on the inputs to their production process.

  • At about the same time, the power of distribution intermediaries like circulating libraries and book clubs began to decline: no longer could a distribution intermediary's decision to refrain from distributing a particular book sink that book in the market unilaterally. Book store chains were, however, rising at this time, replacing older point-of-sale networks like sole proprietorships, and to some extent exercising a function similar to that of the 19th century circulating library. And the review trade - the hundreds of periodical devoted to vetting published work - was on the rise, creating in the form of organs like the New York Times Book Review a powerful post-processing function that interposed itself between publisher and reader.

The history of the publishing value chain is, then, a history of the distancing of writer from reader, and a history of an increasingly elaborate set of filters and stabilizing mechanisms in the chain between reader and writer, most of which served primarily economic functions and operated in one way or another on the saleability of the publisher's product: the book.

This network had other, perhaps less intentional effects. A thumbnail critique of the traditional publishing value chain -- the way books get to the shelves of bookstores, or articles into the pages of newspapers or magazine -- might look something like this:

  • exclusion of would-be authors: the ratio of successful authors (people who succeed in having their content published through the traditional publishing value chain) to would-be authors (people with content to publish and the desire to publish it) was and is very small. The preponderance of minority groups of whatever sort among the 'excluded' category was and is unacceptably high.

  • neutered content: authors whose content is accepted into the value chain for publication inevitably experience a "loss of fidelity" as the content moves from the author's original creation to final form. Often, that loss of fidelity is silent - the reader sees no trace of the hands and minds that operated on the author's original to produce the final product.

  • exclusion of classes of content: because the system is essentially commercial, many valid and valuable kinds of content are never admitted to the value chain because these kinds of content will not "sell"

  • economic thresholds for participation: because the system is essentially commercial, its products are priced, and therefore immediately made inaccessible to large segments of any population

  • owned information: because the system is essentially commercial, information published through it can never be the common property of readers, who may do with the content as they see fit

  • no recourse, no alternatives: if rejected by the value chain, a would-be author has little or no recourse and few if any alternatives (although one would do well to look into the history of fringe publishing houses and the vanity press before pushing this argument too far).

The Web Versus Traditional Publishing

Seen in the light of historical schema and critiques like this, the Web is a both radical return to a point in the value chain evolution before the advent of the printer, and a superior system of publication.

In some very significant ways, the Web permits every author to act as her own printer, and the Web publishing infrastructure can be subjected to scrutiny and control, as was the printer in the 1700s and early 1800s, only by focusing on the point of presence (POP) provider used by the author/printer. The Web's claim to superiority, when compared to the traditional publication value chain, seems to be that the Web does not discriminate: that publication is a fundamental democratic right, that the traditional publication value chain excludes more voices and views than it admits to public discourse (there is after all a strong correlation, in any large social context, between one's ability to publish to an audience and one's ability to make one's self heard to any effect), that the Web allows anyone with a command of a language (English, mostly) and the ability to use a few relatively simple tools (a browser, HTML) to self-publish: to be heard in a forum that is used by 20 million people or so worldwide.

All of this is undoubtedly good, from a political perspective, at least at first glance. It's clear to most people what we gain when we begin to contemplate the Web as the basis for ubiquitous world-wide self-publication.

The questions I want to raise are:

  1. when we truncate the traditional publishing value chain as radically as we have done with the Web, what do we lose?

  2. if we lose something in this truncation, are there ways to reinstate what is lost without reinstating the obvious problems with the traditional publishing value chain?

The Value Of The Traditional Publishing Value Chain

If the traditional publishing value chain has, over the last two hundred years or so, censored and suppressed information that ought to have been made available to the public, and participated in various ways in the publication of misinformation and disinformation, that value system has also worked to stabilize and make reliable most of what constitutes "knowledge" in the West today.

Although we like to say, "don't believe everything you read," the fact is that we do believe - that is to say, we are willing to act - on much of what we read. An article in Mother Earth News about the efficacy of a particular herb in curing headaches, for example, leads a substantial percentage of the readership of that publication to try the herb, without much conscious thought about whether the herb will, in addition to curing headaches, cause brain damage or hair loss. Because the publishing system that produces Mother Earth News has both vetting functions (editors) and legal status, its readers can make use of the information it provides with some degree of certainty that the information in the article is:

  • accurate: the herb will do what the article suggest

  • "complete": the herb will not also cause brain damage or hair loss.

Cases when these rules are found not to apply - as was the case, for example, several years ago, with a 60 Minutes story on Washington apples - are (a) widely reported, and (b) swiftly remedied.

From a theoretical perspective, the traditional publishing system answers, implicitly, some very fundamental questions for the reader about what we might call quanta of information. Quanta of information, in our society, draw boundaries around themselves, assert implicitly or explicitly their completeness or self-sufficiency, and can be transported as such. Newspaper articles, books, journal articles, Web pages, mail messages, news postings, documents are all quanta.

About these quanta, the traditional publishing value chain provides the reader with several kinds of warrantees:

  • a rhetorical warrantee: the traditional publishing value chain warrants the author as someone authorized (as such) to speak on the topic, and warrants the topic as worthy of being spoken about

  • an epistemological warrantee: the traditional publishing value chain warrants the information as (more or less) complete and (modulo the publishing entity's own biases and blindnesses) true. In other words, the traditional publishing system, in the large, warrantees against disinformation: knowing falsehood.

  • a warrantee of redress: when a publishing action within the traditional value chain produces misinformation - unknowing falsehood - the publishing value chain provides legal means to redress the error, including laws of libel and notions of clear and present danger, as well as various kinds of community standards with legal underpinning.

We don't need to go any father than the critiques of Noam Chomsky and others to understand that what is true of the traditional publishing system in the large is not true in the small. Newspapers have blatant political biases and exercise sociopolitical agenda in their editorial work; magazines more than occasionally collaborate with governments and corporations to 'spin' particular kinds of information to particular ends, and publishers are sometimes the willing agents of people and organizations with behavioral designs on readers. None of these critiques however, can shake the fundamental epistemological tenet of Western culture: if it's in a book, it must be good and true, and if it turns out to be false, I can do something about it.

The Shortcomings Of The Web Publishing System

About a Web page, or a posting on an Internet list service, the reader cannot say the same thing.

First of all, the rhetorical warrantee provided by the traditional publishing value chain is completely undone by the Web. Anyone - anyone that is with access to the Web - can say anything she likes about any topic without any credentials check. One need look no further than the fantastical conspiracy theory communities thriving on the Web to be convinced that people who know virtually nothing about the history of the planet are hacking, dismembering and reconstituting the world-historical record with impunity, and for an eager - if smallish - audience. Consider, for example, this exerpt from a recent list service posting:




Posted by:	ray@strategicsw.com



Posted to:	SNETNEWS@XBN.SHORE.NET



Posted on:	Wednesday, February 28, 1996 4:31 PM



              What is the New World Order?



1700's - Illuminati (Adam Weishaupt-Founder, Jesuit Priest and Freemason)



         Illuminati name translated to "bearers of the light" - Lumen

         derived from Lucifer, ancient "angel of light" spoken of in the

         old testament.



1800's - FreeMason/Illuminati Organizations: Rothschilds/Jacob Schiff

         Nathan Rothschild vows to kill Czar of Russia and his family.



1900's - Illuminati: Rothschilds/Cune, Loeb & Co. (Jacob Schiff)/

         Rockefellers



1913   - Federal Reserve Act put into law by Rockefellers on Dec 24 1913.

         Only 3 congressmen were present as it was Christmas Eve.



1913   - 16th Amendment (IRS TAXES) added to Constitution.



1917   - Czar of Russia is killed by Bolshevik revolutionaries. Lenin,

         Trotsky and Stalin are financially backed by Jacob Schiff with

         20M in Gold (paid by Rothschilds/Illuminati).



1920   - League of Nations proposed by Woodrow Wilson.



1921   - COUNCIL ON FOREIGN RELATIONS (CFR) created by Rockefellers/

         New York, deemed as Illuminati Organization in US.



1921   - Royal Institute for International Affairs created by Rothschilds/

         London.



1929   - Rothschild/Rockefellers/Carnegie/Morgan (CFR) created stock

         market crash, worldwide depression ensues.



1933  - President Roosevelt (CFR) declares US. bankrupt - Signs over US.

        monetary power to world bankers (Rothschilds/Rockefellers -

        Illuminati)



1939  - Hitler Invades Poland - Financial backing by Rothschilds/Warburgs/

        Krupps.



1939  - Rothschild companies financially back both Hitler and Stalin for

        World War II.



1941  - US. enters World War II (planned by Rothschild/Schiff/Rockefeller/

        Roosevelt.

 

Rhetorically, this quantum presents itself as historical fact, when it is really an egregiously improbable (mis)interpretation, and quite certainly dangerous to employ as fact in most normal social contexts [1].

Secondly, the epistemological guarantee provided by the traditional publishing value chain is completely undone by the Web. Any quantum of information found on the Web has to be assumed, by a careful reader, to be both incomplete (missing information of a fundamental sort) and inaccurate (that is to say, subject to independent verification).

Finally, the Web offers its readers no real warrantee of redress. If I find an Internet information quantum (like the one quoted at the beginning of this essay) that suggests we are about to be taken into the heavens by a benevolent alien race to find spiritual fulfillment, I cannot seek redress. I cannot have that Web page decommissioned - that is to say, have that content removed from circulation - unless it libels me personally. I cannot enter into the community from which the quantum originated and refute it, in part because part of the fundamental rhetoric of that community is that anyone offering contradiction to such tripe is by definition an "asset": an agent of the conspiracy to hide, in this case, the imminent arrival of the Zeta-Reticulans. And even if I could refute the quantum within the community, I have no ability to force the author to retract publicly his publication.

The Product Of Web Publishing: Gems, Junk and Poison

Like the traditional publishing value chain, the Web publishing system has to date yielded a relatively few superlative works, and a sea of crap: badly-written, unimportant, meandering, blathering, forgettable junk. That is the nature, as far as I can see, of publishing systems generically, and not something remediable.

However, the Web produces in abundance what the traditional publishing value chain produced only rarely: toxic information, quanta that if believed and acted on will produce real material harm to the reader, the reader's community or society at large.

And when the Web produces this toxic knowledge, it cannot (as the traditional publishing value chain nearly always does) clean itself up; in fact, the Web is structured so as to spread, rather than contain, the toxicity.

Who Censors Whom?

These shortcomings of the Web have been used, and will continue to be used, as grounds for centralized censorship of Web content. That is not what I want to argue for, if by censorship one means the centralized vetting of Web content by some organization.

What I want to argue for is:

  • a new model of information in the wake of what some people are calling the death of the book: in the wake of the loss of the warrantees provided by the traditional, book-based, publishing system.

  • a model for ethical publication on the part of writers, which co-opts and extends conventional notions of "reliable" publication (the inclusion of citations [2], for example) to a formal and explicit exposure, on the part of Web authors, of their motives in producing Web documents and their

  • a new model of reading, framed by that new model of information, that creates active, critical readers for a brilliant, shifty, contextual, and polluted worldwide self-publication system.

These two things will produce a kind of censorship: readers who are capable of censoring writers by dismissing or discounting Web quanta that, on the basis of critical examination, are not "information" but either misinformation, disinformation or decontextualized information that, if acted on, produce undesirable consequences for the reader or others. This distributed, reader-based censorship is the only kind of censorship I can see that is in keeping with the fundamental philosophy of the Web.

A New Model Of Information

First of all, we need to clarify the

terms information, misinformation and disinformation.

  • Information is any quanta published in the Web: any information that represents itself as self-contained, complete, accurate and actionable to a reader. Information says about itself: I am justified, true, believable, and actionable.

  • Misinformation is any quanta that unintentionally produces error, confusion or undesirable side effects. For example, the two examples of toxic information I quoted above - the conspiracy theorist and the Zeta Reticulan - are both probably examples of misinformation inasmuch as the authors appear to believe that the quanta they provide are complete, accurate and actionable.

  • Disinformation talks to intent: it is misinformation with a motive to misinform. Whether a quantum is misinformation or disinformation is not a matter of opinion - Noam Chomsky no doubt believes that the Washington Post is an organ of disinformation - but a matter of demonstrable fact: the author/publisher must be shown, by evidence, to have knowingly perpetrated falsehood.

Secondly, we need to revisit the issue of context. Most Web pages and Internet postings are decontextualized: their authors do not fully describe the context in which their comments should be situated (including pointers to contrary or mediating statements) and the wanton habits of forwarding and reposting exacerbate this decontextualization by allowing forwarders to "re-author" texts by clipping and chopping them, for the most part silently. Where we have, consciously or otherwise, assumed that traditional publishing systems produced contextualized information, we have to assume, as state of nature, that all 'Net-based publications are decontextualized, and we have to provide mechanisms within the Web itself for readers to apply context to pages and postings: to do what hypertext theorists have called annealing.

Part of this contextualization model is a distinction between narrow and broad contexts. The narrow context -- who is speaking, speaking about what, speaking to whom - can often be reconstructed by a well-educated, well-equipped reader (see below), but the broad context -- what discourse is this speech-act a part of, and what is the history of this discourse - can almost never be reconstructed by anyone (often including the author). As suggested below, the Web publisher has the responsibility for explicitly exposing the narrow context of any published quantum, and we need to build into the Web a mechanism for allowing readers to expose, to the extent they are able, the broader context of which any quantum is a part.

Thirdly, we need to revisit the fundamental question of veracity: can this quantum be true under any probable set of circumstances, and can it be acted on? This is somewhat beyond conventional notions of epistemology, in which justified, true belief is what constitutes knowledge. Because the Web is behavioral - because people come to the Web looking for information so as to be able to behave in a particular way - we have to ask not just "Is this justified, true belief?" but "Can I act on the basis of this quantum, and to what effect?"

That discussion will lead us quickly to the discovery of a continuum between benign misinformation and toxic misinformation or disinformation: a qualitative estimation of the effect a quantum of information may have, in or out of context, on a reader/auditor not able to evaluate the narrow or broad context of the quantum, or the veracity of the quantum itself.

Out of this discussion about what information, in being free, has become, we would come, I think, to several inescapable conclusions:

  1. all information available on the Web, far from being assumed accurate, complete and benign, must be considered partial, inaccurate, potentially damaging misinformation until proven otherwise

  2. the role of the author in clarifying explcitly the rhetorical and epistemological status of her publications, returns to the problem set. In the absence of agents, publishers, lobraries and bookstores, only the author can begin to provide the warrantees once provided by the publishing system. Authors on the Web, therefore, incur moral obligations to their readers and the Web community by choosing to publish themselves.

  3. only the reader, in the silence of her own intellect, can end the suspension of belief, prove to herself that a particular quantum is not misinformation or disinformation, and act. Readers have obligations to both authors and other readers to validate or invalidate, to rate the content and quality of Web-based materials and to expose publicly misinformation and disinformation.

A New Model Of Ethical Publishing

Web authors should, to optimize their readers' abilities to factor and evaluate material published, should always:

  1. identify themselves in their electronic and real-world incarnations. An e-mail address is not proof of anything, and does not provide an effective means of redress for publishing toxic information. Any publisher who posts a Web page or list service posting and is unwilling to identify herself in the "real world" with a name, an address and affiliations (corporate or organizational) should be considered a knowing junk-monger, regardless of the "sensitivity" of the content being published.

  2. link any information published to their "home page", which should contain enough biographical detail about the author to make a reader aware of the likely biases and blindnesses of the author.

  3. never act in reportage mode, passing on unsubstantiated information, even when the information is explicitly marked as unsubstantiated. If the source of information will not reveal itself in a medium where readers have no effective warrantees (as they do when newspapers publish crap), it ought not to be admitted to public discourse, period. Similarly, people should avoid re-authorization: the duplication of Web content in multiple locations. Pointers are designed to permit references rather than duplication, and should be used to minimize the spread of intellectual pestilence.

  4. make explicit distinction in all published materials between matters of fact (which can be cited or otherwise backed up by inspectable evidence), matters of interpretation (which are the author's construction on fact) and matters of opinion (which have strong rhetorical value and no epistemological value). This suggests the use of introductions and abstracts to identify the rhetorical status of a posting or page ("this is my opinion of…" or "this is my interpretation of…") as well as strong use of conventional citation systems like those of the MLA and APA.

  5. immediately remedy any errors in matters of fact identified by any reader, and publicly highlight those remedies, though appropriate mechanisms (repostings, or explicitly highlighted modifications to Web content).

  6. Voluntarily withdraw from circulation any material which has been determined to be false, undemonstrable, hazardous or improperly marked (as with the silent presentation of opinion as fact - what we might call the Zeta-Reticulan fallacy).

None of these guidelines will prevent the spread of toxic information so prevalent on the 'Net, so I would in addition suggest that some non-profit organization design and install on the Web a consumer safety system in which Web publishers and 'Net posters voluntarily register their acts of publication, and submit their materials to evaluation by their readers, and within which any reader can retrieve peer-reader rating/evaluation information on any page or posting.

A New Model Of Reading

This new model of reading strikes me as critical to future social welfare. Although most adult readers, stumbling upon the examples of toxic information I have noted above, would be immediately able to spot it for what it is - drivel - I am convinced, by watching my sone and other younger readers navigate the Web, that we are raising a generation of children so poor in critical, analytical, hermeneutic and historical skills that we will be faced, in a decade or so, with millions of readers who are not able to differentiate between the veracity of a NASA finding about life on Mars, and the phantasies of the Zeta-Reticulan fans.

To remedy this situation, I suggest we need to do at least the following:

  • reinstall formal education in rhetoric as part of mandatory education curricula. People with no rhetorical training are unable to ask the basic questions every Web reader needs to ask: who is speaking? What right do they have to speak? What is their motive for speaking? What do they want from me?

  • Install or reinstall basic epistemology and hermeneutics as a part of mandatory educational curricula. People using the Web need to be able to ask basic questions like "How do I know this is true?" and "What might this mean to me?" in order to use the Web; today, I would argue, most people have to be able to deploy epistemological and hermeneutic tools routinely, and cannot do so.

  • install basic information theory as a part of mandatory educational curricula. The idea that people could graduate from college in 1996 having had no exposure to Claude Shannon, C.S. Peirce, or Marshall McLuhan is, in my view, frightening. All are of more pratical importance to living well in a wired world than Sartre, Aristotle or Freud.

  • teach within the Web, using Web-based materials in the curriculum to demonstrate both insight and error. The rampant explosion, for example, of electronic literary texts of dubious provenance [3] offers teachers of literature the opportunity to expose the complexities of primary bibliography and the notion of the book as form. Similarly, the vast archives of cold fusion information on the Web offers physicists the chance to expose their students to the formation of discourse around an area of scientific controversy, and the Tesla archives scattered across e-space offer historians and their students a fertile archeological ground for understanding how the historical canon works and forms itself.

  • teach the Web, as a dissemination/publishing mechanism, as an informational system, as a cultural phenomenon. Expose both its superiorities and its disadvantages. Use Web-based publication within the curricula to produce responsible Web authors.

Ultimately, the Web succeeds as the publishing system for the next several generations only if it is used by readers who understand how the Web operates, what it provides, and what it cannot provide, and who further understand their obligations as Web readers (and Web authors).

Conclusions

These remarks are incomplete, and, frankly, motivated by fear. That people believe in Zeta Reticulans does not cause me heartache (or amusement); that a respected journalist and former Presidential press secretary doesn't know how to use, and not use, information published on the Internet frightens me to death. My objective in publishing this is simple: to begin a discussion, to spark thought and hopefully additional work in these areas.

People will no doubt argue that I have made use of edge-case examples; that UFOs and Illuminati conspiracies do not reflect the state of knowledge on the Web today. For readers, I offer a list of sites that I believe contain toxic information, and leave it to those readers to judge whether or not this problem is confined to the fringes of Web discourse.

People will also argue that I have made a difficult problem - how do we know what is true? -- too easy. That may be true, since I am concerned not so much with truth as with what happens when untutored readers ingest junk and act on it. It is certainly easy to make this sort of problem infinitely more difficult by playing epistemological games - how, for example, do I know that the person who published the tripe on Zeta Reticulans isn't telling the truth? How do I know the Rothschilds don't control the world monetary system? How do I know the Holocaust really happened? How do I know that Darwin is right? How do I know that sticking a coat hanger in one's ear is a bad thing to do?

The practical test is a simple one: if we can without damage to ourselves, others and our social groups act on a quantum of information published, then it is admissable to discourse. If we cannot act on it, the quantum is benign (and pointless). If acting on it harms us - whether harm is intellectual confusion or death - or harms others, the quantum is toxic information, and ought to be expunged from public discourse.

I challenge anyone to demonstrate that a country in which Zeta Reticulan-faciliated exodus is a matter of fact, acted on by the citizenry, can play a meaningful productive role in the world, produce healthy, well-adjusted citizens, conduct meaningful scientific or social endeavors, or do anything other than degenerate into a society of glossalalian dingbats.

In closing, something from Nietzsche, to light the way:

    A further step in the psychology of conviction, of 'belief'. I suggested long ago that convictions might be more dangerous enemies of truth than lies. This time I should like to pose the decisive question: is there any difference whatever between a lie and a conviction? - All the world believes there is, but what does all the world not believe! - Every conviction has its history, its preliminary forms, its tentative shapes, its blunders: it becomes a conviction after not being one for a long time, after hardly being one for an even longer time. What? could the lie not be among these embryonic forms of conviction ? - Sometimes it requires merely a change in persons: in the son that becomes conviction which in the father was still a lie. - I call a lie: wanting not to see something one does see, wanting not to see something as one sees it: whether the lie takes place before witnesses or without witnesses is of no consequence. The most common lie is the lie one tells to oneself; lying to others is relatively the exception.


NOTES

[1] It is interesting to note that the conspiracy theory communities have always had access to publishing systems. Western Islands Press and several other publishing operations linked to the John Birch Society have for years turned out reams of spurious junk on all sorts of historical and contemporary conspiracies. What differentiates the Web from Western Islands is not its publication capability, but its distribution reach: no self-respecting bookstore chain carried Western Islands' material (most were sold as far as I can tell through catalogs and direct solicitation), but, nowadays, the same conspiracy theorists who lingered over None Dare Call It Conspiracy in a cheap, tatty paperback read by at most 100,000 human beings now publish their own psychoerotic nonsense to a potential readership of 20,000,00 or more.

[2] Citations continue to be the best mechanism, on the Web, to judge the veracity of postings and pages. Scurrilous tripe nearly always has no citations beyond the unnamed "informed sources" that are the sine qua non of bullshit.

[3] These e-texts are an example of toxicity operating at a silent level. The way a text becomes an e-text is determined by copyright laws. People interested in, say, putting the work of Joseph Conrad on the Web will be looking for versions of his texts no longer subject to copyright restriction. The fact that the sources of these texts - say, the first edition of his first novel, Almayer's Folly, -- are hopelessly polluted, when compared to the manuscript or to later, authorially-edited editions, by bad printing, silent editorial amendation, wholesale omission, and other kinds of errors is almost never pointed out by the e-text creator. In fact, it is very difficult to find e-texts that even explicitly cite their origin, or discuss the textual variants associated with the e-text. Those variants are often significant, as is the case with, say, Shakespeare or Thomas Wolfe. Admitting bad literature to the electronic canon pollutes the canon, and can materially mislead readers.

          

Last updated on 06-22-97 by Marc Demarest (marc@noumenal.com)

The authoritative source of this document is http://www.noumenal.com/marc/toxic.html