Ciolek, T.M. 2005. From Private Ink to Public Bytes: the epistemological effects of Internet publishing on scholarship in the social sciences and humanities. In: Wasilewski, Jerzy S. and Anna Zadrozynska (Eds.). 2005. Horyzonty Antropologii Kultury [Horizons of Anthropology of Culture], Warszawa: DiG, Instytut Etnologii i Antropologii Kulturowej UW, ISBN 83-7181-370-6, pp. 33-49.
www.ciolek.com/PAPERS/ink-and-bytes2005.html

From Private Ink to Public Bytes: the epistemological effects of Internet publishing on scholarship in the social sciences and humanities

by
Dr T. Matthew Ciolek,
Research School of Pacific and Asian Studies,
Australian National University, Canberra ACT 0200, Australia
tmciolek@coombs.anu.edu.au

Document created: 1 Feb 2005. Last revised: 19 Dec 2005.

0. Abstract

The Internet is found to be more than just a congenial catalyst for intellectual activities. In fact, the *Zeitgeist* of the pre-WWW Internet (1969-1994) supported work based on a rapid succession of highly visible shifts in conceptual frameworks and practical conclusions. However, since the mass adoption of the WWW as the dominant online tool in 1994, the Internet is a far tamer medium. While it does not prevent major paradigm-shifts, it clearly favours small-scale and un-publicised (i.e. private) adjustments and modifications to one's published thinking and data. The epistemological consequences of this transformation are explored.

1. Private Ink vs. Public Bytes

There are two distinct allusions in this paper's title: "From Private Ink to Public Bytes...". First of all, there is the contrast between "ink" and "computer" technologies. Secondly, there is acknowledgement of the complex nature of public and private communications and their effect - explicit, as well as tacit - on what we do as social scientists.

When I use the word ink, I refer, of course, to a certain technique of recording and disseminating information. And when I refer to "private" ink, I mean the social status of that information technology. Letters or drawings, and written documents, whether marked on paper with a pencil, typewriter, or brush, are accessible just to one or two people at a time. They are an essentially small scale, private level of communication.

On the other hand, "public" ink is also possible. For the last several hundred years (in Europe at least) people have had ready access to print - a technology which can cheaply multiply and reproduce information. Print has created vast readerships of countless newspapers and books, posters and leaflets, textbooks and journal articles. In all these cases we can talk, I believe, about ink deployed in a large scale, public fashion. There information reaches out to an audience situated beyond the confines of face-to-face interaction, beyond gatherings assembled in a study, lecture hall, or classroom.

The next conceptual step is also easy to grasp. Whenever one uses a standalone desktop computer or a hand-held device, such a person can be said to manage information in the form of digital signals, and do so in a relatively small, controllable and therefore private sphere. However, as soon as such a device is networked, the potential for a public exchange of bytes is immediately created, with all its associated options for information storage, indexing, retrieval, and dissemination across the entire network.

It would seem therefore that the history of a culture or of a group of civilisations could easily be divided into neatly parcelled-out technological stages. Societies could be said to have moved from slow, inefficient times - ones which were characterised by a circulation of information in the form of private ink - into times marked by the reign of public ink. Moreover, one could postulate that in the last thirty years or so we have made a new transition, one from the realm of isolated and restricted private bytes into the universe of freely available and freely moving public bytes.

All such typologies and periodisations are seductively elegant and simple. They imply a never-ending march of technological progress, and they offer us the prospect of an enthusiastic and fascinating future. However, closer inspection reveals that such a future also implies certain epistemological complications.

2. The Internet Revolution

The Internet is widely regarded as the largest and most important social and cognitive development of the last 500 years; that is, since the time when Johannes Gutenberg started printing books in Europe (Harnad 1991, Dewar 1998). Throughout Europe at the time when Gutenberg started his operations in 1455, there existed approximately 30,000 books in the form of handwritten scrolls and codices. Fifty years later, the number of books in European countries had grown to approximately 9 million *printed* volumes. This is a truly dramatic and remarkable intellectual development. Even when expressed in purely quantitative terms these changes represent a 30,000% growth in the number of publicly accessible documents over the space of 50 years, or 600% growth per annum.

There is no doubt that a parallel social and intellectual revolution is happening today. It started in California, on the last day of October 1969, when two hitherto individually operating computers exchanged data (and commands) with each other for the first time. Thirty five years later, in early 2004, there were more than 233.1 million networked computers world-wide (ISC Internet Software Consortium 2004). Many thousands of new machines were being added to the global archipelago on a daily, even hourly basis; almost every minute.

Additional developments are also afoot. The 1969 communication link established between these first two computers brought about the online interaction of a handful, maybe of a dozen or so, individuals. Since then the number of people who interact with each other online has grown exponentially. Three decades later, in September 2002, there were over 605.6 million people who were regular users of the Net (Nua 2002).

This is the most energetic technical revolution ever recorded. We are led to ask what consequences will flow from our use of this large scale global network for the exchange of information. The two motivations behind the Internet's birth were nation-wide efficiency and assured continuity (in the face of a possible nuclear attack) in the use of a handful of costly computers. In the earliest days of the Net its pioneering infrastructure in form of the ARPANET network was supposed to link a series of high-powered computers housed far apart from one another. The intention was to enable a researcher working, say, in the state of Utah in the USA to connect to a more efficient, or a less heavily used computer in California. Likewise, a researcher who had specialist computing needs in Michigan, would be able to take advantage of a machine located elsewhere in the country. It was only a few years later, once the country-wide communicational backbone was established, that other developments could take place. These proved to be momentous developments, as they afforded the prospect of permanent and public audiences: the introduction of email (December 1972), online discussion groups (bulletin boards - 1978; Usenet news, 1979), mailing lists (listservs - 1981) or tools for the placement of electronic documents on the Net (UUCP messaging and file transfers - 1977; FTP archives - 1985). Surprisingly, these additional developments were not deliberately planned; they were created more or less spontaneously and in the form of a series of incremental instalments (Zakon 2005; Ciolek 1999, 2003).

As we know, a lot of other things have happened with the Internet since then. So, let us look at the unplanned, unanticipated and unintended consequences of all those inventions, since they constitute such a dominant part of our recent social and economic history.

It is often the unintended consequences of an event that constitute the most promising area of research for a social scientist (Popper 1969). For instance, if we reflect again on the introduction of the European printing press, we note that originally the technology was meant to produce copies of the Bible in a manner which was quicker, more accurate and less expensive than handwriting or blockprints. But we also find several unanticipated developments as well. We find, for example, that print eventually has brought about the consolidation and standardisation of distinct national languages in Europe (Eisenstein 1983). Language groups which happened to have easy access to printing presses were able to thrive and consolidate. And once that became possible, they could quickly arrive at a certain standard vocabulary, and rules of grammar. On the other hand, those separate linguistic groups which did not have print's cheap and prolific power of transforming privately created information into the public domain tended to dissolve into dialects within the more print-savvy and therefore dominant cultures.

Unintended consequences of a supposedly neutral technical invention also followed the invention of railways. Sometime around 1825, a young English engineer called Stephenson began practical experiments with a mechanical cart rolling on a pair of metal tracks and powered by a steam engine. Initially we see purely local and technical transport solution. Barely fifteen years later, in 1840, many independently funded and constructed railway lines have linked and formed a country-wide transportation system. Simultaneously, Uniform London time (GMT) starts to be adopted across the whole of England. GMT time becomes the single and standard time for all the hitherto separate places which are now spanned by an ever growing railway network. Until the invention of railways, every town and every village had its clocks and timepieces set according to the sun's zenith, or some other local convention. The railways, however, unify and coordinate clocks across train stations, towns, entire provinces, countries and continents. The process was soon underway in mainland Europe, and then Asia, Africa and the Americas. The United States and Canada adopted standard railway time in 1883. A year later, in 1884, the whole world was brought to a unified temporal framework of 24 hourly time-zones and zero (Greenwich) meridian. The all-important (especially to global maritime communications) International Date Line became also delineated and codified.

So, today I want to ask about the unintended consequences of the Internet. More specifically, I want to ask whether the Internet impacts on the way we think about scholarship. Does the Internet change the way we think about the practice of being a researcher? In my experience, it is a question which remains largely unexamined.

3. Two "Flavours" of Knowledge

Science is a tradition of thought. It implies the use of objective, precise and replicable methods of research, logic and honesty in bringing together and interpreting observations. If we reflect on the structure of that tradition of thought, it is possible to discern a certain intellectual divergence. It appears that there are two co-existing philosophies, or "flavours" of knowledge (e.g. Magee 1973:18-22). They can loosely be labelled: "transformative knowledge" and, "cumulative knowledge".

The starting point for transformative knowledge is the realisation that there exists an intellectual or observational problem worthy of our attention. Such a problem exists simply because we realise that what we know does not quite explain the way the world is observed to behave. The realisation leads to an intellectual tension, a cognitive discomfort, a "mental itch". If this occurs, the researcher becomes infused with a burning question: how does the world *really* work? What is, really, that thing in our field of inquiry that so annoyingly eludes the mind's grasp, eludes our understanding?

Naturally, the concept of transformative knowledge is intimately linked to the name of Austrian-born philosopher Karl Raimond Popper (1902-1994) and to his book, the "Logic of Scientific Discovery". Popper was the first scholar to insist that the road to objective knowledge always starts with an initial formulation of the research problem. The next stage is a temporary theory, or an explanation of the intriguing phenomenon. The act of constructing a temporary explanation inevitably leads to the third stage: critical discussion of the problem, as well as of our observations made so far and our tentative explanatory account of them, plus consideration of alternative views and explanations. Following such a critical discussion is step four, namely the revision and re-definition of the research problem. The initial difficulty which launched us on the path of our investigations is re-stated and re-evaluated, cast and re-cast - for as long as we are sustained by curiosity and intellectual honesty.

Thus, according to the tradition of transformative knowledge we are always dealing with an ever-growing body of well articulated, and deeply understood problems, as well as their "intellectual ecologies". This is so, because scientific problems do not occur in isolation. They always occur in the exacting and hard-nosed context of other scientific problems, as well as in the unforgiving context of all previous efforts to resolve them (Popper 1994:101-102).

Cumulative knowledge is a greatly dissimilar fashion of thinking. The initial question and instantaneous albeit provisional explanation are not accorded the key role they enjoy within the framework of transformative knowledge. They are implicitly acknowledged but are not swiftly acted upon. Instead, observations - precise, trustworthy, thoughtful and ample observations - take precedence.

There are many reasons for the existence of scholarship infused with cumulative knowledge flavours. Sometimes systematic observations are embarked on because we need to address an urgent practical issue. In such cases theorising before solid evidence is collected is considered to be pointless. Sometimes "theory-free" observations are commenced as a deliberate respite from the prevailing conceptualisations. In such cases they represent a temporary stage in pursuits for a different and refreshing intellectual stance. Sometimes, finally, they are embarked on (as is the case with electronic "data mining", or archaeological excavations) as a part of a grand fishing expedition, in course of which sources of information are systematically trawled in search for some meaningful though unpredictable correlations between measurements.

One way or the other, cumulative knowledge initially involves carefully conducted and precisely recorded observations which, secondly, lead to an accumulation of data. The volumes of collected data may vary. Sometimes one has only a few points on a scattergram, sometimes one may be lucky and have many data points. Demographers, for example, tend to deal with large volumes of numerical data. On the other hand, historians and archaeologists often have to rely on less plentiful, or even scanty factual evidence.

Third, following the observation and data recording phases, comes the analysis of data. Inevitably, if we look at our materials long enough, sooner or later, we are able to find some aspect of our data that we can report and comment on.

Fourth, from this analysis we start building models - simplified and generalised representations of reality. Some models are mathematical and involve the design of an equation or a matrix, or a computer simulation. Other models can be graphic such as charts and plans and maps. Still others heavily rely on verbal formulations, whether in the form of chronological narratives, or in the form of systematic accounts and descriptions.

After that is accomplished, we can return to step one. More observations are carried out, and more data are collected and analysed so that those representations, their models, can be further refined, reordered and strengthened. And - if we are successful - converted into a semi-permanent theory, a "model of models", a highly generalised account of the studied fragment of reality. This all means that according to that cumulative school of thought, human knowledge can be defined as a growing body of replicable methods and general models, and a body of dependable, trustworthy data.

As we can see there is a strong contrast between these two epistemologies. This is illustrated by a summary comparison below:

Transformative knowledge

Cumulative knowledge Naturally, other major differences can also be found. As we know, life is unpredictable and whatever we find out about it is never enough. So, what happens when we are confronted by new evidence? What happens if a particular body of scientific opinions finds itself in some ways inadequate? Well, the difference in response and strategy of the two distinguished intellectual traditions is quite dramatic.

Transformative knowledge, when it find itself inadequate, strives to radically re-asses and reform its inner logic. It decisively supplants all earlier conceptualisation and meta-conceptualisation with a brand-new and temporary meta-meta-conceptualisation, born from the old. In a way, the whole intellectual process resembles a sequence in which a caterpillar hatches from an egg, the caterpillar becomes a pupa (chrysalis), and a butterfly is born from the pupa. Each stage necessarily builds on but is radically different from the one which preceded it.

In similarly challenging circumstances cumulative knowledge resembles (and here no unfavourable judgement will be implied, as I am looking only for a handy metaphor) the behaviour of an amoeba when it is prodded by an electrical discharge, or maybe exposed to some acid. In those cases the body of cumulative knowledge usually changes its "shape" (i.e. its overall area of investigations, methodology, language of descriptions, the scope of its definitions), while simultaneously trying to protect and preserve as much as possible from its accumulated body of replicable methods and models, and - of course - the accumulated body of its dependable, trustworthy data (Magee 1973:24-25). Within the tradition of cumulative knowledge erroneous theories are seldom formally overthrown, instead they get silently shunned in favour of other more promising accounts. Cumulative scholarship is not fond of dramatic cognitive revolutions. It definitely favours gentle, gradual shifts to what, why, and how research is done and to what is currently known for sure, or even known at all. However, just like transformative knowledge, cumulative knowledge is a well developed and well seasoned adaptive system.

Before we think about the Internet in the context of these two approaches to science, we must ask ourselves how these two approaches to knowledge relate to each other. A number of answers are possible.

4. The Relationship Between the Two Flavours of Knowledge

On one hand we could say that transformative and cumulative knowledge represent two distinct and consecutive stages in the history of science. The first stage, which started in the early 17th century, was initiated by the work of Francis Bacon (1561-1626). Bacon's seminal book "Novum Organum" (= the New Tool, the New Range of Equipment) published in 1620 gave rise to Western experimental science, and to its distinctively empirical and cumulative philosophy of knowledge. That epistemological tradition flourished in the 19th and early 20th century. We hold as its testimony the tremendous achievements in chemistry, physics, geology, and zoology, as well as parallel progress in (proto)sociology, and great leaps in history, archaeology, classical studies, and linguistics.

That period in the history of science was brought to an end in 1935 when Popper, as a young logician published his ground-breaking book, "Die Logik der Forschung". Popper's work was subsequently published in English, in 1959, as the "Logic of Scientific Discovery".

From such an account it would be possible to conclude that the cumulative school of thought which reigned for about 300 years (from 1620 to 1935) was superseded by the newer tradition, that of transformative knowledge, and that the more recent epistemology has been our intellectual guide for the last 70 years or so.

However, two other interpretations of the relationship between cumulative and transformative knowledge are also valid.

Some forty years ago, Thomas S. Kuhn (1922-1996), both a physicist and philosopher of science, wrote a book on the transformations that physics and astronomy experienced during times of major scientific revolutions. In that book, Kuhn (1962) argued that occasional periods of dramatic, fast-paced scientific breakthroughs (i.e. times when one set of theoretical frameworks is suddenly replaced with a new one) are separated by lengthy periods of relatively stable, normal, puzzle-solving science (Kuhn 1970:4-11) during which scientific research progresses routinely and cumulatively. Kuhn seems to describe the real-life practice of academics, their daily behaviour; while Popper is normative, for he charts an idealised etiquette for all scholarly activities (Williams 1970:50).

Or perhaps, as some scholars suggest (e.g. Watkins 1970:32-33, Magee 1973:41), we might wish to conclude that cumulative and transformative types of knowledge in fact constitute two inseparable, because indispensable and complementary aspects of a single continuous intellectual process. This would mean that in order to be effective researchers, we need to take equal advantage of both epistemologies, both strategies of thought.

Personally, of the three interpretations, it is the last that I like the most. It is an interpretation which proposes the intimate co-existence and fruitful co-operation between great diligence and creative imagination.

Still, one of the parties in this philosophical relationship, the transformative approach to research, is clearly the most remarkable one, for a number of reasons. Mysteriously, with transformative knowledge, mistakes and errors cease to be our enemies. Instead, they are our friends, because we can learn from them. Regular and cheerful encounters with our blunders and shortcomings are actually indispensable to the success of our long-term work. Once our mistakes are recognised and well-analysed, they help us do our work better.

Resultatntly, we no longer need to attach ourselves passionately to any particular point of view. Rather, we attach ourselves to the endless search for truth. At any given moment, a particular point of view serves only as a transient approximation of that much prized "correspondence to the facts" (Popper 1994:111), while retaining an acknowledgment of tradition and grounding in the past.

Finally, transformative knowledge brings us civil courage and selflessness as it encourages energetic public discussion of our intermediate solutions to recognised shorcomings in our work.

A transformative knowledge approach therefore imposes certain responsibilities on scholars. True, transformative scholarship is quite happy to swiftly adopt one point of view over another (providing that the resultant informational content is actually increased, and that it can be objectively tested for its validity). Yet it also strongly promotes the tradition and continuity of our intellectual endeavours. Progress is strongly predicated upon the uninterrupted collective memory of all the past problems that have been grappled with, successfully or otherwise. At any point in our research and thinking we are absolutely dependent on the full intellectual background of the problem currently at hand, its entire history. This places an obligation on scholars to keep complete and candid record of all previous theoretical meanderings, temporary solutions, and interim ideas.

So, now we can return to our inquiry concerning the Internet. How do our daily uses of the Internet fit into the picture? How do they impact on our daily epistemological conduct as social scientists?

5. The Beliefs and Practices of the Developers of Online Infrastructures

The success of the Internet as a global network for the production and exchange of electronic information is self-evident. The numbers are eloquent. In May 2002 there were over 222,000 LISTSERV lists (L-Soft International 2002); 100,000 e-mail newsletters emanating from 70,000 individual publishers (Topica Email Publisher 2002); and over 350,000 USENET groups (Google 2002). Also, in late 2004 there were over 56.9 million Web servers (Zakon 2005) housing at least 15 billion electronic documents in hypertext format. This volume of data surpassess the entire holdings of the US Library of Congress, the largest book library in the history of the world. All those resources, and those added since, are readily accessible *anywhere* and *anytime* by *anyone* with basic literacy and a connection to the Net.

Remarkably, the growth and proliferation of Internet-based information resources is not the intentional outcome of the activities of any single organisation, nor of any group working with a common plan. This gigantic development happened perfectly spontaneously, and virtually against the laws of any ordinary logic (see Reid 1997, Gillies and Cailliau 2000). Nonetheless, if we are to look at the years between 1969 and today, we can discern five elements without which it would be unlikely that the Internet would be as ubiquitous and usable as it is today.

First of all, there has been a philosophy of "open source" programs and software. Both the algorithms and their implementation - in the form of a code written in a particular programming language - were fully in the public domain.

Secondly, a piecemeal, collective approach has traditionally been taken to programming and to resolving engineering problems. The "bazaar" approach (Gabriel 1996, Raymond 1997-2000, Cavalier 1997-1998, Ditlea 1999) recommends that a minor, but promising product is released, and that this release is followed by a series of quickly-paced revisions and adjustments. Public contributions are welcomed, acknowledged and encouraged. Moreover, any problems are publicly communicated and descriptions recorded thereof, i.e. documented and archived for future public reference. In addition, as many past versions as possible of solutions are permanently stored and kept in full public view. This bazaar approach to design was found to be a far more effective philosophy than the use of structured, methodical, once-and-for-all approaches - sometimes compared to the building of an intellectual "cathedral".

The third ingredient of the Internet's success are the tightly-run and intensively used communication loops, in the form of moderated specialist mailing lists and discussion groups which enable people to brainstorm freely and safely, and exchange new ideas at great speeds.

Fourthly, participants in all those public discussions dealing with ways to improve a particular piece of software are rewarded (with prestige and renown) for the substance - that is, practical content as well as the intellectual fertility of their contributions - rather than for the fashionable phrasing of their communications. What really matters is the ability to make a clear, substantive point, and it is never a question of eloquence, or of public relations value.

Finally, the ultimate success or failure of a current iteration of software or a piece of engineering is to be judged by looking to the formal integrity of the solution; that is, in terms of correct performance under all circumstances, as well as in terms of its superior usability.

This innovative five-fold value system underpinning the spontaneous emergence of a global networking infrastructure, its hardware, applications and formal protocols was found to work brilliantly in almost all computing and networking contexts. Its fruits are many. One consequence is the immense popularity and widespread use of the UNIX operating system, which was first introduced in 1969. Email, developed in the late 1972, is another (Hafner & Lyon 1996:191-206). A further example is offered by the victory of TCP/IP (a 1978 invention) over the very thoughtful, very logical, and for a while very promising Open Systems Interconnection (OSI) Protocol, a specification which had the administrative backing of a number of standards bodies (Hafner & Lyon 1996:252).

Similarly, the bazaar approach also underpins the process by which Gopher subsumed the navigational functions of FTP Archie software in 1991, only to have its incipient hypertext capabilities supplanted in turn by the newly established - invented in 1991, widely adopted in 1994 ­ World Wide Web (Ciolek 1999). And, of course, there is the case of the exponential growth of WWW technology itself (Shirky 1998), or, more recently, of the runaway success of the LINUX operating system.

All the above lead us to a general conclusion: the dominant strategies which were used in the creation of the global network infrastructures from 1969 and the times of ARPANET and NSFnet until now are strongly infused with the epistemology of transformative knowledge.

But this is not the end of the story. Further and complementary developments can be seen to emerge. Once the online infrastructure was in place, the next wave of online users became visible very quickly. The hardworking pioneers and developers were joined by growing ranks of electronic publishers - people who were busy crafting and installing thousands upon thousands of online documents, including more serious and academically-minded works (see Harnad 1991, Harnad 1996, Anonymous 2002a, Anonymous 2002b). However, most of those e-publishers seemed to take the Internet's hard-earned brilliancy for granted, and by doing so they unwittingly overlooked its ethos of transformative knowledge.

6. The Three Key Qualities of the World Wide Web

Electronic publishers' preferred tool was, and continues to be, the World Wide Web. Why so? Because the WWW confers on anyone interested in online publishing three magnificent and unprecedented advantages (Ciolek 1999).

First of all, the great freedom and richness with which information resources can be built by means of hyperlinks. Second, the Web provides a very simple method for structuring and formatting text, as well as for combining it with images and sounds. Third, and probably most important, the Web is the first technology in the history of humanity to give so many ordinary people the possibility of self-publishing for a mass audience. This vital capability was not achievable with any of the earlier electronic tools such as remotely accessed databases, anonymous FTP archives, Wide-Area Information Servers (WAIS), or Gophers.

The Web *is* the turning point in the history of people's relationship with the Internet. Until the advent of the Web all electronic publishing projects, all activities - however large or small - always had to be implemented and authorised by technicians in charge of a machine with FTP, WAIS or Gopher servers. However, the arrival of the WWW completely changed that situation. The Web allows literally anybody with an account on a machine with WWW server software, to release their electronic information for world-wide public access and inspection - at any time and at any volume. What is more, this is possible with ease, speed, and at a low cost. All of these qualities can be understood to be outcomes of the bazaar manner of the WWW's inception.

Given its advantages, it is almost inevitable that the WWW should have become the most popular and most heavily used online publishing tool in all parts of the world. Also, since about 1994 it has been used very extensively to move scholarly information from the realm of private ink (in the form of small-scale exchanges of printed or hand-written matter) to that of public bytes (in the form of country-wide and transcontinental exchanges of digital materials).

The large scale process of bringing scholarly information into the public realm took place in a particularly idiosyncratic manner. The open source approach and the open document philosophy which were embraced so widely and earnestly by the infrastructure builders have also found followers amongst electronic publishers. But that highly effective and disciplined value system has been deployed with a special twist.

7. The Beliefs and Practices of E-Publishers

There are three large-scale surprises in the way the WWW's openness and visibility have been used.

Firstly the publication of online commentaries and discussions has far exceeded the online provision of full data-sets on which these conclusions rest. In other words, information about social sciences data collection techniques, methodology and data analyses are placed in full public view, but, the data themselves are published only in a highly selective and aggregated manner.

I have been working online since August 1991. In that time, I have encountered very few people who publish their research online and, at the same time, hyperlink them with companion electronic documents displaying all the cited data, especially data listed at the level at which they were originally collected. This is strange, to say the least. Despite the ubiquity of electronic storage, despite its real and obvious potential for easy world-wide access to information, and despite the remarkable cheapness of such electronic data storage, there is a definite lack of interest on the part of social science and humanities scholars in releasing the complete contents of their data sets. Instead, the prevailing norm is to offer only summary cross-tabulations and aggregated listings of select materials.

This would suggest that the old logic of a paper publication has been carried over into the very different environment of the electronic world. Authors still summarise, abstract and compact their evidence, even though informational real estate (measured in number of published pages) is no longer constrained by the economics of printed folios.

The second notable development is that despite unruly but very promising beginnings in the use of Usenet groups and email lists, the majority of social sciences online communication is now increasingly often used only for reporting, and not for more reflective and discursive exchanges (see also Table 2: The developmental stages of electronic and networked Asian Studies, in Ciolek 2003).

The movement away from the discursive mode of public email exchanges appears to be growing. I can see it clearly in the exchanges posted to the ten or so Asian studies and humanities' mailing lists I subscribe to. Over the last seven or eight years the frequency of critical discussions of people's work has diminished steadily. There seems to be a sentiment among those using a large-scale electronic forum that an attempt at such a candid assessment is, actually, an uncivilised or even hostile, i.e. *ad hominem*, initiative. This is so, even if the only things that get dissected and criticised are the shortcomings in that person's published work.

As a consequence, candid and substantive comments on a piece of electronic work are communicated privately, via personal email or during small scale face-to-face meetings. In other words, the tradition of an energetic public critique of one's work so happily and fruitfully engaged in by those who developed the Net, is now just as energetically shunned by those who use it for scholarly publication.

The third major tendency which can be observed among electronic publishers is their abandonment of the original tenets of the bazaar (i.e. piecemeal re-structuring) approach. These days, after a major electronic paper is launched, a series of minor adjustments to the document take place. Such corrections are implemented in a very special, and startling way. The person who receives useful feedback from their colleagues returns to their online document, corrects the document in light of the received remarks, and then... republishes the amended work at its original online address! That is, the new version of the document is almost invariably used to completely obliterate its former iteration. Paradoxically, in the world of infinite electronic writing space, palimpsests are created on regular basis.

Among the tens of thousands of online documents constituting academic work, there are only a handful of disciplined sequences of versions numbered 1.0, 1.1, 1.2, etc. etc. of the same scholarly analysis. Online audiences always encounter only the very latest incarnations of research papers. There is simply no custom and no expectation that a full sequence of the major developmental stages of an idea or methodology will be retained. We seem to be oblivious to the fact that such a practice annihilates the vast potential contained in the genealogy of errors, logical blunders, and problems encountered. As a consequence, conclusions and arguments present themselves as invariably correct and eternally true.

There is a dear price to such a practice.

8. Unforseen Epistemological Consequences of the Internet Revolution

If corrected versions of documents are used to overwrite (replace and destroy) their previous iterations, and if such a practice is widespread and un-acknowledged, then the phenomenon of online self-publishing made possible by the WWW has a number of deep long-term consequences. First of all, scholars' practical reactions to uncovered errors in their works are irretrievably relegated to the private, and thus secretive and emotionally charged domain. Secondly, onlookers are excluded from the process and do not have the opportunity to learn from the body of known and identified mistakes. Thirdly, despite the normative exhortations of Popper, errors start being regarded once again as shameful and unmentionable. They become something which should swiftly be rectified, and the act of their correction is re-cast as a private and embarrassing activity to be covered up and consigned to an Orwellian "memory-hole."

These developments can only mean one thing: our current uses of the Internet hinder and discourage the public critical analysis of online publications by our peers. These developments *immunise* our online work against the spirit of transformative inquiry.

The situation is invidious. As social scientists we are exposed to the real risk of being seduced by this miraculous, "instant publications, instant updates" WWW technology, much to the detriment of our discipline. Should an error in our work be found, we are seduced into correcting it secretively, thus killing its provenance and destroying both the delicate embryology (Harnad 1990) of the investigated problem, as well as its wider context. Increasingly often we are inclined to use the Web in such a way that we appear to be eternally correct. By our participation in that process we tacitly admit that - in fact - we do not care any longer whether we are any closer to the truth, as long as at each instant of our intellectual career we manage to appear error-free.

If this trend online scholarship becomes established, then the radical transformations in our ways of thought will correspondingly diminish. Instead, in such an intellectual climate, scholarship would be dominated by the mere accretion of findings, and their miniscule, infinitesimal modifications.

Thus, we are forced to conclude that our earlier description of the swift and triumphant transition "from private ink to public bytes" in fact carries a mixed blessing.

During the first 25 years of the Internet's history (1969-1994) the manner in which the Arpanet/Internet was *built* promoted and created vast new potential for transformative knowledge. But, since the mass adoption of the WWW as the dominant online scholarly technology in 1994, the way the Internet is *used* actively reinforces cumulative knowledge. It is a most startling discovery.

Perhaps, in the near future we will be able to find further unintended consequences of our private and public behaviours in cyberspace. And, perhaps, if some of us are diligent and selfless enough, then we may publish candid logs of our assumptions, investigations and conclusions, complete with the record of all our hesitations, successes and disasters. Why? Because justice must be done, and must be seen to be done. Likewise, we must keep any revolutions and revisions in our approximation of truth wholly unambiguous, and completely transparent.

For by now it should be perfectly obvious to us, the researchers, that the tree of knowledge blossoms best when it is rooted not only in a stratum of publicly formulated conjectures and refutations, but also in the rich manure of those conceptual tools and data which we, when they no longer help us to find the truth, freely discard in *full public view*.

9. Acknowledgements

This paper is a revised version of my keynote address "From Private Ink to Public Bytes: a hidden impact of the Internet revolution on Social Sciences and Humanities " [www.ciolek.com/PAPERS/ink-and-bytes2002.html] given at the "Internet and Society 2002" Conference, National Tsing Hua University, Hsinchu, Taiwan, 31 May-1 June 2002. My thanks are due to the University for inviting me to speak at that Conference. Also, I am most grateful to Penny Ramsay, Olaf Ciolek, and Monika Ciolek for their unexpected and illuminating comments on earlier drafts.

10. References



Site Meter
visitors to www.ciolek.com since 08 May 1997.

Maintainer: Dr T.Matthew Ciolek (tmciolek@ciolek.com)

Copyright (c) 2005 by T.Matthew Ciolek. All rights reserved.

URL http://www.ciolek.com/PAPERS/ink-and-bytes2005.html