Thursday, March 27, 2008

Beyond knowledge production: Wikipedia as cognosphere

(disclaimer: this blog posting is intended to be read similarly to a wiki article; it is a work-in-progress. I must turn this blog post into Master's thesis that I can defend by August 2008)

As with most literature on web 2.0 technology, words like “open source", "crowdsourcing" and “mass collaboration” are conceptually committed to industry and organizational management needs as well as product improvements. In a web 2.0 world that works right, communities of software engineers pool thought in order to debug problems and accelerate the pace of innovation; likewise, information workers in any venue can reap rewards from the way that these technologies effectively facilitate the distribution of labor. It is because web 2.0 technologies are seen as superior organizational tools, authors and critics tend to understand their function and impact through the narrow lens of producerism. Thus, you will find that these tools are evaluated for their ability to somehow improve the accuracy, profitability or usefulness of some research project (i.e. product refinement). It is no surprise, then, that an emphasis in the arena of Wikipedia research concentrates on questions of information credibility and usability.1

In a different paradigm of inquiry, however, these same technologies exist as something more than the internet's version of the product assembly line. The overlooked process is one where internet tools work like a bridge allowing different parts of the world to see and speak to each other. Wikipedia is to globalization what the corpus callosum is between the left and right brain hemispheres, a constant coordinator of disparate information that will, in turn, be sent out for higher-level processing.

Synthesis of separated elements, especially knowledge synthesis, in a collaboration age, is the production motif relevant to today's online projects. This should be contrasted to the closed-door intellectual exchanges between the information working elites of yesteryear. Such a comparison is done to examine the effects of dialogic thinking within online epistemic communities. With web 2.0 technologies the argument is that an online space will at least ground its production process in a wider, more pluralistic human perspective.

[expand on the idea of knowledge production 2.0 here. key words: refinement. symbiosis. dialectics. exlectics]


Writing Dump


Digitizing movements has become the first step necessary in the massive hauling of textual artefacts left by our predecessors to a new theatre. Accelerating the movement is Google as it continues to physically capture untold terabytes of printed knowledge that existed prior to the birth of its digital empire. A sea of old, dusty books await a new existence as potential nodes of a larger network. Metadata such as tags, RSS feeds, comments, and social bookmarks offer the potential to breathe life into texts that had no way of circulating and attracting the same type of attention in the physical world. While it is still unclear how exactly raw printed materials will be digested by mass internet communities, one thing is clear: fresh online output now grows from an informational base that is deeper and richer than ever before.

For Wikipedia, the exemplar for this new paradigm of content production, the job from the outset was to import a pre-constructed universe of knowledge, piecemeal fashion, into its collaborative encyclopedic format. Out of all the web 2.0 movements, it is currently the only one significantly supplanting, interrupting, or competing head-to-head with texts (books, reports, magazines, etc.) that are in the same business of distilling meaningful information about our known world. A less invasive version of Wikipedia might have limited the masses to commenting on the margins of authoritative texts. Instead, Wikipedia gave internet audiences prime textual real estate. This "encyclopedia" of our newly connected world also became the newest technology granting the public unprecedented managerial powers, instantly demoting authoritative sources to a function that was at most supportive or marginal.

Along with this newfound power, Wikipedians now assumed the burden of culling through a disorderly and contradictory repository of pre-existing texts through which they would collaboratively assemble realities of all kinds. One resulting consequence of collaborative production has been the explosion of knowledge articles covering everything from the mundane to the absurd. With Wikipedia viewing itself as a project to become the world's largest and best encyclopedia, activity and the deliberations backing it, appear purposeful and consensual, even with the full understanding that editors are often weighing their ideas against each other as is done in the scientific process (agonistic reasoning).



, much the way that NASA's search for extra-terrestrials included having users at home share some of their computer's processing power to crunch numbers.

Residing in the most socially sensitive areas in Wikipedia does one find

... could be summarized to the point of discernibility.

Writing Dump (please disregard everything below)

Observation of change in the shape and form of knowledge, supplemented by a history log which tracks every change made by anybody to an article, clue us into the key structures of knowledge that are vulnerable to re-negotiation. This thesis understands Wikipedia through a double lens which sees content being assembled as it is also torn down and re-structured.


This thesis sets out to observe areas of Wikipedia where distinct and novel textual outputs arise out of a dialogic interaction. By moving beyond the mainstream analytic framework of Wikipedia products, I no longer concern myself with evaluation criteria such as degrees of “good” or “bad”. With subtler evaluation criteria, at least an accounting can be made for the various textual shapes and colorings that result from diverse Wikipedians converging into globally central spaces of interaction. This would be to look into the internal composition and content of the “product” itself (the encyclopedic article), bearing in mind that the empty spaces where various editors imput thought is akin to a modern information-scape that is fluid and ever-changing. Behind every article is a collaborative work space, called the “talk page,” where, in correspondence with the editable article page itself, different systems of representation cohabitate, coalesce, blend or antagonize.

A Wikipedia article could be closed, “certified”, rendered into usable product. But that would just put a moratorium on underlying, dynamic processes that could play on indefinitely as more and more participants gain access to the Internet. These are the lesser understood consequences of a globally far-reaching dialogic interaction. It is a space that combines a multitude of culturally-sealed ideologies, discourses, grammars and concepts -- forcing encounters and collisions of an unprecedented scale.

This research arises out of a gap in the basic understanding of the Wikipedia's knowledge production process. A slew of questions don't wait for an explication of the process itself: Is Wikipedia more than just an encyclopedia? If so, what is it? How is global “knowledge” changing because of Wikipedia? How is human understanding being affected by dialogic interactions in Wikipedia? And finally, if we are to look at products, have we properly understood why and how usable texts are borne/evolve in this space? By focusing on socially controversial encyclopedic topics, I focus on areas of Wikipedia where diverse global inputs are most likely to compete and exchange. By doing so, perhaps a better measure can be used to understand how Wikipedia fares as a tool that meets the intellectual needs of the 21st century. Before embarking on an explication of knowledge production in Wikipedia, first, I offer a historical sketch of the pragmatic function encyclopedias served for the cognitive needs of societies that increasingly came into contact with each other.

The history of cognitive spheres and knowledge games


The encyclopedia has always done much more than simply serve as society's source for reference information. Ulterior agendas returned with a vengeance at the time of the Enlightenment, where elite men of letters strove to end the Church's stranglehold over truth in profound ways-- men such as Denis Diderot and Jean le Rond d'Alembert who drastically restructured the shape, and therefore rhetoric, of the knowledge body. For one, they alphabetized all articles from A-Z, in a nod to the spirit of empirical rationalism. By ordering arbitrarily along the alphabet, the encyclopedists discarded with a metaphysical ordering of the universe.

As it usually went, knowledge systems were owned, controlled and operated by those in power. Oftentimes the most critical periods of transition between one major social or ideological system to the next related to how well certain ideological groups were able to interfere with a reigning knowledge system from within.

A method of a system's own survival entails the placement of gatekeepers that “certify” knowledge. Dictionaries and encyclopedias were one of these textual artefacts that helped to build a reality of record, preserving legitimized knowledge by pruning all linguistic and conceptual change arising from grassroots or extra-national forces. By controlling the epistemological means of production, elites would have liked to increase hegemonic powers, orienting and steering action, thought and behavior in the social realm.

With almost every encyclopedia project, one can find an individual or community with a system of thinking to promote, one that offered a way for societies to brand a set of abstract concepts and relations. The underlying power inherent in the task of mapping social realities is too great to ignore, and while many encyclopedists fit the historical label of the sincerely "curious" intellectual quite well, it is another thing all together to dismiss the powers associated information condensation, systematic excision, context stripping, and fixing dynamic phenomena into codified, digestible tracts. To systematize knowledge meant then what it means to today: the ability to superimpose a privileged conceptual map over what is a much denser and dynamic field of cognizable possibilities.

Encyclopedias are fitting to study as culturally-sealed systems of top-down thinking since the mode of knowledge production has always been historically centralized and exclusivist, its efforts usually attributed to elite textual communities or kings with political and ideological agendas. Encyclopedias could, at the very least, be seen as perfectly emblematic of a particular culture's official understanding of reality.

It is at this point that I would like to offer the metaphor of a gel capsule to explain the mobility and interaction of disparate ideas in the age of printed knowledge. Medicine in this capsular form is comprised of an admixture of pharmacologically-active granular agents held together by a gel encasing. The casing ensures that the pharmacological contents are delivered to its source with no chance of cross-contamination. In the context of the history of mass communications, certain technologies did to ideas and knowledge what gel capsules have done to pharmaceutical ingredients: ensuring the controlled diffusion of pre-formulated content.

Encyclopedias, in the context of this metaphor, are the ultimate gel capsule, packing a full admixture within the sturdiest gel encasing. Books, pahmphlets, handbills, plays, and social spaces of deliberation and gossip, of course, function similarly in that they encase processed content that is eventually diffused. The more hands are able to meddle into the "admixture" -- meaning, the degree of access individuals have in determining the outcome of the knowledge -- the less that particular medium resembles the traits of a gel capsule. In this sense books are efficient "capsules" since the the content within a book is assumed to be sufficiently settled so as to justify its closing page and hardcovers. It is the same with most print materials, which is tantamount, for economic and customary reasons, to closure of the case. The fact that a book's case can be reopened when two or more people gather at a coffee shop to weigh its ideas, proves that its "capsularity" is not absolute.

Encyclopedias are more insidious than print literature because they dealt with the business of first assumptions and primary concepts that are already naturalized in language and discourse thus making them more difficult to excavate for the purposes of critical inquiry.

This is to be contrasted with, say, the Tree of Cracow, the famous chestnut tree where numerous Parisians went to circulate gossip and news related to the tumultuous events leading to the French Revolution. This type of culture of oral communications could diffuse information as well, even if its effect was that singular knowledges, say emanating from the king or the pope, would then refract into a whirlwind of hearsay. At the same time, an actual space that allowed for deliberation was a space that facilitated listening and dialogue, allowed for the interpretation and re-processing of disparate information. Just imagine the people at the Tree of Cracow, opening and tampering with the gel capsule only to spill all its contents on the ground. The tree of Cracow did not operation in isolation, however. An explosion of books, pamphlets and newspapers in the last months of 1789 supplemented the grassroots rumblings, each medium doing its own job in propping up the spaces of thought and deliberation that would in turn capitulate the Old Regime and the Church.


[transition needed here]

It is also a given that a multiplicity of textual knowledge systems had to co-exist or compete with each other. In the case of Europe, it is clear that for the most part, knowledge flowed freely between capitals and countries, usually travelling along trade routes. Universal methodologies (via epistemological standards of the day) had it so that any philosophe from Madrid to Moscow could travel to Paris to collaborate in the processing of information into knowledge. The difference between knowledge systems, in Europe at least, could be less attributed to geographic difference than they could be to differences in school of thought: Humanists would compete with Scholastics and Enlightenment thinkers with the clergy. The intense hierarchy needed to achieve universal (european) knowledge implied that its processing would be by nature borderless (among what could be imagined as communities of reasonable men across Europe).

Yet cultural boundedness reveals itself better in instances of comparing knowledge systems separated by extreme geographic distance. European capitals functioned as epistemic centers that imported concepts brought in by voyagers to the far east, assimilating and adapting new concepts so that they could be absorbed into the larger knowledge body.


With particular reference tools being the semantic bedrock of particular localities, the globe, up until the age of the Internet, played host to a constellation of encyclopedias, each one projecting its own concept map of reality; each one colored uniquely enough to exhibit obvious disparities in the way nations chose to internally structure and semantically delimit representations of reality. In brief, the rise of national discourses made artificially intact and complete brought about the opportunity for conflicting encounters between massive meaning-systems, with cosmopolitan and border cities serving as the most likely agora or field where intellectual, discursive and linguistic currents would cross each other.

Tuesday, January 29, 2008

Wikipedia and the Fragmented Mirror of Nature

Unlike many previous attempts at capturing some truth about the world, Wikipedia has elected to not impose a prejudicial barrier barring non-elites from joining the process of creating knowledge fit for an encyclopedia. Despite this drastic novelty, Wikipedia presents itself as nothing more than a traditional encyclopedic project, made to create a repository of verifiable, reference knowledge for the betterment of global, civil society. While the sphere of those who can contribute has changed in profound ways, Wikipedia operates on a fundamental principle that, in the end, no matter how many people or views inform the knowledge-creation process, there rests only one version of reality for everyone to attain.

Various Wikipedia writing guidelines suggest that particular viewpoints are limited versions of something much larger, a knowledge transcendent to all biases and blindspots, one that could embody knowledge from all perceivable angles and instantiations:

"...we can agree to present each of the significant views fairly and not assert any one of them as correct. That is what makes an article 'unbiased' or 'neutral' in the sense presented here. To write from a neutral point of view, one presents controversial views without asserting them...Disputes are characterized in Wikipedia; they are not re-enacted."
Many sociologists of knowledge have referred to this attitude as "the view from nowhere". For Wikipedia, this means an aspiration to divesting the most recent version of an article, as much as possible, from any one angle or perspective of a represented reality. This is not to say, however, that Wikipedia's designers do think it's actually possible to achieve neutrality:
"If there is anything possibly contentious about the policy along these lines, it is the implication that it is possible to describe disputes in such a way that all the major participants will agree that their views are presented sympathetically and comprehensively. Whether this is possible is an empirical question, not a philosophical one."
The author(s) of this guideline seem to be suggesting that neutrality is something that is, if not achievable, at least potentially workable, as if perfection stood at the end of a linear progression from blind, to aware, and finally, to all-seeing.

So when a backlog of unreconciled writing gets bigger and polarizes more and more Wikipedian users, the authors of the guideline assume, then, that people aren't yet ready or mature enough to embark on the Wikipedian mission intended to produce the holy grail to which all Wikipedian collaboration aims for: "the featured article": these are articles deemed by a committee to have met certain writing criteria. Neutrality is one those criteria, yet not a single featured article related to government or politics, that is worth fighting over, has ever made it to the prestigious list.

Empirically speaking, then, Wikipedia is not patching up the great ideological fissures that divide up the world's ideologues, and neutral writing strategies are failing to guide ideological antagonists towards a common place: one which includes, synthesizes, integrates, accommodates and is sensitive to the social situatedness of knowledge artifacts.

In this first section, my goal is to explicate the philosophical incompatibility that exists between Wikipedia's strategy for prescribing writing styles that effect a sense of total awareness (i.e. journalistic notions of objectivity) and an encyclopedic space that is structurally designed to produce monadic representations of reality.

Encyclopedias: structurally inhospitable environments for objective writing


There are many reasons why news media outlets, the original purveyors of disinterested description, are continually able to produce so-called "objective," written accounts of reality. Although this is quickly changing, a news report's purported truth doesn't disintegrate from the prolonged scrutiny of one hundred critical voices the way it can inside Wikipedia. To look at it quantitatively, the less heterogenic and populous the editorial environment, the less time it takes for a textual product to pass the vetting process. A news artefact can assert its own objectivity in the absence of dissenting views from individuals occupying elevated positions in the contemporary public sphere (this will change to the extent that bloggers will continue to acquire attention and respect). In short, the less consciousnesses inhabiting the same space, the less likely a particular impression of reality will encounter its challenge.

But more significantly, the content delivered through an encyclopedia article symbolizes something different than the knowledge claims carried in periodicals. The symbolic difference boils down to the fact that periodicals are "snap shots" of reality whereas encyclopedias are the exact opposite -- they are supposed to withstand the test of universal consensus accumulated over time.

This constraint relates back to the historical function encyclopedias had as tools of reference and introductory learning. Whereas encyclopedias attempt to cover the "aboutness" of a particular thing or phenomenon, a news report concentrates on immediate events that, by virtue of its sharp focus, won't need to address related or contextual information in much depth. Burdened with the ambitious task of integrating and structuring information into a holistic corpus of "human understanding", the information carried within an encyclopedia would be defined just as much by its relationship to other phenomena. In other words, while it may be possible to understand what something "is" by reading a news account, it is only until someone reads about it in an encyclopedia that they can get the sense of what something "is not". Naturally, this adds an additional burden to the task of encyclopedic representation since it would make sense that an integrated/structured/comprehensive picture of the world would be harder to achieve than fleeting snapshots and news reports that relate with/exchange less clearly to adjacent phenomena.

Yet a greater reason exists for why an all-encompassing, transcendent reality remains elusive to the encyclopedic project. The culprit lies in the encyclopedia's ambitious attempt to consolidate reality, by cataloguing it, taming it -- reducing it, from a vast, fluid and multi-perspectival phenomenon. The target is a distilled product, formatted to package information in a way that is topical, segmented, thematic, interlinked, chronological, linear, discursively coherent, consistent in tone and style, etc.


---

What makes Wikipedia the encyclopedia of its age is the almost militant desire to force the conceptual coherence of knowledge products on a global stage. If the Internet cut its globalized audience some slack by allowing ideas to co-exist under a loosely hyper-linked galaxy of documents, than Wikipedia asked from everyone the unthinkable: mass collaboration under claustrophobic conditions.



---




Encyclopedic initiatives point to an attempt by a group of individuals to corral a sea of information into a manageable textual body: an article. By tackling concepts and phenomena that mean many things to many people, the enyclopedia is responsible for capturing via description the polyvalency of its subject. In practice this might involve creating an article on the history of political violence related to the Israeli-Palestinian conflict. The encyclopedist, as the Wikipedist, would believe it possible to create a a definitive account of this topic, no matter how volatile or centripetal the social forces may be that threaten to unravel the body of text into a thousand different ideological strands.

Lying underneath every encyclopedic operation is the act of filtering out information for a distilled knowledge product. There are many methods of arriving to an information-condensed account, be it through excision, elision, grafting, blending and other such acts of reduction.

The second operation is to introduce a structure and order to knowledge.











---

In Wikipedia this difficulty is demonstrated as competing editors disagree over how to arrange and organize certain facts in relation to others. How facts end up getting arranged will, in turn, have an effect on the way they are perceived in terms of significance and importance. This is not to say that in journalism, the structuring of information is trivial. To the contrary, a newspaper's pyramidal structure, with its headline and lead paragraph, can do much to determine the significance to a story's various facts. The issue is simply more pronounced and problematic in Wikipedia, where various communities will attempt to manipulate knowledge outcomes by the way articles are structured and named.

Circumventing Neutrality: loopholes at all levels

Loopholes in the encyclopedic structure

1.Incompatible ontology-knowledge categorization schemes
2.Proliferated/redundant nomenclature (titles/headers) and information (article body)
3.Proportional uncertainty of empirical content
4.Narrative options
5.Discursive chains of significance

to be continued...

Saturday, January 5, 2008

Global outcomes for politically volatile knowledge



"In the multiplicity of writing, everything is to be disentangled..."
-- Roland Barthes, 1977, The Death of the Author



Pictured above: a visual scheme of the epistemic threading that forms a Wikipedia knowledge product. Technologies such as Wikipedia that open up to globally diverse communities of knowledge producers can simultaneously achieve different textual forms out of the same knowledge body :

Refined Knowledge


Interlocked Knowledge


Fuzzy Knowledge


As a many-to-many, global network of Wikipedians add volume and dialogical rigor to the processes involved in representing the realities we consume as knowledge, it will become increasingly unsatisfying to read what provincial knowledge producers (monologuers) have to claim about socially and politically shared realities. Knowledges of mutual interest to multiple communities will increasingly be held to new standards and production processes in a public domain of kaleidoscopically diverse thinkers.

Much discussion has brewed from a desire to harness the "intelligence of masses" and to cross-pollinate our intellectual products with a wide-ranging perspectival, but as of yet there has been scant research observing the state of change affecting bodies of text currently being edited, blended, stitched, dissected or massaged on a daily basis by a new corps of global producers. As products fit for consumption, that the quality and accuracy of knowledge products have fared so well relative to Encyclopedia Brittanica is a promising sign from which to build upon.

Yet much that has been said about the textual products yielded from this wide-reaching collaboration has been negative, citing a tasteless prose that lacks the distinctness and authority of the individual's voice. Others have observed that parts of Wikipedia knowledge exhibit a tendency to be factualist, presentist, unprofessional and, overall, lacking the synthesizing abilities and literary flair in accounts written by well-regarded scholars. But why do these collaborative texts result in the ways observed? So far, no compelling explanations have been offered.

This essay proposes that there are three fundamental actions that will determine the form, style and character of politically volatile knowledge in a collision space of culturally divergent producers.

The main premise of my argument is that it does not suffice to raise issue with surface observations about form and style of Wikipedia textual products without discussing the underlying processes at the same time, which is what most commentators and researchers have thus far only succeeded in doing. This would be akin to seeing the colorful clash of sediment and rock in the Grand Canyon with a tour guide that won't explain the science behind the colors and shapes.


Process #1: refining epistemology

To refine, above many other attributes, means to selectively reduce, to narrow. It is a concept intimately tied to the scientific process which gathers multiple, competing theorems for the end purpose of eliminating all but the strongest of them. It is an impulse that runs deeply within the tradition of Western scholarly inquiry dating back to the days of Aristotle. Formal deductive logic, dialectics and agonistic reasoning are some among the many processes which have names all sharing a similar theme in spirit.

Refining actions are the plainest and most evident processes observable in any Wikipedia entry. WikiMedia software which enables Wikipedians to edit any part of of the text in question, makes it easy for users to isolate particular segments of text, essentially creating the laboratory-like conditions of the observer-analyst. In addition, because any contribution is vulnerable to change by anyone else, individuals must rely on communicated persuasion through reasoning to preserve contributions they believe should stay. If by definition knowledge is to be refined, than this means that there is a will to identify the candidate contributions for deletion, usually arrived at by first comparing two or more competing contributions and weighing their relative strengths and weaknesses vis a vis the other. The contribution deemed superior rises, the inferior is replaced.

It is not just strictly empirical/factual content that is easily refined in Wikipedia. Words, ideas, narratives and ontological categories of knowledge become fair game. Wikipedians are constantly reviewing contributions that mutually exclude each other and deciding as a group to banish that candidate contribution with the weaker justification.

In this sense, knowledge refinement fits perfectly into a collective intelligence theory which posits that given enough problem solvers, all imperfections can eventually be fixed ("given enough eyeballs, all bugs are shallow"). As of late 2007, we have no studies to suggest what percentage of Wikipedians feel that the primary motivation for editing is to distill for knowledge purity through a collective intelligence vetting process.


process #2: Interlocking epistemology

If interlocked knowledge can be proven to be a primary method of knowledge production, than this would do much to explain all the robust user activity surrounding representations of political and social realities, or what I will refer to from hereon as the politically volatile. After all, once a full description of an apolitical entry (say the history of the medieval garden plow) has been constructed, editors move elsewhere and the article stabilizes. Much more labor and intellectual energy must be spent in the deliberative spaces behind sectors of Wikipedia devoted to social knowledge. How much space Wikipedians decide to allocate within G.W. Bush's biography to his alleged cocaine abuse is very much the kind of question that is difficult to resolve with a refining action since different cultures will subscribe to their own preferred allotments of irreverent presidential knowledge.

Thus, the idea behind interlocked knowledge is that no global consensus for social knowledge can be achieved if it does not attempt to integrate in some fashion from the multiplicity of fragmented representational value systems. The key premise under this paradigm of knowledge construction is that true social knowledge is a product of intellectual negotiation via the interweaving of idea-concepts into Wikipedia knowledge products. This method is an expression, essentially, of the social constructionist theory of knowledge. Any Wikipedian working under this modus operandi, is not necessarily averse to knowledge refining actions. A social constructionist Wikipedian has no pre-constructed notion of what the eventual knowledge product must look like. Rather this type of user believes that the quality of the knowledge will reflect the processes that gave rise to it (i.e. determined by variables such as cultural make-up of a particular entries' user demographics and structure of the knowledge production space). Perhaps Wikipedia is an experimental space, like a crystal ball, where individual hope to test their limited view against a exalted form of consciousness, in this case, a body of thinkers synergized by the effects of collective co-construction.

It is much more likely, however, that politically volatile knowledge attracts intellectual antagonists who attempt to inflect knowledge products with a particular world view. The resulting consequence of multiple agents exerting a force onto a text is an inadvertent change to the textual product over time. In cognitive semantics, a creative conceptual blend caused by divergent, oftentimes clashing inputs is called a "double-scope blend."

In Wikipedia the metaphor of interlocked threading differs from a refined thread, since one strand of thread does not supress the other one. There is not pointed end to the interlocked thread metaphor, no refined point of truth at the tip. What we have with interlocked knowledge is a simple conceptual blend of two or more separate strands of thinking, interwoven for better or worse in a textual bind.


process #3: Fuzzy Knowledge

Another outcome entirely for massively authored writing projects could be the end of genealogically traceable encyclopedic representations. Whereas with interlocked knowledge the origins of the knowledge are somewhat traceable to a few users' blended inputs over time; with fuzzy knowledge, the many users' inputs are so vast, amalgamated and heteroglossic that the visible seamlines of difference within the text are smoothened out, no longer appearing as a patchwork of variegated segments.


At a later stage in this paper, I will demonstrate how massively authored texts can yield information-rich and shapeless bodies of text. Facts are orphaned from their parent narratives. Knowledge synthesis soon becomes impossible as there is little social agreement over how to embody and codify raw information into a coherent structure we can call knowledge. Without little refinement and blending processes in the works, uncontrolled fuzziness can lead to an unabated information glut. The information quantity no matter its vastness is observed, even reviewed and commented on, but no mechanism exists to digest or distribute information into its proper ontological resting place. When we have fuzzy knowledge, in essence, there is no floodgate in place to prevent a sea of voices from washing away the narrative structure.

Disorganization can exist at the foundation level as well. Take for instance an age old epistemological method used to organize raw information for the purposes of meaning construction: the knowledge topic; magnets of raw information, topics appear as headers, organized alphabetically from A-Z. In traditional knowledge spaces, facts tend to obey a few, non-contradictory journeys towards the support of larger categories. Since it has been argued that categories are essentially arguments in themselves, what happens when in Wikipedia we see no limit to how many categories and topics can be created? If two people cannot agree on what a fact means within the context of one article, the other person will simply design a friendlier atmosphere for the disputed fact by way of reframing the host topic. Readers under these kinds of fuzzy knowledge conditions may experience a read that is confusing, pointless or incoherent.

Given certain characteristics in textual features of politically volatile articles in Wikipedia that are analyzed later in this study, we must ponder whether Wiki collaboration is the technical realization of Roland Barthe's vision of a text that is decentred and liberated from particular voices and other situated collective consciousnesses. What may look like a long, amorphous and unelegant biography to historian Roy Rosenzweig, may actually be a text freed from the tyranny of authorial narrative.

In a strongly worded criticism of Wikipedia's textual quality, critic Jaron Lanier writes "reading a Wikipedia entry is like reading the bible closely. There are faint traces of the voices of various anonymous authors and editors, though it is impossible to be sure". Lanier, preoccupied by the threat to mono-authored literature, seems to fixate on an aesthetic critique, never once pondering how how semantic forms in the text may have shifted/mutated as a result of a many-to-many encounter.


Wikipedia's investment in objective encyclopedic representations

To believe in semantic changes afoot as a result of a globalized encounter between diverse knowledge producers is to be skeptical of Wikipedia's core belief in universally objective representations of reality achievable through its Neutral Point of View writing policy.

to be continued...