When Sequence Was Still a Promise
There are concepts that arrive too early for their own historical recognition. They appear first as technical curiosities, are briefly discussed in specialist circles, and then disappear beneath more visible technological changes. Only much later does it become clear that what once looked like a marginal formal innovation was, in fact, a symptom of a deeper reorganisation of culture. Hypertext belongs to that category.
For many years, it was treated chiefly as a property of digital writing: a textual structure composed of segments connected by links, associated with early networked literature, experimental interfaces, and the first generations of the web. Such a definition remains technically correct, but culturally insufficient. Hypertext is not merely a textual technique. It is a way of organising discourse under conditions in which linear order no longer guarantees cognitive adequacy.
The significance of hypertext begins precisely where technology ceases to be the main subject. What matters is not the visible link itself, but the logic it introduces:
- discontinuity without chaos,
- plurality without complete dissolution,
- movement through meaning without the obligation of a singular sequence.
Long before digital platforms normalised this logic, culture had already begun to drift toward forms of knowledge and experience that resisted closure. The printed book disciplined thought through a recognisable architecture:
- beginning,
- development,
- conclusion.
It did not merely contain content; it imposed temporal trust. To read meant to accept sequences a method of understanding. Even when modern literature complicated chronology, interrupted narration, or fragmented voice, the material form of the codex preserved an implicit promise that order existed somewhere, even if deferred.
Yet modernity itself steadily undermined that confidence. The expansion of archives, the multiplication of disciplines, the acceleration of publication, and the increasing density of historical self-awareness gradually produced a paradox: culture was generating more interpretive material than any single line of reading could absorb.
The First Machines of Associative Thought
This problem was diagnosed remarkably early. In 1945, Vannevar Bush proposed a machine he called Memex: a device that would allow scholars not merely to store documents, but to create associative trails through them. His vision emerged not from speculative futurism but from a practical recognition that knowledge had already exceeded inherited methods of access. The scholar of the future, Bush suggested, would need to think through association rather than hierarchy.
What Bush anticipated was not simply the internet. He anticipated informational overload as a permanent civilisational condition.
The crucial aspect of the Memex idea lies in its departure from archival order. Libraries had always relied on classification: categories, shelves, indexes, stable retrieval systems. Bush proposed something closer to thought itself - a movement by connection, by recurrence, by relevance that emerges in the act of reading rather than preceding it. Knowledge, in this sense, becomes navigable not because it is simplified, but because it is linked.
Several decades later, Ted Nelson gave this intuition its lasting name: hypertext. His broader project, Project Xanadu, remains one of the most ambitious unrealised visions in the history of digital culture. It imagined a universal textual environment in which documents would remain permanently connected, every quotation would preserve its source, every link would function bidirectionally, and fragments could appear simultaneously in multiple contexts without losing authorship.
Text Beyond Linear Closure
Much of what Nelson proposed still exceeds the architecture of the contemporary web. The web popularised linking but abandoned many of the philosophical safeguards Xanadu considered essential. Links became fragile, references disappeared, texts detached from origin, and digital writing developed under conditions of increasing impermanence. What appears today as ordinary instability - broken citations, vanished pages, uncertain provenance - was precisely what Nelson had hoped to prevent.
For this reason, Project Xanadu now reads less like a failed technological utopia than like an unfinished argument against the fragility of digital memory. If Bush and Nelson belong to the technical genealogy of hypertext, its cultural genealogy reaches further into literary and philosophical territory. Hypertext did not emerge against literature but through literature’s own long dissatisfaction with linear authority.
Long before digital systems made linking operational, writers had already begun constructing works that resisted straightforward sequence. Certain modern texts demanded discontinuous reading, recursive return, movement between fragments, annotations, appendices, and internal crossings that destabilised the ordinary temporal flow of reading. In such works the page ceased to behave as a stable unit of progression.
This is why hypertext should not be confused with a digital novelty. It is better understood as a formal answer to a broader condition: the growing inability of inherited narrative structures to contain expanding cultural complexity.
When George Landow later connected hypertext with post-structuralist thought, the argument proved unusually persuasive because the convergence had already been prepared intellectually. The works of Roland Barthes, Jacques Derrida, and Michel Foucault had long challenged the assumption that texts possess singular centres, stable borders, or final interpretive authority.
Hypertext appeared to materialise what theory had already suspected: that meaning does not proceed in a straight line but emerges through crossings, interruptions, echoes, and deferred relations.
For this reason the hyperlink itself should never be mistaken for the essence of hypertext. The link is only its visible mechanism. The deeper transformation lies elsewhere: in the recognition that texts exist increasingly not as isolated objects but as relational environments.
A printed book may refer outward while remaining materially self-contained. Hypertext externalises that relation. It makes adjacency explicit. It turns reference into movement. The reader therefore changes position.
The Reader as Navigator
Under linear reading, interpretation unfolds within a path largely determined in advance. Under hypertextual reading, interpretation becomes inseparable from navigation. The reader does not merely follow argument but repeatedly decides what constitutes continuity.
This transformation first became culturally visible in early digital literary experiments such as afternoon, a story by Michael Joyce, created in Storyspace. The importance of such works does not lie only in formal novelty. Their deeper significance lies in exposing that narrative itself could no longer assume stable sequence without remainder.
In these texts, repetition acquires a new role. Fragments recur under altered contextual conditions. Meaning shifts not because the words change, but because the path leading toward them changes. A lexia read first appears differently when encountered later through another route. This reveals something fundamental: sequence is never neutral. It is itself an interpretive force.
What hypertext exposed in literature gradually became ordinary elsewhere.
The early internet functioned as a laboratory in which this logic became socially visible. Websites, archives, forums, linked essays, unstable paths of reading - all these gave technical expression to a mode of cognition already forming beneath late twentieth-century informational expansion.
Yet even there, hypertext was often misunderstood as merely a feature of interface design. In reality, it signalled a broader cultural mutation: knowledge no longer arrived as ordered succession but as simultaneous adjacency. The reader became less a receiver than a navigator.
Today this appears almost banal because the logic has been absorbed so deeply into ordinary digital life that it often disappears from notice. Opening ten browser tabs, suspending one text to verify another, moving from article to archive to commentary to reference - this is no longer experienced as experimental reading. It is simply contemporary cognition.
Hypertext became invisible precisely because it became normal.
Its discreet charm lies in this disappearance. Unlike earlier media revolutions, hypertext does not declare itself dramatically. It works by quietly altering the structure of intellectual expectation.
One no longer assumes that understanding requires uninterrupted continuity. One expects interruption, branching, provisional return. This has consequences beyond literature.
The encyclopaedic form itself changed under hypertextual conditions. Wikipedia demonstrated at planetary scale that knowledge could remain usable while distributed across endless internal crossings. Authority no longer depended solely on closure. It depended on transparency of connection, revision history, and visible relationality.
That shift also introduced new fragilities: unstable authorship, contested trust, endless revision. But these are not accidental defects. They belong to the same structural logic.
Hypertext as an Early Diagnosis
Hypertext did not create uncertainty. It made uncertainty legible. This is why hypertext should be understood not as a short technological phase preceding platforms, but as one of the cultural forms through which late modernity learned to inhabit informational excess. It answered neither by restoring order nor by surrendering to fragmentation entirely. Instead, it offered a provisional grammar for moving through multiplicity.
This grammar remains unfinished. Much of contemporary digital culture has inherited hypertext while concealing its visible form. Platform feeds increasingly remove explicit links while preserving relational logic beneath recommendation systems, ranking mechanisms, and predictive pathways. What once required deliberate navigation is now often pre-structured invisibly.
But this only confirms the larger point: hypertext was never reducible to clickable text. It is named a cultural reorganisation already underway.
Perhaps this is why its earliest theoretical formulations still retain force. They belonged to a moment when the problem was visible because solutions were not yet normalised.
We now live inside many of the conditions hypertext first tried to describe:
- informational disproportion,
- unstable textual authority,
- proliferating contexts,
- interrupted continuity.
The old linear confidence has not vanished entirely. Books remain indispensable precisely because they preserve one of the few spaces where sequence can still be inhabited deliberately.
But outside that protected form, culture increasingly behaves otherwise. It expands associatively, loops unexpectedly, remembers unevenly, and often refuses to begin or end where we expect.
In that sense, hypertext was never simply about computers. It was an early name for the moment culture began to resemble its own excess.
Hypertextual Sketches is a micro-series of essays on hypertext, the post-modern condition of culture, semiotics, and non-linear ways of describing how meaning circulates when continuity breaks down. Original research essays were written between 1997 and 2000, in Prague, Krakow, and Leipzig, when the internet was still experimental, but its logic was already reshaping how we read, write, and think. Larger portions of this work were actually