Designing Time-Resilient Assets — Why a 1995 TeX Source Still Compiles in 2026

日本語で読む

A LaTeX source file written in 1995 still compiles to PDF in 2026 with a single lualatex-ja command. The fact may sound technically unremarkable. It becomes oddly heavy, however, when juxtaposed with another: for most knowledge workers, 30-year-old intellectual assets are no longer accessible.

Most of us lose our work on a far shorter timescale. Five years and a Word document opens with garbled layout. Ten years and a Notion notebook is locked behind an account that no longer exists. Fifteen years and the SaaS service has shut down, taking the materials with it. Twenty years and the MO disk has no reader. Against that ecology, a one-line command rebuilding a paper from three decades ago is not a triviality.

This article articulates the design principles that keep intellectual assets durable across long time horizons. The central claim is simple ── building assets that don’t rot is a matter of treating “will this still open in 30 years?” as a criterion at the moment of writing. The criteria themselves are not new; they descend from the UNIX philosophy. What is new, in an era of proprietary formats and SaaS dominance, is that holding to these classical choices consistently has become an active selection rather than a default.

How Assets Rot

Several distinct routes lead to assets becoming inaccessible.

Generational drift in proprietary formats: Word .doc files from the 1990s have partial incompatibility with .docx. AppleWorks documents are stranded since the format was never carried forward. Ichitaro, Lotus 1-2-3, QuarkXPress ── each, when its era of dominance passed, left its files in a state of readable but no longer editable.

SaaS platform dependency: Notion, Evernote, Google Docs, Dropbox Paper ── access ends the moment a terms-of-service change, a pricing revision, a service shutdown, or an account suspension occurs. Export functions exist in most cases but typically degrade structure and metadata.

Font discontinuation: Commercial fonts demand license continuation; specialty fonts vanish when their producer dissolves. PDFs without embedded fonts re-render incorrectly, and pagination breaks.

Editor generation gaps: 32-bit to 64-bit, PowerPC to Intel to Apple Silicon, Windows 7 to 11 ── each transition makes “reproducing the editing environment of a particular moment” progressively harder. Emulators are partial workarounds, never full ones.

Physical media lifespan: Floppies, MO, CD-R, DVD-R, external HDDs ── each fails at the 5–20 year mark, often including the reader hardware itself. Cloud storage seems to solve this but substitutes the SaaS dependency problem above.

Recurring authentication cost: Adobe Creative Cloud, Microsoft 365, and various commercial licences revoke editing rights the moment payment lapses. The assets remain, but the means to edit them does not.

These compound, and the result is what could be characterised as the standard trajectory of contemporary intellectual labour ── personally accumulated work becomes substantially inaccessible within a decade or two.

The Structure of Durable Assets

Asked in the inverse direction, what survives time comes into focus. Six design principles can be drawn out:

  1. Plain text first ── UTF-8 text opens in any editor across Unix-like systems and Windows, is grep-able, diff-able under any version control, and Git-trackable. It has been the single most stable format for over five decades, with the narrowest confidence interval on remaining viable for the next thirty.

  2. Open formats ── LaTeX (Knuth 1978–3; Lamport 19864), Markdown (Gruber 20045), JSON, PNG, PDF/A (ISO 190056). These have published specifications and multiple independent implementations. Their continuity does not depend on the fortunes of any single vendor.

  3. Source separated from artefact ── treat the text source as canonical and PDF / HTML / EPUB outputs as regenerable derivatives. As long as the source remains plain text, output rendering can be redone whenever the rendering engine evolves.

  4. Distributed storage via Git ── distributed version control depends on no single platform. GitHub, GitLab, a self-hosted remote, a local disk: any one of these can hold a complete history, and git clone replicates the whole tree elsewhere.

  5. XDG standards ── the XDG Base Directory Specification7 standardises where configuration ($XDG_CONFIG_HOME), data ($XDG_DATA_HOME), and cache files live across operating systems. This dramatically lowers migration cost when machines change.

  6. Minimised dependencies ── home-grown scripts compose POSIX shell (50+ years stable8) and standard UNIX tools (grep, awk, sed, find) that have been essentially unchanged since the 1970s.

None of these are new design principles. The UNIX philosophy provided this answer in the 1970s2. What is new is that, in an era dominated by SaaS and proprietary formats, holding to these classical choices consistently has become a deliberate posture rather than a default one.

Orchestrator Stance ── Not Writing Generates Durability

Among the design choices required for durable assets, the most often missed is deciding not to write in the first place.

Writing your own code, your own format, your own tool means you are responsible for maintaining it for thirty years. This is essentially impossible. The author retires, loses interest, runs out of time, falls behind technically ── and the assets rot.

Eric Raymond’s reuse principle (1999)9 holds that great programmers know what to rewrite (and reuse). Read along the time axis, this proposition extends naturally to durability: write only what you can maintain over thirty years; for everything else, ride on the durable infrastructure that others have maintained for that long. LaTeX (Knuth’s documented commitment to backward compatibility is famous), POSIX (50+ years of stability), Git (immutable history since 2005) ── these are 30-year-class “good programmers’ code” already in place.

The reuse principle, often discussed as a matter of technical efficiency, also operates as a durability principle when read along the time axis. Choosing not to write is, simultaneously, choosing not to be on the hook to maintain what you wrote.

Thirty Years of Empirical Evidence

The argument here has been verified by the writer’s own practice across a thirty-year window. A 1995 undergraduate thesis, a 1999 master’s thesis, and the bulk of subsequent research notes, conference materials, and manuscripts remain in plain text and open formats ── and remain compileable, editable, and searchable today.

What matters is that no dedicated preservation effort has been involved. Three things only were decided at the time:

  • Write in plain text (TeX / plain UTF-8)
  • Choose open formats (LaTeX, later Markdown)
  • Distribute backups (local disks plus multiple remotes; later, Git)

Thirty-year compileability emerged as a side effect of these three choices, not as a dedicated archival project. This is more accurately described as avoiding short-term-easy choices that rot quickly than as long-term planning. Proprietary tools demand fresh learning at every fad-cycle and impose export costs to move assets out. Plain text and open formats erase those learning and migration costs across decades. Long-term durability emerges as the side effect of avoiding short-term ease.

Common Objections

“Aren’t specialty tools just better?” ── Yes, in the short term. Photoshop, Illustrator, Final Cut Pro provide the best editing experience available at any given moment. The trade is lock-in. The pragmatic resolution is to separate “30-year assets” from “3-year assets” ── avoid proprietary tools for the former, embrace them for the latter.

“Why not just keep the PDF?” ── PDFs are sufficient for reading but not for editing. PDF is a regenerable derivative, not a source. Without the source, one cannot add a citation, reorder chapters, fix a typo, or translate the work. Readable and re-writable are distinct preservation requirements.

“Isn’t the cloud safer?” ── Physical reliability, yes, but physical reliability is not the same as continued ownership. Cloud providers can change terms, raise prices, terminate services, freeze accounts. Long-term durability presupposes a complete copy in the writer’s own hands. Git’s distributed model embodies this exactly ── any single location can be lost without losing the whole.


Re-read alongside the broader dlab corpus, this 30-year claim coheres with several adjacent arguments — and is, at bottom, the time-axis version of the stance declared in Relative, Not Deterministic1: do not fix on a particular editing tool (a mediator); fix the open-format boundary and let rendering engines come and go relatively. Leaving an asset is also preparing for the future reader’s Validation ── leaving room, in an open format, for that reader (the 30-years-later you) to articulate retroactively “what it was that I wanted to record”.

The interface principle from 記録を標準化する (Standardising Records) operates not only between persons but across time: the writer’s future self is, effectively, another person, and assets must satisfy interoperability with that future agent ── honouring standard formats also makes Verification across time (detecting whether a past asset and a present engine agree) possible. The thought-and-tool framework from 書くことは考えることか (Writing as Thinking) and 思考と作業を分離する (Separating Thought from Labour) extends along the time axis to demand that the mediator itself remain stable enough to function as a long-lived cognitive extension. Fukuoka’s dynamic equilibrium from 世界は分けてもわからない ── originally an account of biological systems’ constant turnover ── applies by analogy to intellectual assets ── content survives through the flow of formats. Plaintext source persists as rendering engines (TeX → LaTeX → LuaLaTeX → whatever next) come and go.

There is also an anti-rentier dimension ── which is the problem of mediation itself. When the mediator (the editing platform) moves under you ── terms revised, prices raised, service terminated ── your access to your own intellectual assets moves at someone else’s convenience. Whether to pay platform rent for thirty years, or to choose open formats up front and pay nothing afterward, is the most concrete instantiation of keeping the mediation under your own control: an anti-rentier stance in intellectual work.

Designing time-resilient assets, in the end, is writing a letter to one’s future self in an open format. Today’s writer is the sender; the recipient is some unknown reader in 2056. The post office that mediates between them is not a platform but a long-running service that text and the UNIX philosophy have operated since the 1970s ── and that has not yet shut down.

References


  1. For the operational definitions and the theoretical treatment of mediation / différance / the V&V asymmetry, see the footnote in the foundational essay Relative, Not Deterministic, and the author’s Zenodo preprint series (Letter version DOI: 10.5281/zenodo.20096463). 

  2. McIlroy, M. D., E. N. Pinson, and B. A. Tague. “UNIX Time-Sharing System: Foreword.” The Bell System Technical Journal, vol. 57, no. 6, July–August 1978, pp. 1899–1904. 

  3. Knuth, Donald E. The TEXbook. Addison-Wesley, 1984. Knuth’s own treatment of TeX’s design and his commitment to backward compatibility. 

  4. Lamport, Leslie. LATEX: A Document Preparation System. Addison-Wesley, 1986. 

  5. Gruber, John. “Markdown.” Daring Fireball, 2004. https://daringfireball.net/projects/markdown/ 

  6. ISO 19005-1:2005 Document management — Electronic document file format for long-term preservation — Part 1: Use of PDF 1.4 (PDF/A-1). International Organization for Standardization, 2005. 

  7. freedesktop.org. “XDG Base Directory Specification.” https://specifications.freedesktop.org/basedir-spec/ 

  8. ISO/IEC 9945:2009 (POSIX.1-2008) Information technology — Portable Operating System Interface (POSIX®) Base Specifications. International Organization for Standardization, 2009. 

  9. Raymond, Eric S. The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary. O’Reilly, 1999, Lesson 5.