gekko

quality online informatics since 1994

A solemn man in 1950s attire stands in a vintage airport terminal holding a sticker-covered suitcase, suggesting memory, travel, and data in transit.

Memory by Round-Trip Time

When John Carmack recently mused about storing model weights not in DRAM but effectively “in flight” inside long fiber-optic loops, I felt the peculiar satisfaction of seeing a childhood thought experiment return wearing a proper engineer’s clothes.

His version is, naturally, better. It has numbers, materials, bandwidth, realistic constraints, and the kind of sober physicality that separates “advanced systems idea” from “adolescent daydream entertained while staring out of a window.” He points out that 256 Tb/s over 200 km of single-mode fiber corresponds to a surprisingly large amount of data physically present in transit at any given time. From there comes the delightful suggestion that one might stream deterministic neural-network weights continuously into cache from a recycling fiber loop, a modern descendant of old delay-line memories. It is odd, elegant, and exactly the sort of idea that sounds absurd for ten seconds and then becomes annoyingly plausible.

What amused me was how close this is in spirit to something I used to imagine as a teenager: data storage by laser beam and distance.

The premise was simple enough. If information is encoded in light and sent outward, then for as long as that light has not yet returned, it is, in some sense, still “there.” Not stored in a chip, not written to a magnetic surface, not trapped in a transistor gate, but stretched out as a physical process in space. One could imagine transmitting data toward a reflector placed at some chosen distance. The signal would travel out, bounce back, and return after a delay determined by how far away the reflector was. The storage duration would not be determined by cell leakage or refresh cycles, but by geometry. Memory by round-trip time.

A nearby reflector could function as a sort of cache: fast by astronomical standards, intolerable by any sane computer architect’s standards, but conceptually tidy. A more distant reflector could provide longer retention. If one wished to be grandiose, one could picture an entire hierarchy of storage arranged not by media type but by celestial distance. L1 cache remained mercifully terrestrial. L2 perhaps a mirror on a mountain. Longer-term storage somewhere embarrassingly far away. Backup policies would become orbital mechanics.

At that age, this did not strike me as especially ridiculous. On the contrary, it felt like the kind of idea that must surely have been considered by serious people, perhaps in secret laboratories, perhaps with long tables, slide rules, and ashtrays. It seemed entirely possible that the future of computing might involve not denser chips, but larger loops. If wires could carry bits, why should the universe not do the same? Space, after all, is mostly empty, which from an information-theoretic perspective can easily be misread by an enthusiastic teenager as “available infrastructure.”

Of course, the practical objections arrive immediately and in great numbers. Reflection is imperfect. Alignment is fragile. Targets move. Atmospheres interfere. Planets rotate. Clouds are notoriously unsympathetic to high-speed storage systems. Cosmic latency is not merely high; it is philosophically high. Any architecture depending on a signal returning from a useful fraction of the solar system would demand a level of patience not normally associated with interactive computing. One missed reflection and your file system would not so much corrupt as drift into metaphysics.

Still, the kinship with Carmack’s fiber-loop speculation is obvious enough. In both cases, memory is reimagined not as a static object but as a controlled delay. The stored thing is not sitting somewhere quietly; it is moving. It persists because the system is arranged so that it keeps circulating until needed. This is, in a way, a very old idea. Delay-line memory did something similar with acoustic pulses in mercury. Data survived not because it occupied a stable resting place, but because it remained in motion inside a medium whose timing was known. The machine periodically listened for the returning pattern and sent it on its way again.

There is something deeply pleasing about this class of idea because it strips memory of its apparent solidity. We casually speak of data “at rest,” but in many systems that is already an abstraction bordering on fiction. Charges leak. cells refresh. caches evict. files are read into buffers, copied across buses, staged, prefetched, reconstructed. Modern computation is full of temporary states pretending to be furniture. The fiber-loop notion merely makes the lie more explicit. Nothing is really resting. Some data just has better transit arrangements than other data.

The comedy lies in the contrast between physical absurdity and conceptual cleanliness. A bundle of fiber on a rack is already a little strange if you insist on viewing it as memory instead of transport. A laser bouncing bits off distant objects is stranger still. Yet both force the same question: why do we insist that storage must be local, static, and embodied in the familiar forms we inherited from previous constraints? If bandwidth grows along one curve and traditional memory economics along another, architectures once dismissed as theatrical may acquire the irritating dignity of feasibility.

I do not expect to see planetary cache hierarchies in production. It seems unlikely that future accelerator vendors will publish comparison charts listing throughput, power draw, and “mean lunar round-trip retention.” Even so, I retain some affection for the general shape of the idea. It reminds me that many supposedly wild thoughts are only respectable ideas with insufficient market conditions.

Carmack’s version, needless to say, lives in the world of actual engineering tradeoffs. Mine lived somewhere between science fiction, schoolboy arrogance, and a poor appreciation of latency. But they do share a family resemblance. Both begin with the suspicion that perhaps memory need not be a cupboard full of bits. Perhaps it can be a path. Perhaps to store information is merely to launch it into a carefully designed journey and make sure it comes back in time to matter.

That may not be how computers are usually explained, but it has a certain elegance. Memory, in this view, is not where data is. It is how long you can keep it from disappearing.


Keep up, get in touch.

About

Contact

GPTs

©

2026

gekko