In the year 400, Augustine of Hippo sat down to write about time and immediately ran into trouble. “What then is time?” he asked in Book XI of the Confessions. “If no one asks me, I know. If I wish to explain it to one who asks, I know not.”
Sixteen hundred years later, neuroscientists hit the same wall, only from the opposite direction. They had been studying memory for decades, mapping which brain regions light up when people recall the past, when three independent labs simultaneously discovered something nobody expected: the brain uses the same neural machinery for remembering yesterday and imagining tomorrow.
Science magazine named it one of the top ten breakthroughs of 2007. The discovery didn’t just change how we think about memory. It changed how we think about thinking itself.
The Man Who Lost Tomorrow
On a day in October 1981, Kent Cochrane rode his motorcycle off an exit ramp and into the history of neuroscience. He was thirty years old. He arrived at the hospital in the grip of epileptic seizures, unconscious. Surgeons removed a subdural hematoma from the left side of his brain. He spent a month in intensive care.
When he emerged, something was missing. Not his intelligence: he could tell you the difference between stalactites and stalagmites, explain how to change a car tire, navigate the streets of his Toronto neighborhood. His general knowledge, what psychologists call semantic memory, remained largely intact.
What was gone was everything personal. Every birthday, every conversation, every morning he had ever woken up. Kent Cochrane, designated Patient K.C. in the literature, had lost his episodic memory entirely: the system that stores not just facts but the experience of living through events, the texture of your own life unfolding in time.
The neuroscientist Endel Tulving spent years studying K.C. and made a discovery that would take decades to fully appreciate. When Tulving asked K.C. what he would be doing tomorrow, K.C. could not answer. He described his mental state when thinking about the future as “blank.” The same blankness he felt when trying to think about his personal past.
This was not an inability to reason about the future in abstract terms. K.C. could tell you that winter follows autumn. He knew that people generally eat breakfast in the morning. What he could not do was place himself there. He could not construct a scene in which he, Kent Cochrane, walked into a specific kitchen on a specific morning and reached for a specific coffee cup.
Losing the past had also erased the future.
Post-mortem examination in 2020 confirmed what the behavioral evidence had long suggested: K.C.’s hippocampi were devastated. The left had lost 83% of its volume. The right, 91%.
The Network That Builds Both Directions
The breakthrough of 2007 arrived from three laboratories working independently, each publishing within months of the others.
Karl Szpunar and colleagues at Washington University in St. Louis asked volunteers to remember specific past events and to imagine specific future events while lying inside an fMRI scanner. The overlap was striking: the same regions in the frontopolar cortex, medial temporal lobe, and posterior cingulate lit up in both directions. Remembering and imagining were, at the level of brain activation, nearly indistinguishable.
Daniel Schacter and Donna Rose Addis at Harvard found the same pattern with a twist. The left hippocampus activated for both past and future scene construction. But future events additionally recruited the right hippocampus and right frontopolar cortex. Imagining something new required all the machinery of remembering, plus additional resources for creative assembly.
Demis Hassabis and Eleanor Maguire at University College London took the most direct approach. They tested five patients with bilateral hippocampal damage and asked them to imagine entirely new scenes: lying on a beach, standing in a museum. The patients’ imagined scenes were fragmented, incoherent, lacking spatial structure. They could not build a coherent space in which to place the imagined events. The hippocampus, it turned out, was not just a memory bank. It was a scene constructor. Without it, neither past nor future could be assembled into anything resembling an experience.
These three studies converged on a single conclusion: episodic memory exists not primarily to record the past, but to provide raw material for building the future. Schacter and Addis called this the constructive episodic simulation hypothesis. The brain does not replay memories like recordings. It disassembles past experiences into components, location, people, objects, emotions, and recombines them into simulations of things that have not yet happened.
This also explains one of memory’s most frustrating features: its unreliability. The same recombinative machinery that allows you to imagine a conversation you haven’t had yet is what occasionally makes you “remember” conversations differently than they actually occurred. Memory errors are not a system flaw. They are the price of admission for the ability to think about the future at all.
The brain network responsible for all of this, the default mode network, was discovered almost by accident. In 2001, Marcus Raichle and colleagues noticed that certain brain regions were more active when people were resting than when performing attention-demanding tasks. The brain, it turned out, does not go quiet when you stop paying attention to the outside world. It starts building: replaying, projecting, simulating, spinning narratives about who you were and who you might become.
The Unreliability That Makes Us Human
If memory were a faithful recording, we would be excellent witnesses and terrible planners. The fact that memory is constructive, that we rebuild past events from fragments rather than replaying stored tapes, is precisely what gives us access to the future.
Frederic Bartlett demonstrated this in 1932 with his “War of the Ghosts” experiment. He asked English students to read an unfamiliar Native American folk tale and then recall it over several weeks. The stories they produced were shorter, more logically coherent, and culturally “corrected.” Elements that didn’t fit their existing frameworks were quietly replaced with more familiar ones. Bartlett concluded that memory is an active process of construction guided by what he called “effort after meaning”: we do not retrieve the past so much as reassemble it to make sense.
Elizabeth Loftus pushed this further. In her 1974 car crash study, the single word used to describe an accident, “smashed” versus “contacted,” significantly changed participants’ estimates of the vehicles’ speed. In 1995, her “Lost in the Mall” study demonstrated that 25% of participants could be led to construct vivid, detailed memories of a childhood event that never happened. A 2023 preregistered replication, five times the original sample size, found the rate had risen to 35%.
None of this means memory is useless. It means memory evolved for a different purpose than we assumed. It is not an archive. It is a workshop. And the same workshop that occasionally produces a false memory of being lost in a mall is the one that lets you plan your next vacation, rehearse a difficult conversation, or imagine what your life might look like in five years.
Tulving understood this before most. In 1972, he had proposed the foundational distinction between episodic memory (your personal timeline) and semantic memory (general knowledge). By 1985, he had linked episodic memory to a specific form of consciousness he called autonoetic: self-knowing awareness, the ability to recognize your own experiences as belonging to your past. In 2002, he coined a new term for the temporal dimension: chronesthesia, the conscious awareness of subjective time that allows mental time travel.
K.C. had lost his autonoetic consciousness. He could know facts but could not know himself across time. And without that, both directions of the timeline collapsed into the same blank.
The Jay That Planned Tomorrow’s Breakfast
In 1998, Nicola Clayton and Anthony Dickinson at Cambridge reported something that complicated the “uniquely human” story. Western scrub jays, given the chance to cache both perishable wax worms and non-perishable peanuts, remembered not just what they had hidden and where, but when. If enough time had passed for the worms to have spoiled, the jays skipped those caches and went straight to the peanuts.
This was what-where-when memory, the behavioral criteria for episodic-like recall, demonstrated in a bird.
Clayton’s team was careful with the terminology. They called it “episodic-like” memory, because the subjective experience, the autonoetic consciousness that Tulving considered essential, cannot be verified in a non-human animal. The jay acts as if it remembers. Whether it experiences remembering is another question.
In 2007, the same lab showed that jays could do something even more striking. Birds learned that they would be hungry the next morning in one particular compartment of their enclosure. Given the opportunity to cache food the evening before, they preferentially stored it in that compartment, the one where they would need it. They planned for a future need they were not currently experiencing. A separate study by Correia, Dickinson, and Clayton showed that even when fully sated on one type of food, jays cached that food in locations where it would be unavailable later, overriding their current motivational state in favor of an anticipated future one.
Thomas Suddendorf and Michael Corballis, who had elaborated Tulving’s mental time travel concept into a full evolutionary theory in their 1997 and 2007 papers, argued there was “as yet no convincing evidence for mental time travel in nonhuman animals.” Their position: what the jays demonstrate might be sophisticated associative learning rather than genuine subjective projection into the future.
The debate continues. Jonathon Crystal’s laboratory at Indiana University has produced evidence that rats can represent anticipated future events and use episodic memory to retrieve information about events that were not known to be important at the time of encoding. The question is no longer whether animals show future-oriented behavior. The question is whether the subjective time-traveling experience, the mental movie of yourself-in-the-future, is necessary for the behavior, or whether the behavior can emerge from simpler mechanisms.
It is, in its own way, a question about consciousness. And consciousness, as Augustine knew, is where all questions about time eventually lead.
When the Machine Breaks
If mental time travel is a single system that serves both memory and imagination, then damage to that system should affect both. It does.
Depression does not merely make people sad about the past. It flattens the entire temporal landscape. In 1986, J. Mark G. Williams and Kate Broadbent discovered that patients who had recently attempted suicide, when asked to recall specific personal memories in response to cue words, produced generalized, vague responses instead. Not “the afternoon my daughter took her first steps” but “times I felt happy.” This overgeneralized autobiographical memory extends to the future: depressed patients generate equally vague and generic images of what lies ahead. The mental time machine still runs, but it produces only fog.
Williams later developed the CaR-FA-X model to explain why: depressive rumination captures attention at a general, abstract level, preventing the retrieval process from drilling down to specific episodes. At the same time, staying general may serve as functional avoidance, a way to sidestep the emotional pain of specific memories. The cost is that the same avoidance mechanism blocks access to specific future scenarios, leaving depressed individuals unable to envision concrete positive futures to work toward.
PTSD is the opposite problem. Where depression produces fog, PTSD produces involuntary, vivid, overwhelmingly specific mental time travel. Flashbacks are not ordinary memories. They lack temporal context. The defining feature is the collapse of the time frame: the traumatic event is not remembered as something that happened in the past. It is experienced as happening now. The brain’s temporal tagging system has failed. The past breaks through into the present with all its sensory intensity intact, and the sufferer cannot place it back where it belongs.
The neuroscience confirms this: PTSD flashbacks show increased activity in the amygdala and insula (emotional and bodily arousal) and decreased activity in the ventromedial prefrontal cortex (the region responsible for contextualizing and controlling emotional responses). The brakes are off. The time machine runs without a steering wheel.
Anxiety represents yet another mode of dysfunction: the machine runs too hot in one direction. Worry is, at its core, future simulation stuck in a loop. The system that evolved to anticipate threats and plan responses becomes fixated on negative scenarios, generating them faster than the conscious mind can evaluate and discard them. Daniel Grupe and Jack Nitschke described this in a 2013 review in Nature Reviews Neuroscience: anxiety involves alterations in how the brain handles uncertainty, amplifying potential threats and undermining the capacity to update predictions based on actual outcomes.
There is a striking parallel here. The same default mode network that enables the adaptive capacity to imagine the future, that made the 2007 breakthrough possible, is the network that runs too loosely in depression (producing fog), too rigidly in PTSD (breaking temporal context), and too anxiously in generalized anxiety (spinning worst-case simulations). Mental time travel is one of our most powerful cognitive tools. Its failure modes are some of our most devastating mental health conditions.
The connection extends to the placebo effect, where the brain’s capacity to simulate expected outcomes appears to generate real physiological changes. Expectation, it turns out, is not passive. It actively shapes the body’s response.
The Oldest Question in the World
Neuroscientists arrived at the overlap between memory and imagination in 2007. Philosophers had been circling the same insight for millennia.
Augustine’s analysis in Confessions Book XI remains astonishing. He argued that speaking of “three times,” past, present, and future, is inaccurate. There are instead three presents: the present of things past (memoria, memory), the present of things present (contuitus, direct attention), and the present of things future (expectatio, expectation). All three exist only in the mind. Time, for Augustine, is a distentio animi, a stretching of the soul, the mind pulled simultaneously toward memory and expectation while trying to hold onto attention.
The Latin word distentio carries connotations of disease, distortion, anxiety. Augustine was not describing a serene contemplation of time’s flow. He was describing a soul torn between what it remembers and what it fears, stretched painfully across the very temporality that defines its existence.
Henri Bergson, writing in 1896, drew a distinction that anticipated Tulving’s episodic/semantic split by seventy-six years. In Matter and Memory, Bergson identified two fundamentally different types of memory: habit memory (automatic, bodily, utilitarian, repeating past actions without recognizing them as past) and pure memory (contemplative, spiritual, representing the past and recognizing it as past). The first maps onto what we now call procedural memory. The second maps onto episodic memory. Bergson further argued that the brain does not store memories like files in a cabinet. It filters them. The past survives in its entirety; the brain’s role is to restrict what we access to what is useful for current action.
The term “specious present,” our felt experience of “now” as a duration with breadth rather than a mathematical point, was not coined by William James. It came from E. Robert Kelly, a Boston cigar manufacturer who wrote a single book of philosophy in 1882 under the pen name E.R. Clay. James adopted the concept in his 1890 Principles of Psychology, describing the specious present as “a saddle-back, with a certain breadth of its own on which we sit perched, and from which we look in two directions into time.” James estimated its duration at roughly two to twelve seconds. Modern psychophysics largely agrees: our experienced “now” has a window of several seconds.
And twenty years before the fMRI studies confirmed it, D.H. Ingvar, a pioneer of functional neuroimaging, published a paper in 1985 titled “Memory of the Future.” He had discovered that damage to the prefrontal cortex produced what he called a “loss of future,” an inability to plan or organize behavior across time. His finding was largely overlooked until the 2007 convergence vindicated it.
There is a broader pattern here. The Corpus Hermeticum, composed in the second to third centuries CE, presents a hierarchy in Tractate XI: God makes Aeon (Eternity), Aeon makes Cosmos, Cosmos makes Time, and Time makes Becoming. This is not a creation sequence. It is an ontological map: eternity and time are different modes of existence, and the spiritual path involves ascending from the flux of Becoming, through Time, toward the timelessness of Aeon. The Persians had their own version: in Zurvanism, Zurvan (Infinite Time) is the supreme deity from which all other forces, including good and evil, emerge. Time is not merely the stage on which events unfold. It is the generative power from which existence itself proceeds.
The Cultures That Reversed the Arrow
Not every human civilization imagines time the way speakers of English, German, or French do.
The Aymara people of Bolivia, Peru, and Chile organize time in the opposite spatial direction from most of the world’s cultures. In Aymara, nayra means both “eye/front” and “past.” Qhipa means both “back/behind” and “future.” The logic: the past is what you have seen, so it lies in front of you, visible. The future is what you have not yet seen, so it lies behind you, out of sight. In a 2006 study published in Cognitive Science, Rafael Nunez filmed approximately twenty hours of conversation with thirty Aymara adults and confirmed the linguistic evidence with gesture: speakers pointed forward when discussing the past and swept their hands backward when discussing the future. The pattern was strongest among elderly speakers with limited Spanish.
This is not a quirk of language. It challenges what had been assumed to be a cognitive universal: that humans place the future ahead and the past behind.
The Hopi language became the center of an even fiercer debate. Benjamin Lee Whorf, working in the 1930s and 1940s (published posthumously in 1956), claimed that Hopi has “no words, grammatical forms, constructions or expressions that refer directly to what we call ’time.’” His conclusion was extreme: Hopi speakers literally perceive time differently. In 1983, the German-American linguist Ekkehart Malotki published a nearly 700-page refutation, Hopi Time, documenting hundreds of Hopi temporal expressions. Whorf was wrong that Hopi lacks time concepts. But Malotki discovered that Hopi grammaticalizes tense using a future/non-future distinction (rather than English’s past/non-past), which means the linguistic emphasis falls in a genuinely different place.
The most radical temporal framework may belong to Aboriginal Australians. The Dreaming (Tjukurpa in Western Desert languages, Jukurrpa in Warlpiri) is not, as popular accounts sometimes suggest, a “creation myth” located in the distant past. The anthropologist W.E.H. Stanner, who introduced the English term “The Dreaming” in 1953, was explicit: “One cannot fix The Dreaming in time; it was, and is, everywhen.” The ancestral beings who shaped the land are not historical figures. Their actions remain causally active in the present. The landscape is a living record. Ceremony does not commemorate the Dreaming; it participates in it. The Warlpiri dictionary defines Jukurrpa as “not conceived as being located in an historical past but as an eternal process.”
This is not a primitive failure to understand linear time. It is a different metaphysics, one in which the “origin” is not behind us but is the continuously operating ground of existence. Nagarjuna, the second-century Buddhist philosopher, arrived at a parallel conclusion through pure dialectic. In Chapter 19 of the Mulamadhyamakakarika, he systematically demonstrated that past, present, and future are all sunya, empty of inherent existence. If the present and future depend on the past, he argued, then they would need to already exist in the past, which contradicts their being present and future. Time, like all phenomena, arises through dependent origination and has no independent, self-existing nature.
The Orphic mysteries of ancient Greece offered yet another model: the soul carries knowledge from before birth, and the task of the living is to remember what was known outside of time. Plato formalized this as anamnesis in the Meno and Phaedo, where learning is not acquisition but recollection of eternal Forms. Memory, in this framework, does not point backward along a timeline. It points upward, out of time altogether.
Dissolving the Clock
If the default mode network is what constructs our experience of time, then suppressing it should dissolve that experience. This is exactly what happens.
In 2011, Judson Brewer and colleagues published an fMRI study in PNAS showing that experienced meditators display significantly reduced activity in the main nodes of the default mode network, the medial prefrontal cortex and posterior cingulate cortex, across all meditation types. The same network that builds the past-future narrative goes quiet. Meditators consistently report that deep practice involves a loss of temporal experience: the sense of “before” and “after” dissolves, leaving only an undifferentiated present. The Buddhist doctrine of ksanikavada (momentariness), which holds that every phenomenon exists for only a single instant before giving way to the next, describes from the inside what the brain scans show from the outside.
Psychedelics achieve a similar result through a different mechanism. A 2007 study by Marc Wittmann and colleagues found that psilocybin selectively disrupts temporal processing of intervals longer than two to three seconds while leaving shorter intervals intact. The serotonin 5-HT2A receptors, which psilocybin targets, appear to be specifically involved in the processing of longer durations, the timescales at which narrative and sequence operate. Robin Carhart-Harris’s entropic brain hypothesis proposes that psychedelics disintegrate the default mode network’s organized activity, replacing the brain’s normal hierarchical structure with a flatter, more entropic pattern. The subjective correlates: ego dissolution, timelessness, the merging of self and world.
Rick Strassman’s DMT research, published in 2001, documented a recurring theme among participants receiving high-dose intravenous DMT: a sense of complete timelessness, often accompanied by encounters with seemingly autonomous entities in spaces that felt more real than ordinary reality. Aldous Huxley, after taking mescaline in May 1953 under the supervision of psychiatrist Humphry Osmond, noted the same temporal dissolution: “Along with indifference to space, there went an even more complete indifference to time.”
The near-death experience offers perhaps the most dramatic example. The AWARE study, led by Sam Parnia and published in Resuscitation in 2014, examined 2,060 cardiac arrest cases across fifteen hospitals. Of the survivors who could be interviewed, 39% reported some form of awareness during clinical death. The AWARE-II study in 2023 added real-time EEG monitoring and found physiological markers compatible with consciousness even during cardiac arrest. The researchers hypothesized that disinhibition in the dying brain may “open access to new dimensions of reality, including lucid recall of all stored memories from early childhood to death.”
This is, almost word for word, what Bergson proposed in 1896: the past survives in its entirety, and the brain’s job is to filter, not store. Remove the filter, whether through meditation, psychedelics, or the dying process itself, and the archive opens.
The ouroboros, the serpent devouring its own tail, encoded this insight in symbol long before anyone could scan a brain. The earliest known examples appear in the tomb of Tutankhamun, around 1323 BCE. The Chrysopoeia of Cleopatra the Alchemist inscribed within the serpent’s circle the words Hen to Pan: “The All is One.” Time consuming itself. The end feeding the beginning. The distinction between memory and prophecy collapsing into a single ring.
Two Readings
The materialist reading is clean and comprehensive. The default mode network constructs our subjective experience of time by linking memory to imagination. Damage the hippocampus and both collapse. Suppress the DMN with meditation or psychedelics and temporal experience dissolves. Augustine, Bergson, and the Buddhists were describing, in prescientific language, what neuroscience now measures with fMRI scanners. The “specious present” is a product of neural integration windows. The “threefold present” is the DMN cycling between memory, attention, and prospection. The Aymara and the Hopi demonstrate that these neural processes are shaped by culture and language. There is nothing mysterious here, only a brain doing what brains do.
The other reading notices the gaps.
It notices that K.C.’s “blank,” the same experiential quality for past and future, implies that temporal direction is constructed rather than inherent, that the brain assembles time from scratch each moment rather than riding a pre-existing arrow. It notices that the 2007 discovery, that memory and imagination share the same neural substrate, does not explain why they should. Evolution could have built separate systems. It built one system that does both, as if the distinction between what has happened and what might happen is not fundamental but surface-level.
It notices that the life review reported by cardiac arrest survivors, a panoramic, often emotionally organized survey of an entire life, occurring during a period of no measurable heartbeat, does not reduce comfortably to “dying neurons misfiring.” It occurs in a structured, meaningful form, consistently across cultures, with physiological correlates that the AWARE-II researchers described as “compatible with consciousness.”
It notices that the convergence between Augustine’s distentio animi and the default mode network, between Bergson’s filter theory and the disinhibition hypothesis, between Buddhist ksanikavada and DMN suppression, is either a series of lucky guesses by prescientific thinkers, or evidence that careful introspection can reveal the architecture of consciousness from the inside.
And it notices that the ouroboros was drawn 3,300 years ago by people who had never heard of the hippocampus but who understood something about time that took neuroscience until 2007 to confirm: the past and the future are made of the same material.
We present the evidence. The reader thinks.



