ESSENTIALLY HUMAN. 

Memory, Imagination and creativity.

When I was eight years old, I had three ambitions. The first was to be a ballerina – more a matter of wishful thinking than creative passion, given I wanted to be tall and slim, I was keen to wear a tutu, and I longed to be borne aloft by men who resembled fairy-tale princes. That ambition soon disappeared, snuffed out by common sense and a maturing aesthetic. The second ambition was to be a novelist. For as long as I could remember, novels had provided a refuge from the crush and uncertainty of my child’s life – a very necessary refuge given that childhood and I were not an easy fit. Filled with anxiety that I was getting the child role wrong, and ever alert to any tools that might help me navigate those perilous years, I studied other children in order to learn the ropes. The price of such vigilance was inexhaustible exhaustion. I was watchful during the day, and the nights were spent analysing the previous day’s mistakes and planning protective strategies for the next. The most reliable respite I had from this anxiety was not sleep (with so much to do I was a poor sleeper) but novels. I would slip into other lives and other places and relieved of my deficient yet demanding self, I would feel stimulated, invigorated, engaged, even happy – although like Samuel Beckett, I discovered that happiness was not as lively as I would have thought. Fiction supported me through the seemingly endless years of childhood, and from a young age I was determined to become a novelist. 

That I had decided to be a writer of novels could be explained in terms of the pleasure principle, self-preservation and escape. But I couldn’t escape all the time so what I needed was a reliable means of holding life in place. This need shaped my third ambition, namely, to have a perfect memory, the sort of memory that could learn poems by heart, that could reel off a page of numbers briefly glimpsed, that never forgot a person, place, event or story. If I could remember everything that had ever happened to me in the past, I would better manage the troubles that plagued my present, and gain a grip on the uncertain future.

My concept of memory in those long-ago days was akin to memorising. And I did so love memorising – poems, psalms, one year the entire geography book (I disliked geography but memorising the book as I walked around a deserted tennis court had a pleasing, hypnotic effect). In one respect, memorising was similar to reading in that both activities, by capturing my entire attention, removed me from the trials of childhood and my burdensome self. But that was the only similarity. Reading invigorated me, it sparked my imagination; I would travel to times and places not my own, and mix with people not to be found anywhere in suburban Melbourne. In contrast, my childhood memorising filled the time with a vaguely narcotic calm – like doing Lego or jigsaws or painting by numbers.[1]

To be a novelist and to have perfect memory: two linked ambitions to help steer me through the bewildering world in which I found myself, and through primary and secondary school they sustained me well. It was during my first year at university that I realised my dual ambitions were incompatible. I read The Mind of a Mnemonist, written by the great Russian neurologist, A.R. Luria, a book detailing the extraordinary memory of a man called ‘S’. This was followed a couple of months later by Borges’s Labyrinths, most especially, the short story, ‘Funes the Memorious’. Funes, an ordinary man in most respects, remembered everything; he could neither select nor forget. Beset by details, filled with details, Funes’s imagination was stifled. He was a man incapable of ideas or insight.

 ‘S’ and Funes demonstrated extraordinary feats of memory, like remembering pages of numbers or lists of nonsense syllables or poems in foreign languages, or the foreign language itself. Funes, for example, learned Latin in a week by memorising the Latin of Pliny the Elder as written in Pliny’s Naturalis Historia. But in  both cases this sort of memorising blocked the free-ranging imagination so essential to the novelist; it was a memory confined to particulars – a type of memory that excludes reflection and reason, inspiration and illumination. 

Novelists require a versatile, athletic imagination, one that can leap and loiter in the uncertainties of an emerging narrative. The novelist’s imagination is an explorer and thrill seeker. Perfect memory of the sort to which my childhood self had aspired is in service to safety and certainty; it is memory without imagination, it is memory at odds with metaphor, it is a memory as discreet and as immoveable as a picket fence. 

The memory that fuels human life and endeavour is far more creative than this. If I remember an event, say the breakup of an affair, it is not identical to the actual break-up. My recollection is overlaid with other experiences, other understandings, conversations, desires. Memory relies upon the recruitment of an entire cognitive landscape – experience, study, conversations, other memories. Just as Magritte’s painting of a pipe, Ceci n’est pas une pipe, so too my memory of the break-up, or, indeed, any memory. Memories are created, and what they recall is re-created differently with each recollection. 

This notion of memory is fluid and often illuminating. It is also unpredictable: we cannot determine beforehand what knowledge and experience will be pulled into play during recollection, nor the ideas that will emerge from the mix. And it is unreliable, too. As a photo is not the whole story, that there is, as Barthes wrote, a ‘blind field beyond the frame’; so too with memory. All memory involves partial forgetting, but what is forgotten is not erased, indeed, it is a common occurrence that, triggered by an experience in the present, a formerly forgotten element rises to consciousness. In addition to being unpredictable and unreliable, memory is not neutral. We select what is to be remembered, and depending on current circumstances, what is selected this year will be different from what will be selected next year or in ten years time. Memory is in service to the present far more than the past event it purports to preserve. Far from being solid and immutable, this memory we trust is both delicate and infinitely corruptible, it is also free-ranging and expansive, linking with a swill of experience, desire, emotion, understanding, doubts, values and attitudes. This notion of memory, this fertile and changing memory, this creative and illuminating memory, is one dependent on the imagination and in turn constructing it.[2]

We draw on memory to make sense of ourselves as we trek through daily life. It is memory that fuels imaginings of the future, it is memory that helps plan for it. It is memory that feeds the hopes and desires that drive us to seek out this person and abruptly turn from another. It is memory that seeds great art, composes memorable music, writes heart-stopping poetry and life-changing novels. A life without imagined, creative memory has been described by great neuroscientists like Luria, and it is a life confined in an ever-present where nothing much changes and art is indistinguishable from dust. A life without imagined, creative memory would be, for most of us, unbearable.

And yet it seems we are heading straight into that darkness.

Memory, imagination and creativity require a few simple conditions: uninterrupted time, solitude, no external distractions and the ability to sustain attention and focus.

In the digital age, for many, perhaps most people, simple factual memory – recall – of a name, a book or a film, has been rendered vestigial by web searches. Stop, I say, when someone reaches for their phone because they can’t remember the name of a book, an actor, a film. Stop, I say, give your mind time, give it space, the name will come to you. Mostly the person does not stop, does not wait. What would be the point? they imply, with a nod at their phone. Indeed, we have adapted so quickly to having information at our fingertips that when we want to know something, we want to know it now, whether it be the closest bagel shop (with pictures to show if the bagels are authentic), or a video clip of the admirable Jonas Kauffman singing Andrea Chenier (there is); it doesn’t matter what the nature of the information, we expect to have it immediately. 

Skills, which once were ubiquitous, have been lost. It would be a rare millennial or post-millennial who could read a map, but neither would they regard this as a loss Why bother, when they have GPS? And simple mathematical calculations – multiplying, division, percentages – these calculations would be beyond many people. But again, why bother, when you always have a calculator in reach? 

The history of human progress is also the history of lost skills. Few people these days would know how to rub sticks together to produce fire. Few people would know how to build a fire in a grate; quite a few people would not know what a grate is. And who would know how to read a sundial these days? But the loss of fire-making skills or telling the time by the sun, or reading a map, or doing simple maths do not diminish our humanness: they do not fundamentally alter what it is to be human.

Creativity is in a different sphere, creativity is essentially human. In our switched-on, digital world, sustained attention on a single idea, problem or issue has been lost to the dings and clicks of social media, news feeds, email, texts. Turn off notifications and that little screen still flashes. Solitude has being shoved aside by the need for constant connection and approbation. Checking a phone every few minutes is akin to laboratory rats pressing a lever for food, knowing that eventually they will be rewarded. So, too, with social media.

The mobile phone is the metaphorical heart beat of the modern individual. We sleep with it, run with it, eat with it. It accompanies us to the bathroom, it’s in reach while we have sex. It is a common sight to see two people seated together in a café each occupied with their phone, the only communication occurring when one shares their screen with the other. With WIFI blanketing the earth, even in wilderness landscapes the digital heart beats strongly. There is no repose. There is no time to mull in your own thoughts, to reach into the troughs of forgetting, there is no time for the imagination to flourish. 

Multi-tasking has inflamed the situation. Attention spans have not shortened in the digital age, but the nature of attention has changed. Juggling several tasks simultaneously is counter-productive to the singular attention – the obsessiveness –that creative work requires. With solitude and privacy under assault, and sustained attention of the sort the imagination requires fast going the way of the typewriter and the tape cassette, the conditions for creative work are being steadily eroded. 

Creative skills are surprisingly frail. As a writer, I know that when I return to my desk after just a few weeks away, the words don’t come so easily, the ideas are sluggish or, worse, non-existent. It takes a few days for the creative skills to muscle up again. The imagination, if neglected, readily becomes dormant; if it is not nurtured, it can actually decay. Muscles that are unused become stiff and painful and thereby draw attention to themselves. Unfortunately, the imagination goes far more quietly.

Running alongside this assault on human creative processes is the vast and increasingly complex advances in Generative Artificial Intelligence and Artificial General Intelligence. Generative AI has been designed with the capacity to learn and develop new data from its own actions/work, while Artificial General Intelligence is human-level intelligence that can do everything the human mind can do – and more.

An example of Generative AI was portrayed in a 2016 Go tournament between Lee Sedol, the world champion at the time, and the computer programme AlphaGo. Demis Hassabis, the scientist who produced the early work linking the neurological substrates of imagination and memory, was the primary creator of AlphaGo.[3]  

There were good reasons to choose Go to test the limits of the AlphaGo computer. There had been many AI-versus-human chess tournaments, but Go is far more complex than chess and, for seasoned players, creativity and intuition play a role. Many players speak about the game in quasi-spiritual terms. This spiritual dimension combined with the role of intuition underscore Go as a distinctly human-grounded game. 

AlphaGo demonstrated a level of self-learning in the tournament with Lee Sedol that even surprised its creators. One particular move, move 37 in the second game, was described by one Go Master as ‘not a human move’ but ‘beautiful, so beautiful’. This move had not been a part of AlphaGo’s programmed memory, rather the machine had created it.

Sedol lost the tournament 4 to 1 and, not long after, he resigned from the professional circuit. Since then other programs to play Go have been developed that are even more capable than AlphaGo. 

Generative AI is already with us, and artificial general intelligence with its open-ended creative skills is fast developing. There are daily warnings of the dangers of AI, including some from the founders in the field.[4] There are worries expressed over AI-written student essays and AI-generated novels, but there are far more serious concerns over the possibility of AI surpassing human intelligence and subjugating we humans. 

Whatever the future brings, it has already begun, and running in parallel has been the erosion of the very skills needed to maintain control of our human future. In our eagerness to embrace all the digital world has to offer, we have neglected to take stock of what we have complacently relinquished. Human imaginative work runs the risk of dying out, and with artificial general intelligence, creativity will become the province of the machines that human imagination and ingenuity produced. In short, we are creating the conditions of our own demise. AI isn’t taking over human creative endeavour, we have already ceded the territory.


[1] This childhood memorising is very different from the memorising of my adult years when, consciously and fully alert, I have worked to absorb poems and quotable quotes into the fabric of mind. As George Steiner memorably wrote: ‘To learn by heart is to afford the text or music an in-dwelling clarity and life-force…What is committed to memory and susceptible of recall constitutes the ballast of the self.’ (Steiner. Real Presences. pp 9-10).

[2] In recent years, the correlation between memory and the imagination has been given a neuroscientific basis. Demis Hassabis, neuroscientist, all-round genius, and winner of the 2023 Nobel Prize of Chemistry, published a paper in 2007 in which he linked memory and the imagination. He revealed that similar brain activity occurred during tasks employing memory of the past and tasks requiring an imagining of the future, and both were grounded in the hippocampus. Since then, much further work has been done in the neurological connections of memory and the imagination.

[3] An excellent account of the tournament is the documentary, AlphaGo, directed by Greg Kohs, and available on YouTube. Also Benjamin Labatut in his brilliant, unorthodox novel MANIAC, devotes the last chapters to the game of Go, in particular, this tournament between Lee Sedol and AlphaGo. 

[4] In his book Homo Deus: A Brief History of Tomorrow, Yuval Noah Harari analyses recent advances in AI and what they may mean for the future of humankind. The book was published in 2016, developments (some would prefer the term ‘progress’) since then are cause for far greater concern.

5 thoughts on “ESSENTIALLY HUMAN. 

  1. Lisa Hill's avatarLisa Hill

    Hello Andrea, thanks for sharing your thoughts about this.

    I’ve had some conversations about what is lost when using ChatGPT to write, but I feel like a voice crying in the wilderness.

    As I write this in this moment, my brain is processing what you have written so that I don’t repeat what you have already said; it is sorting out what I want to write, and it’s anticipating responses to it so that I can forestall any objections. It’s also remembering how to spell and construct sentences and paragraphs. It’s dealing with typos that AI underlines in red for me, (which would offer me replacement options to right-click if I needed them, but they are typos not spelling mistakes and I’m good at spelling.) My busy brain is re-reading what I’ve written to edit it if necessary, but it also knows what I’m going to write and often what I think even before I find the words to express it. And all the time it’s doing all those things, it’s forming and retrieving and re-wiring neuro-connections in specific parts of my brain.

    So what happens to our brains if we stop doing this? I guess we lose all those connections that are never formed or retrieved, and if we use AI often enough we may even lose the ability to do it at all. The arrival of print impacted on oral culture so that over time people lost the ability to relate long epics and stories like Beowulf; perhaps AI will impact on writing in a similar way. I don’t like the idea of young people using AI to cheat on their essays, or the demise of professional writing, but worse than that is the neurological impact on the brain, especially the developing brains of young people. That’s uncharted territory.

    Reply
    1. Andrea Goldsmith's avatarAndrea Goldsmith Post author

      Thank you so much, Lisa, I hear your cry. What you describe are the complex (and quite beautiful) workings of an active mind – workings that, to my understanding, AI cannot do. AI can imitate, regurgitate and rearrange what already exists (and often makes mistakes when doing so) but it cannot reason and create in the way you describe. People like the seriously wonderful James Gleick explain why. (In the New York Review of 24/07/25, there’s a terrific article by Gleick, reviewing what sounds like an excellent book THE AI CON by Emily Bender and Alex Hanna.)

      My concern is that technology CAPTURES us, seduces us and deceives us too, and that we are relinquishing our human capacities without protest – mindlessly. And I agree with you that thinking, reasoning, indeed all cognitive processes, if not used, become sclerotic.

      But, I confess I felt a good deal better after reading the Gleick article, and I hope that positive trend will continue when I read the Bender and Hanna. And yet, and yet I cannot help but worry about the unfettered power of the tech bros and what they are unleashing on a compliant world. On us.

      Reply
  2. Constantine Danilevsky's avatarConstantine Danilevsky

    Hi, Andrea. Wonderful essay, and I totally agree with the idea that memory plays a fundamental role in human consciousness, thinking and creativity. But if we are to compare and contrast the human mind with AI, I have the feeling that the view that “AI can imitate, regurgitate and rearrange what already exists … but it cannot reason and create” might be a bit dated. I’d like to draw your attention to an interesting article in the New Yorker, (https://www.newyorker.com/magazine/2025/11/10/the-case-that-ai-is-thinking). I won’t regurgitate it here – suffice it to say that It points out surprising similarities between the workings of AI models and certain theories of the human brain. 

    Reply
    1. Andrea Goldsmith's avatarAndrea Goldsmith Post author

      Thanks Constantine, I really appreciate your response. I have now read the New Yorker article and have been pondering what you have written and Lisa Hill too. What interests me most is creativity and the imagination, and that both contribute significantly to memory. This is not problem solving. Indeed the problem in the New Yorker article which was solved by ChatGPT was no different, in my mind, to the trouble-shooting section at the back of new appliance booklets (now obsolete given that everything is now on line). It was a problem with a single solution – no creativity nor imagination required.
      And metaphor. I can’t see AI coming up with ‘the foul rag-and-bone shop of the heart’; not even ‘the bad-smelling secondhand shop of longing/desire’. Metaphor is creative. And Rothko before Rothko: could AI create that?
      Rather than go on here, I have decided to write another short piece focussed on creative memory. It It will come shortly, while my brain (or mind?) is buzzing.
      But again, thank you both.

      Reply

Leave a comment