Good science fiction asks more questions than it can answer on its own, walking a fine line between teasing the reader and frustrating them. It is thought-provoking in the truest sense, an author’s earnest attempt to infect their reader with an inner conversation that continues well after the last page.
Author Ted Chiang does this really well.
I started reading his short story collection Exhalation this week and was blown away by the questions he asks as well as the deviousness with which he dodges complete answers to them. Over and over he sets up unique worlds, hints at several interesting ramifications of those worlds, and finds some logical conclusion that is just substantive enough so that no one feels cheated when they’re all alone to confront those big ideas themselves. Of all the stories in Exhalation I was most captivated by how well “The Truth of Fact, the Truth of Feeling” does this. Instead of taking an interesting twist or leaving the story open-ended, in this story Chiang offers two parallel storylines, each with half-answers to the same questions.
I want to explore these ideas and use them as a springing off point to ramble on about my relationship to the subject of History.
In the first story, a tribal adolescent gains literacy and must confront the costs and benefits of the written word in the context of his oral society. In the second, the protagonist is a reporter that attempts to reconcile his shattered identity after a black mirror-esque memory retrieval gadget exposes a horrific memory about himself that he subconsciously whitewashed.
In contrasting these stories Chiang probes the many gray areas of what truth means
As someone who writes regularly, I found the first storyline to be especially poignant. Rarely if ever do we take the time to ponder how radical writing is. Literacy is literally a telepathic transmission of thoughts and ideas from one brain to another across time and space. From this perspective, writing is a cognitive aid to our brains, an analog-cyborg mechanism for storage and retrieval imbued with a kind of intrinsic infallibility. It is a sophisticated transcription of the bioelectrochemical state of our brains into coherent subvocalized thought, then into the pressure waves corresponding to those spoken words, then to an abstract system of glyphs representing those sounds.
Writing takes on an even wilder perspective in the context of a journal. Journals can be seen as curated snapshots of ones consciousness at a fixed point in time. The key word being curated. Personally I don’t do a great job of writing down the mundane details of my life; what I cooked for dinner, what I read on Wikipedia. Instead I record the things that I think are important. Journaling is the only artificial, infallible memory I have. How does the selectivity of my journaling shape my overall memories? How does shaping my overall memories effect my overall vision of self? What about on a collective level– does an oral history offer more benefits to a society because fallibility is an advantage?
The main storyline of “The Truth of Fact, the Truth of Feeling” explores this further through the idea of a vast, searchable repository of life footage which renders our organic memory obsolete in favor of an infallible digital memory. It sounds dystopian. Or does it?
The parable of the story would suggest that there are individual protections afforded a fallible memory (repressing an older memory that contradicts with the current Self) and harms on a collective level of fallible memories (holocaust deniers, forgetting ones transgressions). Is it also the case that the reverses are true? Infallible memory would stand in the way of cohesive group narratives reflecting the current identity.
My father is a historian and the running joke is that whenever he’s asked to weigh in on current events he’ll say “I don’t know ask me again in ten to fifteen years.”
It’s not the most satisfying answer for a curious youth, but the older I get the more I appreciate my father’s familiar reply.
With the world in the throws of a global pandemic, public trust in the US government at home and abroad near historic lows, and widespread social and economic unrest, you might say we are living in historic times. Yet even my old man, a historian with a particular interest in race, politics, and sports in the American south, remains as steadfast as ever against the urge to speculate on meaning of the present.
But I can take comfort in a good “I don’t know yet.”
But I take comfort in hearing him repeat that old familiar “I don’t know yet.” Even though our current culture is increasingly skeptical of facts and expert academic opinion, it feels good to be reminded that the experts are out there ready to scrutinize key primary sources, piece-together the bigger pictures, and objectively narrate the whole story.
But if you think about it, 25 years from today only puts us in the year 1995. You know, the year Microsoft unveiled Internet Explorer as paid DLC. If your average historian really needs a quarter century to digest their primary sources, they’ve barely even hit the digital ones yet.
What changes when they do?
Historians, biographers, and other scholars of the past largely structure their research around the collection, categorization, and interpretation of primary sources. There is a certain authority in transcriptions, handwritten letters, newspapers, and other paper trails, but what does one make of the malleability of digital primary sources? I suppose forgeries, lies, and revisionist history have always been a part of the human experience, but there is something to be said for the relative ease with which ditigal records can be manipulated.
Pictures can be photoshopped, videos can be deep-faked, text message screenshots can be generated, tweets can be removed, and emails can be hacked.
Is history more intimate now that we can read up-to-the-minute tweets and posts, or does that superficiality make it less personal? In other words has the connected, instantaneous world created a signal-to-noise problem for those reconstructing the larger narratives that surround people and place in the modern world? Is the so-called digital divide between those unable to have access to internet at home creating a bias in how history is understood, told, and preserved?
Perhaps the more interesting questions lie not in how the famous, important, or powerful are portrayed in the history books, but in how we understand the life of an ordinary person posthumously in the digital age. Previously the letters, notes, and other paperwork of a lifetime formed a factual portrait of that life. If you wanted a better understanding of someone’s life, you could root around through familiar stockpiles– census information, meeting notes, newspaper clippings on microfilm, etc. Even though in theory all of that is accessible over the internet, surely it must be much harder to get ahold of given the fragmentation of information storage in the digital age. Sure, an overwhelming amount of details are floating around about each and every one of us through online purchases, personal data collection, tracking, social media posts, etc. But where is that data? Physcially? It must be a nightmare to submit requests on all those different platforms.
And, for that matter, is the person we are while we are tweeting an accurate reflection of our authentic selves? Who among us posts the pictures and throughts from the drudgery of the 99% of our lives rather than the glamour of our 1% while we are on fancy vacations, posing in our favorite clothes, or otherwise skillfully culling the reality from our digital representations.
I have faith in the historians and biographers of the future to be incredibly savy in these areas, though obviously the vast majority of us will not be great enough to warrent the scruples of a trained researcher. Woe to the prematurely dead millenial that leaves their legacy in an instagram account their parents can’t figure out how to operate.
As a generally quiet and private person I find myself dwelling on this subject quite a bit. What will the world make of me if I bite the dust prematurely given that I don’t have a Twitter? I deleted my twitter because I didn’t think that my tweets would age
well, since they never really reflected an authentic version of myself to begin with. I just picked song lyrics, funny thoughts, and other musings I thought would be unoffensive and popular enough to spread so I would get some more of those sweet, sweet internet points. That wasn’t me– it was a scattershot collection of the most popular and palletable facet of me fit for widespread dispersal. So I deleted it and haven’t looked back.
I don’t think everyone should delete their twitter, but it works for me. Personally I want to be remembered locally by the impressions I had on the people around me in my daily life and more broadly by my writing. To me, deleting my Twitter solved a signal to noise problem. Like spending a lifetime avoiding swear words so that friends and family understand you’re really serious whenever you use them.
Another point is my reliance on a paper journal and paper. I have long flirted with the idea of clearing out my library of paper books because they’re a pain in the ass to move. I have this fascination with minimalism and disaster preparedness that attracts me to the idea of a single ePaper tablet for books and writings. But the sticking point for me has always been that digital is impermanent and immaterial. Records exists and take up space. Moving my books forces me to trim the fat periodically and take notes. The same thing for journaling– I like the authenticity of paper and ink.
Should I die prematurely, I like knowing that some piece of me will live on to influence the lives of others still in the game. Reading to me has always been a kind of cheat code for life. Understand enough of the insights learned from people that took the time to make their own mistakes and you have cheated time. I will not live forever, but if I distill the output of my consciousness after at least 24 years of reading, writing, thinking, and learning then I will not have lived in vain even if I were to meet a grisly end even tomorrow.