How to Preserve Cultural Memory in the Digital Age

15/06/2016 8:56 PM AEST | Updated July 15, 2016 12:54
Humans are a fortunate species. We are not the strongest or fastest. We don't have the biggest brains or live the longest. Yet we are dominant over the planet. From cuneiform to computer chip, our memory technologies give us a unique survival advantage: knowledge. But that knowledge is not secure in the digital age. We're moving from an information economy of relative scarcity to one of abundance. And we have yet to build an infrastructure that can manage titanic masses of data at scale. The high cost of publishing books and making films forced us to ask what we can afford to save. But anyone with an internet connection can write blogs and post home movies to YouTube. Now we must decide what we can afford to lose. Awash in so much data, it is hard to know which have long-term value and which we can ignore. Unfortunately, we must decide to save or lose in real time, because data are ephemeral -- the average webpage lasts about 100 days. Five-thousand-year-old cuneiforms can still be read with the naked eye and a command of ancient Semitic languages. But the data on our smartphones? Only machines write code, and only machines read code. Instead of managing knowledge by managing physical objects, we need to master machines, code and power supplies.
Awash in so much data, it is hard to know which have long-term value and which we can ignore.
How do we do that? By reimagining memory for the digital age. Both findings from the science of memory and lessons from our long history of invention reveal that data storage is not memory. Good memory requires selecting what is important and forgetting the rest. Each innovation in memory technology is met with fear, skepticism and excitement. Some people praise a new memory technology as disruptive and creative while others condemn it as disruptive and destructive. Take Socrates, who famously predicted that the invention of writing would lead to ignorance and ultimately the death of memory itself. Writing "will produce forgetfulness," he warned, and people will "seem to know many things, when they are largely ignorant and hard to get along with." In other words, all they learn is where to look things up. In one sense, Socrates got it wrong. It was the societies that adopted writing that progressed. As print natives, we don't even think of books as information technology, let alone threats to true knowledge. With the invention of visual recording in the 1830s and audio in the 1870s, there was an exponential increase in the reach of knowledge and a rise in the autonomy of the individual to choose what to know or not know. We redefined knowledge as an action verb -- progress. Governments assumed new responsibilities to ensure its access to all. Besides, it is only because of writing that we know who Socrates is and what he said. cuneiform
The Ganjnameh, a cuneiform inscription in Iran, dated to the 6th or 5th century BCE. (DeAgostini/Getty Images)

In another sense, Socrates was right to raise the moral hazards of outsourcing memory. When we post personal photos on Facebook, whose data is that? Do we really control information about ourselves -- content that we create and share on "free" platforms like Gmail, YouTube and the Apple Cloud? In 30 years, the first generation of digital natives will reach maturity, join the work force and have families. What knowledge from today, let alone from 300 years ago, will be there for them? Nature endows each creature, from amoeba to zebra, with the ability to learn and remember. Memory is the primary mechanism for adapting to a changing environment. A well-rested brain can readily distinguish between what is vital from what is trivial or distracting. It filters for value, relying on our emotions to respond quickly, instinctively ignoring the rest. When we update our mental model of the world, we forget what we no longer need. We rely on culture -- our collective memory -- to carry most of the vital knowledge that our technologically intensive world requires. In periods of great instability, such as the present moment, we instinctively tap into the strategic reserve of humanity's knowledge to understand the present and anticipate the future. It is no coincidence that during this U.S. presidential campaign, when political pundits and the commentariat express surprise by the rise of Donald Trump, there is a boom in public interest in the early Republic. Learning about Alexander Hamilton to engage in counterfactual thinking about what might have been or looking back to identify analogous insurgents such as Andrew Jackson or Huey Long -- all testify to the power of the past to help us in periods of uncertainty.
Data are ephemeral -- the average webpage lasts about 100 days.
So what do we do now? There are three urgent tasks. The first is to rescue the past -- the full legacy of human knowledge built up over millennia now living in libraries, archives, museums and in people's homes. Today -- and certainly 20 years from now -- the first place people look for information, and often the last, is online. It is imperative that we make written, audiovisual and cartographic documents as discoverable and available online as possible. Just by scanning, we can gain vast amounts of new knowledge from old sources. For example, thousands of ships' logs recorded at sea over the last three centuries become an incomparably rich database of the oceans' flora and fauna, currents and winds -- the atmospheric conditions from which scientists reconstruct a history of Earth's dynamic systems and improve our projections of future climates. Second, it is imperative that we collect the digital present. We know the value of some data, such as government records, genome databases, burial sites of nuclear waste and medical records. But the value of the vast cultural output now circulating online may be unknowable for 40 years, when the first generation of digital natives matures and looks back on their own past. memory
A girl holds a candle during the commemoration of the Tiananmen Square victims at a vigil in Hong Kong on June 4. (Albert Bonsfills/Anadolu Agency/Getty Images)

We do know that we commonly fail to recognize the value of the every day. We have lost at least 80 percent of all silent films. Few contemporaries saw any long-term cultural value in what was mere entertainment. Recovering the silver off the nitrate film was more expedient than trying to preserve such combustible materials. Who knows what valuable data current technologies would be able to extract from these films if only they had survived. So what are we to make of the millions of tweets and Instagram posts broadcast to followers each week? Or the billions of Google search queries? It is easy to dismiss their potential significance if we think of assessing each one individually. But taken at scale, as a database that we can mine for trends and patterns, we see that the Twitter feed of 2011 yields valuable eye-witness accounts of the Arab Spring, for example. (Twitter has a partnership with the Library of Congress to preserve tweets.) And as Microsoft reported just this week, data scientists can detect evidence of people suffering from cancer before they even know it, based on what they search for. The third and possibly most important task is to ensure that we do not lapse into a knowledge monoculture. Just as environments benefit from biodiversity, most especially during periods of rapid ecological change, so too does a culture. This means more voices must be captured and transmitted into the future, not fewer. Authoritarian regimes can exert old-fashioned mind control by blocking access to the internet and robbing citizens of opportunities to speak beyond their own borders. But in "open societies" that rely on market capitalism, access to information is effectively, if not explicitly, in the hands of commercial behemoths that control markets for search (Google), social media (Facebook), entertainment (Apple) and consumer goods (Amazon). We must demand greater transparency and accountability.
We must demand greater transparency and accountability from the commercial behemoths who control access to online information.
In the meantime, individuals can begin to practice good data management themselves, ensuring local and remote backup of their important data. This does not mean throwing thousands of family photos into a commercial cloud service and hoping for the best. One option is to add to the collective digital memory by uploading sites to the Internet Archive, a nonprofit digital library that preserves large parts of the web. In this age of faith in technological progress, we are susceptible to temporal chauvinism. We believe that our knowledge -- we ourselves -- are superior to what came before, mistaking our material well-being for intellectual and spiritual superiority. But just as environments benefit from biodiversity, most especially during periods of rapid ecological change, so too does culture. The more voices we capture and transmit into the future, the larger our collective memory bank grows and the larger the mental toolkit vital to our destiny as problem-solvers. ship exploration cook
Wood engraving of Captain James Cook sighting the Glasshouse Mountains in Australia in 1770. (Andrew Garran, 1886. Photo by The Print Collector/Getty Images)

The library of Alexandria, today celebrated as the knowledge factory of the ancient world, was not destroyed by war. The library died when it came under the jurisdiction of Christian rulers and then the Islamic caliphate. Neither regime had any use for pagan learning. As they imposed their own knowledge monoculture on the world, Alexandria's library was allowed to disintegrate. Yet without the recovery of classical learning in the Renaissance, there would be no modern world today -- no memory of republican self-rule, no model for the unfettered pursuit of curiosity. We can develop skills to manage and take responsibility for digital memory so that it outlives us. We can leave to our children what was deeded to us: the choice to decide for ourselves what is of value. Whether we do so or not is now in our hands. Also on WorldPost:
More On This Topic