Mind’s masochism: a brief history of why we insist on having too much on our minds

Posted by in Productivity

According to science, the storage capacity of our brain remains a matter of some debate. Let’s ignore for a moment that the complexity of our brain negates a direct conversion from brains to bytes, and rather focus on how estimates still range from a terabyte (1,000 gigabytes) to the petabyte range (1,000 terabytes) – either of which seem to indicate that the brain offers storage space in abundance.

Why, then, do most everyone complain about having too much on their minds?

Part of the reason is that, for a very long time, our particular branch on the tree of life has made do with the brain as our sole medium for storing information. Our innate ability to record events through sensory input is perhaps not as unique or supreme as once thought, but it has certainly served us well enough to instill a sense of blind trust in our brains.

In the 21st century, however, our daily intake of information is no longer a trickle; it’s a veritable torrent. Anyone whom has ever lost a set of keys know the following to be true: our brains offer neither the precision nor the level of detail that we desire in matters ranging from the mundane to the ones that… well, matter.

Where the brain – for now – reigns supreme, however, is at the task of combining and processing existing information in order to create something entirely new. And as humans, we harvested this – our intellect – to invent the concept of external memory.

From brains to bits and bytes

It all began with storytelling, which gave us the first inklings of a collective memory. Alas, storytelling is an inherently degradeable form of storing and retrieving information, and serves mostly to illustrate how history becomes legend, legend becomes myth, and how myth finally succumbs to the mists of time.

Some 40,000 years ago, we progressed to cave paintings. A precursor to writing, cave paintings are fragile, breathtaking links to the minds of humans removed from us by tens of millennia, but hold little potential as tools for augmenting memory. Writing, however, appears to have been invented in excess of 2,500 years ago, heralding the historic age the first truly reliable means of augmenting human memory.

Whether on clay tablets, papyrus or on paper, the concept of writing allowed humankind to do that which it had never done before: record information in a non-degradeable way, with as much detail as one could muster. Whether for personal or collective use, the concept of lossless recording prevents having to rediscover ideas, instructions and information in general, allowing us to allocate resources far more efficiently.

Seeing as how writing proved such a boon, it’s hardly surprising that we immediately set upon improving it. A brief trimple jump through the history of all things writing takes us from pergament and quill through the printing press and typewriters to the preferred writing tool of the modern age: the computer. Or smartphones and tablets, were you to ask a proper masochist.

Yet if we step back and look at the big picture, writing has only existed for a mere few hundred years, whereas the aforementioned lump of fatty tissue has been our go-to organ for storing and retrieving information for a few hundred millennia. Is it likely, then, to assume that in the course of a few hundred years, we have evolved past the point of blind trust in our brains?


Information, instinct and intellect were sitting in a tree

Ironically, it took the intellect of our brain to realize that its capacity for storing – and retrieving – information reliably had been surpassed. Some 2,500 years ago, I’m dead certain someone sat in front of a clay tablet (I rather like to to think of it as the clayPad) and performed the neolithic equivalent of a facepalm, having just realized the woeful inadequacy of the brain compared with what had just been invented.

Yet somehow, two and a half millennia later, here we are with too much on our minds. At a cursory glance, it may be tempting to blame the torrent of information into which submerge ourselves on a daily basis, but the fault lies elsewhere. It’s not with our intellect, either, as I have yet to find someone in the habit of writing things down whom would willingly return to mind-only information management.

Rather, it’s a combination of information quantity and instinct plasticity – or lack of the latter. From infancy, we use our brains for information management, resulting in one of the most intrinsic habits known to mankind: internal memory first, external memory second… if at all.

Turning this habit is a central tenet in any system that promises increased productivity, as our working memory – RAM, if you will, suffers violently from what has become the normal operating speed of the early 21st century. For some people, the turning takes mere weeks. For others, it may take months.

But for everyone, it’s worth it. Just imagine yourself at 8 years old, and you’ll know what it feels like.

It feels like you can do anything.