An early modern guide to information overload

0 30


We live in an era of abundance and distraction, of shrinking attention spans, ‘content shock’, and of seemingly endless quantity over elusively finite quality. Thanks to the internet, we have never had access to so much information, and yet somehow, we appear to be less collectively well-read, less well-informed and more intellectually febrile, fragmented, and jittery. The statistics are depressing – Americans now touch their smartphones an average of 2,617 times a day. As digital databases expand, attention spans shrink. The average time spent on a webpage is 15 seconds; reading an article, approximately one minute. We’re also witnessing a slump in general reading – Americans now claim to read, on average, 12.6 books a year, which is two to three fewer than between 2001 and 2006 – and this decline is most pronounced among university-age individuals. Part of this is due to the ease and ubiquity of continuous digital distraction, but also, one might contend, due to sheer, overwhelming, awe-inspiring – and ultimately terrifying – abundance.

We all struggle to collect and process the ever-growing mounds of information at our disposal. What should we privilege or prioritise during our limited time on Earth? What should we merely skim, and what should we read in depth? Do I really need to read all of Proust or Dostoevsky to be a truly educated, well-rounded individual, as I repeatedly— and vainly—remind myself at the onset of each summer holiday?

Perhaps we can find some measure of solace in recognising that this is hardly a new phenomenon. Writing in a state of cranky isolation from his cottage in mid-19th century Massachusetts, Henry David Thoreau lamented the fact he had never got around to reading Plato: ‘His Dialogues, which contain what was immortal in him, lie on the next shelf, and yet I never read him.’ Then comes that all-too-familiar self-flagellation, as the misanthropic New Englander griped: ‘We should be as good as the worthies of antiquity, but partly by first knowing how good they were. We are a race of tit-men [runts], and soar but little higher in our intellectual flights than the column of the daily paper.’

When it comes to history, which periods should we endeavour to truly grasp and why? After all, history keeps on expanding – not only in temporal scope, but also historiographically, through the accumulated sedimentation of different generations of historical or philological analysis, and even geographically, as we become ever more aware of alternative historical traditions outside the bounds of the so-called West. Should an American ruthlessly prioritise the American Revolution over antiquity? Conversely, can one truly understand the history of the American Revolution and the thought processes of the founding fathers without delving into the writings of Sallust, Livy, Plutarch or Cicero? Are the most recent periods of our history always the most relevant? Not necessarily. As the medieval historian and French Resistance hero Marc Bloch once noted, when reaching for a deeper understanding of complex phenomena, linear chronological prioritisations don’t always make the most sense:

‘What would one think of the geophysicist who, satisfied with having computed their remoteness to a fraction of an inch, would then conclude that the influence of the moon upon the earth is far greater than that of the sun? Neither in outer space, nor in time, can the potency of a force be measured by the single dimension of distance.’

Latent anxieties around informational abundance and complexity – which may appear exacerbated by technological advances – are not new, and were particularly prevalent during the so-called ‘information explosion’ of the early modern era. Most importantly, the historian’s endeavour should be viewed not solely in terms of rich empirical content, but also in terms of intellectual processes. Indeed, the process of historical inquiry provides a form of calisthenics for the mind, one which strengthens its capability to process information, boosts its ability to detect shifting patterns amid tangled skeins of concurrent events, and more generally helps it rise above the crushing mass of everyday phenomena. In so doing, it enhances what the ancients would have termed prudence – practical wisdom in the classical Aristotelian sense – what we would now perhaps refer to as intuitive intelligence, or the capacity for sound judgment.

The early modern period was a time of enormous political, intellectual, and diplomatic upheaval. The increased sophistication of the early modern state, its growing centralisation, the heightened intricacy of its bureaucratic apparatus – complete with teetering mounds of paperwork and endless reams of epistolary exchanges – all required chronically overworked rulers to find safe and effective ways to delegate authority, administer newly sprawling domains and implement their increasingly far-reaching reforms. Institutionally, this led to the mushrooming of small governing councils composed of tight cadres of ministers and royal counsellors across Europe, and to the rise of the figure of the secretary – the discreet and dedicated public servant at the heart of the new ‘letterocracy’, whose dry tendrils extended across the chancelleries, ministries, and embassies of the continent. Paperwork had become, in the words of historian Paul M Dover, ‘the demon of early modern statecraft’.

Kings, popes, and doges all found themselves gasping for air under a deluge of memorandums and correspondence. Philip II of Spain was frequently driven to despair by ‘these devils, my papers’, with up to 16,000 separate petitions sent to his desk over the course of a single year. Similarly, in France, Cardinal Richelieu, behind his wintry and steely exterior, was often a neurotic mess, a ‘bundle of nervous energy’ drowning in work, and suffering debilitating migraines. In the face of this avalanche of correspondence, new bureaucratic solutions were devised, as were new methods of communication. Some of the more fascinating documents from the period could be described as early versions of shared online documents – manuscripts or memoranda, where the sovereign would write on one margin, the advisor on the other, both communicating back and forth through these jointly annotated briefs. Counsellors and ministers would sometimes draft short daily memoranda for their rulers, which compiled key intelligence findings, point-by-point summaries, and select excerpts of diplomatic dispatches. Simply, early versions of the Presidential Daily Briefings (PDBs), which officially came into being during Lyndon Johnson’s administration in the 1960s.

Within this densely saturated information environment, the role of the newly empowered secretary, ambassador, or counsellor was not only to filter, distill, and interpret incoming torrents of data; but, also, to provide clear and actionable guidance to overwhelmed rulers. This required, noted commentators at the time, a distinct set of skills: clarity of style and expression, the ability to be detail-oriented while not losing sight of the big picture, and the capacity to combine careful reflection with decisiveness.

In 1940 Winston Churchill authored a now well-known memorandum touting the virtues of ‘short crisp paragraphs’ in government communications and calling for an end to bloated, unnecessary verbiage. Many 16th- and 17th-century writers argued along similar lines, and even more forcefully. The Spanish Jesuit Baltasar Gracián was characteristically direct in his popular Pocket Oracle and Art of Prudence (1647), berating his fellow court apparatchiks in the following terms: ‘Don’t be tedious. Brevity flatters and opens more doors: it gains in courtesy what it loses in precision. What’s good, if brief, is twice as good.’

Meanwhile, writers such as the Florentine historian and statesman Guicciardini warned against getting bogged down in details; that trying to absorb too much information could exert something of a paralysing effect. In 1530 he wrote: ‘At times, I have seen a man who knows only the general facts of the case judge well, whereas the same person will judge poorly when he has heard the details.’ With the advent of the printing press and the rediscovery of long-lost works from antiquity, historically inclined counsellors and legislators now also faced a daunting new problem – overwhelmingly abundant source material. Some of this cognitive burden was newly off-loaded in the form of florilegia, compendia, and encyclopaedias. It was time, noted Samuel Johnson, for scholars to acknowledge that perfect omniscience was impossible, and that knowledge was ultimately of two kinds: we either perfectly knew a subject ourselves, or – more realistically – we knew where we could find information about it.

It was during the early modern period that some of the most recognisable methods of academic practice came into being, such as the footnote – devised primarily as a means of assisting 17th-century bookworms to burrow through increasingly dense layers of scholarship.

Figures including Johnson and Francis Bacon also spoke of the necessity of adopting new, more differentiated forms of reading strategies. For example, Bacon said in 1597: ‘Some books are to be tasted, others to be swallowed, and some few to be chewed and digested; that is, some books are to be read only in parts; others to be read, but not curiously; and some few to be read wholly, and with diligence and attention.’ This metaphor of reading as a form of intellectual digestion, or careful rumination, had become exceedingly commonplace since the late Middle Ages, and was liberally employed by figures ranging from Petrarch to Bacon, Montaigne or William Drake.

The study of history was considered an essential prerequisite for sound statecraft. Not only did it serve a moral function – encouraging legacy-obsessed legislators to act virtuously – it also allowed for the vicarious acquisition of experiences extending far beyond the fleeting span of mortal life. At the same time, writers started to warn of history’s potential – through its sheer bloated mass – to confound, daze, and overwhelm. ‘How much is the sight of a man’s mind distracted by experience and history?’ wondered Bacon. When it came to the study of the rise or fall of great nations, it was perhaps not wise, he said, to ‘look too long upon these turning wheels of vicissitude, lest we become giddy’. To be overly fascinated by history, Philip Sidney argued in his influential Defense of Poesy (1595), was to run the risk of being ‘captivated to the truth of a foolish world,’ mired in grubby, unedifying sequences of events rather than engaging in more elevated forms of philosophical or poetical reflection.

To which thinkers could – and did – respond that it was not so much the dull cataloguing of events themselves that mattered, but rather how one chose to make sense of them. Central to the process of historical inquiry is the notion of ‘discernment’, a word first entering common usage in the late 1500s. What does it mean precisely? A modern dictionary definition might say ‘keenness of intellectual perception, insight, acuteness of judgment’. More prosaically, one might simply suggest it is the ability to engage in critical thinking. Its etymology can provide some clues – it derives from the Latin dis-cernere, which means to separate, divide, or, more accurately, sift apart. A true student of history, Montaigne wrote in his Essays, should ‘pass everything through a sieve and lodge nothing in his head on mere authority or trust’. Beyond a sometimes-overzealous quest for immediate parallels, history can be almost equally useful when highlighting moments of ruptures and discontinuity. For, as Juan Luis Vives rightly observed in On Education (c.1531):

‘Even a knowledge of that which has been changed is useful; whether you recall something of the past to guide you in what would be useful in your own case, or whether you apply something, which formerly was managed in such and such a way, and so adapt the same or a similar method, to your own actions, as the case may fit.’

To sift through history with ease also requires a certain mental agility, if only to carefully skirt the more common pitfalls or pathologies of the discipline. The French historian Emmanuel Le Roy Ladurie mused that all too often his academic colleagues appeared to be ‘either truffle hunters, their noses buried in the details, or parachutists, hanging high in the air and looking for general patterns in the countryside far below them’. Arguably, this issue has only become more acute since Ladurie made this observation in the 1980s. The historian-cum-policymaker must avoid stumbling into such artificial, academically-imposed binaries, for it is precisely the telescopic quality of historical work – its ability to zoom in and out of the factual weeds while retaining more panoramic societal or geopolitical vistas in sight – that renders it so uniquely valuable to policymakers.

The best historians tend to be the best sifters, sorters, and processors – honing a manner of approaching the world that may, at first glance, appear rooted in relatively unconscious thought processes, but in reality, flows from a much deeper intellectual predisposition. As Herbert Butterfield once observed, a profound understanding of the past, can often prove salutary, uncorking an almost alchemical process within formerly leaden psyches,

‘It seems true… that many of the errors which spring from a little history are often corrected as people go on to study more and more history… A little history may make people mentally rigid. Only if we go on learning more and more of it… will it correct its own deficiencies gradually and help us reach the required elasticity of mind.’

Malleability, adjustability, and a certain wilfully acquired intellectual limberness – it is these aspects of the cognitive process inherent to historical examination that are most relevant to our era – along with the associated ability to detect what the Greek historian Polybius termed the ‘interweaving of events’; that is, the ability to scry the riotous flow of world affairs with the hope of spotting, amid its eddying whirls, underlying currents of cause and effect.

This intuitive sifting and sorting ability, accreted over years of study, allows one to identify chains of causation more rapidly and fluidly, to draw attention more easily to the interconnections between theatres, actions, and events, and to more seamlessly analyse geopolitical developments horizontally as well as vertically. Neuroscientists might refer to this as the ability to engage in associative processes across multiple planes simultaneously. And, indeed, the brain region most consistently engaged during analogical or relational reasoning is the left frontal lobe – the same part of the brain long postulated to play a key role in creative innovation. It’s perhaps for all of these reasons the Harvard historian John Clive once openly wondered whether ‘historians, especially those dealing with abstract entities like groups and classes and movements, have to possess a special metaphorical capacity, a plastic or tactile imagination that can detect shapes or configurations where others less gifted see only jumble and confusion’.

Applied historians should, therefore, perhaps be somewhat more optimistic as to the prospects of their chosen field. Yes, history departments may be closing, yes, tenured job prospects may be dwindling, but somewhat paradoxically, in our information-drenched, distraction-filled and data-drowned world, the historian’s hard-earned mental navigational skills may actually prove to be among those most useful for harried, cognitively-overwhelmed policymakers.



Source link

Leave A Reply

Your email address will not be published.