History: the key to decoding big data

The academic discipline is invaluable in detecting and debunking myths about the past and future, say Jo Guldi and David Armitage. Big data has frequently been used to suggest we are locked into our history, our path dependent on larger structures that arrived before we got here. Historians once told arching stories of scale. From Gibbon, Mommsen and Fustel de Coulanges on the rise and fall of empires to Macaulay and Michelet on the making of modern nations and Mumford and Schlesinger on modern cities, historians dealt with long-term visions of the past over centuries or even millennia.

Nearly 40 years ago, however, this stopped. From about 1975, many (if not most) historians began conducting their studies on much shorter timescales, usually between five and 50 years. This compression can be illustrated bluntly by the average number of years covered by doctoral dissertations in history in the US. In 1900, the period was about 75 years; by 1975, it was closer to 30. Command of archives, total control of a ballooning historiography and an imperative to reconstruct and analyse in ever-finer detail had become the hallmarks of historical professionalism, and grand narratives became increasingly frowned upon.

Read full artcile at the Times Higher Education: http://www.timeshighereducation.co.uk/features/history-the-key-to-decoding-big-data/2016026.article

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s