Alex Soojung-Kim Pang, Ph.D.

I study people, technology, and the worlds they make

Tag: history (page 1 of 2)

History of technology, key to the future

One of the truisms about futures is that insights can come from all kinds of unusual places and unexpected corners of the world. This morning I ran across an illustration of this principle in blog form: an article about a set of 1931 predictions about 2011, via Abnormal Use: An Unreasonably Dangerous Products Liability Blog. Because of course when you think "the history of futures," the next thing that comes to mind is "products liability blogs that interview people on the latest developments in torts."

But to the predictions:

1931 was a long time ago, and few who live today can claim to remember it all too well…. It was a far different time culturally, socially, politically. The issue: What did the great minds of 1931 predict the rapidly approaching 2011 would be like?

There is actually an answer to that question.

Way back on September 13, 1931, The New York Times, founded in 1851, decided to celebrate its 80th anniversary by asking a few of the day's visionaries about their predictions of 2011 – 80 years in their future. Those assembled were big names for 1931: physician and Mayo Clinic co-founder W. J. Mayo, famed industrialist Henry Ford, anatomist and anthropologist Arthur Keith, physicist and Nobel laureate Arthur Compton, chemist Willis R. Whitney, physicist and Nobel laureate Robert Millikan, physicist and chemist Michael Pupin, and sociologist William F. Ogburn.

The most interesting piece, to my mind, is Ogburn's. Of course he got some stuff wrong, but the broad outlines of his vision were pretty spot on:

Technological progress, with its exponential law of increase, holds the key to the future. Labor displacement will proceed even to automatic factories. The magic of remote control will be commonplace. Humanity’s most versatile servant will be the electron tube. The communication and transportation inventions will smooth out regional differences and level us in some respects to uniformity. But the heterogeneity of material culture will mean specialists and languages that only specialists can understand. The countryside will be transformed by technology and farmers will be more like city folk. There will be fewer farmers, more wooded land with wild life. Personal property in mechanical conveniences will be greatly extended. Some of these will be needed to prop up the weak who will survive.

Inevitable technological progress and abundant natural resources yield a higher standard of living. Poverty will be eliminated and hunger as a driving force of revolution will not be a danger. Inequality of income and problems of social justice will remain. Crises of life will be met by insurance.

Not only are the big trends recognizable, but the specificities are interesting too: yes, there's no mention of the microchip, but it strikes me that "the electron tube" is the functional equivalent in his vision. It's also heartening because Ogburn (here's a pretty good biography) was noted for his work at Columbia on social trends, and argued for the growing importance of technology as a driver of human affairs and the future (obviously). He was elected first president of the Society for the History of Technology, but died before he could take office.

illustration from William F. Ogburn, You and Machines, via flickr

Some of his work was controversial– his 1934 pamphlet You and Machines was banned on the grounds that it was too left-wing– but the rest of his work was more mainstream, and as Rudy Volti argues (in a recent Technology and Culture article available behind the Project MUSE firewall), deals with issues that have been at the center of the history of technology and STS:

Ogburn's seminal work on technology was Social Change with Respect to Cultural and Original Nature… [which] introduces the concept that has been his greatest sociological legacy: cultural lag. As he explains: "The thesis is that the various parts of modern culture are not changing at the same rate, some parts are changing much more rapidly that others; and that since there is a correlation and interdependence of parts, a rapid change in one part of our culture requires readjustments through other changes in the various correlated parts of culture."

A 1950 edition of the book more explicitly lays out his theory of the "role of an advancing material culture in bringing about social change," and breaks it down into four parts:"

"Invention" is still given top billing, complemented by "accumulation" (the store of past inventions, which expands at an exponential rate) and the diffusion of inventions from other cultures. The fourth element of social change is "adjustment," the process through which lagging cultural elements catch up with the changes driven by invention, accumulation, and diffusion.

You can see Ogburn's model in his 1931 New York Times piece.

Andy Clark on cognitive prosthetics

Andy Clark argues:

[W]e seem to be entering an age in which cognitive prosthetics (which have always been around in one form or another) are displaying a kind of Cambrian explosion of new and potent forms. As the forms proliferate, and some become more entrenched, we might do well to pause and reflect on their nature and status. At the very least, minds like ours are the products not of neural processing alone but of the complex and iterated interplay between brains, bodies, and the many designer environments in which we increasingly live and work.

Ann Blair suggests, not so fast:

[We assume] that modern technology is creating a problem that our culture and even our brains are ill equipped to handle. We stand on the brink of a future that no one can ever have experienced before.

But is it really so novel? Human history is a long process of accumulating information, especially once writing made it possible to record texts and preserve them beyond the capacity of our memories. And if we look closely, we can find a striking parallel to our own time: what Western Europe experienced in the wake of Gutenberg’s invention of printing in the 15th century, when thousands upon thousands of books began flooding the market, generating millions of copies for sale. The literate classes experienced exactly the kind of overload we feel today — suddenly, there were far more books than any single person could master, and no end in sight.

Four Scenarios for the End of the American Century by 2025

University of Wisconsin history professor Alfred McCoy is blogging about a project he and an international team of scholars has just completed, a series of scenarios on "the end of the American century." This is part of a larger project titled "U.S. Empire Project: Rise & Decline of American Global Power," which seems to be keeping alive Madison's rich tradition of radical scholarship.

It's not clear from the description of the project what kinds of methods they used to craft the four scenarios (or how they were chosen, etc.), but I hope to learn more about the project soon. From McCoy's post:

As a half-dozen European nations have discovered, imperial decline tends to have a remarkably demoralizing impact on a society, regularly bringing at least a generation of economic privation. As the economy cools, political temperatures rise, often sparking serious domestic unrest.

Available economic, educational, and military data indicate that, when it comes to U.S. global power, negative trends will aggregate rapidly by 2020 and are likely to reach a critical mass no later than 2030. The American Century, proclaimed so triumphantly at the start of World War II, will be tattered and fading by 2025, its eighth decade, and could be history by 2030….

Viewed historically, the question is not whether the United States will lose its unchallenged global power, but just how precipitous and wrenching the decline will be. In place of Washington's wishful thinking, let’s use the National Intelligence Council's own futuristic methodology to suggest four realistic scenarios for how, whether with a bang or a whimper, U.S. global power could reach its end in the 2020s (along with four accompanying assessments of just where we are today). The future scenarios include: economic decline, oil shock, military misadventure, and World War III. While these are hardly the only possibilities when it comes to American decline or even collapse, they offer a window into an onrushing future.

Architectural movements and their reinterpretation

Okay, here’s one for all my smart friends.

In my scenarios project, I’m taking an approach in which I treat scenarios not as texts to be read from start to finish, but as a combination– a package or portmanteau– of formal content, tacit knowledge, media of various types (reports, maps, rigorous analytical stuff, more imaginative stories, etc.), and even events or performances (e.g., workshops, client engagements). One of the things I’m interested in is following how scenarios get used in different contexts, and how the constituent parts of scenarios are sometimes carved off from the whole, repurposed and reused.

I think there’s a parallel here to architectural movements and their impacts. Something like neoclassicism or the International Style isn’t a single concept; it’s a whole package of ideas and forms, and while it can be influential worldwide, it’s not influential in the same way everywhere. Sometimes different elements are pulled out and emphasized in different parts of the world: think of how modern architecture in Brazil and Japan have played out, with the former being much more sculptural and sensual. Local materials may blunt the strangeness of a foreign style. Or guiding principles inspire very different kinds of works: Art Nouveau in Vienna and Aberdeen are pretty different creatures.

This is stock in trade in the history of architecture, but I’m a lot more familiar with specific periods in architectural history, or the works of particular architects, than I am with the historiography; so while I can point to lots of examples of this kind of localization and reinterpretation, I don’t know of anyone who’s written about the process in more general terms. Do such articles exist?

[To the tune of Joshua Rifkin: The Bach Ensemble, “Kyrie: Kyrie Eleison #2,” from the album Mass in B Minor (BWV 232) (a 4-star song, imo).]

I will never escape the shadow of Robert Merton

I spent years in graduate school reading Robert Merton's sociology of science: I trucked his collection of essays The Sociology of Science (which I bought in an excellent little overstuffed bookstore on the edge of campus) around with me from Penn to Williams to Stanford to Berkeley to Davis to Chicago and back to California, and had his work assigned in any number of classes.

In all those years, I never realized that he had a second intellectual life as, essentially, a futurist; or more specifically, as a scholar of ideas that we use in everyday thinking about the future. But a few weeks ago I found a 1936 article of his on "The Unanticipated Consequences of Purposive Social Action" when researching a piece on unintended consequences. So it should really come as no surprise, when I idly searched for scholarly works on the concept of "self-fulfilling prophecies" that Merton would have written one of the definitive articles, this time in the Antioch Review in 1948 (it's available on JSTOR).

His generation of social scientists– along with contemporaries like John Kenneth Galbraith, Daniel Bell, Robert Hielbroner, and Donald Michael (now less well-known but still very interesting)– really knew how to ask big questions that connected contemporary issues with futures.

The very long shadow of the history of technology

A confession: when it comes to thinking about the future, I hold two views. On one hand, I find the black swans work of Nassim Taleb– the argument that the speed and complexity of the modern world has left it vulnerable to more, and more unpredictable, crises– pretty convincing. (Call this the New View.)

On the other, I also believe that much of what we claim is novel about this modern age is not so new. Many facets of globalization– the importance of migration, global trade, etc.– are actually as old as civilization. I also believe that other things, like a belief in greater vulnerability to epidemics and financial panics (or just as worrying, the belief that we are now immune from such things), are a product of a relatively short-term view of history. You could better understand our current world, and think more clearly about the future, if you stretch your view of the past from the last 50 years to the last 500 or 5,000. (Call this the Long View.)

Obviously, the New View and the Long View are contradictory. I get around that by not thinking about both of them at the same time. But I'm trying to construct a framework that fits them together.

This morning I ran across another data-point in the Long View: a new piece by Diego Comin, Erick Gong, and William Easterly looking at very long-term trends in technology and economic development. As Easterly explains,

We collected crude but informative data on the state of technology in various parts of the world in 1000 BC, 0 AD, and 1500 AD.

1500 AD technology is a particularly powerful predictor of per capita income today. 78 percent of the difference in income today between sub-Saharan Africa and Western Europe is explained by technology differences that already existed in 1500 AD – even BEFORE the slave trade and colonialism.

From the abstract (pdf):

The emphasis of economic development practitioners and researchers is on modern determinants of per capita income such as quality of institutions to support markets, economic policies chosen by governments, human capital components such as education and health, or political factors such as violence and instability.

Could this discussion be missing an important, much more long-run dimension to economic development?… Is it possible that history as old as 1500 AD or older also matters significantly for today’s national economic development? A small body of previous growth literature also considers very long run factors in economic development…. This paper explores these questions both empirically and theoretically. To this end, we assemble a new dataset on the history of technology over 2,500 years of history prior to the era of colonization and extensive European contacts…. We detect signs of technological differences between the predecessors to today’s modern nations as long ago as 1000 BC, and we find that these differences persisted and/or widened to 0 AD and to 1500 AD (which will be the three data points in our dataset, with 1500 AD estimated from a different collection of sources than 1000 BC and 0 AD). The persistence of technological differences from one of these three “ancient history” data points to the next is high, as well as robust to controlling for continent dummies and other geographic factors.

Our principal finding is that the 1500 AD measure is a statistically significant predictor of the pattern of per capita incomes and technology adoption across nations that we observe today.

Of course, one can get into how this is a different set of forces than most futurists are interested in– but to the degree that it serves as a corrective to the tacit view held some futurists that history doesn't matter at all– a kind of social science version of transhumanism, in which thanks to technology (or migration or whatever) we're able to ignore the past and its gravitational pull– it's worth reading and pondering.

Memory arts and mindware

Ruth Evans takes an historical perspective on Andy Clark's natural-born cyborgs argument, and that "human cognition is not just embodied but embedded: not mind in body, but both mind and body enmeshed in a wider environment of ever-growing complexity that we create and exploit to make ourselves smarter."

From the abstract:

The philosopher and cognitive scientist Andy Clark has argued that humans have always been ‘natural-born cyborgs,’ that is, they have always collaborated and merged with non-biological props and aids in order to find better environments for thinking. These ‘mindware’ upgrades (I borrow the term ‘mindware’ from Clark, 2001) extend beyond the fusions of the organic and technological that posthumanist theory imagines as our future. Moreover, these external aids do not remain external to our minds; they interact with them to effect profound changes in their internal architecture. Medieval artificial memory systems provide evidence for just this kind of cognitive interaction. But because medieval people conceived of their relationship to technology in fundamentally different ways, we need also to attend to larger epistemic frameworks when we analyze historically contingent forms of mindware upgrade. What cultural history adds to our understanding of embedded cognition is not only a recognition of our cyborg past but a historicized understanding of human reality.

This reminds me some of the work of the cognitive anthropology crowd, which I find necessarily speculative but extremely ambitious and interesting.

You could also re-work Paul Saenger's work on word spacing and intellectual history in light of Clark.

H. G. Wells on Professors of Foresight

Via Stories of the Future, I came across this 1932 essay of H. G. Wells, "Wanted: Professors of Foresight:"

It seems an odd thing to me that though we have thousands and thousands of professors and hundreds of thousands of students of history working upon the records of the past, there is not a single person anywhere who makes a whole-time job of estimating the future consequences of new inventions and new devices. There is not a single Professor of Foresight in the world. But why shouldn’t there be? All these new things, these new inventions and new powers, come crowding along; every one is fraught with consequences, and yet it is only after something has hit us hard that we set about dealing with it.

Tonight we are confronted with two facts, one bad and one good; the first, which has only been hinted at, that acts of war have become hideously immediate and far reaching; and the second that the whole round world can be brought together into one brotherhood, into one communion, one close-knit freely communicating citizenship, far more easily today, than was possible with even such a little country as England a century ago.

Tell me if that second paragraph couldn't have been written this afternoon.This should make us pause in any claims that ours is a unique period in human history, in which the opportunities for collaboration and mutual understanding are unprecendented. Ten years after Wells published this essay, where was the world? Fully engaged in World War II.

Kids reenact the American Revolution

My wife has been assigning video production projects in her class, but this raises the game:

While at first it looks like nothing more than a mix of two entertainment genres that were best left unconnected– namely, elementary school skits and Web video– it’s actually wonderfully subversive at times. The Charles Beard shout-out (at 1:01) makes the whole thing worthwhile, and the First Thanksgiving is just incredible.

[To the tune of Sting, “History Will Teach Us Nothing,” from the album Nothing Like The Sun (a 2-star song, imo).]

Justin Fox on economists versus historians

Harvard Business Review blogger Justin Fox talks about why economists have become more prominent public intellectuals than historians: He argues that

economists had managed a remarkable balancing act between making the guts of their work totally incomprehensible — and thus forbiddingly impressive — to the outside world while continuing to offer reasonably straightforward conclusions. The basic form of an academic economics paper is a couple of comprehensible paragraphs at the beginning and a couple of comprehensible paragraphs at the end, with a bunch of really-hard-to-follow math or statistical analysis in the middle. An academic history paper, on the other hand, is often an uninterrupted cascade of semi-comprehensible jargon that neither impresses a lay reader nor offers any clear conclusions.

The one economist in the audience had another suggestion. Most economic work was aimed at prediction, and the world is always hungry for predictions. He added that most macroeconomic predictions are worthless (he was a microeconomist), but that doesn't seem to have damped the demand for them.

Once again, it's clear that Philip Tetlock's work explains everything.

But seriously, it seems to me that one of the key features of any form of prediction or forecast– or even something less formal, like scenarios– should be transparency: if you're asking people to base their actions in the expectations that certain futures are more important to prepare for than others, it's imperative that you be able to explain to yourself and others how you came to think that those futures were worth taking more seriously. It's never possible to clearly describe every step in your thinking: all knowledge work involves a measure of intuition and tacit knowledge that's acquired over years of practice.

There should be no shame in acknowledging that, and I think we have a lot more to gain than to lose from greater transparency. Obviously it opens your work up to criticism, but also to improvement– and potentially, rapid improvement. It's essential for making users smarter. The most thoughtful clients I've worked with were the ones who best understood what I do. (Knowing how someone else works doesn't make them obsolete: I love to read about tailoring, but that doesn't make me a cutter.) And it's important for any work that people aren't just going to apply, but are expected to adapt and extend– which is exactly what happens with scenarios and forecasts. Good tinkering and hacking depends on an ability to get under the hood, play with the parts, understand why things are put together this way rather than that, and thus see new possibilities and uses for products. Perhaps scenarios and forecasts should be designed not just to be read, but torn apart, recombined, and reused: they should be able to stand being assembled in new ways and used in contexts their authors never imagined.

[via Ezra Klein]

Older posts

© 2019 Alex Soojung-Kim Pang, Ph.D.

Theme by Anders NorenUp ↑