Alex Soojung-Kim Pang, Ph.D.

I study people, technology, and the worlds they make

Category: History of Science / STS (page 2 of 3)

Riki Kuklick, contemplative computing, and the challenge of real life

A few weeks ago I spoke a memorial service for one of my thesis advisors, Riki Kuklick. While I was at Penn I also gave a couple other talks, on postacademic careers and contemplative computing; but all three turned out, one way or another, to touch on Riki and her influence on me.

After I returned home, I noodled around with the talks, and eventually put them together. The result wouldn’t have been appropriate in any of the three venues, but it better reflects what I was struggling to say in separate places on different days.

Introduction

In September 2013 I returned to Philadelphia to speak at a memorial service for one of my favorite professors, Henrika Kuklick. Exactly thirty Septembers earlier, I stepped into my first classroom with Riki, and her course on the sociology of knowledge. It was the beginning of an association that would shape the next eight years of my life at Penn, and beyond.

Even though my father was a professor, and I was lucky to have some great teachers and role models at Penn, Riki lived the life of the mind in a way that was especially vivid and accessible. It goes without saying that she was as brilliant as the other professors who most deeply influenced me at Penn– her colleagues Rob Kohler and Thomas Hughes; art historian David Brownlee; and strategist and systems thinker Russ Ackoff– but she was a great model for aspiring scholars.

Riki took unreserved, transparent pleasure in the craft of scholarship, in writing, teaching, talking shop with students. Her stories of her latest agony writing what she called “the Great American monograph” kept me and other graduate students entertained.

For students trying to become scholars, her willingness to pull back the curtain on academic life was refreshing and reassuring. My decision to work on Victorian science was influenced in no small part by her accounts of living in England and working in the archives there.

The Problem of the Real World

The importance of academic models like Riki for aspiring scholars shouldn’t be overestimated, because academic life is often looked at skeptically by people who see themselves as firmly rooted in the “real world.”

As my years at Penn drew out, some of my old friends and relatives expressed the opinion that all this education was just a way of avoid going into the real world. The real world was the place where people DID things, made money, got stuff done. The university was fine if it helped you get a job, but otherwise it was little point to it. Well, if the university was NOT the real world, then I wanted no part of it. I wanted to be a professor; the campus would be MY real world.

That didn’t work out: I graduated into a terrible job market, and after finishing my first book and a couple postdocs became a consultant. But then I made a surprising discovery: the “real world” was actually a great place to pursue the life of the mind.

Working as a futurist means grappling constantly with epistemological issues around the possibility of predicting the future, your professional credibility, and the standards by which your work should be judged– all familiar themes in the sociology of science. In the mid-1990s, thanks to the growth of the Internet, the rising importance of the service economy, the ferocious pace of technological and global change, and other factors, the boundary between the world of ideas and the “real world” was collapsing. In order to survive in today’s economy, organizations have to think seriously about what they were doing and why, and have models that explained how the world works and how it’s changing. In their worldly impact, ideas are more real than ever.

One reason I was able to continue my own intellectual life was that I had Riki’s pursuit of it as a model. There was nothing unreal about the life of the mind the way she lived it, or her love of the craft of scholarship. Her own professional life was lived in the ivory tower, she would have regarded the prospect of working with C-suite executives with horror. Despite this, she gave me the means to see the life of the mind as a devotion rather than just a profession, as an internal discipline as well as an academic one.

In a sense, I was also applying to my own life another lesson Riki taught me: that we should question what others believe is inevitable and inescapable, because what appears fixed may in fact be contingent and changeable. The expertise that may seem unassailable, the assumptions that seem self-evident, the truths that claim to be eternal, all may not be as real as they seem– or like a great movie, their greatness may a blend of hard word, clever staging, and a willing suspension of disbelief.

Seeing that the boundaries between the academic world and “real world” could be more porous than I’d believed helped me create a life that borrowed from both worlds. It let me uproot my own well-cultivated prejudice against corporate life. It freed me to reimagine academic life as something more portable and useful than I’d previously imagined. It let me see that one could make a life that combined the vita activa and vita contemplativa.

Another Real World: IRL

That experience of moving between worlds had a subtle but important resonance in my latest book. While writing The Distraction Addiction, I ran up against the sensibility that Facebook, text messaging, the Web, and the other things that make up the digital world can ONLY be distractions from a well-lived life; that proximate physical interactions are naturally superior to anything we can experience online; and that the best solution to our electronic troubles is simply to turn technologies off. We should get offline in order to spend more time in the real world, where we can have a real life. The simple and apparently innocuous acronym “IRL” turns out to be a kind of intellectual virus. It packs a lot of unexpected information and moral judgment in a very small package.

This claim is one side of an argument that’s into its third decade. In the 1990s and the early days of the World Wide Web, figures like John Perry Barlow and Esther Dyson declared that cyberspace was a new world separate from and superior to the physical world; critics answered that the Internet was a threat to literature, social development, even our memory and cognitive abilities. To me this debate had a ring of familiarity. If the distinction between the academic world and real world doesn’t make a lot of sense, I wondered, could the same be true of the apparently huge gap between digital life and real life?

Merging Worlds

Once I dug deeper, I saw that just as the distance between academic life and real life was overhyped, so too was the distance between digital life and real life. Technologies like smartphones, locative services, and wireless Internet access have erased the functional boundary between bits and atoms, while ecommerce, email, and social media have woven the digital world into our everyday lives.

Even more profoundly, I realized, using technologies is not something that makes us less human, or takes us away from our natural selves. Since the invention of stone tools two million years ago, human bodies have co-evolved with our physical tools, while our minds have co-evolved with our cognitive tools. We are, as philosopher Andy Clark puts it, natural-born cyborgs. At its best, this entanglement of person and technology extends our cognitive and physical abilities, gives us great pleasure, and makes us more human.

The challenge with smartphones and social media, then, is not to learn to give them up, but to learn to use them wisely. We need to practice what I call contemplative computing, developing ways of working and interacting with information technologies that help us be more mindful and focused– and thus better people– rather than be endlessly distracted and frustrated.

By better understanding the nature of attention and distraction, by studying how our interactions with technologies go bad, and by experimenting with new ways of using them, we can resolve the paradoxes these technologies seem to bring into our lives. Using them wisely helps us become wiser about ourselves. Being more mindful about HOW we use technologies helps us be more mindful WHILE using them.

This leads me to argue that we should push back against the moral distinction between academic life or digital life on one hand, and real life on the other. We shouldn’t think in terms of a “real life” versus a “digital life” any more than we should think of our lives in the library or laboratory as unreal.

IRL = In Richer Life

To put it another way, we should redefine what the acronym IRL means. When people talk about “going IRL,” one of the things they’re doing is expressing a desire for self-improvement: turning off the devices, going camping or spending time with the family and friends. The impulse is laudable, but the assumption that it can only happen when you hit the off switch is incorrect.

Instead, we should think of RL as a richer life, one of that isn’t driven mainly by distractions, but reflects a serious attempt to create meaning in the world, to do things that matter with our lives, to build and extend our selves. This is an effort in which the thoughtful, judicious, mindful use of technology can play a role– and which those habits of mind that we think of as “academic” can also be intensely useful. We can build lives aren’t merely real, but are richer, using tools that take form in silicon and electrons, or tools that are encoded in words and ideas.

Practicing contemplative computing requires taking a more critical, ethnographic approach to how we use technology; asking basic questions about why we use technologies, noticing unconscious habits, how we think about them, and how they affect the way we think about ourselves. All these ideas could have come from one of Riki’s classes, even though they’re applied in an area that seems outside her scholarly interest.

Riki and the Richer Life

But that ability to follow ideas wherever they lead, to pursue diversions until they reveal something unexpected yet connected to your original interests, is just me channeling another of Riki’s habits.

Riki was an astonishing conversationalist– indeed it was hard to get a word in edgewise. If you didn’t know her you might listen to her monologues and think she was just free associating. But if you listened carefully, you discovered that she would start a sentence, interrupt herself and veer off onto another subject, then do it again, and again– and then, systematically work her way back, until twenty minutes later she finished that first sentence. That ability to draw together a dozen different subjects in a single conversation, to weave between and weave together different ideas, never failed to amaze her students, and I suspect there’s an echo of it in my writing even today.

But in a sense the questions I’m working on now are not outside her area at all. What Riki showed me, through her work and her life, is that far from being an escape from real life, the life of the mind can serve as a model for how to build richer lives.

Indeed, there’s a parallel between our engagement with books and ideas, and our dual lives in the physical and digital worlds.

The categories of “real world” on one hand, and “digital world” or “academic world” on the other, can be remade, and in the course of doing so, we can make better, richer lives for ourselves. A more thoughtful understanding of our everyday engagements with technology can make our lives better. It’s an attempt to make sense of how we should define what it means to be human, how to think about the divide between people and technologies, and to see that the challenge and the opportunity we face is not to learn how to live in real life, but to learn how better to use tools and time to have a richer life.

 

“As we go about our lives using our mysterious technologies, what kinds of people are we enabling them to make up?”

Deanna Day, a grad student at my alma mater, wrote a nice little piece on “Harry Potter, Wizards, and How We Let Technology Create Who We Are.” It gets seriously into the weeds of the Harry Potter universe, but it makes a serious point about how magic and technology can shape their users:

Muggles and wizards alike are mystified by the mechanisms of objects like iPads and Sorting Hats, and this ignorance can often, ironically, create a deep sense of trust in these objects. We create stories that explain their behavior, and when our tools work, it cements the validity of those stories. How else to explain their mechanism, be it magical or mechanical? But when we allow our technologies to remain opaque, we also prevent ourselves from seeing the crucial ways they make us who we are….

Most of the piece is about how wands work, who can use them, and their relationship to their users (e.g., the whole “wand chooses the wizard” thing). She concludes:

[T]he stories that wizards tell about their tools don’t match up with how they’re used in practice. The wand chooses the wizard, because that’s what wizards want to believe about their type of magic. In this story, wizards are special, and wands are objective proof. In another example, the Sorting Hat is believed to reveal one’s true identity, until an arguing student reveals that the Hat’s interpretation — and its social consequences — are much more negotiable than its song would imply.

In this (and many other) ways, the wizarding world exists in parallel with the muggle world…. By pointing out some of the ways that the technologies of the wizarding world are constructed — and the kinds of wizards they construct — we might also be better able to see the workings of our own muggle magic. As we go about our lives using our mysterious technologies, what kinds of people are we enabling them to make up?

 

“What is there in this that is unbearable and beyond endurance?”

Jessica Francis Kane, writing in The Atlantic, talks about a Marcus Aurelius quotation that she took to heart:

Book 8, #36

Do not disturb yourself by picturing your life as a whole; do not assemble in your mind the many and varied troubles which have come to you in the past and will come again in the future, but ask yourself with regard to every present difficulty: 'What is there in this that is unbearable and beyond endurance?' You would be ashamed to confess it! And then remind yourself that it is not the future or what has passed that afflicts you, but always the present, and the power of this is much diminished if you take it in isolation and call your mind to task if it thinks that it cannot stand up to it when taken on its own.

She reflects:

I thought about how Marcus Aurelius's concerns and mine differed, but I
was inspired by the idea that the spirit of them, separated by so many
centuries, was similar. His words helped me get to the desk, and stay
there, during all the years it took me to write my first good story.
Writing is hard, but is it unbearable? Who would say that it is? Even
asking the question, I'm reminded of the one exclamation in the passage:
"You would be ashamed to confess it!" His words helped me navigate
rejection, which is certainly no fun, but if you ask yourself if it's
unbearable, you find yourself preparing the next self-addressed stamped
envelope pretty quickly. The words helped me survive the protracted sale
of my first novel, and they reminded me to start writing again after a
long hiatus after the birth of my first child. I wasn't sure how to make
room for writing with a baby. It is difficult, but beyond endurance? I
got myself back to the desk.

Personally, I think nothing prepared me for writing as well as studying the Victorians. Not because they invented the world as we know it (in many ways they did), or because their work was awesome (though it was), but rather because they got so much done. Tomes, multivolume histories, three-decker novels. Theories, scientific discoveries, expeditions, surveys. Buildings, massive urban redesigns, vast public buildings, and more than a few dark Satanic mills. New ways of seeing the world, of traveling it, or recording it.

And they still managed to take month-long vacations, or at least have high tea. The older I get, the more impressive that part is– and, I begin to suspect, the more important it is for understanding why they were able to get so much done. It wasn't just the absence of television or Facebook. My intuition now senses that they were productive because they had a better sense of when to quit for the day. They could be more productive because they were more measured.

Granted, I have absolutely no real evidence for this, and I'm sure it'll be years before I can really chase it down, but their lives were about as well-documented as you can get without FitBit and SenseCam, so I'll bet you really could study their work habits, how much time they spent at work and play, how they saw the differences between the two, and how it made them great.

Remembering Riki Kuklick

Yesterday I found out that one of my mentors from college and graduate school, Henrika Kuklick, died.

Riki was one of the professors who got me hooked on the history of science, and along with Rob Kohler helped make me who I am. In the fall of my freshman year I had taken a seminar with Tom Hughes, mainly because it sounded interesting and he had a Ph.D. from the University of Virginia, and then in the spring had a class with Hughes and Rob, who would go on to be my undergraduate and graduate advisor. In my sophomore year I took Riki’s sociology of science class, and from then on hardly a semester went by when I wasn’t taking something with her.

Riki was a kind of intellectual performer I’d never encountered before. I never knew anyone who could keep track of so many thoughts: I marveled at how she could start a sentence, divert herself, then go off on something else, but then work her way back up and finish the sentence 20 minutes later. She had a kind of unreserved enthusiasm for life and ideas that really resonated with me; my decision to work on Victorian science was influenced in no small part by her description of living in England and working in the archives there. When I was a bit older and had more of a critical sensibility, I found her scholarship to be really outstanding, erudite without being purposely complicated: I taught her Great Zimbabwe Ruins article in several of my classes, and it always went over well.

She was also a great person and teacher, always supportive and generous, great at helping you think through arguments. Not the closest reader, though; lots of chapters came back with “Good work” scrawled at the end, and little more. (That’s why you needed Rob Kohler on your committee. That man could line edit a diffraction grating.)

There are lots of people who can hardly remember classes from college, or the professors they had. Riki, in contrast, introduced to me a set of questions about the ways people, ideas, and technologies interact that I’m still dealing with. It’s why I dedicated my first book to her and Rob. And I think I’ll spend the rest of my life working on things that we talked about. Fortunately they’re very big questions.

I find as I close in on 50, I don’t particularly notice my age: I’ve had some grey hair since I was in graduate school (it’ll do that to you), and aside from bifocals, I’m not in worse physical shape (though that’s not the highest bar ever set), and more important, I’m a better writer and thinker than I’ve ever been in my life. But what I can’t comprehend is other people getting older, too: my parents are in their 70s, which I find weird, and Riki was 70, which to me is inconceivable: my memory of her was fixed in the 1980s.

It’s one of life’s ironies that the gap a person leaves when they’re gone is as large as the impact they made when they were alive. By that standard, Riki’s passing leaves a very large gap indeed.

On Gingrich’s use of historical analogy: Was his failure to get on the VA ballot Pearl Harbor, Pickett’s Charge, or 9/11?

Conor:

Everything must be made into a sweeping analogy. And his instinct for grandiosity is so pronounced that only a small group of recurring subjects are fit for the comparisons that he offers. Would anyone other than Newt Gingrich respond to failing to get on a ballot by asking, “Okay, what’s the historical analogy?” Even the “the” is perfect — as if there is one definitive analogy that fits….

Knowing Gingrich regards Pearl Harbor as “the analogy” for effectively losing Virginia, what comparison will he think appropriate if he loses the entire campaign. The Bataan Death March? Lincoln’s tragic night at the theater? The fall of Rome? Or maybe he’ll win, and God help us if so. Has there ever been a politician more inclined to make history? In a president of the United States, that is a dangerous impulse.

There’s actually a serious point here, as Conor observes. For all his love of history– or his love of himself as an historian– Gingrich’s use of history is remarkably shallow, and really consists of a kind of Mad Libs consisting of the Revolutionary War, the Civil War (how many people has Gingrich challenged to “a series of Lincoln-Douglas debates”?), and the Cold War; comparisons of himself to Washington, Reagan, and Thatcher; and apocalyptic language that would have warmed the hearts of Leo Strauss or Nostradamus.

He truly is a stupid person’s idea of what a smart person is like.

Also, this handy flow chart.

Has anyone written a history of the idea of “real time”?

Has anyone written an article on the term "real time"– where it comes from, how it's been used in the last few decades, and what it means today? In particular, I'm interested in how and when the "realness" of things like computer network and information flows became more "real" than those of more "natural" everyday human life.

Remembering Theodore Roszak

Theodore Roszak, author of Making of a Counter Culture and The Longevity Revolution— and another two dozen books, more or less– has died. I interviewed Roszak a few years ago for a project on the future of aging, and loved the passion and energy of his first book, on the counterculture. In effect, in those two books Roszak chronicled the Baby Boomer generation’s great impact on American society and culture. As I explained a while ago,

Roszak spent his career at Cal State Hayward, and retired a few years ago. A decade ago, he went through a medical crisis “that would have killed our parents or grandparents. I came through this realizing that… I might live another twenty or thirty years,” he recalled to me last fall.

His experience revealed two things. First, surviving an event like this is “a profoundly transformative, spiritual experience” that makes you “wonder what you’re going to do with what you now see as a gift.” Second, he wasn’t alone: for more and more people, events like this are becoming a passage into a new, as-yet undefined phase of life.

I really became a fan of his when I came upon a review he wrote of Buckminster Fuller’s Operating Manual for Spaceship Earth that was one of the most critical, smart attacks on Fuller’s worldview.

My other substantive encounter with Roszak happened a decade ago, around his book From Satori to Silicon Valley, which he was kind enough to let me republish on my Making the Macintosh project. He was quite supportive of my interest in the book (which makes an argument about the link between counterculture and computing later explored by Fred Turner and John Markoff, and more recently applied to theoretical physics by David Kaiser), and gracious in letting me reprint the book.

Finally, I recently rediscovered a book he published in 1986, The Cult of Information, which I’d picked up at a book sale and never actually looked at in detail. However, it prefigures some of what I talk about in contemplative computing, particularly in acting on an understanding of the difference between human and computer intelligence and memory:

Two distinct elements come together in the computer: the ability to store information in vast amounts, the ability to process that information in obedience to strict logical procedures. Each of these will be taken up in turn in Chapters 5 and 6 and explored for its relationship to thought. There we will see how the cult of information fixes upon one or the other of these elements (sometimes both) and construes its intellectual value. Because the ability to store data somewhat corresponds to what we call memory in human beings, and because the ability to follow logical procedures somewhat corresponds to what we call reasoning in human beings, many members of the cult have concluded that what computers do somewhat corresponds to what we call thinking…. The burden of my argument is to insist that there is a vital distinction between what machines do when they process information and what minds do when they think.

Another book giveaway

In the course of reorganizing my home office (I’m starting serious work on my next book, on contemplative computing), I’ve made another cull of my book collection. Like last year, I’ve got several large boxes of books that are duplicates, books I read long ago, or books I’m honestly never going to read. Some are in pristine condition (alas), while others are annotated. I’m looking to give away to someone who can use them– preferably a grad student in history of science / STS or some related field, but that’s highly negotiable.

Many of them are really outstanding, but the trajectory of my professional work is such that they’re no longer really relevant for me, and I’ve got more books coming in the door every day. Here’s a picture of many, but not quite all, the books I’m giving away.

Books to give away
Go to flickr for a full size version

The terms of the giveaway are:

  1. The books are free; all you pay is postage (about $25-30 per box).
  2. You take the whole set, which consists largely of books in history of science; contemporary science, science policy, or science and business; and a few in history of art and architecture. I don’t really have time to publish a catalog of all the books, and mail them out individually; plus as a collection it has some measure of integrity or structure that I think is of some small value, and worth preserving.

If you’re interested, contact me at askpang at gmail dot com.

The Titanic as symbol

Edward Tenner has a piece in The Atlantic about the Titanic's continued importance in Ireland as "an icon of the vanished glories of Belfast shipbuilding at its peak." This makes sense: when I was in Cambridge, I saw someone wearing a t-shirt that said,

The Titanic: Built by 15,000 Irishmen. Sunk by one Englishman.

As someone said, the past isn't a different country. It isn't even past.

Robert Darnton on “a font of proverbial nonwisdom”

Robert Darnton challenges “five myths about the information age” that, taken together, “constitute a font of proverbial nonwisdom.”

  1. “The book is dead.” Wrong: More books are produced in print each year than in the previous year.
  2. “We have entered the information age.”… [E]very age is an age of information, each in its own way and according to the media available at the time.
  3. “All information is now available online.” The absurdity of this claim is obvious to anyone who has ever done research in archives.
  4. “Libraries are obsolete.” Everywhere in the country librarians report that they have never had so many patrons.
  5. “The future is digital.” True enough, but misleading.

It used to be said that the difference between God and Robert Darnton was that God was everywhere, while Darnton was everywhere but Princeton. Now that he’s Harvard’s university librarian, I wonder if the joke has migrated and updated?

[To the tune of The Asteroids Galaxy Tour, “The Golden Age,” from the album The Golden Age – EP (a 4-star song, imo).]
Older posts Newer posts

© 2019 Alex Soojung-Kim Pang, Ph.D.

Theme by Anders NorenUp ↑