Alex Soojung-Kim Pang, Ph.D.

I study people, technology, and the worlds they make

Category: Notes / Reading (page 1 of 6)

“I believe an interview needs to be prepared ahead of time to sound spontaneous”

From a Paris Review interview with Italo Calvino:

I could try to improvise but I believe an interview needs to be prepared ahead of time to sound spontaneous. Rarely does an interviewer ask questions you did not expect. I have given a lot of interviews and I have concluded that the questions always look alike. I could always give the same answers. But I believe I have to change my answers because with each interview something has changed either inside myself or in the world. An answer that was right the first time may not be right again the second.

This is pretty much exactly my experience being interviewed. You want to sound fresh and interested in the question, but you also need to negotiate against a couple possibilites: that you're likely to be answering a question you've been asked before (in a satellite radio tour, something you've been asked ten minutes before); or you're asked something you've never really thought about, and so have to actually come up with something interesting on the fly.

Thinking out loud never sounds that interesting (at least not when I do it). Audiences like it when you've thought of something, not when you're thinking of something.

More than just a calendar

Virginia Heffernan laments the demise of datebooks like the Filofax:

It’s hard to remember, surveying my dull Google version (“parents in town,” “book club”), that a Filofax was also a place for plot arcs, self-invention and self-regulation. It was, in every sense, a diary — a forward-running record, unlike backward-running blogs. The quality of the paper stock, the slot for the pen, the blank but substantial cover, the hints of grand possibilities that came with the inserts — all of these inspired not just introspection but also the joining of history: the mapping of an individual life onto the grand old Gregorian-calendar template….

[N]ow that I’ve shelved my Filofax in favor of a calendar program that seems somehow to flatten existence, I realize that another year is passing without my building up the compact book of a year’s worth of Filofax pages that, every December, I used to wrap in a rubber band and put on a shelf, just as my new refills came in the mail.

If there is one thing we've discovered about print media, especially in the wake of the disappearance of some artifact (card catalogs, the encyclopedia, etc.), is that readers and users don't treat print media merely as inefficient carriers of information that wanted to be digital (or free, or expensive, as Stewart Brand put it), but developed all kinds of other uses for print that increased their utility, were taken for granted, and tended to be overlooked by engineers. Engineers looked at the Filofax and saw a digital calendar-in-waiting; in Heffernan's hands, in contrast, it was "a place for plot arcs, self-invention and self-regulation. It was, in every sense, a diary — a forward-running record, unlike backward-running blogs."

We don't just act on information or media; we interact with it, and the character of those interactions, as much as the information itself, define our relationships with media. One reason I still prefer printed books to digital is that it's difficult to annotate digital books in a way I find satisfactory: when I'm reviewing a book, or using it in my work, I need to be able to underline, annotate, add Post-Its, and make notes– to document my dialogue with or reflections on the book. (This goes far beyond the kind of annotations you can make on ebooks today, and a world away from leaving comments on blogs, or hitting the "Like" button on a Web page.) This kind of reading is more like a martial art than the quiet, interior activity that many people think of when they think about "reading." And while I don't do it with everything– I never got the calendar bug, for example– there are a few activities in which the affordances of print media support practices and interactions that electronic media cannot.

Deleting

About a year ago I wrote about Web 2.0 as a time machine for my generation, and my suspicion that "mine may be the last generation that has the experience of losing touch with friends." This concerned me because

when it comes to shaping identity, the ability to forget can be as important as the ability to remember. It's easy to implore people not to forget who they are; but sometimes, in order to become someone better, you need to forget a little bit.

Likewise,

Forgetting insults and painful events, we all recognize, is a pretty healthy thing for individuals: a well-adjusted person just doesn't feel the same shock over a breakup after ten years (if they can even remember the name of Whoever They Were), nor do they regard a fight from their childhood with anything but clinical detachment. Collectively, societies can also be said to make decisions about what they choose to remember, and how to act toward the past. Sometimes this happens informally, but has practical reasons: think of national decisions of avoid deep reflection on wars or civil strife, in the interests of promoting national unity and moving forward.

The idea that digital and human memory work differently, and that we fail to recognize the difference between the two at our peril, is something I've been writing about for a while. So I was very interested to see a review by Henry Farrell in Times Higher Education of Viktor Mayer-Schoenberger's new book Delete: The Virtue of Forgetting in the Digital Age. It sounds like a book I need to read… or at least footnote!

At its heart, his case against digital memory is humanist. He worries that it will not only change the way we organise society, but it will damage our identities. Identity and memory interact in complicated ways. Our ability to forget may be as important to our social relationships as our ability to remember. To forgive may be to forget; when we forgive someone for serious transgressions we in effect forget how angry we once were at them.

Delete argues that digital memory has the capacity both to trap us in the past and to damage our trust in our own memories. When I read an old email describing how angry I once was at someone, I am likely to find myself becoming angry again, even if I have since forgiven the person. I may trust digital records over my own memory, even when these records are partial or positively misleading. Forgetting, in contrast, not only serves as a valuable social lubricant, but also as a bulwark of good judgment, allowing us to give appropriate weight to past events that are important, and to discard things that are not. Digital memory – which traps us in the past – may weaken our ability to judge by distorting what we remember.

[To the tune of Sukhwinder Singh, "Marjaani Marjaani," from the album Saavn Celebrates Bollywood (I give it 3 stars).]

Project Prezi

A while ago I created a Prezi for an end of cyberspace talk. Prezi has a cool functionality that lets you create a path or trail through a presentation (very Vannevar Bush).

I've realized that because of the trail feature, this presentation isn't just a single talk, or it doesn't need to be. Rather, I can use it as a kind of online studio for displaying everything I talk about in this project, and just create different paths through the Prezi for different talks.

I think I'm going to make this a persistent post, so it always stays on the front page. If you want to go directly to the Prezi, here it is.

This is your brain on multimedia

Ed Yong at Not Exactly Rocket Science covers a new study on media multitasking and its impact on cognitive control:

You might think that this influx of media would make the heaviest of users better at processing competing streams of information. But Eyal Ophir from Stanford University thinks otherwise. From a group of 262 students, Ophir indentified two sets of 'light' and 'heavy' multimedia multi-taskers from the extreme ends of the group. The heavy users were more likely to spend more time reading, watching TV, surfing the web, sending emails or text messages, listening to music and more, and more likely to do these at the same time.

The heavy group also fared worse at tasks designed to test their ability to filter out irrelevant information or, surprisingly, to switch from one task to another. In short, they show poorer "cognitive control", a loosely grouped set of abilities that include allocating attention and blocking out irrelevancy in the face of larger goals. They're more easily distracted by their many incoming streams of data, or less good at shining the spotlight of their attention on a single goal, even though they are similar to the light group in terms of general intelligence, performance on creativity tests, basic personality types, and proportion of women to men….

The key question here is whether heavy multimedia use is actually degrading the ability to focus, or whether people who are already easily distracted are more likely to drown themselves in media. "This is really the next big question," says Ophir. "Our study makes no causal claims; we have simply shown that media multitaskers are more distractable." The next step is to follow a group of people with different media habits over time to see how their mental abilities shift, and that's something that Ophir is working to set up.

Nonetheless, as ever-larger computer screens support more applications (Google Wave, anyone?), and social norms shift towards more immediate responses, it seems that multitasking is here to stay and perhaps merely in its infancy. It's important to understand if these technologies will shift our portfolio of mental skills, or equally if people who are naturally easy to distract will gravitate towards this new media environment, and encounter difficulties because of it.

Paul Miller on government and cyberspace

Paul Miller has an interesting piece in The Independent on government and cyberspace:

The relationship between government and the internet has always been tense. “Governments of the Industrial World, you weary giants of flesh and steel”, typed John Perry Barlow in 1996, “your legal concepts of property, expression, identity, movement, and context do not apply to us. They are all based on matter and there is no matter here.”

His Declaration of the Independence of Cyberspace spread quickly among the libertarian digerati of the time. For those who craved a space over which governments could have no influence, it was an appealing idea. They also believed that the internet age would herald an era when decentralised technology could do away with the need for government at all.

John Perry Barlow and his friends, of course, were wrong. The internet hasn’t swept away government, neither has the internet completely escaped government intervention… But we’re still just at the beginning of understanding the relationship between government and the ways that the internet can help deliver public goods – sometimes through government itself and sometimes through new lightweight public service start-ups. As we attempt to understand what might be possible, we need to replace Barlow’s black or white ‘cyberspace versus government’ with a new understanding of the way that online tools could help us to live the lives we want to lead….

What Meetup and the hundreds of other online businesses that facilitate real world activity show is that the real power of the net in the future won’t be about information or content – although that’s what we use it for mainly these days – its real power is organisation away from the computer itself. The most successful services will be those with a ‘Why Don’t You’ ethic, which encourages us away from the screen and to be active participants in the world outside….

There was a time when digital technologies were about a new space, detached from the physical. The digerati took William Gibson’s word ‘cyberspace’ and made it their own. This was a place where the pioneers would be safe from governments or corporations or anybody impinging upon their freedom. It didn’t quite turn out like that. Actually, there’s no such thing as cyberspace. Cyberspace is dead. But I don’t think we should mourn it because what we should be working on is much more exciting. What we’ve realised is that the power of the internet is in changing the real world.

Those last three sentences summarize about half of my book.

Read it in the line for tickets to Maker Faire

My latest article, on tinkering and the future, has been published in the latest issue of Vodafone's Receiver Magazine. The piece is an effort to draw together a couple of my research and personal interests (though the boundaries between those two categories is pretty blurry), and to see the tinkering / DIY movement as one piece in an emerging strategy for creating better futures.

Almost forty years ago, the Whole Earth Catalog published its last issue. For the American counterculture, it was like the closing of a really great café: the Catalog had brought together the voices of contributors, readers and editors, all unified by a kind of tech-savvy, hands-on, thoughtful optimism. Don't reject technology, the Catalog urged: make it your own. Don't drop out of the world: change it, using the tools we and your fellow readers have found. Some technologies were environmentally destructive or made you stupid, others were empowering and trod softly on the earth; together we could learn which were which.

Millions found the Catalog's message inspirational. In promoting an attitude toward technology that emphasized experimentation, re-use and re-invention, seeing the deeper consequences of your choices, appreciating the power of learning to do it yourself and sharing your ideas, the Whole Earth Catalog helped create the modern tinkering movement. Today, tinkering is growing in importance as a social movement, as a way of relating to technology and as a source of innovation. Tinkering is about seizing the moment: it is about ad-hoc learning, getting things done, innovation and novelty, all in a highly social, networked environment.

What is interesting is that at its best, tinkering has an almost Zen-like sense of the present: its 'now' is timeless. It is neither heedless of the past or future, nor is it in headlong pursuit of immediate gratification. Tinkering offers a way of engaging with today's needs while also keeping an eye on the future consequences of our choices. And the same technological and social trends that have made tinkering appealing seem poised to make it even more pervasive and powerful in the future. Today we tinker with things; tomorrow, we will tinker with the world.

OFF=ON

From the Trendwatching report "OFF=ON:"

[T]he near-total triumph of the ‘online revolution’ (1.4 billion people online, anyone?), which now has the ‘offline world’ more often than not playing second fiddle in everything from commerce to entertainment to communications to politics.

In fact, ‘offline’ is now so intertwined with ‘online’ that a whole slew of new products and services and campaigns are just waiting to be dreamed up by … well… you? Our definition:

OFF=ON | More and more, the offline world (a.k.a. the real world, meatspace or atom-arena) is adjusting to and mirroring the increasingly dominant online world, from tone of voice to product development to business processes to customer relationships. Get ready to truly cater to an ONLINE OXYGEN generation even if you’re in ancient sectors like automotive or fast moving consumer goods.

For this briefing, we chose to focus on hands-on innovation. Which to us means coming up with new goods, services and experiences. And as this is about current OFF=ON developments, we’re excluding researched-to-death topics like straightforward ecommerce or cross-media strategies….

[C]yberspace as we know it (read: a wondrous world of control and make-believe restricted to desktops at home or in poorly-lit offices, and laptops that don’t venture too far from spotty hotspots) is about to vanish, and will be replaced by something that is everywhere, enabling consumers if not enticing* them to actually venture out into the—you guessed it—real world.

More on Facebook portraits

Yet another interesting reflection on Facebook portraits, this time by Columbia architecture professor Kazys Varnelis (whose recent book I mentioned in a previous post):

The Facebook self-portrait makes everyone a superstar, famous for no particular reason, but notable for their embrace of fame. So it is that on Facebook, I see friends who I never thought of as self-conscious take photographs of remarkable humor, intelligence, and wry self-deprecation. The Facebook self-portrait insists upon mastery over one's self-image and the instant feedback of digital photography allows us this. Not happy? Well, try again.

Long ago, when I was in high school, I read a book on the Bloomsbury group. I remember that the caption underneath a group photograph in the book (whose title now escapes me) pointed out that even in this über-hip clique, only one member was relaxed, only one understood that the right pose for the camera was a calculated non-pose. Our idea of the self can be read through such images: from the stiff formality of the painted portrait to the relaxed pose of the photograph to the calculated self-consciousness of the Facebook digital image. Each time, the self becomes a more cunning manipulator of the media. Each time, the self becomes more conscious of being defined outside itself, in a flow of impulses rather than a notion of inner essence.

So it was that in reading the first article, I felt that the author missed his friend Caroline's point when she told him "You can never be too cool for your past." As your images catch up to you in network culture, you have to become the consummate manipulator of your image, imagery from the past being less an indictment of present flaws and more an indicator of your ability to remake yourself.

Technology and solitude

Sitting in the quiet living in the pre-dawn hours, I came across William Deresiewicz's essay on technology, sociability, and solitude in the Chronicle Review. For those who have access to it, it's well worth reading.

One book that influenced me when I was younger was Anthony Storr's Solitude. I didn't actually read that much of it, and I doubt I understood it very well, but the idea that solitude was worthwhile and rewarding, and nothing to be afraid of, was a novel concept for me. Deresiewicz argues that his students, who've grown up with MySpace and text messaging (among other things), have lost most opportunities to learn and benefit from being alone.

If Lionel Trilling was right, if the property that grounded the self, in Romanticism, was sincerity, and in modernism it was authenticity, then in postmodernism it is visibility.

So we live exclusively in relation to others, and what disappears from our lives is solitude. Technology is taking away our privacy and our concentration, but it is also taking away our ability to be alone. Though I shouldn't say taking away. We are doing this to ourselves; we are discarding these riches as fast as we can. I was told by one of her older relatives that a teenager I know had sent 3,000 text messages one recent month. That's 100 a day, or about one every 10 waking minutes, morning, noon, and night, weekdays and weekends, class time, lunch time, homework time, and toothbrushing time. So on average, she's never alone for more than 10 minutes at once. Which means, she's never alone.

I once asked my students about the place that solitude has in their lives. One of them admitted that she finds the prospect of being alone so unsettling that she'll sit with a friend even when she has a paper to write. Another said, why would anyone want to be alone?

To that remarkable question, history offers a number of answers. Man may be a social animal, but solitude has traditionally been a societal value. In particular, the act of being alone has been understood as an essential dimension of religious experience…. For the still, small voice speaks only in silence.

One thing that jumped out at me was that Deresiewicz contrasts the physical solitude that used to characterize being online, with the situation today. It used to be that "connecting" online was more a physically isolating experience, done at desks, in front of desktops. Today, though, you don't have to be alone to go online: just as cellphones and mobile Web technologies make it less likely that you'll ever be offline, and lower the bar for jumping onto the Web, they make it less likely that you'll be fruitfully alone.

But as the Internet's dimensionality has grown, it has quickly become too much of a good thing. Ten years ago we were writing e-mail messages on desktop computers and transmitting them over dial-up connections. Now we are sending text messages on our cellphones, posting pictures on our Facebook pages, and following complete strangers on Twitter. A constant stream of mediated contact, virtual, notional, or simulated, keeps us wired in to the electronic hive…. Not long ago, it was easy to feel lonely. Now, it is impossible to be alone.

Of course, we all know plenty of people who manage to feel alone even today, and it's possible to resist the pull of technology: there are people who rebel against constant connectivity, on the grounds that it's too intrusive and distracting. But still, I think Deresiewicz points to a bigger trend that most of us will recognize.

A rich essay. Worth reading.

Older posts

© 2017 Alex Soojung-Kim Pang, Ph.D.

Theme by Anders NorenUp ↑