Alex Soojung-Kim Pang, Ph.D.

I study people, technology, and the worlds they make

Tag: attention (page 1 of 2)

Nicholas Carr, The Shallows

Spent the day (and most of the evening) reading Nicholas Carr’s new book, The Shallows. The money graf:

“[I]f, knowing what we know today about the brain’s plasticity, you were to set out to invent a medium that would rewire our mental circuits as quickly and thoroughly as possible, you would probably end up designing something that looks and works a lot like the Internet. It’s not just that we tend to use the Net regularly, even obsessively. It’s that the Net delivers precisely the kind of sensory and cognitive stimuli– repetitive, intensive, interactive, addictive– that have been shown to result in strong and rapid alterations in brain circuits and functions.” It’s “the single most powerful mind-altering technology… since the book.” (116)

I’ll have a more detailed analysis of the book on the Contemplative Computing blog tomorrow. I’m trying to make my way through as much of what you might think of as the digital Cassandra literature as I can stand.

PowerPoint doesn’t make you stupid, and LOLcats doesn’t rewire your brain

Via Duke professor Cathy Davidson, I just came across this L. A. Times piece by Christopher Chabris and Daniel Simons. (They’re authors of The Invisible Gorilla. The essay aim at “digital alarmism,” the argument that the Internet is making us stupider by “trap[ping] us in a shallow culture of constant interruption as we frenetically tweet, text and e-mail,” both leaving us less time to read Proust, and rewiring our brains so we’re incapable of paying serious attention to… anything.

More at Contemplative Computing.

Calendars, concentration, and creativity

Via Lifehacker, a nice little essay on “the chokehold of calendars,” and how we’ve accidentally (or thoughtlessly) designed them to kill our productivity and concentration:

The idea of a calendar as a public fire hydrant for colleagues to mark is ludicrous. The time displayed on your calendar belongs to you, not to them….

The problem with calendars is that they are additive rather than subtractive. They approach your time as something to add to rather than subtract from. Adding a meeting is innocuous. You’re acting on a calendar. A calendar isn’t a person. It isn’t even a thing. It’s an abstraction. But subtracting an hour from the life of another human being isn’t to be taken lightly. It’s almost violent. It’s certainly invasive. Shared calendars are vessels you fill by taking things away from other people.

“I’m adding a meeting” should really be “I’m subtracting an hour from your life.”

Amen to that….

Dan Ariely on the paradox of productivity tools

Dan Ariely has a good post about why our current “productivity tools” generate time-wasting or addictive behavior: he looks to B. F. Skinner’s work on “schedules of reinforcement” that found that random rewards inspired more work than predictable rewards. (It got more work out of rats, anyway. Come to think of it, it also works for graduate students.)

Ariely comments that Skinner’s work

gives me a better understanding of my own e-mail addiction, and more important, it might suggest a few means of escape from this Skinner box and its variable schedule of reinforcement. One helpful approach I’ve discovered is to turn off the automatic e-mail-checking feature. This action doesn’t eliminate my checking email too often, but it reduces the frequency with which my computer notifies me that I have new e-mail waiting (some of it, I would think to myself, must be interesting, urgent, or relevant). Another way I am trying to wean myself from continuously checking email (a tendency that only got worse for me when I got an iPhone), is by only checking email during specific blocks of time. If we understand the hold that a random schedule of reinforcement has on our email behavior, maybe, just maybe we can outsmart our own nature.

There’s also this observation of Skinner’s own work habits.

Skinner had a trick to counterbalance daily distractions: As soon as he arrived at his office, he would write 800 words on whatever research project he happened to be working on—and he did this before doing anything else. Granted, 800 words is not a lot in the scheme of things but if you think about writing 800 words each day you would realize how this small output can add up over time.

This is something I try to do, but I need to be more disciplined about it. There aren’t THAT many e-mails waiting for me in the morning that require my immediate attention, and I suspect that I’m actually more likely to lose track of tasks or not reply to a message if I read it, think to myself “I’ll deal with this later,” then set it aside. For me, the in-box is not nearly as effective a place to stack tasks than, say, a physical pile (or even better, a written list in my little Moleskine notebook).

[To the tune of The Fixx, “Secret Separation,” from the album The Best of the Fixx (a 3-star song, imo).]

The five most endangered words of the realtime internet era are: “Let me think about that”

Tweetage Wasteland makes a good point:

The five most endangered words of the realtime internet era are:

Let me think about that.

Shirley Sherrod, the former rural development director for the Agriculture Department in Georgia found that out the hard way when she was fired by the Obama administration for her delivery of a supposedly racist speech. The speech was creatively excerpted, political bloggers and cable news commentators blew up the story, it entered the Twitterverse, and boom, Sherrod was asked to resign from her position.

Unfortunately, no one seemed to have time to listen to the whole speech. Once they did, Sherrod was showered with apologies and found herself taking calls from the President.

This story is less about politics and more about pace. It provides a clear example of how our Facebook and Twitter behaviors are bleeding over into the rest of our lives…. When confronted with the realtime web’s constant flow of incoming information, who has time for a full set of facts? We each take a few seconds to consider a one hundred forty character blurb and then hammer out our reactions by way of a Tweet or status update.

Laptops, classrooms, and discussion

A article in the Washington Post (via the Volokh Conspiracy) on the mixed value of laptops in the classroom:

A generation ago, academia embraced the laptop as the most welcome classroom innovation since the ballpoint pen. But during the past decade, it has evolved into a powerful distraction. Wireless Internet connections tempt students away from note-typing to e-mail, blogs, YouTube videos, sports scores, even online gaming — all the diversions of a home computer beamed into the classroom to compete with the professor for the student's attention.

This isn't just confined to colleges and graduate schools (law schools figure prominently in the article): I encounter a similar issue in workshops that I run. Especially here in the Valley, within ten minutes at least one person in a group of fifteen is going to have their Blackberry in their lap, checking their messages. It's so common I no longer take it personally, and I find it doesn't really work very well to ask people to turn things off, or remind them that they should be paying attention. People know they should be paying attention. They haven't forgotten.

Instead, I take it as a challenge to be more creative and engaging. And I'm not the only one:

José A. Bowen, dean of the Meadows School of the Arts at Southern Methodist University, is removing computers from lecture halls and urging his colleagues to "teach naked" — without machines. Bowen says class time should be used for engaging discussion, something that reliance on technology discourages.

I think this is good advice. I prefer not to use Power Point in talks or lectures, because I find that I spend more time interacting with the technology than I do actually talking to students. But more fundamentally, Bowen's advice gets at a deeper point, which is what you might call the information delivery model of teaching– the idea that the point of being in the classroom is to engage in a more-or-less formal set of exercises to master a body of information. Everyone has better things to do in the classroom, and there are more intensive and social kinds of learning that you can practice when you're with other people that you can't when you're alone or online.

Hands-up displays

John Murrell on augmented reality:

As we know from extensive science fiction research, one day we will be equipped with unobtrusive and tastefully designed technology that will project before our eyes a heads-up display of information related to whatever real-life scene we're looking at. That level of augmented reality, however, is a ways down the road, and unfortunately that road is likely to be strewn with the broken bodies of early adopters.

Thanks to the growth in smartphones equipped with large screens, cameras, compasses and GPS, location- and marker-based augmented reality (AR) is in the early stages of a hype cycle. Companies like Layar are building browser apps that look where you're looking and pull in layers of data from reference sources and social media. Startup Wikitude on Wednesday launched a new update of its software for Android handsets that integrates social tagging of physical locations, and an iPhone version is on the way. Apple's App Store recently got its first AR offering when an app called Metro Paris Subway added a feature that superimposes labels for station locations and points of interest over the view through your iPhone.

At this early stage in AR evolution, however, the displays are not heads-up, but hands-up, and that means we will be seeing a new class of situational zombies roaming our streets. We’ve already grown used to dodging around the people with heads bowed over their phones in the texting prayer position and the distracted pedestrians engrossed in conversation with their invisible companions over their Bluetooth headsets. Soon we'll be seeing more folks shuffling around with their smartphone screen held up in their line of vision, absorbed in their augmented reality data, and we'll be faced with a dilemma: keep a watchful eye on these people and tackle them before they wander into traffic or fall into a manhole, or just allow the Darwinian process to cull the herd.

While part of the point of some augmented reality research is to avoid exactly that kind of zombie state, by creating technologies that layer information on top of views (or displays them on things, or what have you), I suspect Murrell is onto something. I got my iPhone a
Facebook, Twitter Revolutionizing How Parents Stalk Their College-Aged Kids

[To the tune of The Police, "Every Breath You Take," from the album Message In A Box: The Complete Recordings (Disc 4) (I give it 4 stars).]

http://www.askpang.com/2008/11/construction-in.html">few months ago, and quickly found that I couldn't check my e-mail and walk at the same time: listening to music is no problem, and I can even usually stay alert while listening to The Bugle; but e-mail was different. Not because the interface is so incredibly compelling, but because I was so accustomed to tuning out the rest of the world when I checked my mail: my brain had trained itself to go into a kind of tunnelvision mode, which meant I couldn't trust my body to avoid potholes or streetlights while my phone downloaded messages.

[To the tune of Jean Sibelius, "Finlandia, Op. 26," from the album Finlandia/Tone Poems (I give it 3 stars).]

Texting while driving: Still unsafe, stupid, and unaccountably popular

From the Good Morning Silicon Valley blog:

Simple common sense should tell us that trying to text while driving is as stupid and dangerous as trying to crochet. We shouldn’t need a bunch of studies calculating and quantifying the risk to goad us into a response, but if that’s what it takes, here’s the latest. A Virginia Tech study that outfitted the cabs of long-haul trucks with video cameras found that when the drivers were texting, their collision risk was 23 times greater than when they had their attention on the road — a figure far higher than the estimates coming out of lab research and a rate by far more dangerous than other driving distractions. And at the University of Utah, research on college students using driving simulators showed texting raised the crash risk by eight times. The variance in the figures is beside the point. “You’re off the charts in both cases,” said Utah professor David Strayer. “It’s crazy to be doing it.”

And the heck of it is, people already know that and they keep doing it anyway.

This is a near-perfect example of how most humans are geniuses at rationalization: yes, I know it's dangerous, but I'll be careful and do it just this time, because I really need to let the office know where that file is. Oh wait, they've got another question. Well, it would be more dangerous to wait and put the phone down, so I'll just– dammit, can't the kids find anything by themselves? Okay, now I'll make up for it by really focusing on the road.

It's also a nice example of the kinds of dissonance created when we take practices and technologies designed for one use context, and move it into another– a phenomenon that mobile technologies makes increasingly common. It was hard to take a Macintosh SE or IBM PC Junior on the road; a smartphone, on the other hand, is a perfect storm of transportable, always-on, and just usable enough when you're doing other things to be dangerous.

[To the tune of Jean Sibelius, "Tapiola, Op. 112," from the album Finlandia/Tone Poems (I give it 2 stars).]

Slow communications manifesto

I'm noticing an uptick in the number of articles on digital sabbaths, zeroing out, or whatever you want to call it. This from John Freeman in the Wall Street Journal:

It is time to launch a manifesto for a slow communication movement, a push back against the machines and the forces that encourage us to remain connected to them. Many of the values of the Internet are social improvements—it can be a great platform for solidarity, it rewards curiosity, it enables convenience. This is not the mani­festo of a Luddite, this is a human manifesto. If the technology is to be used for the betterment of human life, we must reassert that the Internet and its virtual information space is not a world unto itself but a supplement to our existing world, where the following three statements are self-evident.

1. Speed matters…. "The speed at which we do something—anything—changes our experience of it…. The Internet has provided us with an almost unlimited amount of information, but the speed at which it works—and we work through it—has deprived us of its benefits. We might work at a higher rate, but this is not work ing."…

2. The Physical World matters. A large part of electronic commu nication leads us away from the physical world. Our cafes, post offices, parks, cinemas, town centers, main streets and community meeting halls have suffered as a result of this development…. Sitting in the modern coffee shop, you don't hear the murmur or rise and fall of conversation but the continuous, insect-like patter of typing. The disuse of real-world commons drives people back into the virtual world, causing a feedback cycle that leads to an ever-deepening isolation and neglect of the tangible commons.

3. Context matters. We need context in order to live, and if the environment of electronic communication has stopped providing it, we shouldn't search online for a solution but turn back to the real world and slow down. To do this, we need to uncouple our idea of progress from speed, separate the idea of speed from effi ciency, pause and step back enough to realize that efficiency may be good for business and governments but does not always lead to mindfulness and sustainable, rewarding relationships.

In a different register but playing some of the same themes, Mercury News tech columnist Troy Wolverton confesses, "I've been thinking I need to take a break from technology."

Resisting the urge to check my e-mail on my phone, say. Finding something else to do when the TV's not on at night than retreat to my computer for some Web surfing or game playing. Focusing on the people in my life, rather than the gadgets….

Reading a newspaper Web site on my iPhone while sitting next to my son may seem no different from when my dad used to read a real newspaper while I was eating breakfast as a kid. But the iPhone tends to be a lot more engrossing and addictive than a physical newspaper — and not just because the latter keeps getting thinner.

I can peruse hundreds of newspapers on my iPhone, seeking out those stories and topics I'm most interested in. If that gets dull, I can check my e-mail. If there's nothing there to grab my attention, there's always my Facebook app or a game. In short, it's hard to pull away. And once you're entrapped, it's hard to pay attention to anything else.

[To the tune of Keith Jarrett Trio, "Five Brothers," from the album The Out Of Towners (I give it 1 stars).]

This is your brain on multimedia

Ed Yong at Not Exactly Rocket Science covers a new study on media multitasking and its impact on cognitive control:

You might think that this influx of media would make the heaviest of users better at processing competing streams of information. But Eyal Ophir from Stanford University thinks otherwise. From a group of 262 students, Ophir indentified two sets of 'light' and 'heavy' multimedia multi-taskers from the extreme ends of the group. The heavy users were more likely to spend more time reading, watching TV, surfing the web, sending emails or text messages, listening to music and more, and more likely to do these at the same time.

The heavy group also fared worse at tasks designed to test their ability to filter out irrelevant information or, surprisingly, to switch from one task to another. In short, they show poorer "cognitive control", a loosely grouped set of abilities that include allocating attention and blocking out irrelevancy in the face of larger goals. They're more easily distracted by their many incoming streams of data, or less good at shining the spotlight of their attention on a single goal, even though they are similar to the light group in terms of general intelligence, performance on creativity tests, basic personality types, and proportion of women to men….

The key question here is whether heavy multimedia use is actually degrading the ability to focus, or whether people who are already easily distracted are more likely to drown themselves in media. "This is really the next big question," says Ophir. "Our study makes no causal claims; we have simply shown that media multitaskers are more distractable." The next step is to follow a group of people with different media habits over time to see how their mental abilities shift, and that's something that Ophir is working to set up.

Nonetheless, as ever-larger computer screens support more applications (Google Wave, anyone?), and social norms shift towards more immediate responses, it seems that multitasking is here to stay and perhaps merely in its infancy. It's important to understand if these technologies will shift our portfolio of mental skills, or equally if people who are naturally easy to distract will gravitate towards this new media environment, and encounter difficulties because of it.

Older posts

© 2017 Alex Soojung-Kim Pang, Ph.D.

Theme by Anders NorenUp ↑