Alex Soojung-Kim Pang, Ph.D.

I study people, technology, and the worlds they make

Tag: memory (page 1 of 2)

Memory improvements after bariatric surgery

Yes, it sounds a bit like an obscure joke, but via Future Pundit comes this new piece of research suggesting “a link between weight loss and improved memory and concentration.” From the abstract:

Growing evidence has shown that obesity is associated with poor neurocognitive outcomes. Bariatric surgery has been shown to be an effective intervention for morbid obesity and can result in improvement of many co-morbid medical conditions that are associated with cognitive dysfunction. The effects of bariatric surgery on cognition are unknown….

We performed a prospective study total of 150 subjects (109 bariatric surgery patients enrolled in the Longitudinal Assessment of Bariatric Surgery project and 41 obese control subjects who had not undergone bariatric surgery). These 150 subjects completed a cognitive evaluation at baseline and at 12 weeks of follow-up. The demographic, medical, and psychosocial information was also collected to elucidate the possible mechanisms of change. [S]urgery patients had improved memory performance at 12 weeks of follow-up… The present results suggest that cognitive impairment is common in bariatric surgery patients, although these deficits might be at least partly reversible. Future studies are needed to clarify the underlying mechanisms, in particular, longitudinal studies using neuroimaging and blood markers.

[To the tune of The Asteroids Galaxy Tour, “The Golden Age,” from the album The Golden Age – EP (a 4-star song, imo).]

Memory arts and mindware

Ruth Evans takes an historical perspective on Andy Clark's natural-born cyborgs argument, and that "human cognition is not just embodied but embedded: not mind in body, but both mind and body enmeshed in a wider environment of ever-growing complexity that we create and exploit to make ourselves smarter."

From the abstract:

The philosopher and cognitive scientist Andy Clark has argued that humans have always been ‘natural-born cyborgs,’ that is, they have always collaborated and merged with non-biological props and aids in order to find better environments for thinking. These ‘mindware’ upgrades (I borrow the term ‘mindware’ from Clark, 2001) extend beyond the fusions of the organic and technological that posthumanist theory imagines as our future. Moreover, these external aids do not remain external to our minds; they interact with them to effect profound changes in their internal architecture. Medieval artificial memory systems provide evidence for just this kind of cognitive interaction. But because medieval people conceived of their relationship to technology in fundamentally different ways, we need also to attend to larger epistemic frameworks when we analyze historically contingent forms of mindware upgrade. What cultural history adds to our understanding of embedded cognition is not only a recognition of our cyborg past but a historicized understanding of human reality.

This reminds me some of the work of the cognitive anthropology crowd, which I find necessarily speculative but extremely ambitious and interesting.

You could also re-work Paul Saenger's work on word spacing and intellectual history in light of Clark.

How Flickr changes my view of the world

In a recent article on experiments using automatic digital photography to improve the memories of Alzheimer's patients, I was struck by these paragraphs:

When researchers began exploring it as a memory aid a few years ago, they had patients and caregivers look at all the pictures together.

Although the exercise helped improve retention of an experience, it was evident that a better way would be to focus on a few key images that might unlock the memories related to it. The interactive nature of that approach would give patients a greater sense of control over their recollections, and allow them to revisit past experiences rather than simply know they had happened.

They soon realized that the capriciousness of memory made answers elusive. For one subject, a donkey in the background of a barnyard photo brought back a flood of recollections. For another, an otherwise unremarkable landscape reminded the subject of a snowfall that had not been expected.

The idea that "the capriciousness of memory" would make efforts to automatically generate summaries of events difficult, mirrors my own experience: I have entire trips that I recall through a couple apparently random things– the look of a hotel room, what I had for dinner. Likewise, looking at an entire album of pictures doesn't necessarily do much for me in terms of helping me remember more of an event.

I wonder if the scientists have tried getting their subjects to consciously manipulate those records afterwards– to make a photo album, for example– and see if that process of sorting helps improve recall. I remember trips much better if I write about them, or choose pictures to put online, much as I remember books better when I take notes on them. In fact, it's safe to say that the ritual of going through pictures, tagging them, and uploading them has both made it easier for me to remember these places, and changed my view of the world.

Let me explain.

One of the Web services I use a lot is the photo sharing site Flickr (if you don't believe me, just go to my account and see for yourself). I'm a fairly obsessive photographer, mainly because I like good pictures, but I'm not a very good one. With a film camera, you really pay for artistic mediocrity or technical clumsiness: you have to throw the same amount of money at a good picture as a bad. With digital cameras, on the other hand, you can play the lottery: take enough pictures, and some of them will accidentally be good. I'm also a doting father whose children aren't old enough to put up a serious fight when I get out the camera. And finally, digital cameras are small enough to fit in a pocket, so my Canon PowerShot is always handy. I don't have to plan to carry a camera with me: it's one of the things I always have when I walk out the door.

One of my favorite features in Flickr is its mapper, which lets you tell Flickr where in the world your picture was taken. Essentially, you put a digital pin in an online map, much as you would in a real map. Flickr and Yahoo! Maps got together to provide the service in 2006, and since then I've become a slightly fanatical geotagger. It started out as pure geekdom: I'd written stuff about the future of geolocation services and information, so it seemed a good chance to play with a future I had already described. But now I do it because it's a way to help me remember my pictures, and where I took them.

When I'm in a place, I like to walk. I want to know enough to stay out of bad neighborhoods, to find interesting ones, and to be aware of significant landmarks. I don't want to miss the big attractions, but I also want the freedom to happen upon that perfect little cafe and pastry shop, or the brilliant bookstore that's not in any of the guidebooks. (How many travelers define themselves as people who want to escape the boundaries of the guidebooks?) This style of wandering is one reason I absolutely love certain cities. In London, for example, you can't go three blocks without coming upon something grand and historic, a charming little square, or an interesting piece of street life. You can never be sure which you'll find. It's one reason Samuel Johnson could say, when you're tired of London you're tired of life. Likewise, Singapore and Budapest reward walking, though for different reasons: Singapore is a kind of life-sized scenario of a prosperous, benevolently authoritarian, multicultural Asian Century could be like, with amazing food. Budapest is a wonderful Old European city, alternating twisty streets, grand boulevards, the magnificent Danube, and faded (but rapidly renovating) buildings and apartment blocks, with great coffee on every block.

So I like to wander. But once I'm back in my room, and have uploaded my pictures from the day, I want to reconstruct my path, and figure out where I've been. I used to do this on maps, tracing out my route with a highlighter. This wasn't always very successful. It required remembering street names, knowing how many blocks it had been since I'd turned left last, or estimating how far I'd walked on the boulevard or embankment before stopping to take those pictures. Given that I often walk at night– my days are taken up with work– all this was tough. Putting that information onto a map that often was in an unfamiliar language didn't make things easier, either.

But what turned me into a Flickr map fanatic? And what bigger lesson could that possibly hold?

The act of putting pictures on the Flickr map combines three different kinds of knowledge. First, it draws on your physical memory of travel and picture-taking. Second, it draws on your visual memory. And third, it connects those two kinds of knowledge and memory to a formal system, the logic of the map. Putting these together help you connect your personal, street-level view of a place with a higher-level, abstract understanding of it.

Consider picture-taking first. Like all forms of knowledge-creation, picture-taking is a physical activity as well as an intellectual or technical one, and that physicality can be something that helps fix in your memory the event of taking the picture. I have pictures of Wiamea Canyon, on the island of Kauai, that I can't look at without being reminded of a long drive, and the pleasant contrast between the warmth of the coast and the chilly interior. I'd probably have long forgotten those sensations without the picture, and without the sensations I'd have a harder time placing the picture; but both memories live together and reinforce each other. Often the order of pictures in a photo stream can be used to reconstruct an evening's path. Something in the distance in one picture is in the center of another, or a corner in one photo is turned in the next. With the visual cues that the photographs provide, combined with a few memories of turning down this street and that boulevard, and a couple landmarks as reference points, I can reconstruct my steps pretty accurately.

Flickr lets you put pictures on an ordinary street map, which is just a grid with street names, rivers, train lines, and the occasional park. Sometimes that's enough information; but when it's not, I switch to the satellite mode, which overlays aerial photographs atop the street map. I find that the satellite photographs let me establish much more precisely just where I was, what this photograph shows, and where it should go on the map. Without it, I can place pictures on the right block; with the satellite photos, I can get to within a few feet.

Of course, that requires knowing how to decode satellite photographs, and how to relate that information to my own experience. Figuring out how to connect what you see in your photograph to what's on a satellite picture is a skill that we didn't have to learn before. Unless you worked for the CIA or had a particularly sadistic geography teacher, you never had to make that connection; and until recently satellite photos weren't easy for ordinary people to get. You could think of the Flickr mapping tool as a giant machine that gives people the chance to learn how to read satellite pictures. Maybe it's a cartographic Ender's Game, training a generation of open-source spooks who twenty years from now won't be fooled by doctored military recon photos or what's really scant evidence of wrongdoing.

Translating the ground-eye view of a landmark or city grid into an aerial view isn't that hard, but it does need to be learned. London's Trafalger Square becomes a set of long shadows (Nelson's column) with a few shapes (the lions around it, the fountains nearby); Leicester Square, trees and park paths bordered by the blocky shapes of theatres. Sometimes you learn how big something really is ("Boy, Suntec City really is HUGE"); when I'm trying to find someplace I've reached by tai or subway, the satellite photos are the only way to find it. I've walked some parts of Copenhagen, for example, but there are some things– the new Information Technology University, for example– that I've only driven to; I don't know the ITU address, but because I know the shape of the building and have a pretty good sense of the buildings around it, I can find it on a satellite map.

Finally, putting the pictures on the map is a way to relate the personal experience and first person view to the formal, high-level view. They're my memories, organized; and organizing my memories builds my knowledge of– and arguably my understanding of– the place and how it's laid out. Given that I may post 500 pictures from a trip, and geocode almost all of them, the simple repetition of the exercise does a lot to fix in my mind what buildings are where, how places relate to each other, and what route I took when walking, say, from the Elizabeth Bridge to St. Stephen's Church in downtown Budapest.

Right now this kind of mapping is mainly fun (believe it or not) and educational, but it will really pay off in a couple years, when I can go back to city with my e-paper travel journal, equipped with wifi and GPS. So equipped, I'll be able to call up those pictures in situ: see what Piccadilly Square looked like the last time I was there, or see exactly where in Singapore I had those rice noodles so memorable I Fickred them. And I can see where I haven't been, since pictures serve as visual crumbs, dropped on the map to mark my earlier travels.

Sensecam, Alzheimer’s, and memory

An article in the New York Times on experiments using what sounds like Gordon Bell’s MyLifeBits technology “to help people with Alzheimer’s and other memory disorders.”

The concept was simple: using digital pictures and audio to archive an experience like a weekend visit from the grandchildren, creating a summary of the resulting content by picking crucial images, and reviewing them periodically to awaken and strengthen the memory of the event….

In Pittsburgh, researchers had Mr. Reznick go on three excursions with a Sensecam around his neck, and a voice recorder in his shirt pocket and a GPS unit. On one trip, he went to an exhibition of glass sculptures with his wife, Sylvia, his son and a granddaughter.

The Sensecam takes hundreds of pictures in a short period. When researchers began exploring it as a memory aid a few years ago, they had patients and caregivers look at all the pictures together.

I’m not surprised that scientists would be using this technology for Alzheimer’s patients. If today’s early adopters are twentysomethings, in the next couple decades we’re going to see a shift: the most augmented, information technology-intensive Americans are likely to be the elderly, who will be using these technologies (often embedded in more ordinary-looking everyday devices) to battle memory loss, stay independent, and of course post pictures to their Facebook accounts.

[To the tune of Emerson Lake & Palmer, “Tarkus,” from the album Live at the Wheeling Civic Center (November 12 1977) (a 2-star song, imo).]

Music, materiality, and memory

Music writer and candy fanatic Steve Almond (one of my wife's college classmates, interestingly) has a nice piece in the Boston Globe about music, materiality, and memory:

I start browsing the discs, and inevitably find one I haven’t heard in years and slip it onto the crappy boom-box I keep down there and pretty soon the record has transported me back to the exact time and place where I first fell in love with it. The physical object, in other words, becomes a time machine. And who in their right mind would throw away a time machine?

The younger generation has no romantic attachments to records as physical objects. To them, music exists as a kind of omnipresent atmospheric resource.

And it’s not that I begrudge them their online treasure troves or bite-size iPods. But I still miss the way it used to be, in the old days, when fans had to invest serious time and money to track down the album or song they wanted.

What I’m getting at here is a deeper irony: technology has made the pursuit of our pleasures much easier. But in so doing, I often wonder if it has made them less sacred. My children will grow up in a world that makes every song they might desire instantly available to them. And yet I sort of pity them that they will never know the kind of yearning I did.

As a young kid, before I could even afford records, I listened to the radio. I waited, sometimes hours, for the DJ to play one of the idiotic pop songs with which I’d (idiotically) fallen in love. And yet I can still remember the irrational glee I felt when the DJ finally did play "Undercover Angel" or "The Things We Do for Love."

Almond and I are the same age, and I completely get where he's coming from: I can still remember the pleasure of my favorite song finally coming on the radio, and rediscovering old music can sometimes be a Proustian experience.

But I don't feel like something is really lost by moving from one playback medium to another. Or rather, I understand why Almond feels that way, but it's not a universal for our generation.

Why do I think this? Maybe it's because, despite the audiophile's fetishization of the LP, I grew up in a pretty technologically heterogeneous musical environment: I had LPs, 45s, cassette tapes, a few 8-tracks, and of course the radio (AM and FM). The vinyl LP is the first edition book of the music world, the technological object that comes to stand for an era or cultural moment, and in so doing obscures all the other kinds of printed matter that surrounded us way back before personal computers but didn't have much cultural significance (who has mourned the decline of the Sears catalog in the age of the Web?). So when CDs came along, it was kind of just one more thing.

I also think Almond somewhat overplays the idea that for kids, "music exists as a kind of omnipresent atmospheric resource," as if it didn't for us. How many times did our parents say, "Turn that music down!" How many times did we choose a particular restaurant, or go to the pool, or hang out somewhere, partly because of the music? I don't remember music being a rare commodity when I was a kid. It might have been harder to make it completely private– to go out in public plugged into your own audio universe, the way my kids do with their iPods– but the music was definitely there.

Another reason my experience differs is that I don't have a gigantic record collection that I've built up over decades. I once had a lot of LPs. Then I replaced them with a lot of CDs. Then all my CDs got stolen (I love Berkeley!). Then I rebuilt my collection, and again have a lot of CDs.

So iTunes– and more recently things like Concert Vault– allowed me to rediscover a lot of music that I hadn't heard in decades. In other words, Almond and I have the same experience, only he has in his basement, and I have mine online. (There are virtues in deleting and forgetting, but on the whole I prefer rediscovery. Though you can't have the last without one of the first two, I suppose.)

But there's one other thing: as I discovered when I first upgraded to OS X and started dropping money into iTunes, finding an old song usually doesn't involve getting back in touch with something I hadn't heard in a long time. Just as often it's about rediscovering the music. As I discovered about five years ago,

When I was young, I always had pretty lousy stereo equipment– often just a portable AM/FM radio, or a $39 stereo from K-Mart– and it turns out that, even though I heard some of these songs a thousand times, there was a lot of detail I missed. Now I hear it. Twenty years later.

Though it won't be long before we start fondly remembering CDs or the early days of music on the Web…. Actually, MIT professor Henry Jenkins has already gotten nostalgic: years ago he compared Napster and iTunes, and argued that for his generation, the former was far superior. "iTunes is about music as commodity," he wrote. "Napster was about music as mutual experience. iTunes is about cheap downloads; Napster was about file sharing– with sharing the key word."

For me, the process of rediscovering music is more like the experience of reconnecting with people on Facebook than being transported back in time: yes, they have the same names as they did when they were in college (well, some of them have the same names), but they're not the same people– and neither are you. But it's still nice to hear from– or just hear– them.

[To the tune of Greg Lake, "In the Court of the Crimson King," from the album Live at the Hammersmith Odeon, London (November 5 1981) (I give it 3 stars).]

Memory and Megabytes online

Just found an online reprint of Ellen Ullman's wonderful 2003 essay "Memory and Megabytes," originally published in American Scholar. It's one of my favorite short pieces ever, and started me thinking about the differences between human and machine memory.

Though her recent New York Times op-ed on adoption and knowing your family history is great, too:

I am not against … the trend… toward openness, a growing “right” to know. I simply want to give not-knowing its due.

I like mysteries. I like the sense of uniqueness that comes from having unknown origins (however false that sense may be).

[To the tune of Dead Man's Bones, "My Body's a Zombie for You," from the album Anti Sampler Fall 2009 (I give it 1 stars).]

Deleting

About a year ago I wrote about Web 2.0 as a time machine for my generation, and my suspicion that "mine may be the last generation that has the experience of losing touch with friends." This concerned me because

when it comes to shaping identity, the ability to forget can be as important as the ability to remember. It's easy to implore people not to forget who they are; but sometimes, in order to become someone better, you need to forget a little bit.

Likewise,

Forgetting insults and painful events, we all recognize, is a pretty healthy thing for individuals: a well-adjusted person just doesn't feel the same shock over a breakup after ten years (if they can even remember the name of Whoever They Were), nor do they regard a fight from their childhood with anything but clinical detachment. Collectively, societies can also be said to make decisions about what they choose to remember, and how to act toward the past. Sometimes this happens informally, but has practical reasons: think of national decisions of avoid deep reflection on wars or civil strife, in the interests of promoting national unity and moving forward.

The idea that digital and human memory work differently, and that we fail to recognize the difference between the two at our peril, is something I've been writing about for a while. So I was very interested to see a review by Henry Farrell in Times Higher Education of Viktor Mayer-Schoenberger's new book Delete: The Virtue of Forgetting in the Digital Age. It sounds like a book I need to read… or at least footnote!

At its heart, his case against digital memory is humanist. He worries that it will not only change the way we organise society, but it will damage our identities. Identity and memory interact in complicated ways. Our ability to forget may be as important to our social relationships as our ability to remember. To forgive may be to forget; when we forgive someone for serious transgressions we in effect forget how angry we once were at them.

Delete argues that digital memory has the capacity both to trap us in the past and to damage our trust in our own memories. When I read an old email describing how angry I once was at someone, I am likely to find myself becoming angry again, even if I have since forgiven the person. I may trust digital records over my own memory, even when these records are partial or positively misleading. Forgetting, in contrast, not only serves as a valuable social lubricant, but also as a bulwark of good judgment, allowing us to give appropriate weight to past events that are important, and to discard things that are not. Digital memory – which traps us in the past – may weaken our ability to judge by distorting what we remember.

[To the tune of Sukhwinder Singh, "Marjaani Marjaani," from the album Saavn Celebrates Bollywood (I give it 3 stars).]

Stop what you’re doing, and listen to Feist’s “Inside and Out”

Yesterday I was lamenting the fact that the Bee Gees’ work is under-appreciated these days, that people are too distracted by the falsetto singing and the disco beats– which have hardened like an amber around a lot of good music– to recognize the great craft underneath.

I just discovered a brilliant exception to this rule: Feist’s cover of their “Inside and Out.”


Feist – Inside And Out
Uploaded by beautifulcynicMusic videos, artist interviews, concerts and more.

It may be even better than the original. Certainly it’s one of those covers that is a really interesting mix of classic and very contemporary elements. Actually, it reminds me of the work of Japanese soul singer Misia, who for my money makes Celine Dion look (and sound) like Suzanne Vega:

My corner pub

Last night, as I was having an exceptional second beer in 24 hours (I’d had the first with dinner, and then went to the gym and sauna, so I thought I could risk it), I briefly lamented the fact that when I lived in Berkeley, I had a corner pub– the wonderful, loud, and interesting Bison Brewery, where I’d go, have a pint or two, and write. I wasn’t exactly a regular– the bartenders and I didn’t know each others’ names– but I still enjoyed the place. I don’t have a pub here. I drink so little it would hardly be possible. Still, it seemed a bit of a shame.

Today, as my wife took the kids and their friends to the movies, I headed over to Cafe Zoë, to do some work. (I’m now at that age– or maturity– where I see that solitude is an opportunity, not the absence of others.) I’ve been coming here for years, when it was under different ownership. As I was ordering my chai latte, I read a sign they’d just put up announcing a loyalty program. Visit ten times, your next coffee is free– a deal I’ll be able to take advantage of approximately every four days, even when I’m not running a tab. “I should sign up for that,” I said.


via flickr

The owner– Zoë’s mother– said, “Oh, we’ll give you this drink for free. You’ve been here a lot more than ten times. You’re a regular.”

I guess I am.

Technology and imagination

One of the things I've been reconstructing in the cyberspace book is how going online was interpreted and experienced by people in the early days of the Web, and how the details of getting onto the Internet contributed to the feeling that you were going to another place when you went online. Trying to rebuild the experience years later, to pull together the different strands of practice and fiction and metaphor that people bring together when interpreting an experience, is a challenge, but one that I think is essential. Tonight I was struck by another example of how technology and our own mental worlds intersect.

This evening I set up a telescope and let the kids look at a couple planets, and after it rose over the garage, the Moon. My son's a bit young for the patience required for even simple stargazing– you have to be careful about touching the telescope and mount, to avoid creating vibrations that disturb the image. My daughter, however, quite enjoyed it: she kept saying, "That's really cool" when I showed her anything new, then proceeded to spin stories about what she was seeing, which is always a good sign that she's engaged.

When I was about nine or ten, I developed a passion for astronomy. I remember it being a special issue of National Geographic that sparked my interest; I don't know if that was really it, but for a time I read tons of astronomy books, devoured astronomy magazines, and had posters of the 1973 solar eclipse and the Andromeda galaxy on my wall. I didn't have a telescope, but we had a pair of binoculars that I would take out at night and look at the moon, major planets, and just random stuff– the dimmest stars I could see, areas where I knew Messier objects were, but were too faint for me to see.

My binoculars were pretty pathetic in objective optical terms, but they were plenty for me. I was reading a lot of science fiction at the time (my taste ran to Heinlein and Asimov– pretty canonical and, in retrospect, strangely unimaginative), and I could look at the dark space between stars and fill them with spaceships, or the Foundation, or an article about stellar evolution, of any number of other things. At that age, what you observe through the eyepiece is but a trigger to what you see in your brain. Indeed, I could get lost in space pretty easily, and usually had to be pulled back inside at bedtime. It wasn't the faint objects I saw in the binoculars that kept me going out: that was neat, but you could see much more impressive things in a book. And knowing that I was looking at the exact same things, but just a million times fainter, was usually more effective a reminder of how pathetic my own instruments were, not a thought that connected me to the universe. What I liked about stargazing wasn't just seeing stars; it was the fevered mental state it triggered, the combination of observing, recall (an accurate memory feels very powerful at a certain age), and imagination.

Years later, when I was writing my dissertation on eclipse expeditions, I would sometimes think back to that poster, and wonder if there was a direct line from my childhood interest to this project. Would I have written about the history of astronomy if I had developed an interest in butterflies or geology?

After about half an hour, my daughter went in, my son came back out, and after a couple minutes looking at the moon declared he was cold. It got colder and darker. I didn't notice anything. I was somewhere else.

Older posts

© 2017 Alex Soojung-Kim Pang, Ph.D.

Theme by Anders NorenUp ↑