Last week I was on the Wall Street Journal Web site, talking about Z2K, the New Year's Eve Zune collective meltdown. One of the really outstanding, subtle points I made (ha!) which unfortunately were left on the cutting room floor (or these days, the studio's server) is that the problem illustrates how we use certain words to describe functions that both humans and machines perform, but which we perform in very different ways– and this ends up creating problems for us sometimes.

Take for example the concept of "memory." People and computers both have it, but at a fundamental level, human memory functions very differently than computer memory, and in some ways using the same term for the two obscures more than it connects. Yes, I remember my phone number and computer passwords (barely); but my memory is an active thing. I'm constantly in the process of updating and reinterpreting old memories– putting personal things in new contexts, sifting through recollections for some shaft that can illuminate a current dilemma. Part of my "memory" consists of facts; but a much bigger part of it consists of things I know how to do. (When you get amnesia, you may forget who you are, but you don't forget how to walk. Traumas affect different forms of memory differently.)

So human memory is lots of different things– procedural, descriptive, somatic– and is not just information stored and retrieved. With computers, however, "memory" is something much more specific: it's about exact recall of specific things. If my computer gets imaginative with my credit card number, I'm in trouble. Indeed, computer memory is supposed to possess a kind of exactitude and reliability that our memories don't: it's supposed to be strong where human memory if frail.

To some degree, the Z2K problem might have been influenced by a failure to recognize that humans and computers deal very differently with something else that we describe using a common word: namely, time. One early theory held that the extra second in the year threw off Zunes. Humans aren't going to notice one second more or less in a year, but computers will. More generally, we think of time (or timekeeping, more precisely) as something that is absolute and completely standardized, like units of volume or the boiling point of water (well, Hasok Chang's brilliant book Inventing Temperature undermines that last idea): time marches on, time never changes.

Only it doesn't exactly. Years get a tiny bit longer or shorter, depending. The calendar isn't a unit of measure: it's more like a suit that's taken up and let out. For humans, those differences usually aren't an issue; but they can be a problem for computers. (What was the Y2K threat but a reflection of how computers and humans deal differently with time?)

This problem also applies to describing activities people engage in online and offline. For example, marketing guy Dave Evans writes:

You've no doubt heard of people with hundreds, thousands, or tens or hundreds of thousands of online friends… [O]bviously, when someone "friends" a hundred or a thousand people online, clearly they are thinking about something very different than a traditional friendship. [Thanks to Zoë for the link.]

Personally, I think the word "friend" means very different things in Facebook versus real life. Yes, there are people I know on Facebook who I see in my daily life, but it would be better if we had a different term than "friends" to describe people with whom we only interact on services like FB or MySpace or LinkedIn.

Update: And speaking of memory, it turns out I've written on this before:

One of the reasons that predictions about the death of the library, office, workplace, book, etc. in the age of the Internet have not come to pass is that these older institutions or technologies had uses that went beyond the strictly functional ones of information processing, storage, retrieval, etc.. Offices, for example, aren't just places where knowledge workers move zeros and ones around; the good ones are creative spaces that offer workers access to unique stores of informal knowledge.

A second reason that bits haven't triumphed over atoms is that some activities or functions look the same when done by computers (or networks) and people (or institutions), but turn out to have subtle but critical differences. Two examples recently crossed my radar.