Critics of computers in education have sometimes argued that computers prevent students from developing intellectual or manual skills that older media encouraged. The complexity and razzle-dazzle of computers, they contend, require students (and teachers) to concentrate more on hardware and software, and less on the actual subject they're supposed to be studying. This might provide a sense of accomplishment and sense that you're learning something, but diverts attention away from what you're not learning.

This latter effect is particularly insidious because there are subtler benefits to older media that are inadvertently lost in the rush to computerized pedagogy. The argument is probably made most forcefully in the humanities: the patience and concentration required to read serious literature can't be acquired by sitting in front of a computer screen, nor do the imaginative faculties develop as sharply when a student can hop between, say, a page of Pride and Prejudice and Keira Knightley and Rosamund Pike giggling under the covers after the ball at Nedderfield.

But it's hardly restricted to the English Lit. When I was in high school, math and science teachers were arguing (mainly with us students, and to a lesser degree with each other) over the use of calculators in class. While the ease of using a calculator was undeniable, teachers argued that learning how to solve equations by hand gave you a better feeling for math, and taught patience and attention to detail.

Likewise, some engineering faculty argue that CAD programs may let students work quickly, but prevent them from developing the feel for objects that came from old-fashioned drafting. In last Friday's Financial Times, architecture critic Edwin Heathcote made a similar argument, urging architects to get "back to the drawing board"– not because computers are inherently bad, but because there are important skills you never learn if your education mainly takes place on a screen.

Although still only in my (late) thirties I was among the last architects to qualify without undergoing training in computer-aided design (CAD). When I started out, architects‚ offices were still cluttered with drawing boards and trolleys full of different-gauge pens, ink, scales, set-squares, even razor blades, which we used to erase mistakes on tracing paper. A decade later the drafting boards were turning up in skips and the plan chests were in antique shops, as quaint and obsolete as washstands. Like the machinery of the industrial revolution, computers were supposed to strip away the drudgery, leaving architects free to soar, to spend their time sketching on napkins during long lunches. Instead, architects‚ offices became satanic mills of banks of frazzled youngsters gazing at screens.

I'm overstating the case to make a point – computers have become an indispensable part of the construction process, whether for homes, offices or more ambitious projects, and we won't see the back of them until something better comes along. But in my opinion they have created a dangerous fissure between the brain and the hand which, far from leading to a utopian world of clinical perfection in the messy building process, is leading to an insidious and serious diminution of quality and thought in architecture.

Computers may be efficient at processing complex data, but they are far from efficient in the creative process. Sketching is not only practical but essential. It is the quickest, most accessible way to find out if a space, a vista, a progression can work and also to communicate it to others. It is the fundamental link between brain and hand transferred direct on to paper without interference by binary codes. It is a direct process, and it is the most human way of generating ideas. Once students begin to ignore the sketch and mediate their designs through data, the most basic developmental tool is lost. Cyber space replaces real space, and the envisioning of architecture will increasingly ensure that real environments resemble nothing except other artificial environments – malls, theme parks, huge hotels and supermarkets rolled out in cloning programs – safe, easily reproducible, easily surveyed virtual space impinging on the real world.

It strikes me, though, that many of these criticisms rest on an assumption that dealing with computers automatically divorces you from the real world; that the seductive universe of zeroes and ones pulls your attention away from the messy world of atoms and people; and that the character of students' interactions with computers are very different from those with paper, ink, compass, or modeling clay.

But is that necessarily true? More to the point, is it going to be true in the future?