(Deepest Apologies to the ghost of Andre Malraux)
The Scholarly Kitchen writes today about the promise of human interfaces that would scan for information embedded in the environment and pass along relevant data to the user via a cell-phone, special glasses, etc. I’m not interested in having electrodes attached to me to allow a stranger to strike up a casual conversation (“hey, buddy, do you know that we were both born in February and have light brown hair?”). But i’m very interested in technologies that could enhance users’ experiences of museums and move us away from relying on wall text, for example.
It’s easy to imagine applying an RFID chip to an object–anything from a painting to a piece of furniture to a piece of a building–that would pass along some basic info to somone in the vicinity holding a small hand-held device. I had assumed that something like this would have to be done via geo-location, and imagined visitors wandering around with little GPS devices and aiming them at things they wanted to learn about. The problem, of course, is that much of the museum world is indoors, where satellite-based GPS doesn’t work. There’s also the problem of associating precise geo-spatial information, including altitude, with every object, and the need to update that information whenever a thing moves. With an RFID tag, the metadata travels with it.
We’re already building the databases, now we just need to attach the stickers (with conservators’ help, of course).