Who's in Charge of the Augmented City?
I’m sitting at New York University (NYU), in a Brooklyn Tandon School of Engineering building with wires sprawling across my body while I stare at a 98-inch TV screen. Glasses with inward-facing cameras are tracking my pupils; plaster wrapped around my finger is measuring my perspiration; cheek pads are monitoring whether or not I’m smiling; another device looks at my heart-rate; and an electroencephalogram (EEG) headset—a contraption with wet squidgy nodes on the end of prongs, similar to a head massager—monitors my brain activity. All while my face is being filmed.
The researchers want to quantifiably measure my response and stress levels to two slightly different virtual environments. On the screen I am looking at a basic scene created in Google SketchUp. Mock-ups of an existing NYU building are identical minus a couple of minor differences: one has wider windows and lighter wall colors. Within the two environments I have to navigate my way up a set of stairs onto a mezzanine level, open a door, switch on a thermostat, and return.