Posts Tagged ‘navigation’

The affective intelligent driving agent!

Monday, January 11th, 2010

AIDA is part of the Sociable Car - Senseable Cities project which is a collaboration between the Personal Robots Group at the MIT Media Lab and the Senseable Cities Group at MIT. The AIDA robot was designed and built by the Personal Robots Group, while the Senseable Cities Group is working on intelligent navigation algorithms.

blocks_image_2_11.jpg

One of the aim of the project is to expand the relationship between the car and the driver with the goal of making the driving experience more effective, safer, and more enjoyable. As part of this expanded relationship, the researchers plan to introduce a new channel of communication between automobile and driver/passengers. This channel would be modeled on fundamental aspects of human social interaction including the ability to express and perceive affective/emotional state and key social behaviors.

In pursuit of these aims they have developed the Affective Intelligent Driving Agent (AIDA), a novel in-car interface capable of communicating with the cars occupants using both physical movement and a high resolution display. This interface is a research platform, which can be used as a tool for evaluating various topics in the area of social human-automobile interaction. Ultimately, the research conducted using the AIDA platform should lead to the development of new kinds of automobile interfaces, and an evolution in the relationship between car and driver.

Currently the AIDA research platform consists of a fully functional robotic prototype embedded in a stand-alone automobile dash. The robot has a video camera for face and emotion recognition, touch sensing, and an embedded laser projector inside of the head. Currently a driving simulator is being developed around the AIDA research platform in order to explore this new field of social human-automobile interaction. The researcher’s intention is that a future version of the robot based on the current research will be installed into a functioning test vehicle.

The robot is super cute, I wonder how it can be more distracting than it is, maybe it should be installed in the back with the kids as a baby sitter, kids would have a blast with it! Don’t miss this video!

Intelligent tactile materials for navigation

Sunday, January 25th, 2009

I recently investigated objects and textiles made in intelligent materials (mostly actuated) that modulate braille text. Texture and communication go hand in hand in braille, so we can neatly experiment in that direction with smart materials.

I recently searched on the web the type of commercial work done with braille, how do products combine tactile information to the design of everyday objects and I found mainly braille exit signs, signs that provide iconic information accompanied by two forms of text: visual and tactile. It is inspiring to see that companies explore Braille in combination to emergency. This specific company for instance is specialized in exit and emergency lighting. I am curious to know how does these systems translate to a visually impaired person? I am always intrigued when I go to Harvard and play with the tactile classroom signs around campus, it is both informational and pleasurable.

fire-sign-braille.gif

Wouldn’t that be neat if this kind of electrical emergency lights could behave with sonic and tactile information? This could be a next step for companies who design signs for navigation and emergency situations. Exploring haptic information in garments as a mean to convey both meaning, comfort and entertainment, commercial work could draw on this type of research exploration and go beyond the scope of static tactile text for navigation.