Archive for the ‘research’ Category

Making the Invisible Visible in Video

Thursday, June 21st, 2012

MIT researchers — graduate student Michael Rubinstein, recent alumni Hao-Yu Wu ‘12, MNG ‘12 and Eugene Shih SM ‘01, PhD ‘10, and professors William Freeman, Fredo Durand and John Guttag — will present new software at this summer’s Siggraph, the premier computer-graphics conference, that amplifies variations in successive frames of video that are imperceptible to the naked eye.

See the researchers’ full video and learn more on the project’s webpage:

The next step after Clocky, Catapy!

Wednesday, November 16th, 2011

Go Catapy, go!

Catapy from Yuichiro Katsumoto on Vimeo.

At UIST this Monday: Scopemate, a robotic microscope!

Monday, October 17th, 2011

I am at UIST this Monday to present one of my project along with my mentor Paul Dietz since I joined Microsoft Applied Sciences Group. It is a very quick but efficient solution for the ones who like to solder small components!

Scopemate is a robotic microscope that tracks the user for inspection microscopy. In this video, we propose a new interaction mechanism for inspection microscopy. The novel input device combines an optically augmented web-cam with a head tracker. A head tracker controls the inspection angle of a webcam fitted with ap-propriate microscope optics. This allows an operator the full use of their hands while intuitively looking at the work area from different perspectives. This work was done by researchers Cati Boulanger and Paul Dietz in the Applied Sciences Group at Microsoft and will be presented at UIST 2011 this Monday as both a demo and a poster!


The affective intelligent driving agent!

Monday, January 11th, 2010

AIDA is part of the Sociable Car - Senseable Cities project which is a collaboration between the Personal Robots Group at the MIT Media Lab and the Senseable Cities Group at MIT. The AIDA robot was designed and built by the Personal Robots Group, while the Senseable Cities Group is working on intelligent navigation algorithms.


One of the aim of the project is to expand the relationship between the car and the driver with the goal of making the driving experience more effective, safer, and more enjoyable. As part of this expanded relationship, the researchers plan to introduce a new channel of communication between automobile and driver/passengers. This channel would be modeled on fundamental aspects of human social interaction including the ability to express and perceive affective/emotional state and key social behaviors.

In pursuit of these aims they have developed the Affective Intelligent Driving Agent (AIDA), a novel in-car interface capable of communicating with the cars occupants using both physical movement and a high resolution display. This interface is a research platform, which can be used as a tool for evaluating various topics in the area of social human-automobile interaction. Ultimately, the research conducted using the AIDA platform should lead to the development of new kinds of automobile interfaces, and an evolution in the relationship between car and driver.

Currently the AIDA research platform consists of a fully functional robotic prototype embedded in a stand-alone automobile dash. The robot has a video camera for face and emotion recognition, touch sensing, and an embedded laser projector inside of the head. Currently a driving simulator is being developed around the AIDA research platform in order to explore this new field of social human-automobile interaction. The researcher’s intention is that a future version of the robot based on the current research will be installed into a functioning test vehicle.

The robot is super cute, I wonder how it can be more distracting than it is, maybe it should be installed in the back with the kids as a baby sitter, kids would have a blast with it! Don’t miss this video!

Gesture Objects: movie making at the extension of natural play

Thursday, November 19th, 2009

I passed my PhD critique successfully! My committee: Hiroshi IshiiEdith Ackermann and Cynthia Breazeal. I will now focus on few more studies and building few more projects as much as I can before graduating (in 9 months). A little bit on my presentation …


Gesture Objects: Play it by Eye - Frame it by Hand!

I started with my master thesis Dolltalk, where I establish the ability to access perspective as part of gesture analysis built into new play environments. I then, move into a significant transition phase, where I research the cross-modal interface elements that contribute to various perspective taking behaviors. I also present new technologies I implemented to conduct automatic film assembly.


The structure of my presentation

At each step, I present the studies that allow me to establish principles which I use to build the final project, the centerpiece of my third phase of research, Picture This. At its final point, Picture This is a fluid interface, with seamless integration of gesture, object, audio and video interaction in open-ended play.


With Picture This! children make a movie from their toys views, using their natural gestures with toys to animate the character and command the video making assembly. I developed a filtering algorithm for gesture recognition through which angles of motions are detected and interpreted!

Finally, I developed a framework that I call “gesture objects” synthesizing the research as it relates to the field of tangible user interfaces.


Gesture Objects Framework: In a gesture object interface, the interface recognizes gestures while the user is holding objects and the gesture control of those object in the physical space influences the digital world.

A .pdf of my slides!

Play-it-by-eye! Collect movies and improvise perspectives with tangible video objects

Monday, June 22nd, 2009

My journal paper Play-it-by-eye! Collect movies and improvise perspectives with tangible video objects is now published at Cambridge University Press!

The paper in .pdf ->here<-

We present an alternative video-making framework for children with tools that integrate video capture with movie production. We propose different forms of interaction with physical artifacts to capture storytelling. Play interactions as input to video editing systems assuage the interface complexities of film construction in commercial software. We aim to motivate young users in telling their stories, extracting meaning from their experiences by capturing supporting video to accompany their stories, and driving reflection on the outcomes of their movies. We report on our design process over the course of four research projects that span from a graphical user interface to a physical instantiation of video. We interface the digital and physical realms using tangible metaphors for digital data, providing a spontaneous and collaborative approach to video composition. We evaluate our systems during observations with 4- to 14-year-old users and analyze their different approaches to capturing, collecting, editing, and performing visual and sound clips.

Do it yourself tangible systems!

Wednesday, April 15th, 2009

This is MIT Media Lab’s open house and we show our latest demos, ideas, research. Adam Kumpf impressed all of us with the Trackmate initiative, an open source system he designed to create an inexpensive, do-it-yourself tangible tracking system.

For over 20 years researchers have been looking at ways to go beyond the mouse and keyboard to interact with computers. One of the most promising areas has been tangible user interfaces; physical objects directly coupled with digital information. These new interfaces have typically required expensive technologies and complex installation procedures, limiting them to the context of specialized research labs and museums.

Trackmate is an open source initiative to create an inexpensive, do-it-yourself tangible tracking system. The Trackmate Tracker allows any computer to recognize tagged objects and their corresponding position, rotation, and color information when placed on a surface. Trackmate sends all object data via LusidOSC (a protocol layer for unique spatial input devices), allowing any LusidOSC-based application to work with the system.

Adam designed a special barcode system that allows the object to be detected when rotated. It is pretty neat as it allows not only to distinguish between objects (280 trillion unique IDs are possible), but to be able to identify their rotation.

This opens a world of application and my next project will make use of this brilliant technology. The project is open source and its components can be downloaded here.


Trackmate :: 5 ways to get started from adam kumpf on Vimeo.

Posted by Cati Vaucelle @ Architectradure

[tags] trackmate, DIY, Adam Kumpf, MIT, Media Lab, Tangible Interfaces, TUI, barcode, open source [/tags]

A Design for Extimacy and Fantasy-Fulfillment for the World of Warcraft Addict

Wednesday, February 25th, 2009

April 13th 2009 I will give a talk and participate in a panel organized at MIT Museum based on my idea of designing a WOW pod for addicted players! Design that we will have finished this week (more pictures soon…). It should be fun and y’aura du beau monde!

On the WOW Pod: A Design for Extimacy and Fantasy-Fulfillment for the World of Warcraft Addict.

A panel discussion about the inducement of pleasure, fantasy fulfillment, and the mediation of intimacy in a socially-networked gaming paradigm such as World of Warcraft (WOW). Participants include Raimundas Malauskas (curator, Artists Space, NY); Visiting Scientist Jean Baptiste LaBrune (Media Lab); Laura Knott (Associate Curator, MIT Museum); MIT Gambit Lab researcher (TBA), and co-artists Marisa Jahn, Steve Shada, and Cati Vaucelle.

The WoW pod at Mixer in New York!!!

Monday, February 23rd, 2009

After receiving three grants: from the Council for the Arts at MIT, SHASS’s Peter de Florez Fund for Humor and from Eyebeam, the WoW Pod will be exhibited during the MIXER event in New York!

Cati Vaucelle, Steve Shada, Marisa Jahn’s WoW Pod is an immersive architectural space that provides and anticipates all life needs of the World of Warcraft player. Outfitted with toilet throne, hydration system, and meals at the ready, the WoW Pod makes daily human function possible without ever stepping away from the game. In addition, these tasty meals are cooked via a cookset that connects a hotplate to the computer, allowing the player to let their World of Warcraft avatar know when the meal is ready to eat.


The official call!
Eyebeam presents an alternate “World’s Fair” with airborne surveillance balloons, guerilla media towers, and computerized prayer booths. A temporary village occupied by a dozen creatively engineered pavilions, performances, and DJ sets by Tim Sweeney and Juan Maclean.

Friday, March 6 & Saturday, March 7, 2009
9PM - 2AM
Tickets: $15 per night in advance; $30 for both nights in advance at; $20 per night at the door.
Eyebeam 540 W. 21st St. (btw 10th and 11th Aves.)
Limited press passes available: RSVP:
Installations will remain on view at Eyebeam, Saturday, March 7, Noon - 6PM, with free entry.

New York City, February 20, 2009 - MIXER, Eyebeam’s quarterly event series dedicated to showcasing leading artists in the fields of live audiovisual performance, interactive and participatory art, will present its fifth iteration on Friday, March 6 - Saturday, March 7, 2009. Using the World’s Fair as the framework, Eyebeam will transform its rugged warehouse space into a temporary village of utopian pavilions for a two-night extravaganza called MIXER: EXPO.

Both evenings will include musical guests: Tim Sweeney (Friday, Midnight - 2AM) and Juan Maclean (Saturday, Midnight - 2AM); multimedia pavilions by Angela Co + Aeolab, Anakin Koenig, Chris Jordan, and Caspar Stracke, and Not An Alternative; interactive installations by Taeyoon Choi and Cheon pyo Lee, The Institute for Faith-Based Technology, Mark Shepard, Cati Vaucelle, Steve Shada, and Marisa Jahn; and fashion performances by Di Mainstone.

MIXER: EXPO - Background

From London in 1851 to Chicago in 1893 and New York in 1939, the World’s Fair has been an influential cultural spectacle that promised a utopian “world of tomorrow” while packaging and promoting the national and corporate agendas of the day.

MIXER: EXPO is an alternate take on “World’s Fair” expositions, a faded cultural phenomenon that set the tone for urban planning in the 19th and 20th centuries. The World’s Fair also championed the philosophy of better living through technology, presenting innovative strategies that continue to resonate through contemporary life and leisure - from shopping malls and theme parks to natural history and science museums; broadcast media and exhibit display to sell consumer products, technological innovations, and nationalistic ideologies.

Like the best science fiction and social satire, MIXER: EXPO constructs a fictitious place in order to examine a world that might have been, that has come to be, or that might be on the horizon.

Musical Acts
Friday, Midnight - 2AM
Tim Sweeney (Beats in Space) is a respected international club DJ, remixer, and host of Beats In Space, a weekly radio show mixed live every Tuesday night on WNYU. Sweeney rocks the party with a mix of electro, disco and No Wave.

Saturday, Midnight - 2AM
Juan Maclean (DFA Records) first garnered attention in the early 90s as the guitarist/keyboardist for electro-punk band Six Finger Satellite, but has gone on to wider acclaim in the last decade as a solo artist on DFA Records (founded by James Murphy of LCD Soundsystem). Maclean’s recordings combine his multi-instrumental virtuosity with tight beat production inspired by house, techno, and funk classics. His DJ sets dig deeply into the same vault of musical riches.

Installations / Participating Artists
Taeyoon Choi and Cheon pyo Lee’s sculptural installation and performance,
Grey Belt tells the story of an undiscovered nation located in a demilitarized zone. The land of Grey Zone is the world’s purest natural site, secretly inhabited by mutant animals, abandoned war machines and the exiled living in a zero-gravity landscape.

Angela Co + Aeolab’s Weather Making Balloon utilizes NASA materials technology for its own “Space Mission”. The metalized thermoplastic skin of the Balloon functions as a mirrored surface through which attendees can be monitored and captured on film. Playful interaction with the responsive surface of the puffy, cloud-like Balloon masks its primary function as a surveillance tool.

The Institute of Faith-Based Technology, or InFaBat™, was founded in 2006 by techno-theologists Aaron Meyers and Jeff Crouse to bring religion into the digital age. Praying@Home is the name of a suite of technologies developed by InFaBat™ and installed for use at Eyebeam, which is designed to broadcast a worshipper’s “Prayer Signature” directly to God. Unlike humans, who need to take breaks from praying to fulfill biological needs, computers need no breaks, resulting in 24/7 prayer output. Praying@Home represents a revolutionary breakthrough in the field of Digital Prayer Technology.

Media artists Anakin Koenig, Chris Jordan, Caspar Stracke pay tribute to the “retro-futurist” utopian dwellings of the 20th Century with TripleFlow, a large-scale inflatable architectural structure. Referencing Buckminster Fuller’s geodesic dome, the three-chamber biomorphic dwelling creates a fluid, immerse experience through responsive lighting, and live audio and video performance by Jordan and Stracke.

The nomadic-citizen of the world is never lost because she is always at home. Di Mainstone’s SHAREWEAR questions this utopian ideal, through a performance that incorporates a set of modular dresses that explore our desire for a connection to “home” in an increasingly transient world. Referencing familiar icons of the home, such as the armrest on our favorite sofa, SHAREWARE is comprised of a pair of modular electronic dresses housed in crates that are unpacked, assembled on each performer’s body, and then physically slotted to one another, unleashing the potential for intimate interactions.

The Subsumption Machine by activist collective Not An Alternative is a skeletal multi-level media tower hacked with video projections, TV monitors, billboards, stage sets, live video feeds, and surveillance cameras. As the audience walks through the chaotic architectural structure, they are captured on camera and unwittingly inserted into the media stream. The Subsumption Machine represents the postmodern dystopian world as a biopolitical “prison house of language”, and in a Warholian gesture, flattens all images into a non-hierarchical supersaturated mix.

Hertzian Rain is a wireless audio broadcast system designed by Mark Shepard that responds to bodily movement. Just as land and water are limited resources, Hertzian Rain demonstrates the limits of the electromagnetic spectrum. Wearing wireless headphones and carrying an umbrella covered with electromagnetic-shielding fabric, users walk around the exhibition space tuning into an audio broadcast of a live music performance while creating interference into the audio broadcast signal with the umbrella, and as a result destroy the shared resource. Live performances will be provided by Doug Barret, Craig Shepard, Daniel Perlin, Al Laufeld and others.

Cati Vaucelle, Steve Shada, Marisa Jahn’s WoW Pod is an immersive architectural space that provides and anticipates all life needs of the World of Warcraft player. Outfitted with toilet throne, hydration system, and meals at the ready, the WoW Pod makes daily human function possible without ever stepping away from the game. In addition, these tasty meals are cooked via a cookset that connects a hotplate to the computer, allowing the player to let their World of Warcraft avatar know when the meal is ready to eat.

Tickets: $15 per night in advance; $30 for both nights in advance; $20 per night at the door. For more info and to purchase tickets visit


Founded in 1997, Eyebeam is an art and technology center that provides a fertile context and state-of-the-art tools for digital experimentation. It is a lively incubator of creativity and thought, where artists and technologists actively engage with the larger culture, addressing the issues and concerns of our time. Eyebeam challenges convention, celebrates the hack, educates the next generation, encourages collaboration, freely offers its output to the community, and invites the public to share in a spirit of openness: open source, open content and open distribution.

More info

Posted by Cati Vaucelle @ Architectradure

[tags] art, interaction-design, world of warcraft, technology, installation [/tags]

Papers at Chi 2009!

Tuesday, February 3rd, 2009

Yeah!! The two papers I wrote for CHI 2009 were accepted this year! One paper is entitled Design of Haptic Interfaces for Therapy the second one, a work in progress, is called Cost-effective Wearable Sensor to Detect EMF

Design of Haptic Interfaces for Therapy
Touch is fundamental to our emotional well-being. Medical science is starting to understand and develop touch-based therapies for autism spectrum, mood, anxiety and borderline disorders. Based on the most promising touch therapy protocols, we are presenting the first devices that simulate touch through haptic devices to bring relief and assist clinical therapy for mental health. We present several haptic systems that enable medical professionals to facilitate the collaboration between patients and doctors and potentially pave the way for a new form of non-invasive treatment that could be adapted from use in care-giving facilities to public use. We developed these prototypes working closely with a team of mental health professionals.

Download the .pdf ->here<-


Cost-effective Wearable Sensor to Detect EMF .
This other paper is a work in progress, based on a circuit design that I did for the class of Joe Paradiso (co-author). Even though many designers have explored wearable EMF displays, I implemented an electric field sensor that is low-cost, this to democratize EMF reading.

Download the .pdf ->here<-


Chi will be in Boston this year, so that means lots of visits and parties and hang out with old friends!

Posted by Cati Vaucelle @ Architectradure
Blog Jouons Blog Maison Blog Passion