Category: interaction design

  • 22JunPlay-it-by-eye! Collect movies and improvise perspectives with tangible video objects

    My journal paper Play-it-by-eye! Collect movies and improvise perspectives with tangible video objects is now published at Cambridge University Press!

    The paper in .pdf ->here<-

    We present an alternative video-making framework for children with tools that integrate video capture with movie production. We propose different forms of interaction with physical artifacts to capture storytelling. Play interactions as input to video editing systems assuage the interface complexities of film construction in commercial software. We aim to motivate young users in telling their stories, extracting meaning from their experiences by capturing supporting video to accompany their stories, and driving reflection on the outcomes of their movies. We report on our design process over the course of four research projects that span from a graphical user interface to a physical instantiation of video. We interface the digital and physical realms using tangible metaphors for digital data, providing a spontaneous and collaborative approach to video composition. We evaluate our systems during observations with 4- to 14-year-old users and analyze their different approaches to capturing, collecting, editing, and performing visual and sound clips.


  • 01NovCup communicator

    If you’re new here, you may want to subscribe to my RSS feed to receive the latest Architectradure’s articles in your reader or via email. Thanks for visiting!

    cupcom_06_small.jpg

    cupcom_01_small.jpg

    Cup communicator by Duncan Wilson. Tug the cord to activate, squeeze to talk and hold to the mouth and ear.

    The design of the Cup Communicator is focused on the gesture of use and the relationship between the users and object. I aim to explore the potential of the product as a medium for interaction and reassess the way we use technology.

    The form and function of the Cup Communicator refer to the two-cans and string’ children’s toy and the physical factors involved with that device. This typology and its associations remind us of the magic and playfulness of our first communication devices.

    cupcom_02_small.jpg

    cupcom_03_small.jpg

    cupcom_04_small.jpg


  • 19NovGesture Objects: movie making at the extension of natural play

    If you’re new here, you may want to subscribe to my RSS feed to receive the latest Architectradure’s articles in your reader or via email. Thanks for visiting!

    I passed my PhD critique successfully! My committee: Hiroshi Ishii, Edith Ackermann and Cynthia Breazeal. I will now focus on few more studies and building few more projects as much as I can before graduating (in 9 months). A little bit on my presentation …

    1.jpg

    Gesture Objects: Play it by Eye – Frame it by Hand!

    I started with my master thesis Dolltalk, where I establish the ability to access perspective as part of gesture analysis built into new play environments. I then, move into a significant transition phase, where I research the cross-modal interface elements that contribute to various perspective taking behaviors. I also present new technologies I implemented to conduct automatic film assembly.

    3.jpg

    The structure of my presentation

    At each step, I present the studies that allow me to establish principles which I use to build the final project, the centerpiece of my third phase of research, Picture This. At its final point, Picture This is a fluid interface, with seamless integration of gesture, object, audio and video interaction in open-ended play.

    2.jpg

    With Picture This! children make a movie from their toys views, using their natural gestures with toys to animate the character and command the video making assembly. I developed a filtering algorithm for gesture recognition through which angles of motions are detected and interpreted!

    Finally, I developed a framework that I call “gesture objects” synthesizing the research as it relates to the field of tangible user interfaces.

    4.jpg

    Gesture Objects Framework: In a gesture object interface, the interface recognizes gestures while the user is holding objects and the gesture control of those object in the physical space influences the digital world.

    A .pdf of my slides!

  • 11JanThe affective intelligent driving agent!

    If you’re new here, you may want to subscribe to my RSS feed to receive the latest Architectradure’s articles in your reader or via email. Thanks for visiting!

    AIDA is part of the Sociable Car – Senseable Cities project which is a collaboration between the Personal Robots Group at the MIT Media Lab and the Senseable Cities Group at MIT. The AIDA robot was designed and built by the Personal Robots Group, while the Senseable Cities Group is working on intelligent navigation algorithms.

    blocks_image_2_11.jpg

    One of the aim of the project is to expand the relationship between the car and the driver with the goal of making the driving experience more effective, safer, and more enjoyable. As part of this expanded relationship, the researchers plan to introduce a new channel of communication between automobile and driver/passengers. This channel would be modeled on fundamental aspects of human social interaction including the ability to express and perceive affective/emotional state and key social behaviors.

    In pursuit of these aims they have developed the Affective Intelligent Driving Agent (AIDA), a novel in-car interface capable of communicating with the cars occupants using both physical movement and a high resolution display. This interface is a research platform, which can be used as a tool for evaluating various topics in the area of social human-automobile interaction. Ultimately, the research conducted using the AIDA platform should lead to the development of new kinds of automobile interfaces, and an evolution in the relationship between car and driver.

    Currently the AIDA research platform consists of a fully functional robotic prototype embedded in a stand-alone automobile dash. The robot has a video camera for face and emotion recognition, touch sensing, and an embedded laser projector inside of the head. Currently a driving simulator is being developed around the AIDA research platform in order to explore this new field of social human-automobile interaction. The researcher’s intention is that a future version of the robot based on the current research will be installed into a functioning test vehicle.

    The robot is super cute, I wonder how it can be more distracting than it is, maybe it should be installed in the back with the kids as a baby sitter, kids would have a blast with it! Don’t miss this video!


  • 11DecVideo scenarios of tech-enabled behaviors

    If you’re new here, you may want to subscribe to my RSS feed to receive the latest Architectradure’s articles in your reader or via email. Thanks for visiting!

    Mobility platform videos for Intel by Ideo.

    Visualizing high-tech’s human-centered future by Ideo.

    Tapping into the growing broadband access around the world, Intel has innovated a series of chipsets and mobile platforms that enable smarter, more efficient laptops, mobile phones, and PDAs. While these offerings represent new possibilities for original equipment manufacturers (OEMs), it’s not uncommon for OEMs to need an added push to invest in unproven technology. To bring to life the potential of this technology, IDEO worked with Intel to visualize what user behaviors might be enabled by the offerings. The work, which culminated in the form of three videos, points to Intel’s larger shift toward human-centered technology innovation.

    In looking at Intel’s next-generation products, IDEO had a clear sense of the emphasis on mobility. From a design perspective, the offerings were each exciting, offering new ways for people to live and work, but lacked the cohesion of a system. To integrate the platforms, IDEO developed user scenarios that merged product, interaction, and experience as they related to such behaviors as hands-free communication, social networking, and purchasing. These user scenarios were then fully storyboarded and scripted for video production.

    In depicting user behaviors through video scenarios, IDEO created a unified vision for Intel’s offerings that has served to foster alignment within Intel as well as to communicate product value to OEMs and service providers.

    By Architectradure


  • 19MarYour umbrella is your better sword!

    As a child I empowered my toys with all kinds of will of their own. We all have one day stood up on a chair pretending it was a boat crossing Niagara Falls. Well Yuichiro Katsumoto did it. He creates objects that become anything you want with noises. By combining common household commodities with computers, Yuichiro Katsumoto works on ubiquitous computing. He created a set of daily objects that give our everyday lives a whimsical spin.



    He created Amagatana an umbrella for enjoying a blissful walk after a rain.

    Amagatana is a mystical sword for enjoying the blithe feeling after the rain. When you swing Amagatana, you can hear the sound of swords clashing from the headphone. Amagatana seems to be just a plastic umbrella. You also seem just like a cheerful person when you are playing Amagatana. However, the umbrella exists beautifully in your hand as a “sword”. On your way home, Amagatana offers you the world of make-believe. Then, you will be able to get a feel for heroes of comics, cartoon, and video games. It’s your own pleasure, which nobody can notice.

    He also created Fula, a muffler for warming the user’s body and soul on a cold winter’s day. Ordinary mufflers protect our body against the cold by blocking the cold air. However, Fula warms the user by encouraging him to physically move, by fluttering its fabric in a superhero-like manner in response to the user’s muscular contractions. Through acting in accordance to the flutter, the user can warm his body, and by seeing the reflection of his heroic self in store windows, his soul as well. Fula can also be used in conjunction with Amagatana, to have the two interact together.

    I found this awesome video on Youtube. You gotta watch this video. It is too funny.

    Posted by Cati Vaucelle @ Architectradure

    …………………………………………………………………………………

    Blog Jouons Blog Maison Blog Lesson


  • 26JunHungry stuffed animals

    If you’re new here, you may want to subscribe to my RSS feed to receive the latest Architectradure’s articles in your reader or via email. Thanks for visiting!

    The Hungries by Dana Gordon is a family of plush monsters with an impressive appetite for RFID. Each Hungry has a different personality expressed with its own unique voice. It listens to the secrets whispered to its ear. When its arm is pulled, it plays them back in its own voice. When they are fed to one another, their voices are mixed in a burp-like way.


  • 30JulThe SynchroMate

    If you’re new here, you may want to subscribe to my RSS feed to receive the latest Architectradure’s articles in your reader or via email. Thanks for visiting!



    The SynchroMate fits snuggly in the palm of one’s hand (…) it encourages serendipitous synchronous interaction by indicating when a message is being composed for you by a distant companion through gentle vibrations and pulsing concentric circles of lush colors on the display

    SynchroMate: A Phatic Technology for Mediating Intimacy, by Martin R. Gibbs, Steve Howard, Frank Vetere, Marcus Bunyan (2006)

    Abstract

    By and large interaction design has been concerned with information exchange – technologies for the collection, processing and transmission of informational content. This design sketch discusses preliminary ideas about an alternative way to think about interactive technologies – phatic technologies – that are less concerned with capturing and communicating information and more about the establishment and maintenance of social connection. Drawing on insights and inspiration gleaned from a recent field-based study of the role of interactive technologies within intimate relationships we outline our preliminary ideas concerning technologies to support phatic interaction. Using materials collected during our fieldwork as design inspirations, we developed design sketches for phatic technologies intended to support playful connection between intimates. One of these sketches – SynchroMate – is presented. SynchroMate is a phatic technology designed to mediate intimacy by affording serendipitous synchronous exchanges.

    Full case study


  • 30JulThe SynchroMate

    If you’re new here, you may want to subscribe to my RSS feed to receive the latest Architectradure’s articles in your reader or via email. Thanks for visiting!



    The SynchroMate fits snuggly in the palm of one’s hand (…) it encourages serendipitous synchronous interaction by indicating when a message is being composed for you by a distant companion through gentle vibrations and pulsing concentric circles of lush colors on the display

    SynchroMate: A Phatic Technology for Mediating Intimacy, by Martin R. Gibbs, Steve Howard, Frank Vetere, Marcus Bunyan (2006)

    Abstract

    By and large interaction design has been concerned with information exchange – technologies for the collection, processing and transmission of informational content. This design sketch discusses preliminary ideas about an alternative way to think about interactive technologies – phatic technologies – that are less concerned with capturing and communicating information and more about the establishment and maintenance of social connection. Drawing on insights and inspiration gleaned from a recent field-based study of the role of interactive technologies within intimate relationships we outline our preliminary ideas concerning technologies to support phatic interaction. Using materials collected during our fieldwork as design inspirations, we developed design sketches for phatic technologies intended to support playful connection between intimates. One of these sketches – SynchroMate – is presented. SynchroMate is a phatic technology designed to mediate intimacy by affording serendipitous synchronous exchanges.

    Full case study


  • 31JulThe next UI breakthrough, physicality?

    If you’re new here, you may want to subscribe to my RSS feed to receive the latest Architectradure’s articles in your reader or via email. Thanks for visiting!



    The Design of Future Things by Donald A. Norman

    A discussion by Donald A. Norman on the passage from Graphical User Interfaces, especially command line based, to Tangible User Interfaces, in particular motion based interfaces.

    Introduction

    In my previous column I discussed the reemergence of command line language. Once these were the ways we used our operating systems and applications. Now they are reemerging within search engines. They are hidden and not easy to learn about, but I expect them to grow in power and, over time, become the dominant means of interaction.

    In this column I will talk about a second trend, one that also has much earlier origins: the return to physical controls and devices. In the theoretical fields that underlie our field, this is called embodiment: See Paul Dourish’s book, Where the Action Is. But the trend is far more extensive than is covered by research on tangible objects, and somewhat different from the philosophical foundations implied by embodiment, so I use the term “physicality.”

    Physicality: the return to physical devices, where we control things by physical body movement, by turning, moving, and manipulating appropriate mechanical devices.

    Reference

    Norman, D. A. 2007. The next UI breakthrough, part 2: physicality. interactions 14, 4 (Jul. 2007), 46-47.

    Column on Command Line Interfaces available online

    Full paper available at the ACM digital library