Friday, September 9, 2011

Tangible Bubbles TUI in relation to TUI framework

Tangible Bubbles allows children to record their voice by speaking into a hallow container such as the balloon or accordion prototype. The design is light and manageable for small children so that it acts as a toy. The TangibleBubble that now encapsulates the child's recorded voice, replays the sounds and distorts it by changing pitch and speed according to intuitive manipulation like squeezing. Further, the "bubbles" of their recordings can be released onto a "window." There the encapsulated bubble messages can be dragged to "Grandma's Door." This action results in the voice recording being emailed to "Grandma!" Or, the window would be utilized as a two way communications system if the other user was "online."

This concept of making sound tangible and movable is similar to the case study in The TAC Paradigm: Specifying Tangible User Interface (Ryokai, Raffle, Brooks) on the Marble Answering Machine. The answering machine saved encoded voice mails onto marbles that were manipulated in two ways. The message marble would be placed in an indentation on a sound machine for playback and the other to call the person back by placing it in an indent in a phone. Both turn the intangible sound/voice into recordings that are physically represented by the balloon or accordion and marble, and both make the sound malleable.

Both examples follow the guidelines of a Token and Constraint structure (TAC). In the marble message example, its is clear that the marble is a Pyfo, it acts as the token that is constrained by the playback machine and the augmented phone. For Tangible Bubbles, the TAC element comes in two forms. First, the child's face is a trigger for the start of recording when they press their faces against the constraint, the opening of the containers. The second token comes as the bubbles that have to be directed to the specific mail recipients and only manipulated according to the physical constraints of the balloon or accordion.
Another element of TUI's is the input, sense, and output pattern. In A taxonomy for and analysis of tangible interfaces, Fishkin highlights the "Sketchpad" as an example. The ‘‘Sketchpad’’ is a small computer display that when shaken clears the display. Here the shaking movement is like our voice recording. They are both examples of input, generally a physically event caused by the user. The way the objects reacts is the sensing component and their reactions, erasing the screen and the sounds being distorted, are the output feedback.

The Tangible Bubble also clearly uses Fishkin's concept of the Metaphor. The whole idea of a TUI,taking a virtually represented concept or idea and physically embodying it into something realistically, needs the metaphor. It is what instills the intuitive understanding and links between the virtual and tangible. This example depends on the shape of how we visualized our words and sentences. Remember how in comic strips the words and speech of the characters are represented by bubbles? In this way the Tangible Bubble becomes like the bubbles emitting from the child's mouth that eventually are transported onto the whiteboard and sent off endearingly to a relative!

In summary, our sound bubble is our constraint that is "coupled," an attribute that the TAC paradigm points out, with the sound of the child's voice. This coupling is enabled by the metaphor, the intuitive understanding that the bubble represents the sound recording. Then, after our input has been sensed and understood and the sound has been recorded, the bubble is manipulated by the constraints provided by the embodied object; the accordion or balloon is compressed to increase or decrease the pitch and speed. Finally the exported result is the altered token, for example the new recording that is lower in pitch and slower.

That's my understanding so far of the Tangible Bubble and how it fits into TUI frameworks!

UIST: the crazy fun of Brainstroming

So... I've somehow thrown myself into a User Interface Software and Technology Student Competition that's based off of Microsoft's TouchMouse!

"Get your thinking caps on and ideas flowing for the third annual UIST Student Innovation Contest (SIC). The goal of the contest is to innovate new interactions on state-of-the-art hardware. We give you the latter, and you show us what you can do.

This year we're going to be working with the brand, spanking new Microsoft TouchMouse. In addition to supplying you the hardware for free, Microsoft is providing exclusive access to a pre-release of the TouchMouse API. This lets you get at the underlying 2D capacitive image captured the mouse’s sensor matrix; You’ll get a chance to hack together some cool demos before the everyone else gets their hands on the API. You even get to keep the mouse after you're done!"

About six of us started brainstorming ideas for the Mouse implementation and, oh how crazy our ideas were! It was great! We were all random, freely spouting our thoughts ranging from only semi-serious contraptions, curious musings stemming from a small inkling, detailed articulated ideas (albeit those were few) to fun quirks that we just liked for fun. Brainstorming chaos aside, we were all respectfully attentive to each others ideas, recording each down, boucing and feeding off them, giving them their equal due on the idea table before they got bumped over by or morphed into another even better idea.

I'm wondering how much of our ideas I should reveal since this is a competition! :P

But some things we thought about:

  • We made note to avoid using the mouse like a controller. This is a very common use a of a technical equipment and the only selling point of that would be how inventive we could be with the gesture coordination.
  • We recognized our tendencies to draw from already established and typical set ups like controller plus screen and the treacherous screen-mouse-keyboard ones.
  • And so we worked to move away from the little screen be expanding the size
  • moving the screen or mouse movement away from their typical orientation weather it be horizontal of vertical
  • and considering the output, for example, eliminating most visuals and focusing on tactile output.
  • We found a very innovative aspect of the TouchMouse depended on it's ability to sense gestures through material, so we wanted to optimize that quality.

Some things I learned:

  • I tend to tone down my imagination for fear of an inability to achieve. For example, I would say, "that's a great idea but I doubt we will be able to finish that or it would be hard. Let's stay with something simpler." I've become a very cautious person, someone unwilling to be vulnerable and take chances on the fact that I might fail. I want to change and to remember even if I can't reach the moon I'd end up amoung the stars anyway. So, be creative, don't let fear and reality stop you. You can always prune back later.
  • Robots are fun and exciting but can be hard and time consuming.
  • I think we tend to them because of how they can be similarly automated to be akin to us.
  • Make any object an animal or fuzzy and the idea is infinitely better.
  • Turtles purr.
That's it for today! Keep posted for more hints about our project event!

Tuesday, September 6, 2011

A Step into the Tangible world... in how to make the electronic virtual world more tangible and more able to be stepped into.

I'm a Wellesley College student originally from Texas taking Orit Shaer's Tangible User Interface class from the computer science department!

I'm quite nervous to take the class because of how much there is that I don't know
I'm quite excited just because I have so much to learn!!

Wish me luck!

I plan to post my thoughts and the new things I've learned and am struggling with!