Posts Tagged ‘Interaction’
Quintilian is a framework for hand based gestural interaction in immersive environments. It was implemented to take advantage of the existing real time motion capture system in the AlloSphere, focusing on the psycholinguistic and ergonomic aspects of everyday human interactions. The framework is intended to provide a basis for building a human centric interaction space derived from meaningful communication of gestural discourse (semiotic values) and manipulative properties of physical interaction (ergodic values). These principles are applied to achieve interaction tasks in immersive environments for navigation within virtual worlds and manipulation with their objects. A Device Server plugin was also implemented for the framework to make it easier for existing applications to obtain a much more natural mode of interaction than currently exists.
Quintilian was my Master’s project and the accompanying thesis can be found here: Quintilian by Ritesh Lala.
Interactive Installation @ MAT End of the Year Show 2011. (June 9, 2011)
Animus is an interactive multimedia installation inspired by Quorum Sensing in bacteria. It stands as a metaphor for the governing spirit of the system, a collective brain if you will, for cell to cell communication in microbes which in turn defines its gene expression and the collective behaviors that emerge from it. Micro organisms use this kind of communication constantly to check for their population density and crossing a certain threshold display behaviors that vary from bioluminescence and toxic secretion to sporulation and conjugation.
Animus places the user as a controller for this system, where how they choose to interact with it defines its outcomes. The user gets a continuous feedback of the threshold required to display a certain behavior (bioluminescence in this case), affecting how they interact with it, making them more a part of the system then its controller eventually.
The installation uses a Kinect to create a sensing field in front of the projection, making direct manipulation of the system possible unaffected by any background noise. The user needs to perform a certain gesture to identify them as the controller and start affecting the system’s behavior reflected by the audio interface and the display projection.
Interactive Multimedia Project (MAT 594O : Sensors)
This project was an exploration to link ideas in Human Computer Interaction and Computational Geometry to create an engaging audio-visual environment. The concept of the project was to trigger a series of 3D forms (based on the SuperFormula proposed by Johan Gielis) transformations & Sound (Vibrato) transformations in Virtual (3d) Space. All action is triggered through a series of hand gestures (Arduino w/ Accelerometer), motion of the hand rotation measured (X,Y,Z) axis and the up/down quick movements to change the SuperShape. Further adjustments to signal received in the system can be used to create more expressive transformations of the SuperShapes.
The user/viewer will be presented an interface (Arduino attached to either right/left hand) that will engage the system, stimuli of site, sound and motion. The areas of HCI, Computer Graphics and Electronic Music were concurrent areas for research.
Check out the video:
Group Collaboration Project for MAT 200C: Spring 2010
I worked with a team of six other people to create a multi-touch surface table for MAT. We used FTIR for touch detection, TUIO for tracking and Processing for the display.
The concept of Natural User Interfaces is becoming more and more widespread from applications in user appliances to research instruments. In this domain of technology where user experiences range from simple touch screens to fluid interfaces, multi-touch surfaces have an important role to play. For some they might promise of a more integrated, interactive and intuitive multi-user solution, while for others they might pose a situation of unwanted change. This project takes a positive approach towards adapting to this technology and investigates to a certain extent, the mechanics of it. It also presents a brief overview of certain applications for such an interface and how they can be expanded. It is concluded that touch interfaces provide a sense of instant feedback and a feeling of prompt reaction which makes it a richer and faster experience. Further investigations and user feedback would lead to more streamlined applications that could take collaborative software applications to the next level.