Blog Archives

Latest Posts

Monthly

Categories

All Posts in Exploration

September 29, 2013 - Comments Off on Aether

Aether

Exploration @ CIID & Volvo | 2014

During a three day exploration on haptics and materials we were given the brief to imagine a haptic prototype that could be inserted into the driverless “Car of the Future”. The concept of Aether is focussed on imagining the car as a ‘Membrane’, that would serve as a filter between the interior and exterior environments.After a rapid brainstorming session, it was decided to focus on abstracting the sensation of wind as felt when sticking your hand outside the car window in a wave-like pattern. The goal was to provide future car passengers with the subtle feeling of wind when future conditions make it impossible to engage with the outdoors.Experimentation with different materials was conducted that might create the same tickling or vibratory sensations typically associated with the action. By attaching strips of straws to ‘The Plank’ that would vibrate and move according to a slider, the user could control the speed of the car in motion. The more the user ‘accelerated’, the faster the motor vibrated which in turn generated a wind noise provided by a Max patch. This all worked in tandem with a Servo motor that adjusted the angle of a screen located above the straws to guide the user in the wave-like pattern we aimed to replicate.

IDEATION

3

CIRCUIT & MOTOR SHIELD

4

Testing "The Plank" with different materials

Aether-Materials

INTERIOR OF AETHER

1 2

In collaboration with Bethany Snyder, Chiayu Hsu, Myoungeun Kim

Mentored by : Bill Verplank (Stanford University), Daviid Gauthier (CIID Research) & Jakob Bak (CIID Research)

September 29, 2013 - Comments Off on eyeChitra

eyeChitra

Exploration @ Srujana Innovation Center LVPEI | 2013

eyeChitra is a portable ERG Scanner (Electroretinogram) which ophthalmologists can use to detect and diagnose retinal diseases/disorders like Retinitis pigmentosa, Leber’s congenital amaurosis, Choroideremia, Achromatopsia, Cone Dystrophy etc., at an early stage which can prevent blindness and other retinal problems.The prototype was mainly intended to reduce the cost of the equipment which is used in hospitals. The current device which is used for Electroretinography is a huge bulky machine which costs around $20000 and is not portable. The inspiration was to make this into a handheld device which can be accessed in the remotest parts of the world and should cost nothing more than $200. The “eyeCHITRA” device was tested on some visually challenged people at the L.V. Prasad Eye Institute and produced positive results.

PROTOTYPE

EC-Prototype

 

EC-Prototype1

TESTING

EC-Self-Test

USER TESTING

Testing with Dr N.Rao (Founder & Chair LVPEI)

EC-Testing

Mentored by :  Ramesh Raskar (MIT Media Lab), Dr. Anthony Vipin Das (LVPEI), Everett Lawson (MIT Media Lab), Amy Canham (MIT Media Lab)

>Download high resolution images and video here<

September 29, 2013 - Comments Off on Imaging in 3 Dimensions

Imaging in 3 Dimensions

Exploration @ CIID | 2014

Working on the brief “make the invisible visible ”, this project explored the process of capturing photographic data in 3 dimensions. We were inspired to think of creative ways of imaging and filmmaking and at the same time exploring the various techniques in which technology could help us achieve that.We started the experiment by mounting a Microsoft Kinect and a DSLR Camera on a tripod using a laser cut mount. Further we used the RGBDToolkit to augment high definition video stream with 3D scan data from the Microsoft Kinect. It enabled us to  calibrate the DLSR Camera to the depth sensor, allowing their data streams to be merged. Further using a visualization application we combined the footage and applied different 3D rendering styles. The output was visually interesting since we were able to capture a phenomenon of the real world on the screen.The final video was a combination of the render data retrieved from RGBD Toolkit.

CALIBRATING THE KINECT

Setup

SETUP

BTS

RENDERED IMAGE

1

 

In collaboration with Peter Otto Kuhberg

Mentored by : Timo Arnall (BERG London, Oslo School of Architecture & Design) & Matt Cottam (Tellart, Rhode Island School of Design)

October 26, 2014 - 947 comments

Stamping the City

Exploration @ CIID | 2014

Stamping the city is a project where we try to determine the contextual approach of naming places in a city and how the culture around it reshapes places; how culture plays an important role. Who decided what should this place's name? Those were the tuning question in our mind. While we took few trips to the library we also made a light wand attached with an Arduino and programmed to show texts in a split version, row by row. We then marked famous landmarks of Copenhagen. The scenes were captured using long exposure shots. The names were automatically downloaded from our current standing location using Google Map's API.

BUILD PROCESS EI5

EI2

PLACES: COPENHAGEN OPERA HOUSE

EI1

COPENHAGEN ROYAL PALACE

EI6

KONGENS NYTORV

EI4

NYHAVN

EI3

In collaboration with Saurabh Datta

Mentored by : Timo Arnall (BERG London, Oslo School of Architecture & Design) & Matt Cottam (Tellart, Rhode Island School of Design)

October 26, 2014 - 87 comments

fADE

Exploration @ Jaypee Institute of Information Technology | 2011

fADE-fluidic Aura Desktop Environment is an exploration into spatial operating environment using hand gestures.The design of the system is inspired by MIT Media Lab’s gstalt and Oblong Industries gspeak operating environment. A list of gestures have been defined which would come naturally to the user’s mind without having to memorize them. Specific functions are assigned to each gesture. The system has been developed in a way to accommodate custom gestures as per the requirements of the user. The hardware implementation includes the use of a Kinect Sensor to detect the hand movements and depth of the user. The laptop acts as the central server for collecting and processing digital information received via the ‘Kinect Sensor’. The software then deciphers the hand movements and executes the desired operation based on the pre-defined gesture functions. Further the system has been scaled to accommodate multiple users.

GESTURES:

ZOOM

F4 F5 F6 F7jpg

DRAW

F13 F14 F15

PAUSE/STOP

F12 F16 F17

SINGLE HAND NAVIGATION

F18 F19

TWO HAND NAVIGATION

F8 F9 F10

F20

November 6, 2014 - 84 comments

Posture Patrol

Exploration @ CIID | 2014

As a result of long hours spent hunched over in front of computers, we decided to investigate ways in which a device can sense “incorrect” posture when a person begins to slouch. During this exploration, we worked on bend sensors as detectors of posture, and how we could incorporate our findings into a wearable device called the Posture Patrol that gives user feedback when the back begins to curve in an incorrect position. By creating a bend sensor using neoprene, conductive fabric, thread, and velostat, we applied the sensor vertically along the back at the point of maximum inflection. The sensor is attached to a micro controller concealed in a wristband pocket with an LED light that indicates when the person begins to slouch. When one exhibits the “correct” posture in the upright position, the LED should not be active, but as wearer loses this posture, the LED should begin to light up due to the feedback being received from the wristband micro controller.

CONCEPT SKETCH

PP5

BEND SENSOR

PP4

FEEDBACK DEVICE

PP1

TESTING SENSOR WITH THERMOCHROMIC INK

PP2

PP3

After many different prototypes using different conductive materials and feedback indicators such as thermochromic ink, we learned that the bend sensor used in this application reacted more as a pressure sensor. By applying it directly to the back along with a layer of thin paper as well as clothes over the top was necessary for the sensor to detect the level of resistance and thus, create output for the LED light.

In collaboration with Bethany Snyder

Mentored By :  Hannah Perner-Wilson (Plusea), David A. Mellis (MIT Media Lab), Daviid Gauthier (CIID Research)