Background

I was interested in making everyday objects responsive to human interaction without affecting their appearance. I tried to find out what it would mean to be able to interact with any object.

To allow complex interactions, I decided to use pattern recognition, for any surface to make sense of a user's touch gestures. I prototyped some ways to make any object responsive to touch gestures on their surface.

Testing Basic Patterns

Pattern - C


Pattern - I


I first experimented with patterns on the Num Keys on my laptop's keyboard.

It is a 3X3 grid of buttons, so any pattern I create by pressing a series of Num Keys could be mapped to a unique output.

I connected an Arduino Uno with some LEDs to my laptop. I ran a Processing sketch that did some binary calculations on the patterns created from the Num Keys and translated each pattern into a known symbol. It then sent the mapped command to the Arduino, which lit the LEDs in corresponding patterns.

So making different patterns on the Num Keys led to different combinations of LEDs lighting up as per a predefined mapping.




Recognizing the Gesture L from it's Binary Pattern

Basic Patterns on Surfaces

Copper sticky-tape

I then implemented this on an actual surface. I created a 3X3 grid of capacitive touch points and implemented the same logic for detecting patterns.

I used pieces of household Aluminium foil, connected them to an Arduino Uno with jumpers and used their responses to emulate the functions of the Num Pad keys from my first experiment.

This setup was sensitive enough to pick up gestures from the opposite side of a 4mm thick piece of plywood. So making patterns on the front surface of a seemingly normal piece of plywood led to different combinations of LEDs lighting up.

I also tested the same concept by replacing the Aluminium foil with some ITO (Indium Tin Oxide) + PET film and applying it on a window. This let me implement it with a transparent grid of capacitive points on glass.

This was my first demonstration of having inanimate objects respond to us.

Touch sensitive environments could add depth to Mixed Reality experiences by integrating physical interaction with objects and surfaces.

3x3 grid on the Plywood

3x3 electrodes on the Plywood

3x3 electrodes on ITO

Further Work

The Vacom Bamboo sketchpad

Some training images for 'A'

Planning the high-res touchpad

I worked with a friend, Vikas, while in College to take this concept a step further.

We implemented richer gesture recognition by using OCR for recognizing actual shapes and symbols created as gestures on the surface.

We developed and tested this with a Vacom Bamboo touch pad. We used the 26 alphabets as known gestures. We used this logic with a Processing sketch for a quick demonstration.

So on writing different alphabets on the Bamboo, the Processing sketch changed the window colour on a laptop. We further planned to deploy this with a high resolution capacitive grid on a piece of plywood as a demonstration.


This project was surely the most exciting one I have worked on. The process of going from an idea to the actual demonstration was amazing.

Specs for fabricating the high-res touchpad on plywood

Get In Touch

  • Pune (MH), India
  • ranjitbhinge@gmail.com