back to index


Assignment: write an application that interfaces with an input/output device

For this assignment I decided do a step forward regarding my final project, and I am going to connect a light sensor, using the board I designed for the input devices week, to Processing.

Why that? The idea is to put the sensor close to the iPhone to detect the amount of light that each screen is emitting, and use that data to know better what is happening in the device, in order to control it more accurately. For example, if an application in the iPhone last more than expected to open, you can detect that and wait before touch the screen again.

The application will have two main parts:
— the reading of the light sensor
— an interface simulator of an iPhone, more particularly the interface inside Instagram, to reproduce the position of the touch events from taking a picture to sharing inside the social network

I start working first on the simulator part. I take my iPhone and I go to Instagram, take a picture and share it. I take screenshots on each step and I send it to my computer. Here you can see the schema that I want to reproduce with the machine that I am going to build for my final project.


0. open Instagram

1. go to "take a photo" in the main menu
2. focus
3. take photo
4. next
5. press near the photo to write a caption
6. write something and press on the dark area or "accept"
7. share
8. repeat from step 1?


What I want to get from the simulator?
— the coordinates where the machine has to touch
— a visualization of all these coordinates at the same time, to study where to put the sensor, for example

I put all the screenshots in Processing. When I do click I load the next screen. Each click leaves a mark on a white image at the side to generate the visualization. While moving the mouse over the screenshot, I can see the its position in X and Y.



Next step is to connect the light sensor with the application. I do that by following instructions from Arduino and Filip Tejchman.

In the application now you can see the value from the sensor, and together with the coordinates, I have enough data to make a machine able to reproduce the path inside Instagram.



password: fab

Some future improvements could be:
— generate a text file with the data
— write data in a file using a format ready-to-read for the machine
— connect the app to the machine to have direct control over its movements



download files


back to index