This project was completed by myself and Alastair Cooke after we agreed with our Secondary School / Sixth Form that they would fund the project in 2010 (straight after our GCSE’s).
The table consisted of an approximately 40inch piece of acrylic on top of a open-frame box produced by us. We put sheets of anodised aluminium on the outer edges so the panels were easily removable for maintenance, which meant it also looked fantastic.
The FTIR effect was used to detect where someone had pressed the screen. The acrylic screen is flooded with Infra-Red light from a strip of LED’s, and when touched light is reflected down. The following image shows this:
We used a BenQ projector and a Sony PSEye Web Cam to show the image and detect clicks, as well as a mirror to enlarge the surface image without raising the height of the table. To allow the PSEye camera to see Infra-Red and not light visible to humans, we placed 3 pieces of old camera film in front of the sensor. We initially used old Windows PC’s to power the detection but kept suffering from crashes with the touch-detection driver. In order to keep costs down, we located an old MacBook with a broken screen, which of course made it useless for nearly every purpose except for ours. This is now what powers the machine today. To keep the whole box cool, vented wholes were cut on opposite sides and a bathroom extractor fan installed.
The software included Community Core Vision (used to calibrate the algorithm for converting the image from the camera feed to a graph where points can be read off) and Firefox (used in the final product along with a custom-built ‘website’ to display the images.
A video of some of it’s operations is included below.