Update: It seems this somehow got on TechVibes & HackerNews
So for Vancouver’s Barcamp 2010, we made a pong game for the iPad that people could control using their eyes.
Unfortunately, this isn’t a Vancouver Barcamp summary. However, my good friend Mack Flavelle wrote a great summary and another guy named Shivanand, whom I wish i had met, did another awesome job.
What this is, is a post explaining the project I hacked together during Vancouver Barcmp 2010 with Craig & Tom: eyePong
I hate when people make me read to the end, so here’s a video of the project:
What is pong?
For those who don’t know what pong is, it was an old arcade game where a ball goes back and forth between two human controller paddles. You can google it or read the wikipedia article for more info. But a picture should pretty much explain it:
The eye gazer equipment is from Craig Hennessey, founder of mirametrix, and I worked with developer & overall hilarious person Tom Schultz (@appskicker).
eyePong
By the time I joined them on Saturday, Tom had already gotten the backend “secret sauce” working, which passed the x & y coordinates of the eyes to the iPad over a TCP socket connection. We pair programmed (read: I watched him type slowly on this weird wireless keyboard while we both laughed a lot) the initial pong paddle & the ball. I then took over, fixed the paddle’s position, recorded persistent high score (so you could play against others), added sound to the ball hitting the paddle, & made the direction & velocity of the paddle change the way the pong moves.
The green “box” is the pong ball, and the white square is the calculated eye Gaze. It’s pretty cool how people tend to watch the ball as it moves around.
Videos
Here’s the same video as above: some random guy playing it near the end of barcamp & me narrating (I hate silent videos).
Here’s a movie of Craig playing the game & me hiding laughter, while someone gives a completely unrelated barcamp talk. Ironically, it’s more of a silent video, which I usually hate.
The Code
Obviously the codes pretty messy since we pushed this out in, literally, a few hours. I’ll be cleaning it up a bit & maybe githubing it if there’s enough interest (read: ping me if you’re interested). Craig’s plan is to make an API out of it.
This would be great for people who are physically disabled. I assume this would also work on the iPhone?
That was actually one of the original purpose of the device!
This would absolutely work on the iPhone.
I was around, and talked to Craig and Tom. You were busy trying to calibrate it and I didn’t want to bother you then. Great job at getting the app refined by the end of the day. I’ve been looking for the video over the weekend.
Another application of this tech (as I was talking to Craig about), is to use this + a clicker of sorts to allow setting boolean states on large sets of data (say selecting 10 pictures from 100 etc). I would be very interested in the github repo when it’s ready.
Let me know if I can be of any help.
Just a dumb question…
1. looks like you are shooting IR light directly at the iris…
2. this doesn’t register in our vision, but IR light in a lazer, for example, can blind.
Is this possibly damaging for the eyes to be directly targeted by ir light either in the short term, or over a long period of time?
Great question – apparently it’s far enough that it’s not too bad. And you obviously shouldn’t spend long periods doing it.
Also, a laser is simply very focused light -these led diodes are not very focused and disperse the photons much like the LEDs you find in circuits & stuff.
Hope that answers your question!
Craig Hennessey has founded a new eye tracker start up called Gazepoint http://www.gazept.com. They offer eye trackers for under $500 which is crazy since existing systems cost $5000 to $10000 for an eye tracking system.