Interactive tables for fun and, er, fun.
There are a number of variations on how to build one, but the one we're planning on trying seems to be the simplest: Build a custom table with a frosted glass or perspex top, and place a projector in the base, projecting onto the bottom of the frosted surface. Additionally, have a camera under the table, pointing at the surface, to detect touches and objects.
There are a number of variations on this theme. trackmate is a system of 2d barcodes and open source software that allows you to tag and track objects. Their example configurations involve a frosted plexiglass surface, with even illumination and a camera placed underneath. None of them directly support surfaces with images projected onto them, though.
This instructable demonstrates the construction of a multitouch table that supports both touch detection and a projector, through a technique called frustrated total internal reflection. It relies on a strip of infra-red LEDs along the edge of the panel, and touching the panel disrupts the internal reflection, allowing an infra-red camera under the table to detect your touches. Since it uses infra-red, it's not affected by the projector.
A good overview of different techniques for multitouch tables is available here.
The projector is another consideration: There's limited space in the table, if we want it to be a reasonable height, but most projectors will only create a fairly small image at those distances. One solution is a short throw projector - a projector with a particularly wide angle lens. Some of these can produce an image over a meter wide at a distance of only about 70cm (a typical height for a table)! They're not even much more expensive than regular projectors, these days.
Our ideal interactive table would combine several of these features: We want to be able to interact with it with fingers, but we also want to be able to place tagged objects on it and have them recognized. Unfortunately, nobody seems to have tried this approach yet: The trackmate examples all use even visible-light illumination to read the tags, while the multitouch examples use infra-red illumination for detecting touches, which isn't going to work for reading printed tags.
My thus-far hypothetical approach is a hybrid: I'd like to use regular visible light illumination provided by the projector, and, after calibrating the two devices, subtract the projected image from the image recorded by the camera, and feed the difference to the routines for recognizing touches and tags. While I hope this will work, it'll take some tests to be certain.
On the software side of things, there are a number of libraries available. trackmate, as already mentioned, tracks custom 2d barcodes, while touchlib takes care of recognizing and tracking 'blobs' such as fingers. At a lower level, libraries like opencv provide primitives for doing image processing yourself.
Finally, what applications do we want to use this for? Besides all the stuff we can already run (mostly demos, so far), what I would really like to use this for is augmented boardgaming. I have two games in mind to start: The 18xx series of games, and RPGs.
The goals for RPGs are fairly straightforward: Provide an interface to simulate tactical movement for battles, where the DM can control things and players can interact with the grid. Additionally, provide some utilities for tracking all the things that usually require manual bookkeeping. Finally, for extra bonus points, be able to recognize dice thrown on the surface, so players can roll the dice and have the computer recognize the outcome.
My goal with the 18xx games are a bit more involved. I'd like to implement a complete interactive-table version of them, starting with simulating just the board. The board in the 18xx games uses hexagonal tiles, which presents challenges all of its own - there don't seem to be any robust tile engines for Python. Pygame Utilities has one, but it's incomplete, and PGU is no longer maintained. fife may have one, but it dies with a bus error any time I try to run a demo on my mac. Unless someone surfaces with a recommendation of another one, it looks like I might have to write my own, which will at least be an interesting challenge. This page links to a lot of useful resources on handling hex grids on a computer.
Hex tile editors are similarly sparse, with most of them being written for windows or dos(!) and unmaintained. The tiled map editor is quite a nice looking editor which claims not to support hexagonal tiles, until you look closer, and notice that it exists in two versions - the 'new' QT version and the 'old' Java version. The Java version, while no longer maintained, is perfectly usable, and supports hex tiles. It also has quite a nice output format. Huzzah!
In terms of how this will interface with the interactive table, what I'd like is a simulation of the board, with real physical hex tiles that you can place down to lay new track. Once you've placed the tiles you want to, and oriented them the way you want, you can tap a 'submit' button, and the game will read the trackmate codes from the bottom of the tiles, figure out what they are and how they're facing, and add them to its own view of the board. You can then remove the physical tiles. This seems like the best of both worlds, as you get the intuitive usage of the game, without the clutter of easily knocked tiles on the board all the time. The game can then do routing and so forth for other game phases entirely in the computer, allowing you to simply tap on tiles to set up a route.
That's it for now. Apologies for the lack of a 'real' blog post - a combination of busy-ness, other things on my mind, and lack of inspiration for a 'real' post led to this braindump instead. If you have any ideas or suggestions about our interactive table project or about implementing a hex tile engine, please speak up in the comments!Previous Post Next Post