Augmented Reality in Blender got some attention a while ago. Several people [here, here and here] reported successful use of the Python ARToolkit bindings, Blender and the Blender videotexture plugin. Because I needed to do some experiments with AR, I tried PyARTK but it failed to compile on any Linux platform (unlike ARToolkit itself, which works fine).
Not willing to spend several days on fixing dependencies I decided to just forget about PyARTK and connect Blender to ARToolkit directly. However, my lack of (sufficient) C skills called for a ‘creative’ solution here.
The above diagram illustrates the solution.
- Vloopback 1.3 to split the video into two virtual streams
- ARToolkit 2.72.1 to do the marker detection and tracking
- Modified simpleTest to put matrix in buffer
- Python script to read buffer and control BGE
- VideoTexture plugin for Blender to show video stream
Firstly, I’m using vloopback to split the webcam stream into two virtualÂ streams, both accessible through /dev/video3 (in this case). One of these streams is used by a modified version of ARToolkit’s simpleTest. This tool analyses the video, finds a marker and outputs the resulting orientation matrix to a buffer in RAM (simply write the matrix to a tmpfs file). This buffer is read by a Python script in the Blender Game Engine, allowing it to align the camera and objects properly. The Game Engine uses the videoTexture plugin to show the webcam’s virtual stream in real-time.
The performance of this setup is better than I originally expected. I’ve tested it on two laptops (two dual cores, 2,5 Ghz and 1,6 Ghz) with NVidia graphic cards (G96M Quadro FX 770M, the other one is older). For now it’s Linux only (Tested on Ubuntu Karmic), mainly because of the vloopback trick. The other tools are pretty much platform independent I think. Compared to the standalone ‘simpleTest’ program there is some lag, but it doesn’t really affect the AR interaction.
- I’m working on a multi-marker setup
- Write some supporting python scripts for the BGE