When I first started this blog I had little idea how Multi-Touch worked, and what was practical and within bounds of readily available technology. I originally intended to make a large(40 Inch) Rear-Projection DI Table, which through many mistakes and decisions, has now turned into a rear-projection LLP Table. Unfortunately, my projector bulb recently died, killing this table, and as of current I have a front LLP setup on uding old 17 Inch monitor. This setup, while it does work pretty well, has some major disadvantages. The lasers have to be pointing away from the camera(so they don’t appear as blobs), thus illumination can only come from one side. This causes really bad occlusions, making Multi-User interaction nearly impossible(aside from that fact that it is on a 17 Inch 5:4 monitor).
I came across the LumenLab database of monitors a few days ago, and my 17 Inch Monitor(Samsung 712N) was on the list of those that didn’t have FCC Issues, so I figured I would give it a go at taking it apart and mounting the camera behind it. If it turns out to work well I might make a nice box for it.
Taking apart this monitor was amazing, and I _HIGHLY_ recommend it for anyone trying to find a small monitor to take apart for such a project. It is amazing how high quality and easy to take apart it was built, its almost like they took us into consideration when they designed it! Ill post a more details description with pictures later(once I take some:)).
Its time for a new tutorial! This time we will build a slightly(but only slightly) more realistic example. We will use PyGame to render a green box, and allow us to drag it around, and resize it by pinching and squeezing. As usual, if you have any suggestions regarding tutorials, or if you have an idea for a tutorial I should write, send me an email(can be found in the contact me section to the right), or say something in the comments.
Lately I have been working on a Multi-Touch Physics application. It is pretty much a big 2D physics sandbox, you can spawn new objects, make joints, then throw them at stuff. Its written in Python, using Box2D for the Physics engine. TouchPy is used for TUIO Parsing, and PyGame for graphical output.
It does support a ZUI, but it is not a “full ZUI”. It does have simulation limits, so if an object happens to leave this limit it will freeze(as seen in video below). It also does not implement the Gear Joint, but all the other joints are supported.
Its still in early development, so it has its share of bugs. The biggest has to do with triangles. When you choose to make an arbitrary polygon, it works fine with any number of vertices between 4 and 8, but once you try 3 weird things happen. Half of the time it will work, and the other half it won’t. Due to the way PyBox2D was coded, I can’t catch this error either(or any Box2D error, for that matter), so the program will unceremoniously exit.
Code can be had from the downloads page, and here is a video!
Its been a few weeks since I wrote a tutorial(or posted on this blog, for that matter), so I figured it was about time. In this tutorial we will be starting from scratch, and using a different method to get touches. At the end of this tutorial you will have used PyGame to write a program that will draw circles under each touch, and follow them as they move. Most importantly, you will understand how to do it yourself!
This is the second part in an endless series of tutorials dealing with Multi-Touch Development in Python. If you have not read the first one yet, it can be found here.
In the last tutorial we made a really useless example, where every time each TUIO event happened, it would print out “Something left the table”, “Something moved on the table”, or “Something was placed on the table”. We are going to further extend this program to print out where it happened.
I have noticed a lot of tutorials dealing with Multi-Touch in C++, C#, or AS3. There seems be a lack of Python tutorials, but there is a growing number of people wanting to use Python for Multi-Touch apps. I intend to change that with a set of tutorials to teach people how to write Multi-Touch Apps in Python, starting with setting up the environment, all the way to writing a PhotoApp clone with Python-Lux(which is not released yet, and i’m not going to tell you when it will be released in any of these guides. If thats all you came for, sorry I disappointed you).
These tutorials assume basic knowledge of Python, and at times may use some obscure feature of the Standard Library. If you have any questions regarding something you don’t understand, or anything else about Python Development and such, feel free to email me at xelapond @ gmail . com, or PM me on nuigroup(username xelapond). I also usually hang around IRC, #nuigroup, username xelapond. I am happy to assist you with any problems you are having, and I would love feedback on how well this tutorial goes.
For those of you that have been keeping up with my blog, you would probably know that I ordered some Lasers from Aixiz, but have not gotten them to work. Well, I was about to send them back, when I noticed a NUIgroup Member names AixiZ. He said there was some misinformation about the voltages, and said they were 5v devices and not 3.2v. Well, I found a 5v wall wart, cut off the end, hooked them up to a breadboard, and they work! I am going to try and get them lined up so my setup will actually work, so in the meantime, some pics:
There are the 10mW 780nm Lasers from Aixiz.
I was searching around Craigslist for a broken projector(read: working projector with dead bulb) so I could hack a different bulb into it, and not pay $400 for a $6 bulb. I found one, a Proxima DP6850. Its a really nice projector, 1024×768(1280×1024 max), Digital Keystoning and Zoom, two RGB inputs, S-Video, and built in speakers. The only places it lacks are brightness(1500 lumens) and Bulb Life(1500 Hours). Bulb life didn’t really matter to me, and I figured it would probably be brighter if I go sticking a huge metal halide bulb in there. It is circa-2000, and retailed for $8500. The Craigslist seller wanted $75 for it, so I figured I really had nothing to lose, and if it didn’t work at all at least I would end up with a lot of good lenses and LCD filters.
I bought it this morning, only to arrive home and find out the bulb works! I don’t know how much longer it will last, so I am going to wait until I am ready to mount it in my table to play with it(excluding the image warping).
A while ago I saw that openFrameworks demo where he uses openGl shaders to warp a playing video onto a white board. I could not find the code anywhere, so I used “The GIMP” to Perspective Transform an image. I achieved the same effect, and was able to get a picture of the earth onto a piece of paper. Video below:
I noticed some people were disappointed with the poor speed my last video depicted, so I made a new one. This is take with my Canon digital camera, so there isn’t any Desktop Recording overhead. This demo is with 30 and 100 items, respectively.
I(and a few others) have been thinking about what to do with the desktop in multi-touch tables. It is not optimal, because of small buttons, requirement for a second button on the mouse, and has a sense of direction(there is a defined up and down). I was thinking, what if we threw all the stuff you would normal have on your[Linux] desktop on a big ZUI plane. That way, objects(Text documents, code, pictures) would represent themselves as their content, so pictures would be the pictures they are, and you could manipulate them just like they were in the PhotoApp, rotating and zooming with Gestures. It would get crowded, so I though if you could just circle a bunch if they would stack up, you could have a pretty cool interface. A demo of the stacking is below, its entirely random. It seams to work pretty well, so I am going to take this further with real pictures, apps, widgets, and some other sorting/searching methods(grid, line, flow). Here is a discussion on NUIgroup about WIMP and how it doesn’t work for multi-touch.
Also, I’m sorry for not commenting the code. If you have any questions about the code just leave a question in the comments, and Ill probably get back to you within a day.
At the moment the code is _really_ messy, but here is is for anyone that wants it. I will be cleaning it up later, this is just something I hacked up in a few minutes to see if it would work. You will need Python, PyGame and Rabbyt if you want to run it. Here is the code link.
NOTE: Without the desktop record app(gtk-recordmydesktop) there was zero lag, even on my crappy graphics card(NVidia 7300) and 100 items.