Archive for the ‘DIY’ Category
When I first started this blog I had little idea how Multi-Touch worked, and what was practical and within bounds of readily available technology. I originally intended to make a large(40 Inch) Rear-Projection DI Table, which through many mistakes and decisions, has now turned into a rear-projection LLP Table. Unfortunately, my projector bulb recently died, killing this table, and as of current I have a front LLP setup on uding old 17 Inch monitor. This setup, while it does work pretty well, has some major disadvantages. The lasers have to be pointing away from the camera(so they don’t appear as blobs), thus illumination can only come from one side. This causes really bad occlusions, making Multi-User interaction nearly impossible(aside from that fact that it is on a 17 Inch 5:4 monitor).
I came across the LumenLab database of monitors a few days ago, and my 17 Inch Monitor(Samsung 712N) was on the list of those that didn’t have FCC Issues, so I figured I would give it a go at taking it apart and mounting the camera behind it. If it turns out to work well I might make a nice box for it.
Taking apart this monitor was amazing, and I _HIGHLY_ recommend it for anyone trying to find a small monitor to take apart for such a project. It is amazing how high quality and easy to take apart it was built, its almost like they took us into consideration when they designed it! Ill post a more details description with pictures later(once I take some:)).
This is the second part in an endless series of tutorials dealing with Multi-Touch Development in Python. If you have not read the first one yet, it can be found here.
In the last tutorial we made a really useless example, where every time each TUIO event happened, it would print out “Something left the table”, “Something moved on the table”, or “Something was placed on the table”. We are going to further extend this program to print out where it happened.
For those of you that have been keeping up with my blog, you would probably know that I ordered some Lasers from Aixiz, but have not gotten them to work. Well, I was about to send them back, when I noticed a NUIgroup Member names AixiZ. He said there was some misinformation about the voltages, and said they were 5v devices and not 3.2v. Well, I found a 5v wall wart, cut off the end, hooked them up to a breadboard, and they work! I am going to try and get them lined up so my setup will actually work, so in the meantime, some pics:
There are the 10mW 780nm Lasers from Aixiz.
I noticed some people were disappointed with the poor speed my last video depicted, so I made a new one. This is take with my Canon digital camera, so there isn’t any Desktop Recording overhead. This demo is with 30 and 100 items, respectively.
I(and a few others) have been thinking about what to do with the desktop in multi-touch tables. It is not optimal, because of small buttons, requirement for a second button on the mouse, and has a sense of direction(there is a defined up and down). I was thinking, what if we threw all the stuff you would normal have on your[Linux] desktop on a big ZUI plane. That way, objects(Text documents, code, pictures) would represent themselves as their content, so pictures would be the pictures they are, and you could manipulate them just like they were in the PhotoApp, rotating and zooming with Gestures. It would get crowded, so I though if you could just circle a bunch if they would stack up, you could have a pretty cool interface. A demo of the stacking is below, its entirely random. It seams to work pretty well, so I am going to take this further with real pictures, apps, widgets, and some other sorting/searching methods(grid, line, flow). Here is a discussion on NUIgroup about WIMP and how it doesn’t work for multi-touch.
Also, I’m sorry for not commenting the code. If you have any questions about the code just leave a question in the comments, and Ill probably get back to you within a day.
At the moment the code is _really_ messy, but here is is for anyone that wants it. I will be cleaning it up later, this is just something I hacked up in a few minutes to see if it would work. You will need Python, PyGame and Rabbyt if you want to run it. Here is the code link.
NOTE: Without the desktop record app(gtk-recordmydesktop) there was zero lag, even on my crappy graphics card(NVidia 7300) and 100 items.
You may have seen random sporadic posts like “A post for my rss reader to eat”, or “I know it will work this time, stupid rss parser keeps looping”. Well, I have been writing an RSS reader, which runs on the small LCD I got at the Rotary Auction(see this post for details on that). Anyway, recently a friend dumped an old Compaq laptop on me, with a broken screen(including hinges), and a messed up CD Drive. It has Ubuntu on it, so I figured it was good enough and could make an RSS reader. I hooked it up to the small cabinet LCD over S-Video, wrote some software, and now have a cool little RSS reader, that sits there day and night, churning out RSS feeds. Ill post some pics and a video(and the code) when I get back.
I hacked up a little power supply using an LM317, trimpot, 220 Ohm resistor and a breadboard. It regulates a steady 3.2 volts from a 5 volt input. I hooked the lasers up, and I get nothing. I know my power supply works, because it will drive a small DC motor. This leads me to believe there is some quirk with the laser. I will be calling Aixiz tomorrow or Tuesday to see whats up.
Here is a discussion on NUIgroup forums, apparently other people have had this same issue.
I have been coding a new multi-touch mouse driver for GNU/Linux, in Python of course. I got the basic tapping down, but I wanted to go further. I though it would be cool to be able to put all five fingers down and have it trigger the Compiz scale plugin, or similar. I looked around for a non hackish way of doing it, and was appalled by the [lack] of documentation, and even support, for interfacing Python to Compiz Fusion.
So, what would any DIY Python/GNU/Linux freak do? Write his own binding of course! I wrote pyCompiz just for this purpose. Its really simple, you can call any compiz plugin in one line or less:D To trigger the scale plugin: compiz.call(‘scale’, ‘initiate_key’). I have it hosted on a Google Code page, for anyone interested. I don’t plan on updating it very often, only for bug fixes. There really aren’t any features that could be added, its such a simple thing. Anyway, here is the link. Its under the GPLv2 License, so anyone can use it. Happy Hacking!
I just received my lasers I talked about in an earlier post. I have not messed around with them at all yet, but I will try them out tomorrow. Ill post some first results, and hopefully I can have the tracker working and lasers mounted by Monday.
Being the DIY freak that I am I could not get it past myself to buy a power supply for the lasers from Aixiz, so once I get mine working ill post schematics if anyone wants to duplicate it. Its a really simple LM317 circuit.
Here are the parts I got from Aixiz:
I got four of each, but I will definitely try it with 3, 2 and 1, and post some vids so people can see how many we really need.
I figured out what was wrong with my Multi-Touch Box. Due to the large dimensions of it, I only had about half the illumination I needed. That rather bothers me, because I don’t really want to spend another two days just soldering LEDs. So I found a better solution;)
There is another Multi-Touch Method that I have not talked about before. It is called LLP(Laser Light Plane). Basically, you put one infrared laser in each corner of the screen, pointing straign up. You then put a line lens above each laser. Line lenses take a dot and scatter it into a solid line. This is then projected over the entire suface. The goal is to get this plane as close to the actual surface as possible(on the top). Then, when you touch it the light is scattered downward, and picked up by a camera.
I ordered four Infrared 10mW 780nm Lasers and four 120 Degree Line Lenses from Aixiz.com, for a total of $38. I should be receiving them by Friday. Ill post my success(or failure) then, with some videos if it worked.