FreeTrack Forum

Welcome, you're not connected. ( Log in - Register )

FreeTrack Forum > FreeTrack : English Forum > General Discussion > Tracking in a real (3d) room

RSS >  Tracking in a real (3d) room, Tracking motion in an enclosed room
spectralshift #1 19/02/2008 - 00h49

Class : Apprenti
Posts : 1
Registered on : 19/02/2008

Off line

Hey all, just came across the freetrack stuff and find it brilliant.  I had an idea I've been toying with for a couple of weeks (came about from Johnny's Wii hack and found freetrack from doing research)... but I have no idea if the idea is at all feasible. Keeping it in general forum since it doesn't have all that much to do with FreeTrack specifically (although I'd probably be borrowing heavily from the code :D )

What I want to do is create a real 3d space - a room - that could track the movement of a person inside it...  for now, just XYZ (since I'd build that first before setting up multiple emitters!)

Not having worked with the system/concept before, the major limitation seems to be the strength of the IR diodes and the angle of the IR beam.  The more powerful diodes that I've looked at trade off angle to the point of making them useless.  According to the datasheet for SFH485P (which seems to be the one of choice), I need only 6 to ensure that the weakest angle is still greater than 100% (about 140%+ with quick mental math).  Course I'm choosing to ignore any artifacts generated by having multiple emitters.  Needing a full 360 emitter makes life difficult, heh.

I think it will work, but I don't know if I can pick up the light at those strengths at a farther distance than what is being used here.  The issue of mapping the sensor into 3d space doesn't seem all that difficult, assuming just XYZ is needed - for this project, that's all I'd need.  Assuming I can get a decent 360 emitter built and get it picked up reasonably well on some camera, I'd likely place 8 cameras to cover the room.  I'd be processing and returning up to 8 XY values and passing them to a rendering computer to reproduce the location of the 'person camera' in order to render a mock object on a screen (and being rendered as the person walks around the room - emulating an object at a distance).

As far as I can tell, there are no theoretical limits for the xyz detection - but I'm pretty clueless on this topic (and yes, I'll be doing it in stages :D ).  Has anything like this ever been talked about?  Is it utterly impossible to generate an emitter strong enough to pick things up from a moderate distance (say up to 10-15 feet, or at half length at 5/7.5?)

I know some concerns over calibration and such will come up, but I don't forsee a problem since I control the physical environment.  I'd use the same hardware, validate the feedback and test incrementally by placing the emitter in given locations and measuring the output for each camera.


(Now I just need LCDs that bend!)
Deimos #2 22/02/2008 - 02h23

Class : Beta Tester
Posts : 120
Registered on : 07/11/2007

Off line

Hi. There are few possible approaches to 360 degree emitter. The best is probably the one used by professional motion capture hardware - spherical retroflctive makkers illuminated by strong IR light source - like 10 IR LEDs running at or above maximum ratings - about 100mA. But LEDs driven with high current generate much heat, so they need to be mounted on a circuit board (preferably a custom-made one - the prototype boards have rather thin circuits). Attempting to solder them to wires will probably result in LEDs being damaged after some use. Also, i don't really know where you can get a hold of such markers (ebay maybe?) and how much would they cost.
Of course there is a solution with glowing markers. If you want them to be visible from large angles, try getting a small block of some heavily diffusive material (like polyamide. Just so you know what to look for - white harddrive power connectors in computers are made from it), drill 5mm holes in the material and put a flat-topped LED inside the hole (or more than one, preferably each facing different direction).
Here's a random polyamide part put on one of the LEDs on my FreeTrack cap (the LEDs are filed down flat):
Posted Image
And the same seen from the back (LED faces away from the camera):
Posted Image
Pics are taken with IR-modified labtec webcam.
Note that the hole in which the LED is embedded is drilled all the way trough the part, so some light escapes trough the hole instead of dispersing in the material. The LED is running at rather low current (i don't remember exactly, probably about 30mA), you can make the marker much brighter by increasing the current.
Edited by Deimos on 22/02/2008 at 02h25.
Kestrel #3 22/02/2008 - 06h38

Webmaster (admin)
Class : Webmaster (admin)
Posts : 780
Registered on : 13/07/2007

Off line

The tracking points really aren't an obstacle to be concerned with, the commercial standard is to use retroreflective points lit by IR leds situated around each camera.

FreeTrack is far removed from 3D real-space tracking because it uses weak perspective algorithms for determining rotation, the translation is just guesswork. From what you've described, essentially you're creating a mocap system from scratch, so you'd need identical cameras with known intrinsic parameters, you would need to account for these parameters as well as lens distortion and use an iterative perspective algorithm (or equivalent) to locate and correlate the points in accurate real world coordinates.

 >  Fast reply

Message

FreeTrack Forum > FreeTrack : English Forum > General Discussion > Tracking in a real (3d) room

 >  Stats

1 user(s) connected during the last 10 minutes (0 member(s) and 1 guest(s)).