1

Hi all,

I'm new to Kinovea, and I cannot get the joint angle tracking to stay attached to joint markers I put on the physical person. I have tried brightly colored strips of tape, bright button stickers, etc., but regardless of how small or large I make the tracking windows, my markers end up on some other portion of the bike, floor or background within the first 2 seconds of stationary cycling. I've tried different color back drops behind me, changing my own clothes and nothing is working. Does someone have any recommendations or proven tutorials/schemes that work for this? My university has the classical round, reflective markers for IR camera systems, but those are not effectively picked up by a simple desktop webcam or phone camera. I was hoping to utilize this software to allow students to analyze their own videos, so I wanted a software that would work with a range of potential smart phone or tablet cameras, rather than relying on our singular biomechanics lab space, but I obviously need a simple scheme that I can give students (i.e., wear black leggings or shorts, place X on the joints of interest, etc.).

I would appreciate any advice.

2

I also encountered the same problem. I bought a 30mm reflective ball. The front part could track precisely, but the rear part couldn't.

3

It's a shame because I really wanted an elegant automated joint angle tracking solution, but I've ended up just porting most of my efforts over to Tracker, which I've found less intuitive to work with for biomechanics explicitly (since it's for pure physics) but more effective in regards to tools like the automated tracking system. If Tracker does make a mistake and loses the marker, it prompts you to tell it where the marker is within that frame, and telling it where it went usually rights the ship for several seconds and sometimes the rest of the video. It's unfortunate, because to automatically compute angles, you need to export the coordinates of the tracked markers to a spreadsheet and run the calculations separately, but I could not get Kinovea to RGB match bright red X's on black leggings with no black or red anywhere else in the video frame (almost all else was white or light grey) while Tracker has minimal difficulties with such obvious contrast.

4

If you can share the video privately it would be great to see where it falls short to improve the algorithm or the experience.

The core tracking algorithm in Tracker and Kinovea are fairly similar. They are both based on tracking a template inside a search area and update the template based on some heuristics.

Depending on the failure case you might try different values for the match and update thresholds. This is how Kinovea controls template update (Tracker uses different mechanics with "evolve" and "tether").

For example if it gets the wrong marker too often you can increase the "match threshold". When it fails and you replace it manually it will reset the template.

I see now that it doesn't actually stop the playback when it looses tracking, I agree it probably should, I think this behavior was changed at some point. I'll fix that or at least make it an option you can toggle on/off.

You can also increase the update threshold to 0.85 or 0.90, if the tracked object visually changes a lot this will update the template more often and can improve things.

One of the most important thing is that the object window, which defines the template, should be as tight as possible around the physical marker to avoid picking too much background (but still have some details in it). The search window should be large enough to cover marker motion but not too large that it might get confused with another marker.

A round shaped marker will work better (in both software) because an "X" shape will vary under rotation. (we don't look for the rotated template, only slide the template inside the search area and compute a score based on pixel correspondences).

If you can share the video with me I would love to test it.

5

I remember now, the issue with immediately killing the tracking as soon as it fails is that it breaks when the object is temporarily occluded, for example an arm or leg passes in front of it. The idea was that when this happens it keeps looking and when the occlusion is over it might recover the tracking by itself if it's still inside the search window.

What I would like to do is add an option somewhere to tell how many frames of failed tracking are allowed before the playback is stopped. 3 or 4 frames should be a good default I think, but if you want you could set it to 1 and get the same behavior as in Tracker.

Or maybe playback should always stop on failure and it's the track that is force-closed after n failures.