901

(9 replies, posted in General)

"4. Usage of the distortion parameters to compute undistorted coordinates and usage of these coordinates to make all measurements in Kinovea."

Some measurements done filming the checkerboard pattern "roughly" orthogonal to the camera axis. Pictured lines, spanning 5 squares, are ±1% from each other in measured length (still true close to the border of the image).
http://www.kinovea.org/screencaps/0.8.x/lensdistortion/calibrated.png

For comparison, the same lines without lens distortion calibration :
http://www.kinovea.org/screencaps/0.8.x/lensdistortion/uncalibrated.png
(Lines at the edges of the image go down to 50% of their ground truth.)

For this test lens distortion coefficients were computed in an external (free) application and imported automatically. The geometric calibration used a grid spanning 10 squares and marked as 200% in length.

902

(9 replies, posted in General)

"2. A module to compute lens distortion parameters based on the distortion grids you added throughout the video."

Distortion map computed using data from the grids added the video of the previous snapshot.

As a vector field:
http://www.kinovea.org/screencaps/0.8.x/lensdistortion/vectorfield-small.png

As a distortion grid:
http://www.kinovea.org/screencaps/0.8.x/lensdistortion/distortiongrid-small.png

GoPro Hero 2 at 1920×1080@30fps (170°). Calibration used 5 sets of 25 points fed to OpenCV's cvCalibrateCamera2 to compute the distortion coefficients and camera intrinsics parameters. The distortion grid is drawn using the direct equations to distort back the coordinates.

903

(10 replies, posted in General)

Hello,
Yes I did not add any new arrow tools in this version.
I want to focus on the quantitative analysis aspects for a while before returning to presentation features.
I keep them in mind though.

The first two topics I will likely work on for 0.8.23 are :
1. A "data analysis" window to review kinematics data : trajectory plots, projective view of points and lines on the plane, possibly a heatmap style plot, possibly an angle/angle or angle/angular velocity diagram for coordination analysis. With facilities to export the filtered values and the resulting plots.
2. Lens distortion compensation for measurements, with import/export of distortion profiles and distorted view of the coordinate systems.

I also want to do some research on time distortions (e.g: rolling shutter), camera pose estimation, frontal area computation, time of flight cameras, etc.

904

(0 replies, posted in Français)

Version expérimentale, merci de remonter toutes les regressions éventuelles !

Installeur: Kinovea.Setup.0.8.22.exe

Le topic annonce sur le forum anglais.

905

(10 replies, posted in General)

Experimental version, feedback needed ! wink
Beware of regressions and report anything suspicious. Do not assume the issue is known.

Installer: Kinovea.Setup.0.8.22.exe

Highlights:

General

  • New locales : Japanese, Serbian Cyrillic and Macedonian.

  • Autosave and crash recovery mechanism.

  • Better video quality when saving.

  • Drag & drop KVA file on top of video.

Trajectories

  • Tracking parameters can be changed manually from the configuration dialog.

  • Subpixel accuracy everywhere.

  • 2D Kinematics: velocity per components, acceleration, coordinates.

  • Angular kinematics through "best fit circle" of trajectory.

  • Filtering of kinematics data through a Butterworth filter with autoselection of cutoff frequency.

  • Tracking data for trackable drawings is saved to KVA file.

Other

  • The coordinate system display will work in perspective when using plane calibration.

  • The synchronization logic has been rewritten and many bugs with regards to synchronization or dual saving should be fixed.

  • Many bugs have been hunted down and fixed, entire parts rewritten to be more testable and new tests designed. A special effort went into this release to improve the internal and external quality of the software as it is more and more used in sport science classrooms around the World. If you find an issue, please report it immediately, on the bug tracker (preferred), here on the forum, or by email.

The raw changelog is here.

Thanks!

----
Some screencaps:

The new trajectory configuration dialog with tracker parameters:
http://www.kinovea.org/screencaps/0.8.x/0822-configure-trajectory.png


Best-fit circle on rotative trajectory:
http://www.kinovea.org/screencaps/0.8.x/0822-bestfit-somersault.png


Perspective coordinate system:
http://www.kinovea.org/screencaps/0.8.x/0822-coordinatesystem.png

Link to website : http://slowmovideo.granjow.net/

This is definitely something to check out and experiment with. It will compute the optical flow of the scene and interpolate frames between the existing ones to create a fluid slow motion effect.

The graph editor requires some trial and error as it's very generic and defaults to segments instead of curves. You can basically map the input time (vertical axis) to the output time (horizontal axis) in any way you want, to create slow, static, fast, reverse motion.

Got it to crash a few times so definitely save your project often smile

I don't know if it would be useful to have this type of slow motion at analysis time since the frames are built from scratch and will not give you any information you are missing. For qualitative analysis it may be more pleasing to watch and help feeling the movement better. For export purposes it would be very nice to warp time around the key images in the time freeze export for example.

Chas Tennis wrote:

Golf Ball Impact  - But for the next level of studying golf - ball impact - the Jello Effect distortion seriously distorts the ball and club interaction.  I don't know how much research has been done on the impact of the golf club head and the ball.  I don't believe that it is routinely done for golf instruction. Unfortunately, I doubt that many golf instructors or golfers will pursue this level of detail. I'm sure the details of impact are very important in determining the ball's flight .......

Regarding Golf, the folks at Quintic commercialize a dedicated rig for putting analysis. A small high speed industrial camera attached to a platform with additional flood lighting (link to software page). I have no expertise whatsoever in Golf mechanics. I don't know if it could be used for the full speed swing, they probably tried but they don't market it this way. In any event, it means there is indeed interest and research done in the club/ball impact, and not only at the "fundamental research" level.

908

(9 replies, posted in General)

1. A new drawing tool "distortion grid" allowing the user to place a grid on top of a checkerboard like pattern.
(The easy part).

http://www.kinovea.org/screencaps/0.8.x/lensdistortion/drawingdistort2.png
(I have enlarged the width of the grid lines for the screenshot.)

At 6x zoom:
http://www.kinovea.org/screencaps/0.8.x/lensdistortion/drawingdistort-zoom.png

Manually placing the grid points might prove cumbersome. Many calibration tools have a function to automatically find corners in the image, we'll see.

909

(9 replies, posted in General)

For some reason, existing 2D analysis softwares for sport do not seem to have provisions to correct or take into account lens distortion¹.
This is unfortunate because the 2D measurements assume that all points are coplanar, and lens distortion is bending that plane.

Stating the obvious:

http://www.kinovea.org/screencaps/0.8.x/lensdistortion/distorted-measures.png
Fig 1. Ridiculous measurements on a checkerboard pattern displayed on a flat LCD screen, filmed with a 170° lens (GoPro Hero 2 at 1080p).


The green line was used for calibration and has a length of 5 squares. The red lines have the same pixel length as the green line so without any other form of correction they also display the same physical length, which is obviously wrong.
The wide angle lens primarily exposes radial distortion, no amount of plane alignment or perspective plane calibration is going to fix this.

The current suggested approach is to first undistort the video in an external software and then use the undistorted video for your 2D analysis.

The problems with this approach are that:

  • You need to do it for every input video,

  • It may degrade the quality of the footage as it implies a re-encoding,

  • The software used may be more oriented towards "pleasing to the eye" results rather than mathematical rigor and may use a simplified distortion model.

It's not just fisheye lenses. Almost all cameras have some form of distortion even though it's not always readily noticeable.


Here is the plan to integrate lens distortion into Kinovea.

  1. A new drawing tool "distortion grid" allowing the user to place a grid on top of a checkerboard like pattern.

  2. A module to compute lens distortion parameters based on the distortion grids you added throughout the video.

  3. A module to save and load distortion parameters as "profiles", for reuse with all videos from that specific camera.

  4. Usage of the distortion parameters to compute undistorted coordinates and usage of these coordinates to make all measurements in Kinovea.

  5. Usage of the distortion parameters to redistort straight lines into curves for displaying purposes. Interesting for lines, angles, rectangular grids, perspective plane and coordinate systems.

  6. [Nice-to-have] Import of calibration files from third party photogrammetry software.

Note that it is not planed to undistort the images themselves in real time. Rather, I would like to keep the video untouched, but use the distortion parameters for measurements and display.

I will use this topic to post updates on the progress of this feature. It may take a while since there are plenty of other topics that are in progress, completion of the 0.8.22, bug fixing, etc. Your feedback is nonetheless highly appreciated.

----
¹: The popular commercial packages for 2D analysis (Dartfish, Quintic) do not seem to have any lens distortion related features. If you know otherwise, please post a comment.
The Centre for Sports Engineering Research at Sheffield Hallam University (UK) has developed a software called Check2D, it is the first effort to my knowledge to address the issue in the context of sport study. Background and case study (PDF).
Softwares rebuilding 3D coordinates from multiple cameras are immune to lens distortion.

I'm actually cautious with the very high framerates for now. During some experiments I have found that there seems to be "time noise", with the framerate variating around its nominal value. The higher the framerate, the higher the error. When digitizing coordinates, this is aggravated by the reduced resolution.

As the frame interval is used to compute the velocity of tracked objects, time bias on high framerate videos have a massive impact and introduce larger errors than for lower framerates. At 1000fps a 1/10px digitization error can be catastrophic for acceleration measurement, so if the frame time itself is not exact…

I haven't found much literature on the subject yet (contrary to spatial distortions), but I intend to explore the subject more when time permits. The time bias introduced by the rolling shutter is another time distortion I would like to experiment with.

Do you know how the framerate fidelity is assessed at the manufacturer level ?

Even the simplest experiments where I would film a high resolution stopwatch displaying miliseconds is not trivial to implement as these sort of stopwatches are not so common.

In addition to this, the things we see at 1000fps that we don't see at 300fps are probably for different analysis contexts. For evaluating limbs or implement movement in the context of sport technique, 300fps might be sufficient. Medical applications or "ball" analysis might benefit from higher framerates.

It is a niche market for sure. Whether the niche is large enough to sustain a business is a tough question.

Also noteworthy are the alleged specs of the upcoming GoPro Hero 4 : 4K @ 30fps, 1080p @ 120fps, 720p @ 240fps… In a consumer ranged price. I think it has a rolling shutter though, and untweakable shutter speed, so whether it's suitable depends on the final application.

Somewhat related to wide angle view, I've also noted this currently live Kickstarter project of a 360° cam in the affordable zone. I don't know if they can pull it off.

912

(2 replies, posted in Bug reports)

Exfal wrote:

I'm trying to use Kinovea (version 0.8.15) to combine pairs of videos together into a single video. Sometimes they work, but most of the time the two videos in the combined video don't stay in sync (i.e. one plays at a normal-ish speed while the other is much slower)

The synchronization mechanics has been rewritten and many issues should be fixed in 0.8.22.

913

(3 replies, posted in General)

Good thinking on the calibration by hose, it would reduce the problem to one dimension instead of three, tracking the position of the point along the curve. However I think it would be very hard to get right. Holding the calibration hose in mid-air so that it matches the trajectory. Even with the help of the actual video overlaid on top of the live view, it would be mostly guess work to position and curve it properly, with no real way of knowing the bias relative to the actual trajectory in 3D space.

The positions between two marks would also not increment linearly due to perspective, although this could be mitigated by interpolating.

Another way might be to get a depth map with a Kinect-like device. The athlete might have to wear markers reflective to IR. A way to actually calibrate that third axis would still be needed though, but maybe the Kinect already provides this information.

914

(3 replies, posted in General)

You need to have a calibration reference in the video. Add a line drawing on top of something of a known length, right click the line and go into Calibrate measure to set the physical length of the object.

The accuracy of the measure will depend on several factors.

The line of known length should be on the plane of motion otherwise there will be perspective error.

The calibration object should be as large as possible and as close to the center of the image as possible to limit error due to camera lens distortion.

It should also be as close as possible to where the trajectory will take place. One idea would be to use the leg of the kicker if static points are clearly visible on it during the kick and assuming it doesn't deform (see also reservations about rotation).

To limit this camera lens distortion avoid using fisheye type lens by all means. With regular camera you will want to move back and then zoom in, this will flatten the scene and reduce perspective errors. Use a tripod and remote control the recording to minimize camera movement between calibration and measure. This is especially important if you take several attempts with the same calibration frame.

The camera optical axis should be perpendicular to the plane of motion to avoid perspective errors. At the least you will want to make sure it's parallel to the ground, with a small spirit level for example.

I'm not too familiar with how much rotation component is there during the kick. Not all of the motion will take place on a single plane of motion. You need to keep that in mind when comparing speed of several subjects if they have different kicking style.

The next version is coming along and will have several improvements on the topic. (sub pixel tracking, changing tracker parameters, acceleration, angular speed, etc.)

915

(6 replies, posted in Français)

Bonjour,
Comment se manifeste le problème ? Il ne se passe rien en faisant glisser la camera dans le second écran de capture ? Il y a un message d'erreur ?