If you go to Options > Time you can choose a format for timecodes. Using "Total milliseconds" for example should turn all times into numerical values in the interface and when exporting to spreadsheet. Then you can use them for time arithmetic.

In the data analysis window of tracks the times will always be in milliseconds, disregarding the time format option. So when copying to clipboard and pasting to spreadsheet it should also allow arithmetic.

So I haven't experienced this issue myself yet, but it has been wildly reported that a recent auto-update of Windows 10 is breaking MJPEG streams for applications based on DirectShow. This impacts for example the Logitech cameras in Kinovea.

There is a long thread over at MS Dev forums and the issue is apparently breaking many high profile applications like Skype.

Basically they wanted to support for multiple applications to consume camera streams so they moved the decoding stage upstream in the pipeline and removed access to the compressed stream.

They are working on a fix which should be pushed through auto-update in September.

This is definitely handled differently in 0.8.24.

When using comparison the videos should be both progressing relatively to a common absolute time. There is also an option in the preferences to unlock the speed sliders if that's required. The speed slider percentage is relative to real time (takes capture framerate into account if it has been set).

After the synchronization point has been set between videos, the times before it are indeed expressed as negative.

0.8.25 also has a additional setting to force a different reference playback framerate, in case the video metadata is wrong.

It's also my experience that most "cheaper" sensors will auto-exposure in low light conditions and degrade the framerate to a value of 1/exposure duration.

Thanks for providing the straight-from-camera files. Here are the links:
- ex100f-1.avi. 240fps, 1/10000, 512×384px. (8.17 MB).
- ex100f-2.avi. 480fps, 1/10000, 224×160px. (7.36 MB).

Cool. Thanks for posting!

That's a lot of light for a 100µs opening, that's good, was it a very sunny day? It also seems you are facing the sun which would help with the short aperture time. The dynamic range doesn't seem super high though. I wonder the amount of light required for indoor filming.

The rolling shutter distortion is visible on the club on the way down.

Do you still have the raw file straight from camera and could you upload it somewhere? I'm wary of YouTube compression artifacts. We can also host it here if it's not too large.

Do we know what sensor this device is using?

741

(3 replies, posted in General)

This operation is not currently possible but that's a very good idea!

A slight generalization of this would be "alignment by coordinate systems" or "by calibration", as the approach could also work with the line-based calibration / coordinate system (just origin + scale, axes stay aligned with image axes). Even if less accurate it's often the only calibration available.

Lens distortion correction might also comes into play.

The original goal with superposition was actually to compute this transform matrix automatically, refining it using the video sequence to ignore the foreground layer. I very much like the idea of being able to do something manually before implementing an automation of it.

We need the full homography matrix by the way, not just affine, as it will map arbitrary quad to arbitrary quad. I've been thinking about how to finally build a platform to experiment with these ideas more easily. I also need to revisit and homogenize the matrix maths in some places. No ETA.

This looks pretty promising!

743

(2 replies, posted in Cameras and hardware)

Yeah, this is the color model, unfortunately I hadn't been able to test it at the time and 0.8.24 won't work with any of the color models.

The issue should have been fixed for 0.8.25 but the code is still based on Basler Pylon API v4 which was the then-current version when I worked on this last fall. Since then Balser updated their software stack to v5 which will break compatibility. I will have to revisit this for 0.8.26.

coxisambo wrote:

Some times the camera is not well placed and it is not 100% horitzontal.

The grid coordinate system should be well suited for this, as it's one of its main purposes. Add a perspective grid and right-click one of the corner and enter the calibrate dialog.

coxisambo wrote:

Another thing is to calculate inclinations or an angle between to segments that are not intercontected by an axis of movement. Then an angulus of "four" points would be the point.   Digitization is then from distal to proximal in both arms.

Yes, that would be a nice tool to have. It might be doable as a custom tool.

Well, the circle tool is more designed as an annotation tool rather than a measurement tool. As such it's not exported in spreadsheet export. The center and radius are saved (in pixels) in the KVA file.

The marker tool is going to be the preferred option to export individual coordinates of things.

I just thought of the fact that you are filming underwater!

The lens distortion is going to be different due to the refractive index of water vs air. Ideally you should perform the lens distortion calibration underwater as well, not reuse the coefficients computed in air. I don't know exactly how much of a difference it will make but it's worth a test I think.

Yes the frame shift most likely depends on the format. Or even the encoder in the camera. If you want you can send me a file with the problem so that I can see if it's a bug that could be fixed somehow. Less than 5MB send to joan at kinovea dot org, if more than that host it somewhere else and send me the link.

Regarding filtering, if it's possible, I would still suggest to test a digitization of a file for which you have ground truth available if possible. Ideally coming from a physical measurement system, not from another optical based system. The filtering helps smoothing the minuscule noise introduced by the manual or automated tracking process, where even subpixel placement at 600% zoom might not be enough to get the correct coordinates. I would assume this to be universally beneficial for precision/repeatability.

Note that the radial distortion calibration will also not be perfect, and usually less accurate at the periphery. The tracking works only in 2D so deviation from the the plane of motion is also going to add errors. If you are computing derivatives the noise is going to increase the error. If you compute or save acceleration data for example, I would definitely try to evaluate the accuracy first, to know where you are standing.

There is a filtering pass on the raw coordinates to remove the high frequency noise produced by the digitization. There is more information about the exact process and the cutoff frequency selected in the about tab of the data analysis dialog.

The approach comes from sport science literature, I don't know its relevance to burst swim of fishes. But I think it should still be better than the raw coordinates.

The spreadsheet export from the main menu does not have the filtering, it's just the raw data. The shift by 4 frames is strange. Maybe one of those files where the first image has a time coordinate different than zero and cause some issues.

Export through spreadsheet from main menu or export through data analysis dialog from trajectory context menu?

750

(2 replies, posted in Bug reports)

Please try the bug related to zooming in 0.8.24, there were important changes regarding precision since 0.8.15, the coordinates are now stored in subpixel precision. I'm interested to know if you can reproduce it.

The bug where an angle drawing would vanish out of thin air has been fixed for 0.8.25. To avoid the problem in 0.8.24 you should keep the mouse down until the second leg of the angle is placed. 1. mouse down to add the angle, 2. stay mouse down and move away a bit, 3. mouse up.