1

Hello Joan,

Recently I stumbled upon some strange things with the timing of the delay. I'm using Kinovea 0.8.27 with an IP camera 1080p30.
When I open the video stream there is an initial delay of around 0.3 second, because an IP stream is not 100% realtime (this is expected behaviour). The reported framerate is 'stable' at 29.9-30.1 fps. Display framerate is set at 30 fps.

Now when I increase the delay to 1800 frames (60sec @30 fps), the actual delay in the video is around 85 seconds.
https://i.ibb.co/WWLkqGV/Kinovea-Delay-issue.png

When I record the delayed stream, I get around 21 fps recorded (so the playback is way too fast).

If I now change the display framerate from 30 fps to 21 fps (and reset the delay to 60s@21fps==1260 frames), the actual delay in the video is the correct 60 seconds. and playback of a new recording is at normal speed.

Proposal for 3 changes:
1. Indicate the actual display framerate in the statusbar on top of the video.
2. Fix the delay slider timing+frames to the actual display framerate, so the reported delay is correct.
3. Set the correct fps (actual display fps) in the header of the recorded video.

2

Hi,
In addition to the configured framerate and the display framerate, there is also the actual, received framerate. It's possible that Kinovea is not receiving frames at the correct speed for whatever reason (low light, network conditions, camera is lying, etc.). There should be a "Signal" field in the infobar that shows up after a while. This shows the framerate received.

The way recording with delay works has completely changed in the next version, so we'll see if that fixes your particular problem. It will now always save at the camera configured framerate, display framerate is not used anymore.

It's possible the issue is that the stream received is at 30 but Kinovea wasn't able to record it fast enough to disk unless your force it down to 21. Or it's possible the stream received is at 21 in the first place. The "Signal" value should give the answer to that. Another way I use is to pause the stream and navigate in the recent frames with the delay slider to see if the stopwatch matches what the delay says.

Based on your description it looks like the actual received framerate is 21, even before attempting to record to disk. Unless the camera is lying this should be fixable by other ways (most probably adding more light to avoid triggering auto-framerate by way of auto-exposure).

For this case of the camera not sending what it's configured to, currently the best way to fix it is at playback time, by going to menu video > configure timing and fixing the framerate there. In the future I'll also see to add an option to be able to save the "received framerate" in the video, but ideally this option would be per-camera and saved in the preferences per-camera, as the goal is to fix cameras that are lying about their framerate. So as the release of the next version is coming soon, I pushed this to the roadmap for the next one.

3

Hello Joan,

The 'signal' field shows 29.9-30.1 fps received, so that seems to be okay (camera is even capable of 1080p50). If I record the video stream directly in Kinovea, it records the correct 30 fps, so the harddisk is also not the bottleneck.

Good to hear that you are working on the delayed recording for the next version. Do you have any idea when the newest version would become available? I would love to test the new delay features.

4 (edited by arnopluk 2019-11-29 16:22:15)

Hi Joan,
I just noticed that you have updated the build environment to something feasible. Thank you very much for that. I immediately tried the latest Kinovea version from github!
The recording results from the delayed video stream are indeed much better. I now get the expected 30 fps recording. However, unfortunately the display framerate is very low (1 fps). From the logfile I get

CaptureScreen - Nominal camera framerate: 0 fps, Monitor framerate: 59 fps, Custom display framerate: 30 fps, Final display framerate: 1 fps.

The cameraGrabber.Framerate == 0 for some reason, I haven't found where that info comes from.
If I change the code in Kinovea.ScreenManager.CaptureScreen.Connect2() (line 609-611) it works for IP camera's, but I don't know enough about the details to say this is an actual fix:

double slowFramerate = Math.Min(displayFramerate, monitorFramerate);
double framerate = cameraGrabber.Framerate;
if (framerate == 0)
    framerate = pipelineManager.Frequency;
if (framerate != 0)
    slowFramerate = Math.Min(slowFramerate, framerate);
// slowFramerate = Math.Min(slowFramerate, cameraGrabber.Framerate);
slowFramerate = Math.Max(slowFramerate, 1);

5

Oh this is great! smile

I reproduced the problem and pushed a fix, let me know if it works for you.

The goal of this piece of code is to set the timer refresh rate for display, and the only thing we really want is to avoid it to be too high, for example if this is a high speed camera running at 500 fps we don't want the burden of refreshing the UI at that rate, it would compete for computer resources with the recording which actually *needs* to happen at 500 fps. So it needs to stay `min()`. In some cases though, the camera framerate can't be known in advance, this is the case for IP cameras (for now, until maybe ONVIF is implemented or something). So in that case we'll just ignore the (unknown at this point anyway) camera framerate altogether and use the configured display framerate or monitor framerate.

The `cameraGrabber.Framerate` is going to come from the camera modules (FrameGrabber.cs in Kinovea.Camera.HTTP for example).

`pipelineManager.Frequency` is only going to be valid after some frames have been received, so it shouldn't be used during connection.

Thanks!

6

Thank you for the update. It works perfect for me now!

In 0.8.25 I proposed a fix to update the delay when changing delay with the hotkeys. This doesn't work anymore (seems to be due to the nud), setting the delay in the slider is also very buggy because of this.
The old fix should be removed. I'm working on a new fix (using sldrDelay.Force(value) works for the hotkeys, but not for the nud).

In my opinion the nud should have a fixed maximum (now calculated from framerate, so changing over time?), and maybe everything should be in frames instead of seconds for the nud?

Next to that, I would suggest to change the slider for the delay to a linear slider.

7

Yeah, before the nud it was really hard to set small values precisely, that's why the slider is logarithmic, I guess this is no longer really relevant now, so yes, it could use a linear slider instead. It could also be an option. The scenario for very small values is to match two cameras that have different capture latency.

I still think it should be in seconds though, internally everything is in frames but from a user point of view I don't think frames make sense for the general concept of delay, what is the scenario where you think about delay in frames? Also for the case of pre/post recording, like recording for x seconds before and after a trigger event, it's natural to have this in seconds.

8

I agree that seconds is a more natural setting for delay. Lets try to make that work.

I hadn't considered the usecase 'capture latency' (or synchronisation) for the delay. This would indeed be better with a logarithmic scale (or maybe this is fixed by using the nud?).

I think I narrowed the problem down to a loop in updating values:

SldrDelay_ValueChanged
--> calls: presenter.View_DelayChanged
--> calls: DelayChanged
--> calls: view.UpdateDelay (with delay in seconds recalculated from pipeline frequency)
--> calls: UpdateDelayLabel
--> changes: nudDelay.Value , so calls: NudDelay_ValueChanged
--> changes: sldrDelay.Value , so calls: SldrDelay_ValueChanged (with delay recalculated from a different framerate)

9

I added a flag to the ValueChanged event from the Nud to prevent it from recursively updating the slider.

I created a Pull request in Github for the changes I made in the code. Please feel free to check and comment on it.

10

Merged!
Super thanks big_smile