You're right, at the moment it's not really possible to do this.

You can use start/stop tracking to prevent objects from being tracked after the section you are interested in.
When you start tracking, it automatically changes the visibility option from the default fading to "Always visible". (You can change it back but it might become invisible before the tracking section is over.).

In the next version there will be more control over each drawing's visibility/fading. In theory it will allow you to do this as you can keep a drawing visible for a set number of frames but it will be tedious to match duration manually. I'll check how to support such an option in a more friendly way.

This sounds awesome! Thank you for looking into this.

edit: Tested with various alphabets and it seems to work beautifully. Thanks!

558

(2 replies, posted in General)

Hi,
You are right, the documentation is obsolete on this point, the capture system was rewritten for performance reasons and this particular feature, in the way it was written, wasn't compatible with the changes.
I'm looking at supporting this again for the next version with a different approach to keep the recording performance intact.

You can design your own grid. There is no "designer" application though, it has to be created manually in an XML file.

It's a bug. For some reason there was an explicit test capping the output framerate to 100 fps. I traced it to a specific commit in 0.8.22 but I don't remember why I put it there in the first place. I can't think of any adverse effects of removing this limit at the moment. Should be fixed in the next version.

560

(36 replies, posted in General)

NeilHa wrote:

* Update. 09Feb20. No response from Code Laboratories or the original author of the CL PS EYE driver regarding plans for a 64bit version so have switched to a different driver which appears to work pretty well in 64bit mode with Kinovea 9.1 on both Windows 10 and 7 64bit. It can be found here :  https://github.com/jkevin/PS3EyeDirectS … rBeta2.msi

This is great!

561

(36 replies, posted in General)

dong979us wrote:

Once the delayed video is captured, how do I make it so that it pops up in New window automatically so that i dont have to click on it.

You can do File > Open replay folder observer… and point it to the directory where the videos are captured. Any time a new video file is created in this folder it will automatically open and start playing in that screen.

dong979us wrote:

How do I make it so that delayed video is played at slower speed all the time so that I do not have to set it every recorded video?

Once you make the videos open in a replay folder observer, you can reduce the speed and it should keep that setting for the next videos.

dong979us wrote:

Can the drawing be transfered so that I don't have to draw it on every recording? Or at least save the drawing setting??

If you want to have specific drawings always loaded whenever a video is opened, you can do this:
- Add the drawing.
- Save as .KVA (File > Save, Save only the analysis).
- Rename it to "playback.kva".
- Copy it to the application data directory, which you can get from Help > Open log folder.

When you do this, every video will open playback.kva and import its content.
This can also be done with the capture screen, the file has to be named "capture.kva".

One thing that is in the works though is to have the drawings you add in the capture screen to automatically be saved as a KVA file alongside the recorded video. This would solve your problem in a better way I think and will also help with setting the time origin and other things.

Could you describe the symptoms more? When you go to the camera tab, there is no entry for the camera? Or there is a spot but the thumbnail is empty, or you see the thumbnail but when you open it the screen is empty, or the screen is black, etc.
Could you send me the log.txt at joan at kinovea.org, thanks!

563

(36 replies, posted in General)

Hi, yes the PS Eye webcam will only work with 32 bits applications. I don't know of any workaround. At the moment the 32 bit build is broken and it's a bit of a pain to maintain, so the incentive isn't there. So right now there is no plan to add back support for 32 bit. If someone contributes it and it's not a chore to maintain I'll merge it though.

564

(1 replies, posted in General)

Hi,
You can go to menu Tools > Angular kinematics. In the lower right corner there are buttons to export the data to CSV. The first column will be the time in milliseconds and then one column per angle. You can check/uncheck sources to include/exclude angles.

Hi, there are multiple framerates to distinguish, the one announced by the vendor on their website, the one exposed by the driver and that can be configured in AMCap, Kinovea or other Directshow applications, and finally the one that is actually sent by the camera.

If the info bar says something like

1280×720 @ 120 fps (MJPG) - Signal: 101.00 fps

Provided the exposure duration is low enough (less than 1/fps), it means the camera isn't really sending what the driver is announcing. Driver says 120, camera sends 101. This seems to be a recurring problem with these modules from Shenzhen.

AMCap doesn't do any dynamic measurement as far as I know, it shows what the driver says. In Kinovea you will also be shown what the driver says in the configuration window, but then while streaming it will count the frames really received from the camera and show you the actual framerate.

Does the camera not work at all in 0.9.1 or does it work with this framerate discrepancy?

566

(36 replies, posted in General)

The threshold should floor at 60 fps, can you really set it to 30?
To be clear this mechanism is only used by the capture screen. When the video opens in playback, what does the header above the screen say? It should say 24 fps. If both files are showing 24 fps in the header the composite video should also be 24 fps. I'll double check the behavior when the files don't have the same framerate, it's possible there is some trickery that makes it fallback to 100 fps for some reason.

567

(36 replies, posted in General)

Thanks, I think I can reproduce both issues. The Syncronize button is definitely broken for the right video, this looks systematic, collateral damage of recent changes in this area, will fix asap.

I can also reproduce the auto playback issue by manually recording each camera to force a gap in file creation time. The dual playback needs to be robustified. In theory when the late video starts it should force the other one to move back and they should continue in sync. This seems to work when done manually but not when the videos are started automatically.

There is also a third issue you might encounter from times to times, it may happen that both videos start replaying, but not properly locked and one of them start to drift away after a few iterations.

568

(36 replies, posted in General)

inorkuo wrote:

i'm still not sure what the threshold and replacement settings do but I will play around with it.

Typically when capturing a high speed camera, the camera is sending let's say 300 fps, in the final file we don't put 300 fps but something more reasonable like 30 fps so the player can read it without trying to decode at 300 fps which would be too intensive. Physical high speed cameras or phones do the same when saving.

The first line, "Framerate replacement threshold" is the value above which this behavior is active. By default it's 150 fps. Anything under this will keep the value in the final file. Anything above this value will be replaced by something else. I had a hard time with the wording, it's hard to convey the meaning succinctly.

The second line, "Replacement framerate" is the actual framerate that will be written in the video metadata.

The old behavior is that anything above 100 fps was silently converted to 30fps.

inorkuo wrote:

there are a few more issues I've run across. in dual playback mode, the "synchronize videos on the current frames" does not work.

Can you describe how the problem manifests itsedf? What this is supposed to do is create a "time origin" point in each video at the respective current location. Then synchronization works based on these time origin.

- When you hit the button, does it change the timing in individual videos (they should now have negative times before the sync point)?
- Does it work if you move videos to the desired point and set up each individual videos instead, by using "Mark current time as time origin", (button or right click in each video)?

inorkuo wrote:

the last issue is with automatic playback with two playback screens, when new videos are created, often only one will play and I have to manually press play to get both to play together. it seems that there is a correlation to the difference in the length of the two videos. I have the "stop recording by duration" set to 3 seconds but sometimes, one video will be 3.1s and the other will be 2.5s. it seems that when the difference is small, both videos will play. when the difference is larger, only one will play.

Are the camera captured to the same folder? This is not supported in replay at the momentq as it just looks for the most recent one, they should be saved to different folders and each replay screen should be pointed to the respective folder. But even then both screens should load the same video and start playback, not sure what's going on.

One video being 2.5 indicates another issue. Either there are frame drops or the camera isn't sending a true 120 fps.

You can add the bug on github and add the screenshots there. Most probably there is a reset of the scaling so the video and drawings are exported at the original video size, and it's missing a revert to the current scale based on video fitting the screen space or other custom zooming.

Ah yes, the first idea should totally be doable. Even by default it feels that if there is a key image sitting exactly at the time origin it should be framed in red to match the color scheme and the tick mark in the timeline.