Another day of serious work during the day, hacking in the evening.
We now have an image of the movement from which we will be able to extract descriptors. (We will use it to look for known patterns).
Of course, everything is done real-time with a pretty good frame-rate.
Note the small line at the top left of the visualization window, it shows the global direction of the movement. In this sequence, I moved my arm toward my head.
Canard took the time to do some code refactoring and successfully built in on Windows.