Practice log entries tagged vision research

View all log users

View older log entries

25th June 2017

Stephen Meschke

# vision research, 1 minutes

Major breakthrough: smoothing the data with a Savgol filter improves the quality tremendously.


from scipy import signal
x_smooth = signal.savgol_filter(x, window_length, polyorder)

Total practice time: 1 minutes

Location: Linux

Comments (0)

24th June 2017

Stephen Meschke

# vision research, 1 minutes

(F+D)/(V+D)= #balls/#hands

https://youtu.be/fbC7B6a6His

Total practice time: 1 minutes

Location: Linux

Comments (0)

22nd June 2017

Stephen Meschke

# vision research, 1 minutes

Juggling Edge User unigamer made a really cool graph. It showed me how to graph in 3D. Made this: https://youtu.be/GGLzph6IZYc?t=6s.

I need to redo this with two cameras.

Total practice time: 1 minutes

Location: Linux

Comments (0)

16th June 2017

Stephen Meschke

# vision research, 1 minutes

Graphed 17 catches of 7 ball cascade: https://drive.google.com/open?id=0B7QqDexrxSfwZUhKamliLVJ2Ykk

I recorded this in 480p@120fps. The tracking program does much better at higher frame rates or lower resolutions. It is counter-intuitive that it would work better at lower resolution. This is the best data that I have produced.

This is also the fourth framerate that I have used; 30, 48, 60, and now 120 frames per second. The data doesn't work well for comparative purposes if the frame rates are different.

I have made some improvements to my tracking program, and am currently working on applying a Kalman filter. Kalman filters are rocket-science level confusing. Literally; Kalman filters were developed for the Apollo program to help land on the moon.

Going forward, the best path is to write a better tracking program, and the re-record the juggling data in 120fps.

Total practice time: 1 minutes

Location: Linux

Comments (0)

14th June 2017

Stephen Meschke

# vision research, 1 minutes

Idea for juggling data set:

Imagine a magical set of balls. The balls change to red when touched with the left hand, and stay red until they are touched with the right hand. When they are touched with the right hand, they turn green, and stay green until they are touched by the left hand.

This would be really helpful for analyzing a juggling graph. If all the green peaks were lower than the red peaks, it would be obvious that the throws from the right hand were lower than the throws from the left hand.

Using various methods (finding local maxima in velocity or finding local maxima in x position) I can accurately determine when a ball is touched by a hand, I can also determine which hand touches the ball.

But how should I go about this? So many interesting questions!!!

Total practice time: 1 minutes

Location: Linux

Comments (3)

13th June 2017

Stephen Meschke

# vision research, 1 minutes

Worked one some of Daniel Simu's data. The video that he provided was actually not that great in resolution, but the contrast between the balls and the rest of the frame was excellent. My tracking programs worked much better on Daniel's video than on the ones that I had recorded in higher resolution! The tracker got lost only a few times.

I need to record a short clip of 5b cascade to compare myself to him. I also need to figure out how to compare his data to mine. I am left with a lot of questions, but am interested and excited.

Daniel's data:

Graph: https://drive.google.com/open?id=0B7QqDexrxSfwc2tUMHBlQU5CSlU
Data: https://drive.google.com/open?id=0B7QqDexrxSfwdWpGOGl4TGhJRlk
Tracked Video: https://drive.google.com/open?id=0B7QqDexrxSfwMFlUYWx5NnZBYVE

Total practice time: 1 minutes

Location: Linux

Comments (0)

10th June 2017

Stephen Meschke

# vision research, 1 minutes

Position, velocity, acceleration graph and video: https://youtu.be/m-T94mpD8TU

Total practice time: 1 minutes

Location: Linux

Comments (0)

9th June 2017

Stephen Meschke

# vision research, 1 minutes

Read a couple of papers:

Functional Analysis of Juggling Trajectories (http://www.gribblelab.org/publications/2013_EJS_ramsay.pdf).

Using the OPTITRACK, measurements of the position of a hand during juggling were taken. Different from my approach of measuring the position of the balls. An interesting graph was the velocity and acceleration of the hand. So far, I have only been looking at position, but I should make graphs that show velocity and acceleration as well.

Analysis of juggling data: Landmark and continuous registration of juggling trajectories: https://projecteuclid.org/download/pdfview_1/euclid.ejs/1414588170:

Uses the same data set as the previous paper. The average cycle time in this data seat is 712 milliseconds. Some cycles were faster, and some slower, due to variations in how high the balls were thrown. This paper explains how to register all the cycles to a constant 712 milliseconds.

---

Total practice time: 1 minutes

Location: Linux

Comments (0)

5th June 2017

Stephen Meschke

# vision research, 1 minutes

Promo video for Juggling Data Set: https://youtu.be/Y1nBBMQ9H8c

Total practice time: 1 minutes

Location: Linux

Comments (0)

3rd June 2017

Stephen Meschke

# vision research, 1 minutes

Counting Catches.

When a ball is caught it stops traveling left-to-right and starts traveling right-to-left. This produces peaks in a graph where the x-coordinate is plotted against the frame_number.

This seems to work really well for counting catches.

video: https://drive.google.com/open?id=0B7QqDexrxSfwZGZVN0kyR2tORzg
code: https://drive.google.com/open?id=0B7QqDexrxSfwMDFCc29OekN1SDA

Total practice time: 1 minutes

Location: Linux

Comments (2)

View older log entries