Practice log entries tagged vision research

View all log users

View older log entries

10th July 2017

Stephen Meschke

# vision research, 1 minutes

I have been thinking about the injury that Peter Bone sustained while juggling: link to thread about topic. I want to provide him with the best text based remote diagnosis over the internet possible, so I have done some research into his injury.

Using the same tracking programs from the Juggling Data Set, I created this: pose estimation of 2 ball juggling.

I am not sure how to interpret this, but I believe that it would be useful to show that just after a ball is caught, the tricep fires, pulling the hand down. This rapid downward movement of the hand can put lots of pressure on the ulnar collateral ligament if the bicept is not relaxed.

My diagnosis is a torn UCL, and my treatment recommendation is ulnar reconstructive surgery (also known as Tommy John Surgery) where a ligament is taken from the leg of a cadaver and implanted into the arm. Tommy John Surgery is used effectively for Major League Baseball pitches, who often see in increase in velocity after the surgery.

Total practice time: 1 minutes

Location: Linux

Comments (0)

1st July 2017

Stephen Meschke

# vision research, 1 minutes

Going from graph to siteswap:

http://imgur.com/a/dpZD3 (the patterns are 5550,552, and 744)

Found that most throws are within 2% of predicted siteswap value.

Total practice time: 1 minutes

Location: Linux

Comments (0)

30th June 2017

Stephen Meschke

# vision research, 1 minutes

Camshift Tracking + Kalman Filter: https://youtu.be/430lPkbvmc0

Having trouble with these lines:

kalman = cv2.KalmanFilter(4,2)
kalman.measurementMatrix = np.array([[1,0,0,0],[0,1,0,0]],np.float32)
kalman.transitionMatrix = np.array([[1,0,1,0],[0,1,0,1],[0,0,1,0],[0,0,0,1]],np.float32)
kalman.processNoiseCov = np.array([[1,0,0,0],[0,1,0,0],[0,0,1,0],[0,0,0,1]],np.float32) * .002

Total practice time: 1 minutes

Location: Linux

Comments (0)

28th June 2017

Stephen Meschke

# vision research, 1 minutes

Created site for the juggling data set:

https://sites.google.com/view/jugglingdataset/

Total practice time: 1 minutes

Location: Linux

Comments (0)

27th June 2017

Stephen Meschke

# vision research, 1 minutes

Worked on a new tracking program that uses camshift instead of optical flow. This is a completely different mechanism for tracking the balls. This new method is quite a bit faster than the optical flow method because it is more automated and requires less use input.

Demonstration of the camshift tracking program: https://youtu.be/TCct--xtKp0

Code is in the Python Tutorials folder: https://drive.google.com/open?id=0B7QqDexrxSfwcmROb1ByNkhqOEU

The camshift program works well for indoor juggling, but poorly for outdoor juggling. The optical flow program works well outdoors, but poorly indoors.

Total practice time: 1 minutes

Location: Linux

Comments (0)

25th June 2017

Stephen Meschke

# vision research, 1 minutes

Major breakthrough: smoothing the data with a Savgol filter improves the quality tremendously.


from scipy import signal
x_smooth = signal.savgol_filter(x, window_length, polyorder)

Total practice time: 1 minutes

Location: Linux

Comments (0)

24th June 2017

Stephen Meschke

# vision research, 1 minutes

(F+D)/(V+D)= #balls/#hands

https://youtu.be/fbC7B6a6His

Total practice time: 1 minutes

Location: Linux

Comments (0)

22nd June 2017

Stephen Meschke

# vision research, 1 minutes

Juggling Edge User unigamer made a really cool graph. It showed me how to graph in 3D. Made this: https://youtu.be/GGLzph6IZYc?t=6s.

I need to redo this with two cameras.

Total practice time: 1 minutes

Location: Linux

Comments (0)

16th June 2017

Stephen Meschke

# vision research, 1 minutes

Graphed 17 catches of 7 ball cascade: https://drive.google.com/open?id=0B7QqDexrxSfwZUhKamliLVJ2Ykk

I recorded this in 480p@120fps. The tracking program does much better at higher frame rates or lower resolutions. It is counter-intuitive that it would work better at lower resolution. This is the best data that I have produced.

This is also the fourth framerate that I have used; 30, 48, 60, and now 120 frames per second. The data doesn't work well for comparative purposes if the frame rates are different.

I have made some improvements to my tracking program, and am currently working on applying a Kalman filter. Kalman filters are rocket-science level confusing. Literally; Kalman filters were developed for the Apollo program to help land on the moon.

Going forward, the best path is to write a better tracking program, and the re-record the juggling data in 120fps.

Total practice time: 1 minutes

Location: Linux

Comments (0)

14th June 2017

Stephen Meschke

# vision research, 1 minutes

Idea for juggling data set:

Imagine a magical set of balls. The balls change to red when touched with the left hand, and stay red until they are touched with the right hand. When they are touched with the right hand, they turn green, and stay green until they are touched by the left hand.

This would be really helpful for analyzing a juggling graph. If all the green peaks were lower than the red peaks, it would be obvious that the throws from the right hand were lower than the throws from the left hand.

Using various methods (finding local maxima in velocity or finding local maxima in x position) I can accurately determine when a ball is touched by a hand, I can also determine which hand touches the ball.

But how should I go about this? So many interesting questions!!!

Total practice time: 1 minutes

Location: Linux

Comments (3)

View older log entries