Practice log entries tagged vision research

View all log users

View older log entries

9th January 2019

vision research, 1 minute

I built a Claude Shannon Juggling Machine. Instead of using physical objects, I built this machine virtually. This is written in Python 3. The UI and some of the functions use the computer vision software library OpenCV. Like Scott said in the original post, the fun of this device is not in building it. The interesting part is tuning the device so that it juggles.

This program is object-oriented. The balls are the objects. Each ball object has two attributes, (1) a tuple to describe the ball's (x,y) coordinates, and (2) a tuple to describe the ball's velocity vector (speed and direction).

Several functions allow the ball to interact with the environment. These functions move the ball, apply gravity, bounce the ball off an object, etc... In the main loop of the program, all of these functions are called on the balls to produce juggling.

The machine is tuned by adjusting the parameters. Parameters are defined before running the program, or changed during runtime using the keyboard. The most important parameters are speed and rotation. Each number of balls requires a specific combination of speed and rotation to juggle. Graph of number of balls and, speed and rotation.

To find these values, I tuned the machine during runtime. It was tedious to tune the machine, and I am searching for a way to mathematically derive the parameters that will produce juggling for n number of balls. Is there a formula that relates the number of balls and the combination of speed and rotation?

Please try this out for yourself:

Total practice time: 1 minutes

Location: Linux

3rd January 2019

Juggling Simulator, vision research, 1 minute

Machine progress: https://imgur.com/a/2ugHkUZ

Total practice time: 1 minutes

Location: Linux

2nd January 2019

vision research, 1 minute

Progress on machine: https://imgur.com/a/USnXxG4

Fitting the parameters is impossible to do manually, but can be accomplished easily using for loops.

Total practice time: 1 minutes

Location: Linux

31st December 2018

vision research, 1 minute

Goal: Build a Claude Shannon Juggling Machine in Python.

Progress: https://imgur.com/a/W9MWpxb

https://github.com/smeschke/juggling/blob/master/miscellaneous/%20claude_shannon_juggling_machine.py

Steps:

1. Define the components of the machine
- balls
- floor for balls to bounce off
- arm apparatus
- bearing (height off ground)
- arm between bearing and paddle
- rotation of arm in degrees
- motor
2. Define physics of environment
- gravity
- no air resistance
- balls have a bounce factor of 1.0 on the floor
- balls have a bounce factor of 0.2 on the paddle

Programming challenges:

1. gravity
2. ball floor bouncing interaction

Total practice time: 1 minutes

Location: Linux

5th December 2018

vision research, 1 minute

Several people have expressed interest in my tracking software. Feels good. One person used the HSV color picker program to make a cool video about triangles. That's cool. Another person used Tracking_Colorspaces to track 3 ball cascade.

lissajous curve - looks exactly like a long exposure of a cascade pattern

Total practice time: 1 minutes

Location: Linux

20th November 2018

vision research, 1 minute

Decided to go in a different direction with the Juggling Data Set.

I am going to focus on videos that are 848x480 @ 120 fps. I'll convert all the other videos to this aspect.

By focusing on only one type of video, it's easier to present and analyze the data.

Goals:

• Make a Pateron Page for the Juggling Data Set
• Convert videos to one resolution and frame rate

Total practice time: 1 minutes

Location: Linux

31st October 2018

vision research, 1 minute

https://imgur.com/a/HesBjq8

Total practice time: 1 minutes

Location: Linux

16th August 2018

vision research, 1 minute

How to find juggling balls: Look Up!

Sky detection is a solved problem in computer vision. Using a convex hull, the sky can easily and quickly be segmented from the rest of an image (seen in light blue). Once the sky is segmented, detecting the balls is a simple matter of finding the contours.

This robust method detects the balls extremely quickly, but does not work at all below the horizon, where the juggler interacts with the props. This method only detects the object, it does not track them.

The camera must be set so that it looks toward a bright sky (i.e. the setting sun is to the jugglers back). In this configuration, the balls are backlit. The video doesn't look great, but it is optimized for detecting the dark juggling balls against the bright sky.

This only works on a clear day, or a day with uniform haze.

Example: http://juggling.tv/16952

https://github.com/smeschke/juggling/blob/master/convexHull_previous_points.py

Total practice time: 1 minutes

Location: Linux

29th May 2018

vision research, 1 minute

Total practice time: 1 minutes

Location: Linux

1st May 2018

vision research, 1 minute

I have dropped more recently than ever before: https://imgur.com/a/7nW4C44

Total practice time: 1 minutes

Location: Linux