top of page

Midterm Project: Ball Thrower

3/2/2020

Goal

The Process

The goal for this robot was to throw a ball into a cup that can be placed any distance away from the robot. It utilizes machine learning to determine how fast to spin the motor to hit the ball into the cup. The user tells robot to launch remotely in System Link Cloud. 

00100dPORTRAIT_00100_BURST20200227220315

Version 1: The first robot, while it had neat cable management, it lacked the ability to throw the ball more than 20 cm.

Features: Incorporated a motor to spin the throwing arm and a motor to rotate the platform to the angle chosen by the user. The platform was relatively high, and there was very neat cable management

Future Improvements: Throwing distance 

IMG_20200301_232019.jpg

Version 2: The second robot design was very similar to final robot but the platform was much closer to the ground than the final iteration. This robot was changed because while the ball could be thrown far (up to 94cm), the platform was lower than the cup rim chosen and made it very hard to throw the cup. 

Updates: Version 2 incorporates the same features as Version 2 but also increased the size of the hitting arm increasing the distance thrown to 94cm.

Future Improvements: After collecting data points to determine the correlation for platform angle of 0 degrees, 10 degrees, 30 degrees, 90 degrees, this robot design was changed because the platform height was shorter than the cup chosen, so the platform height had to be changed and the data recollected. 

Data Collected from Version 2

The data was collected by throwing the ball starting at 200 degree/s at a 0 degree angle for the platform, take the measurement 10 times and then increasing the speed by 100 degrees/s. This was repeated at different angles for the platform ranging from 0 to 45 degrees. The measurements were collected by writing with a whiteboard marker on the table and then seeing where the mark was removed on the table (which shows where the ball hit the table). The distance between the ultrasonic sensor and the mark on the table was measured manually with a measuring tape. As shown in the graphs the distance traveled by the ball increased as the angle of the platform and motor speed increased .

Click on the Graphs and Table Below to Learn More

Challenges

Despite collecting data for Version 2,  I realized once I implemented Machine Learning onto my robot that my design of the robot and of the collection of data was inadequate. I chose my cup after collecting data which was a mistake because the cup was taller than my platform, and the data I collected measured the distance the ball traveled until it hit the ground, and not until it reached the cup. Therefore, I increased my height of the platform, and redesigned how I would collect data. 

IMG_20200303_224922.jpg

Version 3:  Version 3 has the same features as Version 2 with the improvement of a higher platform.

Future Improvements: In future iterations, I would like to increase the height even more. However, because a motor needs to spin the platform, this can cause some restraints to the stability and height of the design. In future iterations, the machine learning code used in this can be improved by continuing to add more data samples and to implement other forms of machine learning (such as K Nearest Neighbor).

Data Collected from Version 3

 I developed new training code along with a new way to collect data for version 3. I started with an initial platform angle and initial motor speed of 100 degrees/s. First, I watched to see roughly where the ball fell, and then I would try to catch the ball in the metal cup. Once it succeeded, I would write down the distance the cup was from the ultrasonic sensor, as measured by the ultrasonic sensor. After the ball successfully landed in the cup five more times, I proceeded to the next speed (increasing by 50 degrees/s each time) by pressing a touch sensor. A second second touch sensor was attached to the EV3 to decrease the speed by 50 in case I wanted to return to a previous speed. During this training portion, the EV3 would launch the ball only if I pressed a button located on the EV3 itself. Due to time constraints, data was only collected when the platform was at 40 degrees, 60 degrees and at 90 degrees. The data for the platform angle at 90 degrees was only calculated up to the motor speed of 550 degrees/s, and only up to 650 degrees/s for the platform angle of 60 degrees. The data collected from Version 3 and this new technique maintained a similar trend as the data collected from Version 2 with the distance increasing as the speed and angle increased. While the distance traveled was lower than calculated in Version 2, this ocurred because I measured the distance from the when the ball hit the table and not when the ball landed in the cup. If I measured the distance from where the ball hit the table for Version 3, it would most likely travel farther than Version 2 due to the increase in height. 

Click on the Graphs to Learn More

Goal
V1
V2
V3
Training

Training Code

training code.PNG

The Machine Learning

Once the data was collected I implemented my equations into the code shown below, using the different linear fits for the different platform angles to determine how fast to spin the motor to hit the cup. K Nearest Neighbor was also attempted for the machine learning portion of the code, but after difficulties, I remained using linear regression. Angles under 30 degrees never resulted in the ball  landing into the cup due to the height of the cup. Therefore, it was not included in the Machine Learning section. 

Mahcine Learning Code​

machine learning code.PNG

Accuracy:

After trying 50 trials per platform angle  (10 trials each in 10 cm) 

  • With a platform angle of 30-50 degree angle, I had a 62% accuracy. 

  • With a platform angle of 50-70 degree angle, I had a 60% accuracy. 

  • With a platform angle of 70-90 degree angle, I had a 52% accuracy.

Projectile Motion

One of the modes of the robot is "Projectile Motion." In the projectile motion mode, the robot will throw the ball based on the kinematics equation. 

Calculations for Projectile Motion Equation

IMG_20200302_023535.jpg

Code for Projectile Motion 

projectile.PNG

Accuracy: 

  • From 5-15 cm: 90% accuracy

  • From 15cm - 35cm: 70% accuracy

  • 35cm+: 40% accuracy

Machine Learnin
Projectile Motion

System Link Cloud

System Link

While there were previous assignments of commanding the robot remotely, this was my first time truly in charge of implementing this part, and I feel as though I fully understand the process and implementation of System Link Cloud. While the project required that System Link Cloud was used to launch the ball, I also used System Link Cloud to choose the angle of the platform.The user can choose which mode to use the robot: the Training Mode to collect data, the Projectile Motion mode which throws the ball based on kinematics equations or the Machine Learning Mode which throws the ball based on the linear equation that fit the data collected during the training mode. The EV3 data is also sent to System Link Cloud and the and the distance of the cup from the ultrasonic sensor is displayed on System Link Cloud along with whether the ball successfully launched and was ready to launch again (displayed with text and an LED that was green when ready to launch and red when not yet ready to launch). 

Dashboard on System Link Cloud

Interface.PNG

System Link Cloud Code

systemlink1.PNG
systemlink2.PNG
systemlin3.PNG
systemlink4.PNG

Challenges

One of the most challenging aspects of my design of the robot was the tilting platform. Because the trajectory of the ball is altered at different angles, I had to collect data for many possible platform angles. When I implemented my tilting platform to the System Link Cloud, I designed the code so that after each time it gained user information, it would tilt the platform to the chosen spot and return it to initial placement after running through the program. I noticed however that the platform did not always return to the exact place it had began, and as the platform continued to tilt, the placement error would continue to increase. Also, throughout my tests, I experienced challenges with the ultrasonic sensor reading different values when the cup remained in the same place as the previous test due to the roundness of the cup.

Reflections 

Reflections

As my first time this semester working with Lego and robotics, it is very exciting to see how much I have learned during this class and this project, and it has shown me where I can grow and expand my robots capabilities. In my next version of this robot, I plan to continue testing my skills with machine learning and try to correctly implement K Nearest Neighbors and K Mean. In the future I plan to iterate through my designs before testing, making sure I have all of the parts incorporated as needed (for instance choosing the cup before collecting data). After talking to the employees at Lego, I learned that it would be easier to detect a flat object than a round object with the ultrasonic sensor, and in the future I would choose a rectangular object for more accurate readings. If I had more time on this assignment, I would have collected more data which would have increased my accuracy in the machine learning portion.  

bottom of page