This is the FIRST of many blog posts detailing the escapades of the programming team. I am Sam Weaver, Programming Lead, and I will be working to post information about how our programming team performed this season. This post concentrates on the efforts of the Gamma Squadron, one of the many subteams of the Programming Team, of which I was the captain. Gamma Squadron was responsible for creating the Vision driver-assist code that ran on our robot. Though the code is not currently used in the existing iteration of the robot, it was an excellent learning experience for us, and was very powerful when we originally had our high shooter. 

First steps: Planning

The first thing we did after kickoff was analyze where vision could be used in this year’s game. We determined it could have two major potential uses: targeting boulders for driver intake, and targeting the high goal for shooting accurately. We chose the latter to focus on. We analyzed the options, and decided to use a Jetson TK1 from FIRST Choice to run our code, in Python 2.7 and OpenCV 2.7.10. 

We first decided that we should add a light to our robot and threshold out any noise, to receive a mask with just the target highlighted, something similar to this: 


Then, we decided we needed the orientation and distance of the target from the robot, to do the kind of operations we wanted. This was the most difficult part. 

Enter cv2.solvePnP!

The aforementioned function was deep in linear algebra, something many of our programmers were unfamiliar with. We spent a week of the season just trying to figure this bit out (within Gamma Squadron.) Eventually, we discovered that we could pass it the expected dimensions of the target, and the four corners of the target we found in the image, and it would calculate a distance and angle. We then tested this at different points, then used a calculator to fit a function that could be used to transform the output at any point, and voila! Oh, and let’s not forget the camera calibration!


We decided to use the excellent people at RobotPy’s version of NetworkTables for Python. This allowed our code to communicate with the roboRIO regardless of whether it was running on the Jetson on the bot, or on a driver computer for testing. 

Integration with our Drivers

Our Gamma Squadron worked with Delta Fleet to decide the best driver control scheme for this, and they came to the decision to force drivers to hold a button to vision align. One button would position our robot an accurate distance from the goal, and the other would make sure we were aligned to the right angle. Together, this let our bot aim for the high goal anywhere in the courtyard!