“Robotics are at the leading edge of Industry 4.0, AI and the Edge revolution. Let’s look at how we can create an FPGA-controlled robot arm.
Robotics are at the forefront of the Industry 4.0 and edge revolution along with artificial intelligence and machine learning.
As such I thought it would be fun to create a base robot arm project we can come back to and add in several features such as:
Inverse kinematics - determine the position of the end-effector.
AI / ML - object classification during operation.
Networked control - enabling remote control at the edge.
This example will use a robot arm which uses six servos under the control of a Zynq SoC. It will be controllable using either a simple software interface or using two Pmod joysticks to enable direct control.
The first thing we need to do is work out how we are going to control the Servo Position. Servos are one of the simplest motors to drive, and idea for robotics as they also hold position provided we maintain the same drive signal.
So what is the drive signal for a servo? Most Servos in the class we are using, use a 60Hz PWM waveform. In the 16.66 ms period of the 60Hz waveform, the signal will be high between 0.5 ms and 2, 5 ms. The duration of the signal will drive the servo in a range of motion between 0 and 180 degrees.
Driving a 0.5 ms pulse drives the 0 degree position while 2.5 ms will result in the 180 degrees. 90 Degrees therefore is maintained by driving the signal high for 1.5 ms.
Therefore increasing of decreasing the pulse width by 13.9 us moves the servo by 1 degree.
While we have the triple timer counters which are capable of providing the PWM signals necessary. We would also need to supply servo power at 6 volts in this application, therefore the simplest manner to make use of the Adafruit PWM Shield. This not only provides 6v power but also performs the level conversion on the PWM signal.”