from the camera to find the location of the clusters (objects) and then feed the 3D position of the objects to the inverse kinematics code to be used as the desired position. You will then be able to use ROS to command the robot joints to move the end-effector to this desired position. Feel free to experiment and design your own project. 2 Vision-Aided Numerical Inverse Kinematics Control of the Robot Arm The objective of this part is to merge the numerical inverse kinematics from the previous part with the robot’s perception package to create a visionaided inverse kinematics mission planner. The depth camera will first capture the AprilTag attached to the robot’s arm to help ROS figure out the homogeneous transformation of the robot’s base frame relative to the camera and vice versa. This will help convert the camera’s depth readings of scene objects to homogeneous transformations with respect to the robot’s base. Later, the inverse kinematics module will map these transformations to joint angle set-points to make the robot catch and release these objects. For a short introduction to image processing basics, you can refer to this lesson. The camera that we used is the Intel D415 RealSense Depth Camera, which features 1920×1080 resolution at 30fps with a 65o × 40o field of view and an ideal range of 0.5m to 3m. Using the stereo vision technology and the infrared sensor will help us to estimate the 3D structure of the scene. 2.1 Physical Setup and Running the Perception Pipeline First, you need to make the physical setup ready. To begin, construct your stand and fasten the RealSense camera onto the 1/4-inch screw located at the top. Subsequently, position the stand in your designated workspace and manipulate the goose-neck or ball/socket joint to orient the camera towards your tabletop. Then, arrange your target objects and the robot in a way that the objects and the AprilTag are clearly visible in the camera’s view. Fig.1 shows a sample setup (for your reference). 7 Computer Vision News Vision-aided Screw Theory-based…
RkJQdWJsaXNoZXIy NTc3NzU=