The project utilizes UR5e from Universal Robots for automated screw driving.
The novelty of this system lies in the integration of a depth camera mounted on the end effector, enabling the robot to adapt to different screw-hole locations without manual tuning.
Project Duration: 6 weeks
Authors:
- Nicholas (James) Bell: Nick's LinkedIn Profile
- Jiawen Zhang: Jiawen's LinkedIn profile
- David Nie: David's LinkedIn profile
Supervisor:
- Daryl Lee: Daryl's LinkedIn profile
Video link: https://www.youtube.com/shorts/ufF1myLWomA
brain_routine_test sometimes send two commands to brainneed to addis_busy
support in brain
- [Week 10 Mon][Nicholas, Jiawen] Fine tuned brain routine to align with hole displacement and optimised screwdriving control routine.
- [Week 9 Thu][Nicholas, Jiawen] Refined the movement package for precise cartesian control commands.
- [Week 9 Wed][Jiawen] Attempted the jogging command to improve precision of arm movement.
- Week 9 Tue Move to screw fully working, change home position, change in arm-movement pkg, ignore large screws.
- Week 9 Mon Add
OOI
frame that indicates the screwhole, add transformation to real-coordinate support. - [Week 8 Thu][Jiawen] Attempted Cartesian movement for the movement package but MoveIt unable to finish planning path.
- Week 8 Wed Add support for converting to RealCoor with respect to base_link, bring back transformation pkg.
- Week 8 Wed Fix centroid locating algorithm.
- [Week 8 Tue][Nicholas, Jiawen] Attempted orientation constraint and established movement package to control UR5e robot arm based on joint constraints.
- Week 8 Tue Enable collosion check, add toolpoint_link and camera_link, deprecated transformation pkg.
- [Week 8 Mon][Jiawen] Created the base code for the movement package, including an arm brain and arm movement.
- Week 7 Sun Add dy_trans between camera_socket and camera, tune end_effector scale.
- Week 7 Sun End_effector_description package complete, system_launch now launches with UR5e and camera and end_effector visualisation.
- Week 7 Sat Add stub for Screwdriving Routine.
- Week 7 Wed Vision and Brain framework completed; testing package added.
- [Week 7 Wed][Nicholas] Established arduino serial communication inside of ROS and mapped basic motor control functions inside End Effector package.
- [Week 7 Tue][Nicholas] End Effector design complete and 3D printed.
- rebase first (so that the latest commits are on top)
git fetch origin # Updates origin/master
git rebase origin/master # Rebases current branch onto origin master
- Make sure you squash the commits when merging
- Make sure the codebase is stable
- Add to "Recent Updates" if it's a feature update
- movement
(services and publisher model for communicating with the UR5e)- arm_brain
- arm_movement
README.md
(for movement)
- vision
(Centroid locating services and publisher model)- Vision Server
- end_effector
(end_effector related publisher and control algo)- end_effector (Server offering services such as screwdriving, light on/off, status report)
- arduino_serial (For bridging Arduino and ROS2)
- end_effector_description (contains end_effector visualisation, camera launch, and UR5e launch)
- brain
-
- brain
- interfaces
(Custom messages and services)- Src
- BrainCmd (for testing individual packages)
- VisionCmd (interface with Vision Module)
- EndEffectorCmd
- ArmCmd (interface with Moveit and UR5e)
- BrainRoutineCmd (run the closed-loop operation)
- PublishOoiCmd (publish a frame
OOI
that indicates the target screwhole) - RealCoorCmd (converting to coordinate with respect to
base_link
)
- Msg
- N/A
- Src
- transformations
(Static Transformations publishers)
- camera_dy_trans (dy_broadcaster bewteen camera_socket and camera)- ooi_server (publish
OOI
, convert to RealCoor with respect to base_link)
- ooi_server (publish
- end_effector visualisation in Rviz
The code is tuned for small circles
python3 blob_detection.py
python3 blob_detection_im.py
Step 1: In one terminal, run ros2 launch brain system_launch.py
Step 2: In another terminal, run ros2 run testing brain_vision_test
, or other testing files
Step 1: In one terminal, run ros2 launch brain without_endeffector_launch.py
Step 2: In another terminal, run ros2 run testing brain_vision_test
, or other testing files
use_fake
indicates whether Real or Fake UR5e is used- There are launch files that launches only the end-effector or the end-effector with UR5e with no driver support