Jetbot Diff Drive Save

|| A ROS package of the WaveShare Jetbot AI Kit. A Differential Drive Robot ||

Project README

Jetbot_Diff_Drive

For the previous version files, download this repo and inside the folder do git checkout cd8f47e

Brief Review

This project includes all necessary files reproduce a simulation of the waveshare Jetbot AI Kit model in rviz and gazebo to visualize the camera, control and navigate the differential drive robot.

At this point we have several algorithms for SLAM and Navigation that you will explore below like gmapping, karto, hector.

I based the structure of this repository using husky, turtlebot3 and Raymon Wijnands for move_base_flex repos mainly. Other repositories like Intel and PAL Robotics are mentioned below.

Several sensors are included in the simulation like:

  • 1 x RGB Camera
  • 1 x Intel Realsense Depth Camera D435
  • 1 x RPLIdar Laser Scan
  • 8 x Sonar
  • 1 x GPS
  • 1 x IMU
  • 1 x Odometry (see issues at the end of the document)

All sensors excluding the jetbot camera could be enabled or disabled visually.

NOTE:

  • For the realsense you will need to make the plugin first that is included here
  • For the IMU, GPS, Odometry and sonar you will need folder hector_gazebo_plugins from here or here
  • Several dependencies will be needed, see the package.xml dependencies of each one.

The robot is a WaveShare Jetbot AI Kit and its main goal is navigation.

Below a few image examples of the outcome.

The project tree:

The project is now divided in several folders and now you can easily excecute effectively each file.

Using Jetbot Differential Drive Package

NOTE: By default, all sensors are visually enabled.

  • Put the ball folder from jetbot_diff_drive/model folder inside ~/.gazebo/models folder to load also the soccer ball.
  • See prerequisites on the package.xml of each package.
  • Create a ROS ros workspace and compile an empty package:
    cd ~
    mkdir -p catkin_ws/src
    cd catkin_ws
    catkin_make
  • Open the .bashrc with nano:
    nano ~/.bashrc
  • Insert this line at the end of the ~/.bashrc file for sourcing your workspace:
    source ~/catkin_ws/devel/setup.bash
  • Clone this repo in the ~/catkin_ws/src folder by typing:
    cd ~/catkin_ws/src
    git clone https://github.com/issaiass/jetbot_diff_drive --recursive
    git clone https://github.com/issaiass/realsense_gazebo_plugin
    git clone https://github.com/issaiass/hector_gazebo_plugins
    cd ..
  • Go to the root folder ~/catkin_ws and make the folder running catkin_make to ensure the application compiles.
  • Now you can test in several ways the packages
  • For just only the robot description
    # 1st terminal - mount only the robot description
    roslaunch jetbot_description description.launch
    # 2nd terminal (optional) - get the robot description
    rosparam get /robot_description
  • Visualizing only the robot
  • You could enable more parameters if tab is pressed
    # visualize the robot in rviz
    roslaunch jetbot_viz view_model.launch
    # example... visualize the robot in rviz and disable intel realsense
    roslaunch jetbot_viz view_model.launch realsense_enable:=false
  • For just view gazebo simulation (no control)
  • There are more parameters for enabling sensors
  • Press tab if you want to see all parameters list
  • Basic spawning of the robot
    # spawn jetbot model in gazebo in turtlebot3_world
    roslaunch jetbot_gazebo spawn_jetbot.launch
    # example... spawn jetbot model in gazebo, other world
    roslaunch jetbot_gazebo spawn_jetbot.launch world_name:=<your_world>
  • For controlling the jetbot in gazebo and visualize in rviz
    # launch the jetbot to control it in gazebo and visualize in rviz simultaneously
    roslaunch jetbot_control control.launch
    # OR
    # Same as above but with multiple terminals (4 terminals to launch)
    roslaunch jetbot_gazebo spawn_jetbot.launch
    roslaunch jetbot_viz view_model.launch
    roslaunch jetbot_control jetbot_controller_manager.launch
    roslaunch jetbot_rqt_robot_steering.launch

Finally, control the robot with the rqt steering controller
  • For robot navigation (it is not fine tuned at this checkpoint):
    # 1st terminal, launch gazebo
    roslaunch jetbot_gazebo spawn_jetbot.launch
    # 2nd terminal, launch navigation node (dynamic window approach or time elastic band)
    # <option> = teb or dwa
    roslaunch jetbot_navigation jetbot_navigation.launch local_planner:=<option>
    # 2nd terminal, or launch navigation node (dynamic window approach only)
    # <option> = 0 or 1, 0 = move_base 1 = move_base_flex
    # Let's say we want move_base_flex, then the argument is 1
    roslaunch jetbot_navigation jetbot_navigation.launch move_base_flex:=<option>
  • For robot slam:
    # 1st terminal, launch gazebo
    roslaunch jetbot_gazebo spawn_jetbot.launch
    # 2nd terminal, launch slam node
    # <option>: gmapping, hector or karto
    roslaunch jetbot_navigation jetbot_slam.launch slam_methods:=<option>
    # 3rd terminal, launch a controller (option 1)
    roslaunch jetbot_control jetbot_rqt_control_steering.launch
    # 3rd terminal, launch a controller (option 2)
    rosrun jetbot_twist_keyboard teleop_twist_keyboard.py
    # 4rt terminal, save the map when finished
    rosrun map_server map_saver -f <path_and_name_of_map>
  • You could enable/disable any sensor in the launch file.
  • You must see that roscore and all configurations loading succesfully.

Results

You could see the results on this youtube video.

Last video update - Jetbot AI Kit Karto SLAM:

Previous videos list:

Jetbot AI Kit Hector SLAM

Jetbot AI Kit move_base_flex and dwa planner

SLAM using gmappig

Navigation Stack with DWA and TEB Planners

Odometry Plugin

Sonar Plugin

GPS Plugin

IMU Plugin

RPLidar

Realsense and PCL Demo in gazebo and rviz

ROS Controllers and Camera Plugin in gazebo and rviz

The video only shows the application running, not the explanation of the code.

Video Explanation

I will try my best for making an explanatory video of the application as in this youtube video.

Last video update - Explaining Jetbot AI Kit Karto SLAM:

Previous videos list:

Explaining Jetbot AI Kit Hector SLAM

Explaining Jetbot AI Kit move_base_flex and dwa planner

Explaining Move Base Flex

Explaining Jetbot AI Kit gmapping SLAM

Explaining Navigation Stack with DWA and TEB Planners

Explaining Odometry Plugin

Explaining Sonar Plugin

Explaining GPS Plugin

Explaing IMU Plugin

Explaining how to solve Model Sinking, not moving or drifting

Explaining RPLidar

Explanation Realsense and PCL in gazebo and rviz

Explanation ROS COntrollers and Camera Plugin in gazebo and rviz

Issues
  • When the navigation stack is running, in some point the map is not aligned with the laser scan, we have to test more if is the AMCL or other map parameters related to local planning.
  • URDF need some modification, if you disable the link of imu, gps will not link and will cause an error.
  • Always leave to true both, imu_enable and gps_enable. I will fix that later
  • Planners are not fine tunned and sometimes will cause the bot to go back and forth.
  • For some reason odometry plugin by p3d of libhector always read (in my case) frame id and child as odom the next plan is to make a simple node package to get the transformation between the base_link and base_footprint to get the transform and publish in a topic.
  • Cartographer is not working, does not publish submap_list
  • hdl_graph_slam maps realsense pointclouds, but position of the robot is odd, we are inspecting the input values to the tf transform inside the hdl_graph_slam configuration
Future Work

Planning to add to this project:

  • :x: Probably i will add effort controllers
  • :heavy_check_mark: Navigation capabilities
  • :x: Navigation fine tunning
  • :heavy_check_mark: Computer Vision capabilities
  • :x: OpenVINO as an inference engine for future deep learning based projects
Contributing

Your contributions are always welcome! Please feel free to fork and modify the content but remember to finally do a pull request.

:iphone: Having Problems?

License

Open Source Agenda is not affiliated with "Jetbot Diff Drive" Project. README Source: issaiass/jetbot_diff_drive
Stars
50
Open Issues
0
Last Commit
3 years ago

Open Source Agenda Badge

Open Source Agenda Rating