# Tailored_Target_Jun_for_2024 **Repository Path**: ucas-sas-robot-team/tailored-target-jun-for-2024 ## Basic Information - **Project Name**: Tailored_Target_Jun_for_2024 - **Description**: Baseline for self-target research since 2023 - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2024-03-10 - **Last Updated**: 2024-04-06 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Tailored code for self-target 2024 ## Getting started ### Hardware requirement - Intel realsense (or you can use our recorded bags for debugging) - ttyUSB module ### Environment setup - Ubuntu 20.04 - ROS2 foxy - Intel realsense SDK and its ros2 wrapper ### IMU Serial setup In ros2 foxy and humble, the serial library is not provided, so we need to install it manually. ```bash # make sure that you have sourced your ros2 environment cd git clone https://gitee.com/ucas-sas-robot-team/serial.git # a mirror of a ros2 serial repo in github cd serial mkdir build cd build cmake .. sudo make install ``` ### system setup ```bash cd git clone https://gitee.com/ucas-sas-robot-team/tailored-target-jun-for-2024 cd tailored-target-jun-for-2024 sudo apt-get update && rosdep install --from-paths src --ignore-src -r -y ``` ```bash sudo apt install libgeometry-msgs-dev # for #include ``` ### Build 在`src/rm_vision/rm_vision_bringup/config/robot_type.yaml`目录下,选择机器人类型。IMU、相机、等结构的安装方位将会在启动时装载。 ```bash # assume you are already in the root of the workspace tailored-target-jun-for-2024 colcon build --symlink-install --cmake-args -DCMAKE_BUILD_TYPE=Release ``` ### Run make sure that IMU is ttyUSBimu, and the can_serial port is ttyUSBcan. If not, you need to change the port in the config file. serial port is fixed now In one window(start nodes) ```bash source install/setup.bash ros2 launch rm_vision_bringup all.launch.py ``` In another window(set params) ```bash ros2 param set /camera/camera rgb_camera.auto_exposure_priority false ros2 param set /camera/camera rgb_camera.enable_auto_exposure false ros2 param set /camera/camera rgb_camera.exposure 30 ``` 启动serial节点,只通过all.launch.py启动,而不通过serial_driver.launch 启动 这是因为有枪口矫正系数写在shoot_caliberate.yaml里 目前,按照代码里的逻辑,得先插IMU,再插CAN USB接口。保证IMU是ttyUSBimu,CAN_USB是ttyUSBcan #### Without hardware, use our bag In one window ```bash source install/setup.bash ros2 launch rm_vision_bringup no_hardware.launch.py ``` In another window Download our bags at [Baidu Netdisk link](https://pan.baidu.com/s/1IdKN-XNSiF6MLV-Dg0wDOg?pwd=w6fw) and unzip it, then, navigate to the folder containing the extracted 848x480x60_30_mid_trans and use ```bash ros2 bag play 848x480x60_30_mid_trans ``` to play the bag. #### With hardware ##### all in one ```bash source install/setup.bash ros2 launch rm_vision_bringup all.launch.py ``` ##### Separate In one window ```bash source install/setup.bash ros2 launch rm_vision_bringup no_hardware.launch.py ``` In another window ```bash ros2 launch realsense2_camera rs_launch.py camera.profile:=848x480x60 ``` We set the camera exposure to 30 Launch Imu ``` source install/setup.bash ros2 launch fdilink_ahrs odom_tf.launch.py ``` #### Visualization You can use some rviz2 to visualize the results. ```bash # enter the root of the workspace source install/setup.bash rviz2 -d src/rm_vision/rm_vision_bringup/rviz/visualize.rviz ``` You can use rqt to see the relationships and tfs. ``` rqt ``` #### Notice How to set the camera exposure: ``` ros2 param get /camera/camera rgb_camera.exposure ros2 param set /camera/camera rgb_camera.exposure 30.0 ``` How to change params: 1. ```ros2 param set xxx``` 2. change params in node_params.yaml 3. change ratios in codes How to solve "/dev/ttyUSB0 permission denied. ": ``` sudo usermod -a -G dialout $USER ``` then reboot ## Design ### Frames Change We use two ways to create frames.\ We totally set five frames:odom(inertial system) , gimbal_link , gun_link , imu , camera_color_optical_frame\ Only when the data was put in an inertial system can we get a proper pose estimation with EKF.So we set an odom frame which is on the center of PTZ witn no rotations.(It means we see rotations as non-inertia but we see translations as inertia.)\ Please note that we cannot only use imu to create an odometry!!!!!! 1. create a node \ Properties:\ It can set both static and dynamic frames.But every time it is changed,we need to build again.\ Reference documents:\ https://docs.ros.org/en/foxy/Tutorials/Intermediate/Tf2/Adding-A-Frame-Cpp.html\ Use:We only set a transform whose father's name id is odom and child's name id is gimbal link to record the rotation of PTZ. 要把无下位机的控制程序写完 ## future work(TODO LIST) 1. 添加哨兵双云台全部在NUC上启动的流程 1. 编写参数标定的README 1. 编写上场工作流程,调节红色和蓝色的识别 1. 测试IMU热插拔流程 如此定义机器人的串口和IMU 对于步兵和英雄机器人,can口是/dev/ttyUSBcan,IMU是/dev/ttyUSBimu 对于哨兵机器人,can口是/dev/ttyUSBcan,IMU是/dev/ttyUSBimuup和/dev/ttyUSBimudown 为了启动哨兵机器人的两个节点,需要启动: 两组:姿态解算,装甲板识别,目标追踪 一组:串口收发 一组:无下位机控制 如果按照现在的结构,一切的launch file将会变得比较混乱。这是因为在launch file中根据某一字符串确定读取的参数,导致没有办法从高层级的python launch把字符串发给更低层级的 python launch,这是因为参数在运行时确定,而launch file只是生成了参数的"description"。 为了快速完成任务,保留英雄和步兵的launch方式:直接启动。而对于哨兵双云台同时启动的情况,我们从头直接编写launch 文件,而不考虑现有launch file的复用,以尽可能少地修改代码,将精力聚焦于更重要的事情上。 现在必须要让自瞄和串口收发剥离开来了,先跑通单个机器人的代码。