Skip to content

Commit 3e86814

Browse files
authored
Merge branch 'main' into feature/smooth_ndt_map_update
2 parents 4a991a0 + 82864fb commit 3e86814

File tree

64 files changed

+6216
-105
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

64 files changed

+6216
-105
lines changed

.github/CODEOWNERS

+6-9
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,6 @@ common/tier4_camera_view_rviz_plugin/** makoto.ybauta@tier4.jp uken.ryu@tier4.jp
4242
common/tier4_control_rviz_plugin/** taiki.tanaka@tier4.jp
4343
common/tier4_datetime_rviz_plugin/** isamu.takagi@tier4.jp
4444
common/tier4_debug_rviz_plugin/** takayuki.murooka@tier4.jp
45-
common/tier4_debug_tools/** ryohsuke.mitsudome@tier4.jp
4645
common/tier4_localization_rviz_plugin/** isamu.takagi@tier4.jp takamasa.horibe@tier4.jp
4746
common/tier4_logging_level_configure_rviz_plugin/** kosuke.takeuchi@tier4.jp satoshi.ota@tier4.jp takamasa.horibe@tier4.jp
4847
common/tier4_perception_rviz_plugin/** yukihiro.saito@tier4.jp
@@ -124,11 +123,11 @@ perception/detection_by_tracker/** yoshi.ri@tier4.jp yukihiro.saito@tier4.jp
124123
perception/elevation_map_loader/** kosuke.takeuchi@tier4.jp taichi.higashide@tier4.jp
125124
perception/euclidean_cluster/** yukihiro.saito@tier4.jp
126125
perception/ground_segmentation/** abrahammonrroy@yahoo.com dai.nguyen@tier4.jp shunsuke.miura@tier4.jp yukihiro.saito@tier4.jp
127-
perception/image_projection_based_fusion/** dai.nguyen@tier4.jp kotaro.uetake@tier4.jp shunsuke.miura@tier4.jp tao.zhong@tier4.jp yoshi.ri@tier4.jp yukihiro.saito@tier4.jp
126+
perception/image_projection_based_fusion/** dai.nguyen@tier4.jp koji.minoda@tier4.jp kotaro.uetake@tier4.jp shunsuke.miura@tier4.jp tao.zhong@tier4.jp yoshi.ri@tier4.jp yukihiro.saito@tier4.jp
128127
perception/lidar_apollo_instance_segmentation/** yukihiro.saito@tier4.jp
129128
perception/lidar_apollo_segmentation_tvm/** ambroise.vincent@arm.com xinyu.wang@tier4.jp
130129
perception/lidar_apollo_segmentation_tvm_nodes/** ambroise.vincent@arm.com xinyu.wang@tier4.jp
131-
perception/lidar_centerpoint/** kenzo.lobos@tier4.jp
130+
perception/lidar_centerpoint/** kenzo.lobos@tier4.jp koji.minoda@tier4.jp
132131
perception/lidar_centerpoint_tvm/** carl.liu@autocore.ai xinyu.wang@tier4.jp
133132
perception/map_based_prediction/** kotaro.uetake@tier4.jp kyoichi.sugahara@tier4.jp shunsuke.miura@tier4.jp takayuki.murooka@tier4.jp tomoya.kimura@tier4.jp yoshi.ri@tier4.jp
134133
perception/multi_object_tracker/** yoshi.ri@tier4.jp yukihiro.saito@tier4.jp
@@ -192,10 +191,9 @@ planning/motion_velocity_smoother/** fumiya.watanabe@tier4.jp makoto.kurihara@ti
192191
planning/objects_of_interest_marker_interface/** fumiya.watanabe@tier4.jp kosuke.takeuchi@tier4.jp zulfaqar.azmi@tier4.jp
193192
planning/obstacle_avoidance_planner/** kosuke.takeuchi@tier4.jp takamasa.horibe@tier4.jp takayuki.murooka@tier4.jp
194193
planning/obstacle_cruise_planner/** kosuke.takeuchi@tier4.jp satoshi.ota@tier4.jp takayuki.murooka@tier4.jp yuki.takagi@tier4.jp
195-
planning/obstacle_stop_planner/** berkay@leodrive.ai satoshi.ota@tier4.jp shumpei.wakabayashi@tier4.jp taiki.tanaka@tier4.jp tomoya.kimura@tier4.jp
194+
planning/obstacle_stop_planner/** berkay@leodrive.ai bnk@leodrive.ai satoshi.ota@tier4.jp shumpei.wakabayashi@tier4.jp taiki.tanaka@tier4.jp tomoya.kimura@tier4.jp
196195
planning/obstacle_velocity_limiter/** maxime.clement@tier4.jp
197196
planning/path_smoother/** maxime.clement@tier4.jp takayuki.murooka@tier4.jp
198-
planning/planning_debug_tools/** taiki.tanaka@tier4.jp takamasa.horibe@tier4.jp takayuki.murooka@tier4.jp
199197
planning/planning_test_utils/** kyoichi.sugahara@tier4.jp takamasa.horibe@tier4.jp
200198
planning/planning_topic_converter/** kosuke.takeuchi@tier4.jp satoshi.ota@tier4.jp shumpei.wakabayashi@tier4.jp
201199
planning/planning_validator/** kosuke.takeuchi@tier4.jp takamasa.horibe@tier4.jp
@@ -209,12 +207,12 @@ planning/sampling_based_planner/sampler_common/** maxime.clement@tier4.jp
209207
planning/scenario_selector/** fumiya.watanabe@tier4.jp satoshi.ota@tier4.jp shumpei.wakabayashi@tier4.jp taiki.tanaka@tier4.jp takamasa.horibe@tier4.jp takayuki.murooka@tier4.jp tomoya.kimura@tier4.jp
210208
planning/static_centerline_optimizer/** kosuke.takeuchi@tier4.jp takayuki.murooka@tier4.jp
211209
planning/surround_obstacle_checker/** satoshi.ota@tier4.jp
212-
sensing/gnss_poser/** koji.minoda@tier4.jp yamato.ando@tier4.jp
210+
sensing/gnss_poser/** anh.nguyen.2@tier4.jp kento.yabuuchi.2@tier4.jp koji.minoda@tier4.jp masahiro.sakamoto@tier4.jp ryu.yamamoto@tier4.jp shintaro.sakoda@tier4.jp taiki.yamada@tier4.jp yamato.ando@tier4.jp
213211
sensing/image_diagnostics/** dai.nguyen@tier4.jp
214212
sensing/image_transport_decompressor/** yukihiro.saito@tier4.jp
215213
sensing/imu_corrector/** koji.minoda@tier4.jp taiki.yamada@tier4.jp yamato.ando@tier4.jp
216214
sensing/livox/livox_tag_filter/** ryohsuke.mitsudome@tier4.jp
217-
sensing/pointcloud_preprocessor/** abrahammonrroy@yahoo.com shunsuke.miura@tier4.jp yukihiro.saito@tier4.jp
215+
sensing/pointcloud_preprocessor/** abrahammonrroy@yahoo.com kyoichi.sugahara@tier4.jp shunsuke.miura@tier4.jp yukihiro.saito@tier4.jp
218216
sensing/radar_scan_to_pointcloud2/** satoshi.tanaka@tier4.jp shunsuke.miura@tier4.jp taekjin.lee@tier4.jp yoshi.ri@tier4.jp
219217
sensing/radar_static_pointcloud_filter/** satoshi.tanaka@tier4.jp shunsuke.miura@tier4.jp taekjin.lee@tier4.jp yoshi.ri@tier4.jp
220218
sensing/radar_threshold_filter/** satoshi.tanaka@tier4.jp shunsuke.miura@tier4.jp taekjin.lee@tier4.jp yoshi.ri@tier4.jp
@@ -242,8 +240,7 @@ system/system_error_monitor/** fumihito.ito@tier4.jp
242240
system/system_monitor/** fumihito.ito@tier4.jp tetsuhiro.kawaguchi@tier4.jp
243241
system/topic_state_monitor/** ryohsuke.mitsudome@tier4.jp
244242
system/velodyne_monitor/** fumihito.ito@tier4.jp
245-
tools/simulator_test/simulator_compatibility_test/** shpark@morai.ai shumpei.wakabayashi@tier4.jp tomoya.kimura@tier4.jp
246-
vehicle/accel_brake_map_calibrator/accel_brake_map_calibrator/** taiki.tanaka@tier4.jp takeshi.miura@tier4.jp tomoya.kimura@tier4.jp
243+
vehicle/accel_brake_map_calibrator/** taiki.tanaka@tier4.jp takeshi.miura@tier4.jp tomoya.kimura@tier4.jp
247244
vehicle/external_cmd_converter/** takamasa.horibe@tier4.jp
248245
vehicle/raw_vehicle_cmd_converter/** makoto.kurihara@tier4.jp taiki.tanaka@tier4.jp takamasa.horibe@tier4.jp
249246
vehicle/steer_offset_estimator/** taiki.tanaka@tier4.jp

perception/image_projection_based_fusion/launch/pointpainting_fusion.launch.xml

+2-2
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@
5252
<param from="$(var class_remapper_param_path)"/>
5353
<param name="rois_number" value="$(var input/rois_number)"/>
5454

55-
<!-- This parameter shall NOT be included in param file. See the PR from the git blame for details. -->
55+
<!-- This parameter shall NOT be included in param file. See also: https://github.com/autowarefoundation/autoware.universe/pull/6169 -->
5656
<param name="build_only" value="$(var build_only)"/>
5757

5858
<!-- rois, camera and info -->
@@ -93,7 +93,7 @@
9393
<param from="$(var class_remapper_param_path)"/>
9494
<param name="rois_number" value="$(var input/rois_number)"/>
9595

96-
<!-- This parameter shall NOT be included in param file. See the PR from the git blame for details. -->
96+
<!-- This parameter shall NOT be included in param file. See also: https://github.com/autowarefoundation/autoware.universe/pull/6169 -->
9797
<param name="build_only" value="$(var build_only)"/>
9898

9999
<!-- rois, camera and info -->

perception/image_projection_based_fusion/package.xml

+1
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@
1010
<maintainer email="dai.nguyen@tier4.jp">badai nguyen</maintainer>
1111
<maintainer email="kotaro.uetake@tier4.jp">Kotaro Uetake</maintainer>
1212
<maintainer email="tao.zhong@tier4.jp">Tao Zhong</maintainer>
13+
<maintainer email="koji.minoda@tier4.jp">Koji Minoda</maintainer>
1314
<license>Apache License 2.0</license>
1415

1516
<buildtool_depend>ament_cmake_auto</buildtool_depend>

perception/lidar_apollo_instance_segmentation/README.md

+1
Original file line numberDiff line numberDiff line change
@@ -47,6 +47,7 @@ None
4747
| `use_constant_feature` | bool | false | The flag to use direction and distance feature of pointcloud. |
4848
| `target_frame` | string | "base_link" | Pointcloud data is transformed into this frame. |
4949
| `z_offset` | int | 2 | z offset from target frame. [m] |
50+
| `build_only` | bool | `false` | shutdown the node after TensorRT engine file is built |
5051

5152
## Assumptions / Known limits
5253

perception/lidar_apollo_instance_segmentation/launch/lidar_apollo_instance_segmentation.launch.xml

+4
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22
<arg name="input/pointcloud" default="/sensing/lidar/pointcloud"/>
33
<arg name="model" default="model_128"/>
44
<arg name="output/objects" default="labeled_clusters"/>
5+
<arg name="build_only" default="false" description="shutdown node after TensorRT engine file is built"/>
56

67
<arg if="$(eval &quot;'$(var model)'=='model_16'&quot;)" name="base_name" default="vlp-16"/>
78
<arg if="$(eval &quot;'$(var model)'=='model_64'&quot;)" name="base_name" default="hdl-64"/>
@@ -24,5 +25,8 @@
2425
<param name="precision" value="$(var precision)"/>
2526
<param name="target_frame" value="$(var target_frame)"/>
2627
<param from="$(var param_file)"/>
28+
29+
<!-- This parameter shall NOT be included in param file. -->
30+
<param name="build_only" value="$(var build_only)"/>
2731
</node>
2832
</launch>

perception/lidar_apollo_instance_segmentation/src/detector.cpp

+5
Original file line numberDiff line numberDiff line change
@@ -48,6 +48,11 @@ LidarApolloInstanceSegmentation::LidarApolloInstanceSegmentation(rclcpp::Node *
4848
return;
4949
}
5050

51+
if (node_->declare_parameter("build_only", false)) {
52+
RCLCPP_INFO(node_->get_logger(), "TensorRT engine is built and shutdown node.");
53+
rclcpp::shutdown();
54+
}
55+
5156
// GPU memory allocation
5257
const auto input_dims = trt_common_->getBindingDimensions(0);
5358
const auto input_size =

perception/lidar_centerpoint/package.xml

+1
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@
55
<version>1.0.0</version>
66
<description>The lidar_centerpoint package</description>
77
<maintainer email="kenzo.lobos@tier4.jp">Kenzo Lobos-Tsunekawa</maintainer>
8+
<maintainer email="koji.minoda@tier4.jp">Koji Minoda</maintainer>
89
<license>Apache License 2.0</license>
910

1011
<buildtool_depend>ament_cmake_auto</buildtool_depend>

perception/tracking_object_merger/src/utils/utils.cpp

+45-23
Original file line numberDiff line numberDiff line change
@@ -324,43 +324,61 @@ autoware_auto_perception_msgs::msg::TrackedObjectKinematics objectKinematicsVXMe
324324
// copy main object at first
325325
output_kinematics = main_obj.kinematics;
326326
auto sub_obj_ = sub_obj;
327-
// do not merge reverse direction objects
327+
// do not merge if motion direction is different
328328
if (!objectsHaveSameMotionDirections(main_obj, sub_obj)) {
329329
return output_kinematics;
330-
} else if (objectsYawIsReverted(main_obj, sub_obj)) {
331-
// revert velocity (revert pose is not necessary because it is not used in tracking)
332-
sub_obj_.kinematics.twist_with_covariance.twist.linear.x *= -1.0;
333-
sub_obj_.kinematics.twist_with_covariance.twist.linear.y *= -1.0;
334330
}
335331

336332
// currently only merge vx
337333
if (policy == MergePolicy::SKIP) {
338334
return output_kinematics;
339335
} else if (policy == MergePolicy::OVERWRITE) {
340-
output_kinematics.twist_with_covariance.twist.linear.x =
341-
sub_obj_.kinematics.twist_with_covariance.twist.linear.x;
336+
// use main_obj's orientation
337+
// take sub_obj's velocity vector and convert into main_obj's frame, but take only x component
338+
const auto sub_vx = sub_obj_.kinematics.twist_with_covariance.twist.linear.x;
339+
const auto sub_vy = sub_obj_.kinematics.twist_with_covariance.twist.linear.y;
340+
const auto main_yaw = tf2::getYaw(main_obj.kinematics.pose_with_covariance.pose.orientation);
341+
const auto sub_yaw = tf2::getYaw(sub_obj_.kinematics.pose_with_covariance.pose.orientation);
342+
const auto sub_vx_in_main_frame =
343+
sub_vx * std::cos(sub_yaw - main_yaw) + sub_vy * std::sin(sub_yaw - main_yaw);
344+
output_kinematics.twist_with_covariance.twist.linear.x = sub_vx_in_main_frame;
345+
342346
return output_kinematics;
343347
} else if (policy == MergePolicy::FUSION) {
348+
// use main_obj's orientation
349+
// take main_obj's velocity vector and convert into sub_obj's frame, but take only x component
344350
const auto main_vx = main_obj.kinematics.twist_with_covariance.twist.linear.x;
345351
const auto sub_vx = sub_obj_.kinematics.twist_with_covariance.twist.linear.x;
352+
const auto sub_vy = sub_obj_.kinematics.twist_with_covariance.twist.linear.y;
353+
const auto main_yaw = tf2::getYaw(main_obj.kinematics.pose_with_covariance.pose.orientation);
354+
const auto sub_yaw = tf2::getYaw(sub_obj_.kinematics.pose_with_covariance.pose.orientation);
355+
const auto sub_vel_in_main_frame_x =
356+
sub_vx * std::cos(sub_yaw - main_yaw) + sub_vy * std::sin(sub_yaw - main_yaw);
346357
// inverse weight
347358
const auto main_vx_cov = main_obj.kinematics.twist_with_covariance.covariance[0];
348359
const auto sub_vx_cov = sub_obj_.kinematics.twist_with_covariance.covariance[0];
360+
const auto sub_vy_cov = sub_obj_.kinematics.twist_with_covariance.covariance[7];
361+
const auto sub_vel_cov_in_main_frame_x =
362+
sub_vx_cov * std::cos(sub_yaw - main_yaw) * std::cos(sub_yaw - main_yaw) +
363+
sub_vy_cov * std::sin(sub_yaw - main_yaw) * std::sin(sub_yaw - main_yaw);
349364
double main_vx_weight, sub_vx_weight;
350365
if (main_vx_cov == 0.0) {
351366
main_vx_weight = 1.0 * 1e6;
352367
} else {
353368
main_vx_weight = 1.0 / main_vx_cov;
354369
}
355-
if (sub_vx_cov == 0.0) {
370+
if (sub_vel_cov_in_main_frame_x == 0.0) {
356371
sub_vx_weight = 1.0 * 1e6;
357372
} else {
358-
sub_vx_weight = 1.0 / sub_vx_cov;
373+
sub_vx_weight = 1.0 / sub_vel_cov_in_main_frame_x;
359374
}
360-
// merge with covariance
375+
376+
// merge velocity with covariance
361377
output_kinematics.twist_with_covariance.twist.linear.x =
362-
(main_vx * main_vx_weight + sub_vx * sub_vx_weight) / (main_vx_weight + sub_vx_weight);
378+
(main_vx * main_vx_weight + sub_vel_in_main_frame_x * sub_vx_weight) /
379+
(main_vx_weight + sub_vx_weight);
363380
output_kinematics.twist_with_covariance.covariance[0] = 1.0 / (main_vx_weight + sub_vx_weight);
381+
364382
return output_kinematics;
365383
} else {
366384
std::cerr << "unknown merge policy in objectKinematicsMerger function." << std::endl;
@@ -453,21 +471,25 @@ autoware_auto_perception_msgs::msg::Shape shapeMerger(
453471

454472
void updateExceptVelocity(TrackedObject & main_obj, const TrackedObject & sub_obj)
455473
{
474+
// do not update if motion direction is different
456475
if (!objectsHaveSameMotionDirections(main_obj, sub_obj)) {
457-
// do not update velocity
458-
return;
459-
} else if (objectsYawIsReverted(main_obj, sub_obj)) {
460-
// revert velocity (revert pose is not necessary because it is not used in tracking)
461-
const auto vx_temp = sub_obj.kinematics.twist_with_covariance.twist.linear.x;
462-
main_obj = sub_obj;
463-
main_obj.kinematics.twist_with_covariance.twist.linear.x = -vx_temp;
464476
return;
465-
} else {
466-
// update velocity
467-
const auto vx_temp = sub_obj.kinematics.twist_with_covariance.twist.linear.x;
468-
main_obj = sub_obj;
469-
main_obj.kinematics.twist_with_covariance.twist.linear.x = vx_temp;
470477
}
478+
// take main_obj's velocity vector and convert into sub_obj's frame
479+
// use sub_obj's orientation, but take only x component
480+
const auto main_vx = main_obj.kinematics.twist_with_covariance.twist.linear.x;
481+
const auto main_vy = main_obj.kinematics.twist_with_covariance.twist.linear.y;
482+
const auto main_yaw = tf2::getYaw(main_obj.kinematics.pose_with_covariance.pose.orientation);
483+
const auto sub_yaw = tf2::getYaw(sub_obj.kinematics.pose_with_covariance.pose.orientation);
484+
const auto main_vx_in_sub_frame =
485+
main_vx * std::cos(main_yaw - sub_yaw) + main_vy * std::sin(main_yaw - sub_yaw);
486+
487+
// copy sub object to fused object
488+
main_obj = sub_obj;
489+
// recover main object's velocity
490+
main_obj.kinematics.twist_with_covariance.twist.linear.x = main_vx_in_sub_frame;
491+
492+
return;
471493
}
472494

473495
void updateOnlyObjectVelocity(TrackedObject & main_obj, const TrackedObject & sub_obj)

perception/traffic_light_arbiter/CMakeLists.txt

+1
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@ autoware_package()
66

77
ament_auto_add_library(${PROJECT_NAME} SHARED
88
src/traffic_light_arbiter.cpp
9+
src/signal_match_validator.cpp
910
)
1011

1112
rclcpp_components_register_nodes(${PROJECT_NAME}

perception/traffic_light_arbiter/README.md

+19-5
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,19 @@ This package receives traffic signals from perception and external (e.g., V2X) c
88

99
A node that merges traffic light/signal state from image recognition and external (e.g., V2X) systems to provide to a planning component.
1010

11+
### Signal Match Validator
12+
13+
When `enable_signal_matching` is set to true, this node validates the match between perception signals and external signals.
14+
The table below outlines how the matching process determines the output based on the combination of perception and external signal colors. Each cell represents the outcome when a specific color from a perception signal (columns) intersects with a color from an external signal (rows).
15+
16+
| External \ Perception | RED | AMBER | GREEN | UNKNOWN | Not Received |
17+
| --------------------- | ------- | ------- | ------- | ------- | ------------ |
18+
| RED | RED | UNKNOWN | UNKNOWN | UNKNOWN | UNKNOWN |
19+
| AMBER | UNKNOWN | AMBER | UNKNOWN | UNKNOWN | UNKNOWN |
20+
| GREEN | UNKNOWN | UNKNOWN | GREEN | UNKNOWN | UNKNOWN |
21+
| UNKNOWN | UNKNOWN | UNKNOWN | UNKNOWN | UNKNOWN | UNKNOWN |
22+
| Not Received | UNKNOWN | UNKNOWN | UNKNOWN | UNKNOWN | UNKNOWN |
23+
1124
### Inputs / Outputs
1225

1326
#### Input
@@ -28,8 +41,9 @@ A node that merges traffic light/signal state from image recognition and externa
2841

2942
### Core Parameters
3043

31-
| Name | Type | Default Value | Description |
32-
| --------------------------- | ------ | ------------- | -------------------------------------------------------------------------------------------------------------------------------- |
33-
| `external_time_tolerance` | double | 5.0 | The duration in seconds an external message is considered valid for merging |
34-
| `perception_time_tolerance` | double | 1.0 | The duration in seconds a perception message is considered valid for merging |
35-
| `external_priority` | bool | false | Whether or not externals signals take precedence over perception-based ones. If false, the merging uses confidence as a criteria |
44+
| Name | Type | Default Value | Description |
45+
| --------------------------- | ------ | ------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
46+
| `external_time_tolerance` | double | 5.0 | The duration in seconds an external message is considered valid for merging |
47+
| `perception_time_tolerance` | double | 1.0 | The duration in seconds a perception message is considered valid for merging |
48+
| `external_priority` | bool | false | Whether or not externals signals take precedence over perception-based ones. If false, the merging uses confidence as a criteria |
49+
| `enable_signal_matching` | bool | false | Decide whether to validate the match between perception signals and external signals. If set to true, verify that the colors match and only publish them if they are identical |

perception/traffic_light_arbiter/config/traffic_light_arbiter.param.yaml

+1
Original file line numberDiff line numberDiff line change
@@ -3,3 +3,4 @@
33
external_time_tolerance: 5.0
44
perception_time_tolerance: 1.0
55
external_priority: false
6+
enable_signal_matching: false

0 commit comments

Comments
 (0)