Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ros2-foxy: OS-0 sensor hangs while activating #79

Closed
crowncastlejsl opened this issue Mar 10, 2023 · 29 comments
Closed

ros2-foxy: OS-0 sensor hangs while activating #79

crowncastlejsl opened this issue Mar 10, 2023 · 29 comments
Assignees
Labels
bug Something isn't working

Comments

@crowncastlejsl
Copy link

crowncastlejsl commented Mar 10, 2023

Describe the bug

Driver can connect to the sensor and successfully pulls the metadata, but it hangs while the sensor is activating. The topics are visible using RQT, but data never actually gets published.

Setup

docker-compose.yml - container runtime configuration

ouster:
    build:
      context: https://github.com/ouster-lidar/ouster-ros.git#ros2-foxy
      args:
        - ROS_DISTRO=foxy
    restart: unless-stopped
    network_mode: host
    tty: true
    stdin_open: true
    environment:
      - ROS_DOMAIN_ID=101
      - ROS_DISTRO=foxy
      - WORKSPACE=/var/lib/build # needed for /ros_entrypoint.sh to find install/setup.bash
    volumes:
      - ../scripts/ros_entrypoint.sh:/ros_entrypoint.sh
      - ../config/sensors/ouster_params.yml:/var/lib/build/src/ouster-ros/ouster-ros/config/parameters.yaml
    entrypoint:
      - /ros_entrypoint.sh
    command:
      - ros2
      - launch
      - ouster_ros
      - sensor.independent.launch.py
      - viz:=false

ouster_params.yml - sensor configuration

ouster:
  os_sensor:
    ros__parameters:
      sensor_hostname: 10.0.0.32
      udp_dest: 10.0.0.60
      lidar_mode: 2048x10
      timestamp_mode: TIME_FROM_ROS_TIME
      udp_profile_lidar: LEGACY
      lidar_port: 7501
      imu_port: 7502
      metadata: ''

  os_cloud:
    ros__parameters:
      tf_prefix: left_os_zero
      timestamp_mode: TIME_FROM_ROS_TIME

ros_entrypoint.sh - Docker container entrypoint

#!/bin/bash
set -e

# setup ros environment
ROS_SETUP="/opt/ros/$ROS_DISTRO/setup.bash"
if [ -f "$ROS_SETUP" ]; then
    source "$ROS_SETUP"
    echo "Sourced $(readlink -f $ROS_SETUP)"
fi

# setup ROS packages
PKG_SETUP="$WORKSPACE/install/setup.bash"
if [ -f "$PKG_SETUP" ]; then
    source "$PKG_SETUP"
    echo "Sourced $(readlink -f $PKG_SETUP)"
fi

exec "$@"

Results

Command

docker-compose -f ./docker/docker-compose.yml --build up ouster

Output

Sourced /opt/ros/foxy/setup.bash
Sourced /var/lib/build/install/setup.bash
[INFO] [launch]: All log files can be found below /var/lib/build/.ros/log/2023-03-10-20-16-24-748088-rapid-dev-box-1-1
[INFO] [launch]: Default logging verbosity is set to INFO
[INFO] [os_sensor-1]: process started with pid [54]
[INFO] [os_cloud-2]: process started with pid [56]
[INFO] [os_image-3]: process started with pid [58]
[os_image-3] [INFO] [1678479384.872404521] [ouster.os_image]: contacting get_metadata service; attempt no: 1/10
[os_cloud-2] [INFO] [1678479384.877239482] [ouster.os_cloud]: contacting get_metadata service; attempt no: 1/10
[os_sensor-1] [INFO] [1678479385.097626349] [ouster.os_sensor]: TIME_FROM_ROS_TIME timestamp mode specified. IMU and pointcloud messages will use ros time
[os_sensor-1] [INFO] [1678479385.097917648] [ouster.os_sensor]: Will send UDP data to 10.0.0.60
[os_sensor-1] [INFO] [1678479385.190119356] [ouster.os_sensor]: Sensor 10.0.0.32 configured successfully
[os_sensor-1] [INFO] [1678479385.190318752] [ouster.os_sensor]: Starting sensor 10.0.0.32 initialization...
[os_sensor-1] [2023-03-10 20:16:25.190] [ouster::sensor] [info] initializing sensor: 10.0.0.32 with ports: 7501/7502
[os_sensor-1] [INFO] [1678479386.649603744] [ouster.os_sensor]: ouster client version: 0.7.1+unknown-release
[os_sensor-1] product: OS-0-64-U02, sn: 122226001747, firmware rev: v2.3.1
[os_sensor-1] lidar mode: 2048x10, lidar udp profile: LEGACY
[os_sensor-1] [INFO] [1678479386.649715174] [ouster.os_sensor]: No metadata file was specified, using: 10.0.0-metadata.json
[os_sensor-1] [INFO] [1678479386.650035131] [ouster.os_sensor]: Wrote sensor metadata to 10.0.0-metadata.json
[os_sensor-1] [INFO] [1678479386.651548360] [ouster.os_sensor]: reset service created
[os_sensor-1] [INFO] [1678479386.653285624] [ouster.os_sensor]: get_metadata service created
[os_image-3] [INFO] [1678479386.653573389] [ouster.os_image]: waiting for get_metadata service to respond; attempt no: 1/10
[os_cloud-2] [INFO] [1678479386.653738287] [ouster.os_cloud]: waiting for get_metadata service to respond; attempt no: 1/10
[os_sensor-1] [INFO] [1678479386.654336994] [ouster.os_sensor]: get_config service created
[os_sensor-1] [INFO] [1678479386.655261165] [ouster.os_sensor]: set_config service created
[INFO] [launch.user]: os_sensor activating...
[os_image-3] [INFO] [1678479386.657114815] [ouster.os_image]: retrieved sensor metadata!
[os_cloud-2] [INFO] [1678479386.657131660] [ouster.os_cloud]: retrieved sensor metadata!

Platform (please complete the following information):**

  • Ouster Sensor? OS-0
  • Ouster Firmware Version?
"sensor_info": 
    {
        "base_pn": "",
        "base_sn": "",
        "build_date": "2022-06-08T01:26:30Z",
        "build_rev": "v2.3.1",
        "image_rev": "ousteros-image-prod-aries-v2.3.1+20220608012528.patch-v2.3.x",
        "initialization_id": 8247304,
        "prod_line": "OS-0-64-U02",
        "prod_pn": "840-103574-06",
        "prod_sn": "122226001747",
        "proto_rev": "",
        "status": "RUNNING"
    }
  • ROS version/distro? ROS2-foxy
  • Operating System? Linux
  • Machine Architecture? x86 and arm64
@crowncastlejsl crowncastlejsl added the bug Something isn't working label Mar 10, 2023
@Samahu Samahu self-assigned this Mar 21, 2023
@Samahu
Copy link
Contributor

Samahu commented Mar 21, 2023

Interesting, I'll try to reproduce tomorrow.

@Grisly00
Copy link

Grisly00 commented Mar 24, 2023

I think I have the same issue with a OS1. The driver works well in WSL with Ubuntu Jammy and Humble (as long as the UDP ports are forwarded), but does not in a Ubuntu Jammy Docker container. The sensor gets configured (i.e. switches from operating mode "Standby" to "Normal") but then hangs at:
[component_container_mt-1] [INFO] [1679653979.326010969] [ouster.os_image]: waiting for get_metadata service to respond; attempt no: 4/10

See the full log
$ros2 launch ouster_ros sensor.launch.xml sensor_hostname:=192.168.1.60 lidar_port:=7502 imu_port:=7503

[INFO] [launch]: All log files can be found below /home/grisi/.ros/log/2023-03-24-11-11-56-349559-molisens-349
[INFO] [launch]: Default logging verbosity is set to INFO
[INFO] [component_container_mt-1]: process started with pid [361]
[INFO] [bash-2]: process started with pid [363]
[INFO] [bash-3]: process started with pid [365]
[INFO] [bash-4]: process started with pid [367]
[INFO] [static_transform_publisher-5]: process started with pid [369]
[INFO] [static_transform_publisher-6]: process started with pid [373]
[static_transform_publisher-5] [INFO] [1679652716.688072914] [ouster.stp_sensor_imu]: Spinning until stopped - publishing transform
[static_transform_publisher-5] translation: ('0.000000', '0.000000', '0.000000')
[static_transform_publisher-5] rotation: ('0.000000', '0.000000', '0.000000', '1.000000')
[static_transform_publisher-5] from 'os_sensor' to 'os_imu'
[static_transform_publisher-6] [INFO] [1679652716.693022295] [ouster.stp_sensor_lidar]: Spinning until stopped - publishing transform
[static_transform_publisher-6] translation: ('0.000000', '0.000000', '0.000000')
[static_transform_publisher-6] rotation: ('0.000000', '0.000000', '0.000000', '1.000000')
[static_transform_publisher-6] from 'os_sensor' to 'os_lidar'
[component_container_mt-1] [INFO] [1679652716.874694400] [ouster.os_container]: Load Library: /home/grisi/MOLISENS/molisens_ws/install/ouster_ros/lib/libos_sensor_component.so
[component_container_mt-1] [INFO] [1679652716.885163776] [ouster.os_container]: Found class: rclcpp_components::NodeFactoryTemplate<ouster_ros::OusterSensor>
[component_container_mt-1] [INFO] [1679652716.885294091] [ouster.os_container]: Instantiate class: rclcpp_components::NodeFactoryTemplate<ouster_ros::OusterSensor>
[INFO] [launch_ros.actions.load_composable_nodes]: Loaded node '/ouster/os_sensor' in container '/ouster/os_container'
[component_container_mt-1] [INFO] [1679652716.898561801] [ouster.os_container]: Load Library: /home/grisi/MOLISENS/molisens_ws/install/ouster_ros/lib/libos_cloud_component.so
[component_container_mt-1] [INFO] [1679652716.901720241] [ouster.os_container]: Found class: rclcpp_components::NodeFactoryTemplate<ouster_ros::OusterCloud>
[component_container_mt-1] [INFO] [1679652716.901792870] [ouster.os_container]: Instantiate class: rclcpp_components::NodeFactoryTemplate<ouster_ros::OusterCloud>
[component_container_mt-1] [INFO] [1679652716.908755480] [ouster.os_cloud]: contacting get_metadata service; attempt no: 1/10
[component_container_mt-1] [INFO] [1679652718.364315653] [ouster.os_sensor]: Will use automatic UDP destination
[component_container_mt-1] [INFO] [1679652718.448792975] [ouster.os_sensor]: Sensor 192.168.1.60 configured successfully
[component_container_mt-1] [INFO] [1679652718.448907176] [ouster.os_sensor]: Starting sensor 192.168.1.60 initialization...
[component_container_mt-1] [2023-03-24 11:11:58.448] [ouster::sensor] [info] initializing sensor: 192.168.1.60 with ports: 7502/7503
[component_container_mt-1] [INFO] [1679652719.716175158] [ouster.os_sensor]: ouster client version: 0.7.1+202c536-release
[component_container_mt-1] product: OS-1-64-U02, sn: 992201000519, firmware rev: v2.3.0
[component_container_mt-1] lidar mode: 1024x10, lidar udp profile: RNG19_RFL8_SIG16_NIR16_DUAL
[component_container_mt-1] [INFO] [1679652719.716262591] [ouster.os_sensor]: No metadata file was specified, using: 192.168.1-metadata.json
[component_container_mt-1] [INFO] [1679652719.716401862] [ouster.os_sensor]: Wrote sensor metadata to 192.168.1-metadata.json
[component_container_mt-1] [INFO] [1679652719.717029177] [ouster.os_sensor]: reset service created
[component_container_mt-1] [INFO] [1679652719.717335979] [ouster.os_sensor]: get_metadata service created
[component_container_mt-1] [INFO] [1679652719.717430748] [ouster.os_cloud]: waiting for get_metadata service to respond; attempt no: 1/10
[component_container_mt-1] [INFO] [1679652719.717735073] [ouster.os_sensor]: get_config service created
[component_container_mt-1] [INFO] [1679652719.717975130] [ouster.os_sensor]: set_config service created
[bash-2] Transitioning successful
[component_container_mt-1] [INFO] [1679652719.719185517] [ouster.os_cloud]: retrieved sensor metadata!
[bash-3] Transitioning successful
[INFO] [launch_ros.actions.load_composable_nodes]: Loaded node '/ouster/os_cloud' in container '/ouster/os_container'
[component_container_mt-1] [INFO] [1679652719.760760674] [ouster.os_container]: Load Library: /home/grisi/MOLISENS/molisens_ws/install/ouster_ros/lib/libos_image_component.so
[component_container_mt-1] [INFO] [1679652719.762364334] [ouster.os_container]: Found class: rclcpp_components::NodeFactoryTemplate<ouster_ros::OusterImage>
[component_container_mt-1] [INFO] [1679652719.762402606] [ouster.os_container]: Instantiate class: rclcpp_components::NodeFactoryTemplate<ouster_ros::OusterImage>
[component_container_mt-1] [INFO] [1679652719.765768648] [ouster.os_image]: contacting get_metadata service; attempt no: 1/10
[component_container_mt-1] [INFO] [1679652719.765842619] [ouster.os_image]: waiting for get_metadata service to respond; attempt no: 1/10
[INFO] [bash-3]: process has finished cleanly [pid 365]
[INFO] [bash-2]: process has finished cleanly [pid 363]
[bash-4] QStandardPaths: XDG_RUNTIME_DIR not set, defaulting to '/tmp/runtime-grisi'
[bash-4] [INFO] [1679652721.856418288] [rviz2]: Stereo is NOT SUPPORTED
[bash-4] [INFO] [1679652721.856576954] [rviz2]: OpenGl version: 4.5 (GLSL 4.5)
[bash-4] [INFO] [1679652721.921919205] [rviz2]: Stereo is NOT SUPPORTED
[component_container_mt-1] [WARN] [1679652721.988104380] [ouster.os_cloud]: New subscription discovered on topic '/ouster/points', requesting incompatible QoS. No messages will be sent to it. Last incompatible policy: RELIABILITY_QOS_POLICY
[bash-4] [INFO] [1679652722.897403148] [rviz2]: Stereo is NOT SUPPORTED
[component_container_mt-1] [INFO] [1679652729.766439155] [ouster.os_image]: waiting for get_metadata service to respond; attempt no: 2/10
[component_container_mt-1] [INFO] [1679652739.767111198] [ouster.os_image]: waiting for get_metadata service to respond; attempt no: 3/10
[component_container_mt-1] [INFO] [1679652749.767764013] [ouster.os_image]: waiting for get_metadata service to respond; attempt no: 4/10
[component_container_mt-1] [INFO] [1679652759.768630278] [ouster.os_image]: waiting for get_metadata service to respond; attempt no: 5/10
[component_container_mt-1] [INFO] [1679652769.769938338] [ouster.os_image]: waiting for get_metadata service to respond; attempt no: 6/10
[component_container_mt-1] [INFO] [1679652779.770803561] [ouster.os_image]: waiting for get_metadata service to respond; attempt no: 7/10
[component_container_mt-1] [INFO] [1679652789.771498287] [ouster.os_image]: waiting for get_metadata service to respond; attempt no: 8/10
[component_container_mt-1] [INFO] [1679652799.772305887] [ouster.os_image]: waiting for get_metadata service to respond; attempt no: 9/10
[component_container_mt-1] [INFO] [1679652809.772905806] [ouster.os_image]: waiting for get_metadata service to respond; attempt no: 10/10
[component_container_mt-1] [ERROR] [1679652819.773809215] [ouster.os_image]: get_metadata service timed out or interrupted
[ERROR] [launch_ros.actions.load_composable_nodes]: Failed to load node 'os_image' of type 'ouster_ros::OusterImage' in container '/ouster/os_container': Component constructor threw an exception: get_metadata service timed out or interrupted
[component_container_mt-1] [ERROR] [1679652819.777422482] [ouster.os_container]: Component constructor threw an exception: get_metadata service timed out or interrupted

@Samahu
Copy link
Contributor

Samahu commented Mar 25, 2023

@Grisly00 thanks for sharing the info. This is mostly an issue an issue with the underlying DDS driver, the asynchronous callback does not work all the time. The only solution probably for us is not to have this dependency to start with.

@Samahu
Copy link
Contributor

Samahu commented Apr 5, 2023

@crowncastlejsl I was able to re-produce the issue, did you happen to try sensor.composite.launch.py instead of sensor.independent.launch.py, did it give you same or different result?

@crowncastlejsl
Copy link
Author

Previous commit

To try the sensor.composite.launch.py launch file, I had to change the Docker build context to use the same commit as when I first created the issue.

ouster:
    build:
      context: https://github.com/ouster-lidar/ouster-ros.git#4fbf576eebcfeeb00d7e9a42f016e84152ca09c5
      args:
        - ROS_DISTRO=foxy
    ...
    command:
      - ros2
      - launch
      - ouster_ros
      - sensor.composite.launch.py # changed to other launch file
      - viz:=false

New error messages:

[component_container_mt-1] [ERROR] [1680715918.147608892] [ouster.os_container]: Component constructor threw an exception: tf_prefix
[ERROR] [launch_ros.actions.load_composable_nodes]: Failed to load node 'os_cloud' of type 'ouster_ros::OusterCloud' in container '/ouster/os_container': Component constructor threw an exception: tf_prefix
...
[component_container_mt-1] [ERROR] [1680715919.221592574] [ouster.os_sensor]: exception thrown while configuring the sensor, details: sensor_hostname

Current ros2-foxy branch

New error message, using sensor.independent.launch.py

[ERROR] [launch_ros.actions.lifecycle_node]: Failed to make transition 'TRANSITION_CONFIGURE' for LifecycleNode '/ouster/os_sensor'
[os_sensor-1] [ERROR] [1680725907.046408387] [ouster.os_sensor]: exception thrown while configuring the sensor, details: mtp_dest
[os_sensor-1] [WARN] [1680725907.046743305] []: Error occurred while doing error handling.

@Samahu
Copy link
Contributor

Samahu commented Apr 6, 2023

All the reported issues are related to missing parameters tf_prefix, sensor_hostname. Did you forget to pass/set the params_file?

For the current ros2-foxy branch there are two parameters (mtp_dest, mtp_main) that were added to the provided parameters.yaml which is what the log is complaining about. You would need to include them in the yaml file you that you supply to the driver through the params_file arg.

@Samahu
Copy link
Contributor

Samahu commented Apr 7, 2023

@crowncastlejsl I pushed a fix #100 which resolves an issue when running the driver on an arm processor. Would you mind giving it a try and see if it resolves the issue that you are seeing?

@Samahu
Copy link
Contributor

Samahu commented Apr 7, 2023

@crowncastlejsl Just to make it easier for you, here is what your updated parameters should be like:

ouster:
  os_sensor:
    ros__parameters:
      sensor_hostname: 10.0.0.32
      udp_dest: 10.0.0.60
      mtp_dest: ''
      mtp_main: false
      lidar_mode: 2048x10
      timestamp_mode: TIME_FROM_ROS_TIME
      udp_profile_lidar: LEGACY
      lidar_port: 7501
      imu_port: 7502
      metadata: ''
  os_cloud:
    ros__parameters:
      tf_prefix: left_os_zero
      timestamp_mode: TIME_FROM_ROS_TIME

@Samahu
Copy link
Contributor

Samahu commented Apr 12, 2023

I merged my proposed fix, let me know if you are still observing the issue after updating to latest

@Samahu Samahu closed this as completed Apr 12, 2023
@crowncastlejsl
Copy link
Author

Thanks for your help with this issue. I apologize for not testing things out sooner - I've been away from the hardware for a little while.

Unfortunately, the behavior is still the same as the original issue

Platform:

~$ uname -v
#76~20.04.1-Ubuntu SMP Mon Mar 20 15:54:19 UTC 2023
~$ uname -i
x86_64

Files

Folder Structure

$ tree ./
./
├── config
│   └── sensors
│       └── ouster_params.yml
├── docker
│   └── docker-compose.yml
└── scripts
    └── ros_entrypoint.sh

4 directories, 3 files

docker-compose.yml

version: '3.5'
services:
  ouster:
    build:
      context: https://github.com/ouster-lidar/ouster-ros.git#ros2-foxy
      args:
        - ROS_DISTRO=foxy
    restart: unless-stopped
    network_mode: host
    tty: true
    stdin_open: true
    environment:
      - ROS_DOMAIN_ID=101
      - ROS_DISTRO=foxy
      - WORKSPACE=/var/lib/build # needed for /ros_entrypoint.sh to find install/setup.bash
    volumes:
      - ../scripts/ros_entrypoint.sh:/ros_entrypoint.sh
      - ../config/sensors/ouster_params.yml:/var/lib/build/src/ouster-ros/ouster-ros/config/parameters.yaml
    entrypoint:
      - /ros_entrypoint.sh
    command:
      - ros2
      - launch
      - ouster_ros
      - sensor.independent.launch.py
      - viz:=false

ros_entrypoint.sh

#!/bin/bash
set -e

# setup ros environment
ROS_SETUP="/opt/ros/$ROS_DISTRO/setup.bash"
if [ -f "$ROS_SETUP" ]; then
    source "$ROS_SETUP"
    echo "Sourced $(readlink -f $ROS_SETUP)"
fi

# setup ROS packages
PKG_SETUP="$WORKSPACE/install/setup.bash"
if [ -f "$PKG_SETUP" ]; then
    source "$PKG_SETUP"
    echo "Sourced $(readlink -f $PKG_SETUP)"
fi

exec "$@"

ouster_params.yml

ouster:
  os_sensor:
    ros__parameters:
      sensor_hostname: 10.0.0.32
      udp_dest: 10.0.0.60
      mtp_dest: ''
      mtp_main: false
      lidar_mode: 2048x10
      timestamp_mode: TIME_FROM_ROS_TIME
      udp_profile_lidar: LEGACY
      lidar_port: 7501
      imu_port: 7502
      metadata: ''

  os_cloud:
    ros__parameters:
      tf_prefix: left_os_zero
      timestamp_mode: TIME_FROM_ROS_TIME

Command

docker-compose -f ./docker/docker-compose.yml up --build

Output

Sourced /opt/ros/foxy/setup.bash
Sourced /var/lib/build/install/setup.bash
[INFO] [launch]: All log files can be found below /var/lib/build/.ros/log/2023-04-12-21-00-00-427250-rapid-dev-box-1-1
[INFO] [launch]: Default logging verbosity is set to INFO
[INFO] [os_sensor-1]: process started with pid [54]
[INFO] [os_cloud-2]: process started with pid [56]
[INFO] [os_image-3]: process started with pid [58]
[os_image-3] [INFO] [1681333200.539731685] [ouster.os_image]: contacting get_metadata service; attempt no: 1/10
[os_cloud-2] [INFO] [1681333200.541380883] [ouster.os_cloud]: contacting get_metadata service; attempt no: 1/10
[os_sensor-1] [INFO] [1681333200.775053796] [ouster.os_sensor]: TIME_FROM_ROS_TIME timestamp mode specified. IMU and pointcloud messages will use ros time
[os_sensor-1] [INFO] [1681333200.775169884] [ouster.os_sensor]: Will send UDP data to 10.0.0.60
[os_sensor-1] [INFO] [1681333200.862090173] [ouster.os_sensor]: Sensor 10.0.0.32 configured successfully
[os_sensor-1] [INFO] [1681333200.862202116] [ouster.os_sensor]: Starting sensor 10.0.0.32 initialization...
[os_sensor-1] [2023-04-12 21:00:00.862] [ouster::sensor] [info] initializing sensor: 10.0.0.32 with lidar port/imu port: 7501/7502
[os_sensor-1] [INFO] [1681333202.312438585] [ouster.os_sensor]: ouster client version: 0.8.1+unknown-release
[os_sensor-1] product: OS-0-64-U02, sn: 122226001747, firmware rev: v2.3.1
[os_sensor-1] lidar mode: 2048x10, lidar udp profile: LEGACY
[os_sensor-1] [INFO] [1681333202.312544168] [ouster.os_sensor]: No metadata file was specified, using: 10.0.0-metadata.json
[os_sensor-1] [INFO] [1681333202.312774899] [ouster.os_sensor]: Wrote sensor metadata to 10.0.0-metadata.json
[os_sensor-1] [INFO] [1681333202.313560652] [ouster.os_sensor]: reset service created
[os_sensor-1] [INFO] [1681333202.314174592] [ouster.os_sensor]: get_metadata service created
[os_cloud-2] [INFO] [1681333202.314538614] [ouster.os_cloud]: waiting for get_metadata service to respond; attempt no: 1/10
[os_sensor-1] [INFO] [1681333202.314595076] [ouster.os_sensor]: get_config service created
[os_image-3] [INFO] [1681333202.314648060] [ouster.os_image]: waiting for get_metadata service to respond; attempt no: 1/10
[os_sensor-1] [INFO] [1681333202.315292552] [ouster.os_sensor]: set_config service created
[INFO] [launch.user]: os_sensor activating...
[os_cloud-2] [INFO] [1681333202.316645141] [ouster.os_cloud]: retrieved sensor metadata!
[os_image-3] [INFO] [1681333202.316738767] [ouster.os_image]: retrieved sensor metadata!

@Samahu
Copy link
Contributor

Samahu commented Apr 12, 2023

No worries, I will re-open the ticket till I have this figured out. Mind sharing your docker-engine version?

@Samahu Samahu reopened this Apr 12, 2023
@crowncastlejsl
Copy link
Author

Certainly:

Docker Version

~$ docker version
Client: Docker Engine - Community
 Version:           23.0.3
 API version:       1.42
 Go version:        go1.19.7
 Git commit:        3e7cbfd
 Built:             Tue Apr  4 22:06:10 2023
 OS/Arch:           linux/amd64
 Context:           default

Server: Docker Engine - Community
 Engine:
  Version:          23.0.3
  API version:      1.42 (minimum version 1.12)
  Go version:       go1.19.7
  Git commit:       59118bf
  Built:            Tue Apr  4 22:06:10 2023
  OS/Arch:          linux/amd64
  Experimental:     true
 containerd:
  Version:          1.6.20
  GitCommit:        2806fc1057397dbaeefbea0e4e17bddfbd388f38
 runc:
  Version:          1.1.5
  GitCommit:        v1.1.5-0-gf19387a
 docker-init:
  Version:          0.19.0
  GitCommit:        de40ad0

@Samahu
Copy link
Contributor

Samahu commented Apr 20, 2023

I am on the same docker version, so it isn't the issue.

First I want to check if you had the chance to pull and try the code after we #103 and #106. There were changes that may help resolve the issue that you are facing. 🤞

In case this didn't work, I would kind ask if you could modify the sensor.independent.launch.py file to set the log level to debug on os_sensor and os_cloud, I do have some DEBUG level log statements that could help me pin point the location the process is stopping on your machine, all you need to do is add the field arguments=['--ros-args', '--log-level', 'debug'] to both os_sensor and os_cloud, then I would need you to send me the log files for both nodes which should have files names like os_sensor_xxxx.log and os_cloud_xxxx.log location under ~/.ros folder.

But first try to pull the latest changes and check if it had helped at all. Thank you.

@crowncastlejsl
Copy link
Author

crowncastlejsl commented Apr 21, 2023

Using the repo URL with the ros2-foxy branch will cause Docker to use the current state of that branch when it builds the image (see docker-compose.yml details above). All I do to test is just run the following command:

docker-compose -f ./docker/docker-compose.yml up --build

Output

Sourced /opt/ros/foxy/setup.bash
Sourced /var/lib/build/install/setup.bash
[INFO] [launch]: All log files can be found below /var/lib/build/.ros/log/2023-04-21-14-44-09-230672-rapid-dev-box-1-1
[INFO] [launch]: Default logging verbosity is set to INFO
[INFO] [os_sensor-1]: process started with pid [54]
[INFO] [os_cloud-2]: process started with pid [56]
[INFO] [os_image-3]: process started with pid [58]
[os_image-3] [INFO] [1682088249.344200172] [ouster.os_image]: OusterImage: node initialized!
[os_cloud-2] [INFO] [1682088249.344516621] [ouster.os_cloud]: OusterCloud: node initialized!
[os_sensor-1] [INFO] [1682088249.578258315] [ouster.os_sensor]: TIME_FROM_ROS_TIME timestamp mode specified. IMU and pointcloud messages will use ros time
[os_sensor-1] [INFO] [1682088249.578473558] [ouster.os_sensor]: Will send UDP data to 10.0.0.60
[os_sensor-1] [INFO] [1682088249.736821102] [ouster.os_sensor]: Sensor 10.0.0.32 configured successfully
[os_sensor-1] [INFO] [1682088249.736928298] [ouster.os_sensor]: Starting sensor 10.0.0.32 initialization...
[os_sensor-1] [2023-04-21 14:44:09.737] [ouster::sensor] [info] initializing sensor: 10.0.0.32 with lidar port/imu port: 7501/7502
[os_sensor-1] [INFO] [1682088263.847479619] [ouster.os_sensor]: ouster client version: 0.8.1+unknown-release
[os_sensor-1] product: OS-0-64-U02, sn: 122226001747, firmware rev: v2.3.1
[os_sensor-1] lidar mode: 2048x10, lidar udp profile: LEGACY
[os_sensor-1] [INFO] [1682088263.847682087] [ouster.os_sensor]: No metadata file was specified, using: 10.0.0-metadata.json
[os_sensor-1] [INFO] [1682088263.847957275] [ouster.os_sensor]: Wrote sensor metadata to 10.0.0-metadata.json
[os_image-3] [INFO] [1682088263.848289750] [ouster.os_image]: OusterImage: retrieved new sensor metadata!
[os_cloud-2] [INFO] [1682088263.848363602] [ouster.os_cloud]: OusterCloud: retrieved new sensor metadata!
[os_sensor-1] [INFO] [1682088263.848945960] [ouster.os_sensor]: reset service created
[os_sensor-1] [INFO] [1682088263.849686296] [ouster.os_sensor]: get_metadata service created
[os_sensor-1] [INFO] [1682088263.850073457] [ouster.os_sensor]: get_config service created
[os_sensor-1] [INFO] [1682088263.850484840] [ouster.os_sensor]: set_config service created
[INFO] [launch.user]: os_sensor activating...

To test the modifications, I modified the file in the running container and then restarted it.

Modified sensor.independent.launch.py

def generate_launch_description():

    ...

    os_sensor = LifecycleNode(
        package='ouster_ros',
        executable='os_sensor',
        name='os_sensor',
        namespace=ouster_ns,
        parameters=[params_file],
        output='screen',
        arguments=['--ros-args', '--log-level', 'debug']
    )

    ...

    os_cloud = Node(
        package='ouster_ros',
        executable='os_cloud',
        name='os_cloud',
        namespace=ouster_ns,
        parameters=[params_file],
        output='screen',
        arguments=['--ros-args', '--log-level', 'debug']
    )

    ...

Seems like it gets stuck in some kind of loop after some initialization.

os_cloud log

[DEBUG] [1682091610.962662948] [rclcpp]: signal handler installed
[DEBUG] [1682091610.962697622] [rcl]: Couldn't parse arg 0 (/var/lib/build/install/ouster_ros/lib/ouster_ros/os_cloud) as a remap rule in its deprecated form. Error: Expected lexeme type (19) not found, search ended at index 57, at /tmp/binarydeb/ros-foxy-rcl-1.1.14/src/rcl/lexer_lookahead.c:239
[DEBUG] [1682091610.962701847] [rcl]: Arg 2 (--log-level) is not a --param nor a -p flag.
[DEBUG] [1682091610.962704167] [rcl]: Arg 2 (--log-level) is not a --remap nor a -r flag.
[DEBUG] [1682091610.962705851] [rcl]: Arg 2 (--log-level) is not a --params-file flag.
[DEBUG] [1682091610.962707942] [rcl]: Got log level: debug

[DEBUG] [1682091610.962709824] [rcl]: Arg 5 (-r) is not a --param nor a -p flag.
[DEBUG] [1682091610.962712176] [rcl]: Got remap rule : __node:=os_cloud

[DEBUG] [1682091610.962713762] [rcl]: Arg 7 (-r) is not a --param nor a -p flag.
[DEBUG] [1682091610.962716463] [rcl]: Got remap rule : __ns:=/ouster

[DEBUG] [1682091610.962718047] [rcl]: Arg 9 (--params-file) is not a --param nor a -p flag.
[DEBUG] [1682091610.962719790] [rcl]: Arg 9 (--params-file) is not a --remap nor a -r flag.
[DEBUG] [1682091610.962772701] [rcl]: Got params file : /var/lib/build/install/ouster_ros/share/ouster_ros/config/parameters.yaml
total num param files 1
[DEBUG] [1682091610.962772867] [rclcpp]: deferred_signal_handler(): waiting for SIGINT or uninstall
[DEBUG] [1682091610.962796231] [rcl]: Initializing wait set with '0' subscriptions, '2' guard conditions, '0' timers, '0' clients, '0' services
[DEBUG] [1682091610.962808581] [os_cloud]: Load library libos_cloud_component.so
[DEBUG] [1682091610.969074335] [os_cloud]: Instantiate class rclcpp_components::NodeFactoryTemplate<ouster_ros::OusterCloud>
[DEBUG] [1682091610.969102571] [rcl]: Couldn't parse arg 0 (/var/lib/build/install/ouster_ros/lib/ouster_ros/os_cloud) as a remap rule in its deprecated form. Error: Expected lexeme type (19) not found, search ended at index 57, at /tmp/binarydeb/ros-foxy-rcl-1.1.14/src/rcl/lexer_lookahead.c:239
[DEBUG] [1682091610.969113459] [rcl]: Initializing node 'os_cloud' in namespace ''
[DEBUG] [1682091610.969123522] [rcl]: Using domain ID of '101'
[DEBUG] [1682091610.972855662] [rcl]: Initializing publisher for topic name '/rosout'
[DEBUG] [1682091610.972882251] [rcl]: Expanded topic name '/rosout'
[DEBUG] [1682091610.973454709] [rcl]: Publisher initialized
[DEBUG] [1682091610.973476995] [rcl]: Node initialized
[DEBUG] [1682091610.973531401] [rcl]: Initializing service for service name 'os_cloud/get_parameters'
[DEBUG] [1682091610.973542132] [rcl]: Expanded service name '/ouster/os_cloud/get_parameters'
[DEBUG] [1682091610.973802785] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.973809957] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_cloud/get_parametersRequest
[DEBUG] [1682091610.973813606] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_cloud/get_parametersReply
[DEBUG] [1682091610.973816470] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.974189953] [rcl]: Service initialized
[DEBUG] [1682091610.974226527] [rcl]: Initializing service for service name 'os_cloud/get_parameter_types'
[DEBUG] [1682091610.974238132] [rcl]: Expanded service name '/ouster/os_cloud/get_parameter_types'
[DEBUG] [1682091610.974316380] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.974323349] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_cloud/get_parameter_typesRequest
[DEBUG] [1682091610.974328889] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_cloud/get_parameter_typesReply
[DEBUG] [1682091610.974333232] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.974715901] [rcl]: Service initialized
[DEBUG] [1682091610.974739240] [rcl]: Initializing service for service name 'os_cloud/set_parameters'
[DEBUG] [1682091610.974745817] [rcl]: Expanded service name '/ouster/os_cloud/set_parameters'
[DEBUG] [1682091610.974816189] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.974822169] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_cloud/set_parametersRequest
[DEBUG] [1682091610.974851274] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_cloud/set_parametersReply
[DEBUG] [1682091610.974854577] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.975954619] [rcl]: Service initialized
[DEBUG] [1682091610.975982471] [rcl]: Initializing service for service name 'os_cloud/set_parameters_atomically'
[DEBUG] [1682091610.975992645] [rcl]: Expanded service name '/ouster/os_cloud/set_parameters_atomically'
[DEBUG] [1682091610.976075307] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.976080256] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_cloud/set_parameters_atomicallyRequest
[DEBUG] [1682091610.976083623] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_cloud/set_parameters_atomicallyReply
[DEBUG] [1682091610.976086928] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.976459321] [rcl]: Service initialized
[DEBUG] [1682091610.976483036] [rcl]: Initializing service for service name 'os_cloud/describe_parameters'
[DEBUG] [1682091610.976494394] [rcl]: Expanded service name '/ouster/os_cloud/describe_parameters'
[DEBUG] [1682091610.976569064] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.976573927] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_cloud/describe_parametersRequest
[DEBUG] [1682091610.976577336] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_cloud/describe_parametersReply
[DEBUG] [1682091610.976580684] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.976955729] [rcl]: Service initialized
[DEBUG] [1682091610.976979195] [rcl]: Initializing service for service name 'os_cloud/list_parameters'
[DEBUG] [1682091610.976989205] [rcl]: Expanded service name '/ouster/os_cloud/list_parameters'
[DEBUG] [1682091610.977047047] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.977053372] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_cloud/list_parametersRequest
[DEBUG] [1682091610.977058189] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_cloud/list_parametersReply
[DEBUG] [1682091610.977063247] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.977285326] [rcl]: Service initialized
[DEBUG] [1682091610.977309965] [rcl]: Initializing publisher for topic name '/parameter_events'
[DEBUG] [1682091610.977358042] [rcl]: Expanded topic name '/parameter_events'
[DEBUG] [1682091610.977623231] [rcl]: Publisher initialized
[DEBUG] [1682091610.977687149] [rcl]: Finalizing event
[DEBUG] [1682091610.977691516] [rcl]: Event finalized
[DEBUG] [1682091610.977898427] [rcl]: Initializing subscription for topic name '/parameter_events'
[DEBUG] [1682091610.977905936] [rcl]: Expanded topic name '/parameter_events'
[DEBUG] [1682091610.978113058] [rcl]: Subscription initialized
[DEBUG] [1682091610.978149396] [rcl]: Finalizing event
[DEBUG] [1682091610.978154705] [rcl]: Event finalized
[DEBUG] [1682091610.978220175] [rcl]: Initializing publisher for topic name '/tf'
[DEBUG] [1682091610.978227255] [rcl]: Expanded topic name '/tf'
[DEBUG] [1682091610.978743612] [rcl]: Publisher initialized
[DEBUG] [1682091610.978798064] [rcl]: Finalizing event
[DEBUG] [1682091610.978803007] [rcl]: Event finalized
[DEBUG] [1682091610.978967587] [rcl]: Initializing subscription for topic name 'metadata'
[DEBUG] [1682091610.978992282] [rcl]: Expanded topic name '/ouster/metadata'
[DEBUG] [1682091610.979254246] [rcl]: Subscription initialized
[DEBUG] [1682091610.979295429] [rcl]: Finalizing event
[DEBUG] [1682091610.979302713] [rcl]: Event finalized
[INFO] [1682091610.979335978] [ouster.os_cloud]: OusterCloud: node initialized!
[DEBUG] [1682091610.979402918] [rcl]: Waiting without timeout
[DEBUG] [1682091610.979405113] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [1682091610.979409000] [rcl]: Subscription in wait set is ready
[DEBUG] [1682091610.979410752] [rcl]: Guard condition in wait set is ready
[DEBUG] [1682091610.979412145] [rcl]: Guard condition in wait set is ready
[DEBUG] [1682091610.979421385] [rcl]: Subscription taking message
[DEBUG] [1682091610.979438544] [rcl]: Subscription take succeeded: true
[DEBUG] [1682091610.979447906] [rcl]: Waiting without timeout
[DEBUG] [1682091610.979458089] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [1682091610.979460211] [rcl]: Subscription in wait set is ready
[DEBUG] [1682091610.979461775] [rcl]: Guard condition in wait set is ready
[DEBUG] [1682091610.979464365] [rcl]: Subscription taking message
[DEBUG] [1682091610.979467697] [rcl]: Subscription take succeeded: true
[DEBUG] [1682091610.979472345] [rcl]: Waiting without timeout
[DEBUG] [1682091610.979474098] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [1682091610.979476120] [rcl]: Guard condition in wait set is ready
[DEBUG] [1682091610.979478592] [rcl]: Waiting without timeout
[DEBUG] [1682091610.979480256] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [1682091612.753426037] [rcl]: Subscription in wait set is ready
[DEBUG] [1682091612.753493207] [rcl]: Subscription taking message
[DEBUG] [1682091612.753572208] [rcl]: Subscription take succeeded: true
[INFO] [1682091612.753601418] [ouster.os_cloud]: OusterCloud: retrieved new sensor metadata!
[DEBUG] [1682091612.773302495] [rcl]: Initializing publisher for topic name 'imu'
[DEBUG] [1682091612.773323970] [rcl]: Expanded topic name '/ouster/imu'
[DEBUG] [1682091612.773603119] [rcl]: Publisher initialized
[DEBUG] [1682091612.773634774] [rcl]: Finalizing event
[DEBUG] [1682091612.773637613] [rcl]: Event finalized
[DEBUG] [1682091612.773651312] [rcl]: Initializing publisher for topic name 'points'
[DEBUG] [1682091612.773656288] [rcl]: Expanded topic name '/ouster/points'
[DEBUG] [1682091612.773742948] [rcl]: Publisher initialized
[DEBUG] [1682091612.773751569] [rcl]: Finalizing event
[DEBUG] [1682091612.773753465] [rcl]: Event finalized
[DEBUG] [1682091612.773774070] [rcl]: Initializing subscription for topic name 'lidar_packets'
[DEBUG] [1682091612.773777587] [rcl]: Expanded topic name '/ouster/lidar_packets'
[DEBUG] [1682091612.773918822] [rcl]: Subscription initialized
[DEBUG] [1682091612.773935723] [rcl]: Finalizing event
[DEBUG] [1682091612.773937891] [rcl]: Event finalized
[DEBUG] [1682091612.773954637] [rcl]: Initializing subscription for topic name 'imu_packets'
[DEBUG] [1682091612.773958106] [rcl]: Expanded topic name '/ouster/imu_packets'
[DEBUG] [1682091612.774002208] [rcl]: Subscription initialized
[DEBUG] [1682091612.774008562] [rcl]: Finalizing event
[DEBUG] [1682091612.774010164] [rcl]: Event finalized
[DEBUG] [1682091612.774023353] [rcl]: Waiting without timeout
[DEBUG] [1682091612.774025119] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [1682091612.774028258] [rcl]: Guard condition in wait set is ready
[DEBUG] [1682091612.774029863] [rcl]: Guard condition in wait set is ready
[DEBUG] [1682091612.774033195] [rcl]: Waiting without timeout
[DEBUG] [1682091612.774034678] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [1682091612.774627117] [rcl]: Subscription in wait set is ready
[DEBUG] [1682091612.774634386] [rcl]: Subscription taking message
[DEBUG] [1682091612.774650646] [rcl]: Subscription take succeeded: true
[DEBUG] [1682091612.774708985] [rcl]: Waiting without timeout
[DEBUG] [1682091612.774711662] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [1682091612.774713987] [rcl]: Guard condition in wait set is ready
[DEBUG] [1682091612.774716665] [rcl]: Waiting without timeout
[DEBUG] [1682091612.774718153] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [1682091612.775423873] [rcl]: Subscription in wait set is ready
[DEBUG] [1682091612.775428129] [rcl]: Subscription taking message
[DEBUG] [1682091612.775432651] [rcl]: Subscription take succeeded: true

os_sensor log

[DEBUG] [1682091610.962614737] [rclcpp]: signal handler installed
[DEBUG] [1682091610.962654916] [rcl]: Couldn't parse arg 0 (/var/lib/build/install/ouster_ros/lib/ouster_ros/os_sensor) as a remap rule in its deprecated form. Error: Expected lexeme type (19) not found, search ended at index 58, at /tmp/binarydeb/ros-foxy-rcl-1.1.14/src/rcl/lexer_lookahead.c:239
[DEBUG] [1682091610.962659267] [rcl]: Arg 2 (--log-level) is not a --param nor a -p flag.
[DEBUG] [1682091610.962661559] [rcl]: Arg 2 (--log-level) is not a --remap nor a -r flag.
[DEBUG] [1682091610.962663276] [rcl]: Arg 2 (--log-level) is not a --params-file flag.
[DEBUG] [1682091610.962665400] [rcl]: Got log level: debug

[DEBUG] [1682091610.962667328] [rcl]: Arg 5 (-r) is not a --param nor a -p flag.
[DEBUG] [1682091610.962669732] [rcl]: Got remap rule : __node:=os_sensor

[DEBUG] [1682091610.962671395] [rcl]: Arg 7 (-r) is not a --param nor a -p flag.
[DEBUG] [1682091610.962674179] [rcl]: Got remap rule : __ns:=/ouster

[DEBUG] [1682091610.962675786] [rcl]: Arg 9 (--params-file) is not a --param nor a -p flag.
[DEBUG] [1682091610.962677564] [rcl]: Arg 9 (--params-file) is not a --remap nor a -r flag.
[DEBUG] [1682091610.962729840] [rcl]: Got params file : /var/lib/build/install/ouster_ros/share/ouster_ros/config/parameters.yaml
total num param files 1
[DEBUG] [1682091610.962731737] [rclcpp]: deferred_signal_handler(): waiting for SIGINT or uninstall
[DEBUG] [1682091610.962753703] [rcl]: Initializing wait set with '0' subscriptions, '2' guard conditions, '0' timers, '0' clients, '0' services
[DEBUG] [1682091610.962767410] [os_sensor]: Load library libos_sensor_component.so
[DEBUG] [1682091610.967214507] [os_sensor]: Instantiate class rclcpp_components::NodeFactoryTemplate<ouster_ros::OusterSensor>
[DEBUG] [1682091610.967238654] [rcl]: Couldn't parse arg 0 (/var/lib/build/install/ouster_ros/lib/ouster_ros/os_sensor) as a remap rule in its deprecated form. Error: Expected lexeme type (19) not found, search ended at index 58, at /tmp/binarydeb/ros-foxy-rcl-1.1.14/src/rcl/lexer_lookahead.c:239
[DEBUG] [1682091610.967246356] [rcl]: Initializing node 'os_sensor' in namespace ''
[DEBUG] [1682091610.967257045] [rcl]: Using domain ID of '101'
[DEBUG] [1682091610.970938534] [rcl]: Initializing publisher for topic name '/rosout'
[DEBUG] [1682091610.970955460] [rcl]: Expanded topic name '/rosout'
[DEBUG] [1682091610.971591185] [rcl]: Publisher initialized
[DEBUG] [1682091610.971604570] [rcl]: Node initialized
[DEBUG] [1682091610.971644974] [rcl]: Initializing service for service name 'os_sensor/get_parameters'
[DEBUG] [1682091610.971649579] [rcl]: Expanded service name '/ouster/os_sensor/get_parameters'
[DEBUG] [1682091610.971802489] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.971806572] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_sensor/get_parametersRequest
[DEBUG] [1682091610.971808320] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_sensor/get_parametersReply
[DEBUG] [1682091610.971809842] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.972064477] [rcl]: Service initialized
[DEBUG] [1682091610.972080076] [rcl]: Initializing service for service name 'os_sensor/get_parameter_types'
[DEBUG] [1682091610.972083678] [rcl]: Expanded service name '/ouster/os_sensor/get_parameter_types'
[DEBUG] [1682091610.972112415] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.972114640] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_sensor/get_parameter_typesRequest
[DEBUG] [1682091610.972116355] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_sensor/get_parameter_typesReply
[DEBUG] [1682091610.972117870] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.972277324] [rcl]: Service initialized
[DEBUG] [1682091610.972286442] [rcl]: Initializing service for service name 'os_sensor/set_parameters'
[DEBUG] [1682091610.972289540] [rcl]: Expanded service name '/ouster/os_sensor/set_parameters'
[DEBUG] [1682091610.972312309] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.972314291] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_sensor/set_parametersRequest
[DEBUG] [1682091610.972332151] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_sensor/set_parametersReply
[DEBUG] [1682091610.972333828] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.972463042] [rcl]: Service initialized
[DEBUG] [1682091610.972471800] [rcl]: Initializing service for service name 'os_sensor/set_parameters_atomically'
[DEBUG] [1682091610.972475098] [rcl]: Expanded service name '/ouster/os_sensor/set_parameters_atomically'
[DEBUG] [1682091610.972497659] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.972499522] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_sensor/set_parameters_atomicallyRequest
[DEBUG] [1682091610.972501089] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_sensor/set_parameters_atomicallyReply
[DEBUG] [1682091610.972502599] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.972622789] [rcl]: Service initialized
[DEBUG] [1682091610.972631789] [rcl]: Initializing service for service name 'os_sensor/describe_parameters'
[DEBUG] [1682091610.972634789] [rcl]: Expanded service name '/ouster/os_sensor/describe_parameters'
[DEBUG] [1682091610.972656054] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.972657848] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_sensor/describe_parametersRequest
[DEBUG] [1682091610.972659429] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_sensor/describe_parametersReply
[DEBUG] [1682091610.972660961] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.972756082] [rcl]: Service initialized
[DEBUG] [1682091610.972764814] [rcl]: Initializing service for service name 'os_sensor/list_parameters'
[DEBUG] [1682091610.972767639] [rcl]: Expanded service name '/ouster/os_sensor/list_parameters'
[DEBUG] [1682091610.972789434] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.972791231] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_sensor/list_parametersRequest
[DEBUG] [1682091610.972792783] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_sensor/list_parametersReply
[DEBUG] [1682091610.972794288] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.972984954] [rcl]: Service initialized
[DEBUG] [1682091610.973007086] [rcl]: Initializing publisher for topic name '/parameter_events'
[DEBUG] [1682091610.973014987] [rcl]: Expanded topic name '/parameter_events'
[DEBUG] [1682091610.973135064] [rcl]: Publisher initialized
[DEBUG] [1682091610.973197963] [rcl]: Finalizing event
[DEBUG] [1682091610.973201997] [rcl]: Event finalized
[DEBUG] [1682091610.973437068] [rcl]: Initializing subscription for topic name '/parameter_events'
[DEBUG] [1682091610.973444193] [rcl]: Expanded topic name '/parameter_events'
[DEBUG] [1682091610.973791488] [rcl]: Subscription initialized
[DEBUG] [1682091610.973825761] [rcl]: Finalizing event
[DEBUG] [1682091610.973830020] [rcl]: Event finalized
[DEBUG] [1682091610.973892266] [rcl]: Initializing publisher for topic name '~/transition_event'
[DEBUG] [1682091610.973901935] [rcl]: Expanded topic name '/ouster/os_sensor/transition_event'
[DEBUG] [1682091610.974204347] [rcl]: Publisher initialized
[DEBUG] [1682091610.974225911] [rcl]: Initializing service for service name '~/change_state'
[DEBUG] [1682091610.974232942] [rcl]: Expanded service name '/ouster/os_sensor/change_state'
[DEBUG] [1682091610.974370507] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.974375339] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_sensor/change_stateRequest
[DEBUG] [1682091610.974378541] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_sensor/change_stateReply
[DEBUG] [1682091610.974381391] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.974607362] [rcl]: Service initialized
[DEBUG] [1682091610.974615372] [rcl]: Initializing service for service name '~/get_state'
[DEBUG] [1682091610.974621560] [rcl]: Expanded service name '/ouster/os_sensor/get_state'
[DEBUG] [1682091610.974668579] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.974672232] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_sensor/get_stateRequest
[DEBUG] [1682091610.974675128] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_sensor/get_stateReply
[DEBUG] [1682091610.974694583] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.974875919] [rcl]: Service initialized
[DEBUG] [1682091610.974882046] [rcl]: Initializing service for service name '~/get_available_states'
[DEBUG] [1682091610.974887704] [rcl]: Expanded service name '/ouster/os_sensor/get_available_states'
[DEBUG] [1682091610.974929938] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.974933742] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_sensor/get_available_statesRequest
[DEBUG] [1682091610.974936664] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_sensor/get_available_statesReply
[DEBUG] [1682091610.974939501] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.975067013] [rcl]: Service initialized
[DEBUG] [1682091610.975071194] [rcl]: Initializing service for service name '~/get_available_transitions'
[DEBUG] [1682091610.975075908] [rcl]: Expanded service name '/ouster/os_sensor/get_available_transitions'
[DEBUG] [1682091610.975110518] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.975113827] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_sensor/get_available_transitionsRequest
[DEBUG] [1682091610.975116734] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_sensor/get_available_transitionsReply
[DEBUG] [1682091610.975119585] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.975253860] [rcl]: Service initialized
[DEBUG] [1682091610.975260057] [rcl]: Initializing service for service name '~/get_transition_graph'
[DEBUG] [1682091610.975265129] [rcl]: Expanded service name '/ouster/os_sensor/get_transition_graph'
[DEBUG] [1682091610.975284020] [rmw_fastrtps_cpp]: ************ Service Details *********
[DEBUG] [1682091610.975287552] [rmw_fastrtps_cpp]: Sub Topic rq/ouster/os_sensor/get_transition_graphRequest
[DEBUG] [1682091610.975290741] [rmw_fastrtps_cpp]: Pub Topic rr/ouster/os_sensor/get_transition_graphReply
[DEBUG] [1682091610.975293767] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.975516000] [rcl]: Service initialized
[DEBUG] [1682091610.975812257] [rcl]: Initializing client for service name 'os_sensor/change_state'
[DEBUG] [1682091610.975823722] [rcl]: Expanded service name '/ouster/os_sensor/change_state'
[DEBUG] [1682091610.975853973] [rmw_fastrtps_cpp]: ************ Client Details *********
[DEBUG] [1682091610.975858315] [rmw_fastrtps_cpp]: Sub Topic rr/ouster/os_sensor/change_stateReply
[DEBUG] [1682091610.975861899] [rmw_fastrtps_cpp]: Pub Topic rq/ouster/os_sensor/change_stateRequest
[DEBUG] [1682091610.975865233] [rmw_fastrtps_cpp]: ***********
[DEBUG] [1682091610.976562721] [rcl]: Client initialized
[DEBUG] [1682091610.976649304] [rcl]: Waiting without timeout
[DEBUG] [1682091610.976661227] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [1682091610.976676667] [rcl]: Subscription in wait set is ready
[DEBUG] [1682091610.976683839] [rcl]: Guard condition in wait set is ready
[DEBUG] [1682091610.976689024] [rcl]: Guard condition in wait set is ready
[DEBUG] [1682091610.976725764] [rcl]: Subscription taking message
[DEBUG] [1682091610.976790752] [rcl]: Subscription take succeeded: true
[DEBUG] [1682091610.976823191] [rcl]: Waiting without timeout
[DEBUG] [1682091610.976863327] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [1682091610.976871782] [rcl]: Subscription in wait set is ready
[DEBUG] [1682091610.976877300] [rcl]: Guard condition in wait set is ready
[DEBUG] [1682091610.976887854] [rcl]: Subscription taking message
[DEBUG] [1682091610.976954749] [rcl]: Subscription take succeeded: true
[DEBUG] [1682091610.976983756] [rcl]: Waiting without timeout
[DEBUG] [1682091610.976991706] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [1682091610.976999218] [rcl]: Subscription in wait set is ready
[DEBUG] [1682091610.977004562] [rcl]: Guard condition in wait set is ready
[DEBUG] [1682091610.977015068] [rcl]: Subscription taking message
[DEBUG] [1682091610.977026900] [rcl]: Subscription take succeeded: true
[DEBUG] [1682091610.977068920] [rcl]: Waiting without timeout
[DEBUG] [1682091610.977077236] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [1682091610.977084242] [rcl]: Subscription in wait set is ready
[DEBUG] [1682091610.977089570] [rcl]: Guard condition in wait set is ready
[DEBUG] [1682091610.977097879] [rcl]: Subscription taking message
[DEBUG] [1682091610.977109437] [rcl]: Subscription take succeeded: true

Again, thanks for your help with this issue. I'd be glad to send you the actual log files if you think they'd be helpful. I also plan to recreate the issue on the arm64 (Jetson AGX Xavier) target later today.

@Samahu
Copy link
Contributor

Samahu commented Apr 21, 2023

I did look briefly at your logs and compared them, the only difference I have observed that somehow within your sensor logs it stops updating and progressing after 14 ms, which makes me think that the timer used within the driver has stopped or got stuck. Can I ask how many cores you have on the machine you are running your docker compose on? I do have doubts that you have low CPU/RAM and it is not working nicely with the underlying RMW?

I see that you are running with fastrtps, have you had the chance to test with any other RMW implementation?

@crowncastlejsl
Copy link
Author

I don't think my system is the issue - the CPU is a 14-core 12th Gen Intel(R) Core(TM) i7-12800H, and there's 32 GB of RAM.

Regarding the other things to try:

  • arm64 system - currently having some unrelated issues, but I still intend to recreate this issue once those are resolved. This system currently runs the ROS1 version of the driver without any problems, but is also considerably less powerful than my dev laptop.
  • Timer - I intend to eventually use PTP time from a server hosted in another container. When trying to diagnose this issue, I reverted to using ROS time because I thought it would be simpler. If you think it's worth trying, I can move to using the PTP time.
  • RMW - I know that ROS2 has several types of middleware that can be used, but I'm not familiar with the differences between them. Is there a specific one that you would suggest using?

@Samahu
Copy link
Contributor

Samahu commented Apr 25, 2023

Timer - I intend to eventually use PTP time from a server hosted in another container. When trying to diagnose this issue, I reverted to using ROS time because I thought it would be simpler. If you think it's worth trying, I can move to using the PTP time.

The choice of timestamp mode is irrelevant here. You can keep using TIME_FROM_ROS_TIME mode. I was referring to the ROS timer callbacks which the driver uses to receive and process packets. These are for some reasons appear to have stopped updating in your logs, but on my end they continue to update to the point I could see the sensor node gets configured and activated.

RMW - I know that ROS2 has several types of middleware that can be used, but I'm not familiar with the differences between them. Is there a specific one that you would suggest using?

Based on your logs you are using FastRTPS, you could try to switch to CycloneDDS since it is free and reasonably easy to setup and use. Assuming that you have CycloneDDS properly installed on your system, you could then replace FastRTPS with CycloneDDS by declaring RMW_IMPLEMENTATION=rmw_cyclonedds_cpp as an environment variable in your system. Note that you need to perform a clean rebuild every time you switch the underlying RMW implementation. If you happen to try that please let me know of the outcome.

@crowncastlejsl
Copy link
Author

I cloned the repo down into a subdirectory (ros2-foxy branch and including submodules) and modified the Dockerfile to add CycloneDDS according to the instructions here - basically just installing an apt package and setting a variable. It doesn't seem to have changed anything though.

Start of Dockerfile:

ARG ROS_DISTRO=rolling

FROM ros:${ROS_DISTRO}-ros-core AS build-env
ENV DEBIAN_FRONTEND=noninteractive \
    BUILD_HOME=/var/lib/build \
    OUSTER_ROS_PATH=/opt/catkin_ws/src/ouster-ros \
    RMW_IMPLEMENTATION=rmw_cyclonedds_cpp

RUN set -xue \
# Turn off installing extra packages globally to slim down rosdep install
&& echo 'APT::Install-Recommends "0";' > /etc/apt/apt.conf.d/01norecommend \
&& apt-get update \
&& apt-get install -y       \
    build-essential         \
    cmake                   \
    fakeroot                \
    dpkg-dev                \
    debhelper               \
    python3-rosdep          \
    python3-rospkg          \
    python3-bloom           \
    python3-colcon-common-extensions \
    ros-foxy-rmw-cyclonedds-cpp

...

Is there a way to otherwise inspect the ROS time to make sure it's incrementing appropriately in the container?

@Samahu
Copy link
Contributor

Samahu commented Apr 25, 2023

Log statements in your logs (the ones you submitted earlier) stops being emitted after 14 ms. Specifically the message Timeout calculated based on next scheduled timer: false stops at some point. It should continue getting logged until the node get's configured and activated.

@crowncastlejsl
Copy link
Author

crowncastlejsl commented May 1, 2023

I reran the last test with the recent updates to the ros2-foxy branch (still using cyclonedds), but it still has the same issue.

Updated ouster_params.yml for tf_prefix deprecation
ouster:
  os_sensor:
    ros__parameters:
      sensor_hostname: 10.0.0.32
      udp_dest: 10.0.0.60
      mtp_dest: ''
      mtp_main: false
      lidar_mode: 2048x10
      timestamp_mode: TIME_FROM_ROS_TIME
      udp_profile_lidar: LEGACY
      lidar_port: 7501
      imu_port: 7502
      metadata: ''

  os_cloud:
    ros__parameters:
      sensor_frame: left_os_zero
      lidar_frame: left_os_zero
      imu_frame: left_os_zero
      timestamp_mode: TIME_FROM_ROS_TIME

I think I was unclear about the log messages before. I only posted the start of the log files because the lines all seemed to repeat after a little bit. Now, I've attached about 15s worth of the full log files, but I modified the timestamps to show relative time from the start of the launch.

os_cloud_rel_time.log
os_sensor_rel_time.log

Something I hadn't noticed before are a couples lines in the sensor node log with [rclcpp]: executor taking a service client response from service '/ouster/os_sensor/change_state' failed to take anything. It seems like that might be related to the node failing to fully change states

Potentially interesting portion of os_sensor_rel_time.log
[DEBUG] [0:00:01.808408] [rcl]: Initializing publisher for topic name 'lidar_packets'
[DEBUG] [0:00:01.808420] [rcl]: Expanded topic name '/ouster/lidar_packets'
[DEBUG] [0:00:01.808752] [rcl]: Publisher initialized
[DEBUG] [0:00:01.808784] [rcl]: Initializing publisher for topic name 'imu_packets'
[DEBUG] [0:00:01.808794] [rcl]: Expanded topic name '/ouster/imu_packets'
[DEBUG] [0:00:01.808928] [rcl]: Publisher initialized
[DEBUG] [0:00:01.808997] [rcl]: Sending service response
[DEBUG] [0:00:01.809115] [rcl]: Waiting without timeout
[DEBUG] [0:00:01.809121] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [0:00:01.809172] [rcl]: Guard condition in wait set is ready
[DEBUG] [0:00:01.809177] [rcl]: Guard condition in wait set is ready
[DEBUG] [0:00:01.809181] [rcl]: Client in wait set is ready
[DEBUG] [0:00:01.809208] [rcl]: Client taking service response
[DEBUG] [0:00:01.809220] [rcl]: Client take response succeeded: false
[DEBUG] [0:00:01.809227] [rclcpp]: executor taking a service client response from service '/ouster/os_sensor/change_state' failed to take anything
[DEBUG] [0:00:01.809250] [rcl]: Waiting without timeout
[DEBUG] [0:00:01.809255] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [0:00:01.809262] [rcl]: Guard condition in wait set is ready
[DEBUG] [0:00:01.809274] [rcl]: Waiting without timeout
[DEBUG] [0:00:01.809278] [rcl]: Timeout calculated based on next scheduled timer: false
[DEBUG] [0:00:01.813978] [rcl]: Service in wait set is ready
[DEBUG] [0:00:01.814006] [rcl]: Service server taking service request
[DEBUG] [0:00:01.814011] [rcl]: Service take request succeeded: true
[DEBUG] [0:00:01.814041] [ouster.os_sensor]: on_activate() is called.
[DEBUG] [0:00:01.814082] [rcl]: Initializing timer with period: 0ns
[DEBUG] [0:00:01.814114] [rcl]: Sending service response
[DEBUG] [0:00:01.814144] [rcl]: Waiting with timeout: 0s + 0ns
[DEBUG] [0:00:01.814146] [rcl]: Timeout calculated based on next scheduled timer: true
[DEBUG] [0:00:01.814163] [rcl]: Timer in wait set is ready
[DEBUG] [0:00:01.814165] [rcl]: Guard condition in wait set is ready
[DEBUG] [0:00:01.814166] [rcl]: Guard condition in wait set is ready
[DEBUG] [0:00:01.814168] [rcl]: Client in wait set is ready
[DEBUG] [0:00:01.814175] [rcl]: Calling timer
[DEBUG] [0:00:01.814211] [rcl]: Client taking service response
[DEBUG] [0:00:01.814214] [rcl]: Client take response succeeded: false
[DEBUG] [0:00:01.814217] [rclcpp]: executor taking a service client response from service '/ouster/os_sensor/change_state' failed to take anything

@Samahu
Copy link
Contributor

Samahu commented May 1, 2023

I do get the same message about the executor so it is unlikely the culprit here. But I will re-examine the log files that you attached and try to find out what's going astray.

PS. the error message doesn't happen when using FastRTPS

@crowncastlejsl
Copy link
Author

Again, thanks for your help. Can you currently reproduce things on your side? Please let me know if there's anything else you'd like me to try.

@Samahu
Copy link
Contributor

Samahu commented May 1, 2023

I re-generated the logs locally but (not using Docker this time) .. the only differences that I observed the following:

my logs state: Expanded and remapped service name, Expanded and remapped topic name
instead of just: Expanded service name or Expanded topic name
This could just due to different ROS version/update and is unlikely to be the issue.

The other difference which I don't understand is that your logs doesn't display the firmware rev as I would expect, for example my logs look lile this

[ouster.os_sensor] ouster client version: 0.8.1+d730798-relwithdebinfo
product: OS-1-128, sn: xxxxxxxxxxxx, firmware rev: v3.0.0-rc.2

Your logs only show the following:

[ouster.os_sensor]: ouster client version: 0.8.1+unknown-release

So unless that you explicitly edited these lines out then I am not sure how this could happen since it is a single log statement and should have information about the firmware rev.

@crowncastlejsl
Copy link
Author

Sorry, there was a bug in my conversion script that was eliminating lines that didn't have a timestamp themselves, which would include any multi-line log statements.

Just for reference, here is the script. convert_timestamps was missing the last yield line part

Relative time conversion script
import re
from datetime import datetime
from pathlib import Path

regex = re.compile('\[(\d+\.\d+)\]')

def convert_timestamps(text):
    start = None
    for line in text.splitlines():
        if (m := regex.search(line)):
            ts = float(m.group(1))
            dt = datetime.fromtimestamp(ts)
            if start is None:
                start = dt
            new_line = regex.sub(f'[{dt-start}]', line)
            yield new_line
        else:
            yield line

def convert_file(path: Path):
    base = '_'.join(p.stem.split('_')[:2])
    with p.with_name(f'{base}_rel_time.log').open('w') as file:
        for line in convert_timestamps(p.read_text()):
            file.write(f'{line}\n')

p = sorted(Path('logs').glob('os_sensor*.log'))[0]
print(p.name)
convert_file(p)

p = sorted(Path('logs').glob('os_cloud*.log'))[0]
print(p.name)
convert_file(p)

Updated log files:
os_cloud_rel_time.log
os_sensor_rel_time.log

The portion you mentioned is now there:

[INFO] [0:00:01.805704] [ouster.os_sensor]: ouster client version: 0.8.1+unknown-release
product: OS-0-64-U02, sn: 122226001747, firmware rev: v2.3.1

I notice that my firmware revision is behind yours. Does that need to be updated?

@Samahu
Copy link
Contributor

Samahu commented May 2, 2023

I notice that my firmware revision is behind yours. Does that need to be updated?

No, the firmware 2.3.1 isn't way too behind and should work with the existing ROS2 driver

@Samahu
Copy link
Contributor

Samahu commented May 2, 2023

Unfortunately the logs didn't expose any perceivable problem, I am unable to replicate the issue locally even though I used your scripts. Did you get to try FastRTPS and see if that works better with your setup by any chance?

@Samahu
Copy link
Contributor

Samahu commented May 17, 2023

@crowncastlejsl I have recently merged a change that replaces the use of ROS timers for the polling data from the sensor with regular threads. It may or may not help with your specific case, there shouldn't be any changes to your build configuration so feel free to pull and try.

@Samahu
Copy link
Contributor

Samahu commented Jul 17, 2023

Hi @crowncastlejsl, sorry it has been a while since we interacted.. I just wanted to notify you about a recent update made to the driver in which it improved many aspects of its behavior. One of which should address the issue you had with the driver earlier on. The new changes combined the three disjointed components that composed the driver into a single node. This should remove the intera-dependency within the driver on its components. The issue that you were having (although I couldn't produce in my tests) should be elevated by this changes. Please give it a try and let me know if we can close this ticket or you are still facing the issue.

Note to use the combined mode, you will need to use driver.launch.py as documented on the README.md

@crowncastlejsl
Copy link
Author

Hi @Samahu, thanks for the update and for your help with this issue!

I think this change you mentioned was the one to fix it. Everything seems to be working now. I think we can close.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants