Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 398 articles
Browse latest View live

Multiple issues while attempting to create a simple robot for localization in simulation using amcl

$
0
0
Hello, I am following a basic tutorial and I created my own robot in Gazebo and Rviz, and added laser and camera sensors to it. I then utilized a map from Clearpath to try and localize this robot using AMCL. There are a few issues I am running across that I am unable to fix as of now. 1. The robot when I launch in RViz needs to be set to the "odom" fixed frame, however the map requires the "map" to function. If I select "odom" then the actual map in RViz shuffles about a point (jittery motion). If I select "map" then the robot seems to be jittering about. I am not entirely sure how to fix that. This doesn't happen in Gazebo so I am assuming this doesn't contribute to any errors. But any suggestions on what I am missing or understanding incorrectly? 2. As soon as I run the launch file for the amcl (which loads the map, and starts amcl node, and the move_base node with the config params), I load it up in RViz and then select "2D pose estimate" and point the arrow towards a particular path in the map from the robot. I am attaching an image below. The green circle is where the robot starts and the red arrow underneath is the "2D pose estimate I set". As you can see, as soon as I did that, the costmap depicts it as if the area is blocked off. I select the "2D Nav Goal" and set it as per the Red arrow you can see on the left side. Since as per the costmap the area in front of the robot is blocked, it decides to take the path opposite of it to try to reach the goal. I am trying to understand why this is happening and how to rectify it. Is it any specific parameter somewhere that I need to tweak or I am initializing something incorrectly? ![image description](/upfiles/15100328906307295.png) 3. After the robot decided to take the other route (much longer) I get the following result after a while (I am running this off of a virtual machine unfortunately so the robot movement is slow, can't speed that up somehow). I have marked areas similar to the image above to give an idea. The robot is now located here and is still trying to find its way around. https://imgur.com/a/FhB61 And I get the following errors/warnings - > [ERROR] [1510033498.355228432, 2137.605000000]: None of the points of the global plan were in the local costmap, global plan points too far from robot> [ WARN] [1510033498.369760487, 2137.609000000]: The origin for the sensor at (-0.49, -0.43) is out of map bounds. So, the costmap cannot raytrace for it.> [ WARN] [1510033500.647048940, 2138.619000000]: The origin for the sensor at (-0.49, -0.46) is out of map bounds. So, the costmap cannot raytrace for it.> [ WARN] [1510033507.102539142, 2141.625000000]: The origin for the sensor at (4.69, 0.95) is out of map bounds. So, the costmap cannot raytrace for it.> [ERROR] [1510033526.597402587, 2147.620000000]: None of the 279 first of 279 (279) points of the global plan were in the local costmap and free> [ERROR] [1510033526.597706226, 2147.620000000]: None of the points of the global plan were in the local costmap, global plan points too far from robot And the actual position of the Robot in Gazebo doesn't match with the one shown in Rviz. The robot is still moving but the position is way off in RViz vs Gazebo. https://imgur.com/a/X14Q3 I am trying to figure out **how to identify the problem here** and **how to go about fixing and improving it**. Would appreciate any help to learn this better. Thank you! P.S. - There are several issues with this editor when I tried posting. Where can I report those? I couldn't upload all the images in one go. I couldn't add proper tags. I couldn't scroll in the text editor at all. Would like to report those.

I wantna publish a initialpose to AMCL, but error~

$
0
0
I try to write a node for publish a msg to initialpose topic, but error Error: TF_NAN_INPUT: Ignoring transform for child_frame_id "odom" from authority "unknown_publisher" because of a nan value in the transform (-nan -nan -nan) (-nan -nan -nan -nan) at line 244 in /tmp/binarydeb/ros-kinetic-tf2-0.5.16/src/buffer_core.cpp Error: TF_DENORMALIZED_QUATERNION: Ignoring transform for child_frame_id "odom" from authority "unknown_publisher" because of an invalid quaternion in the transform (-nan -nan -nan -nan) at line 257 in /tmp/binarydeb/ros-kinetic-tf2-0.5.16/src/buffer_core.cpp my code: #include #include int main (int argc, char** argv) { ros::init(argc, argv, "initialpose_pub"); ros::NodeHandle nh_; ros::Publisher initPosePub_ = nh_.advertise("initialpose", 2, true); ros::Rate rate(1.0); while(nh_.ok()) { //get time ros::Time scanTime_ = ros::Time::now(); //create msg geometry_msgs::PoseWithCovarianceStamped initPose_; //create the time & frame initPose_.header.stamp = scanTime_; initPose_.header.frame_id = "map"; //position initPose_.pose.pose.position.x = 0.f; initPose_.pose.pose.position.y = 0.f; //angle initPose_.pose.pose.orientation.z = 0.f; //publish msg initPosePub_.publish(initPose_); //sleep rate.sleep(); } return 0; } I don't know why.

AMCL Particle Weights Are Always Equal - Why?

$
0
0
I'm driving the Turtlebot around the turtlebot_world Gazebo environment. I'm printing out the AMCL particle filter poses and the weights assigned to the poses, and when I drive around for a while (as long as 5 minutes), the weights assigned to the poses in the particle filter are all set to the same value (particle weight = 0.00196, for 500 particles, so they sum to 1, of course). I expected there to be some variety in the particles' weights - some higher, some lower. My theories as to why they're all equal:
1. I could be printing the weights at the wrong point of the AMCL code
2. It's a demo Gazebo world and a demo AMCL map - is the set-up too "perfect" for different hypotheses to result in different weights or something?

Here's a code snippet:
**File:** AMCL/src/amcl_node.cpp **Function:** AmclNode::laserReceived(const sensor_msgs::LaserScanConstPtr& laser_scan) **~Line 1245**, Added the following: for(int i=0;isample_count;i++) {

   ROS_DEBUG("Weight: %f",set->samples[i].weight);
}

How to calculate localization error for a robot moving in a custom map using amcl and navigation stack?

$
0
0
I have a robot simulated in gazebo and rviz along with a map. I can use amcl to have it localized and move to a specific spot in the map. But, how can I guage how accurate the localization is over the course of the robot's navigations? Is there a way to calculate that? My robot has only camera and a laser rangefinder as sensors and nothing else. And when it reaches the goal it seems to be the exact position. How do I figure out how well or poorly the localization did?

Robot rotates around a point instead of navigating to Goal Position when using the navigation stack

$
0
0
Hello, I have a custom robot with 2 wheels, a camera, and a laser rangefinder. I am trying to localize the robot using amcl. I had everything set up. When rviz was launched I would define the initial 2D pose estimate and the used to define the 2d Nav Goal in RViz as well. However, I implemented a basic node to publish to the topic so that I could have the robot move to the goal directly. When I run that node, RViz displays the correct path to the goal. But the robot starts moving forward and after a while (usually around the same spot) it starts to rotate around a single point for some reason. I am not sure why that's the case. It works fine when I try to run it via RViz but not via the node. The code I am working with for the node - #include #include #include #include typedef actionlib::SimpleActionClient MoveBaseClient; int main(int argc, char** argv) { ros::init(argc, argv, node_name); // create the action client // true causes the client to spin its own thread MoveBaseClient ac("move_base", true); // Wait 60 seconds for the action server to become available ROS_INFO("Waiting for the move_base action server"); ac.waitForServer(ros::Duration(5)); ROS_INFO("Connected to move base server"); // Send a goal to move_base move_base_msgs::MoveBaseGoal goal; goal.target_pose.header.frame_id = "map"; goal.target_pose.header.stamp = ros::Time::now(); goal.target_pose.pose.position.x = 0.995; goal.target_pose.pose.position.y = -2.996; goal.target_pose.pose.orientation.w = 1; ac.sendGoal(goal); // Wait for the action to return ac.waitForResult(); if (ac.getState() == actionlib::SimpleClientGoalState::SUCCEEDED) ROS_INFO("You have reached the goal!"); else ROS_INFO("The base failed for some reason"); return 0; } I run the above using `rosrun package_name node_name` This is what it looks like right now. The robot started to rotate instead of following the green path. ![image description](/upfiles/15106129894124305.png)

Issues with custom robot amcl in Rviz/Gazebo

$
0
0
Hello, I am following a basic tutorial and I created my own robot in Gazebo and Rviz, and added laser and camera sensors to it. I then utilized a map from Clearpath to try and localize this robot using AMCL. There are a few issues I am running across that I am unable to fix as of now. 1. The robot when I launch in RViz needs to be set to the "odom" fixed frame, however the map requires the "map" to function. If I select "odom" then the actual map in RViz shuffles about a point (jittery motion). If I select "map" then the robot seems to be jittering about. I am not entirely sure how to fix that. This doesn't happen in Gazebo so I am assuming this doesn't contribute to any errors. But any suggestions on what I am missing or understanding incorrectly? 2. As soon as I run the launch file for the amcl (which loads the map, and starts amcl node, and the move_base node with the config params), I load it up in RViz and then select "2D pose estimate" and point the arrow towards a particular path in the map from the robot. I am attaching an image below. The green circle is where the robot starts and the red arrow underneath is the "2D pose estimate I set". As you can see, as soon as I did that, the costmap depicts it as if the area is blocked off. I select the "2D Nav Goal" and set it as per the Red arrow you can see on the left side. Since as per the costmap the area in front of the robot is blocked, it decides to take the path opposite of it to try to reach the goal. I am trying to understand why this is happening and how to rectify it. Is it any specific parameter somewhere that I need to tweak or I am initializing something incorrectly? ![image description](/upfiles/15100328906307295.png) 3. After the robot decided to take the other route (much longer) I get the following result after a while (I am running this off of a virtual machine unfortunately so the robot movement is slow, can't speed that up somehow). I have marked areas similar to the image above to give an idea. The robot is now located here and is still trying to find its way around. https://imgur.com/a/FhB61 And I get the following errors/warnings - > [ERROR] [1510033498.355228432, 2137.605000000]: None of the points of the global plan were in the local costmap, global plan points too far from robot> [ WARN] [1510033498.369760487, 2137.609000000]: The origin for the sensor at (-0.49, -0.43) is out of map bounds. So, the costmap cannot raytrace for it.> [ WARN] [1510033500.647048940, 2138.619000000]: The origin for the sensor at (-0.49, -0.46) is out of map bounds. So, the costmap cannot raytrace for it.> [ WARN] [1510033507.102539142, 2141.625000000]: The origin for the sensor at (4.69, 0.95) is out of map bounds. So, the costmap cannot raytrace for it.> [ERROR] [1510033526.597402587, 2147.620000000]: None of the 279 first of 279 (279) points of the global plan were in the local costmap and free> [ERROR] [1510033526.597706226, 2147.620000000]: None of the points of the global plan were in the local costmap, global plan points too far from robot And the actual position of the Robot in Gazebo doesn't match with the one shown in Rviz. The robot is still moving but the position is way off in RViz vs Gazebo. https://imgur.com/a/X14Q3 I am trying to figure out **how to identify the problem here** and **how to go about fixing and improving it**. Would appreciate any help to learn this better. Thank you! P.S. - There are several issues with this editor when I tried posting. Where can I report those? I couldn't upload all the images in one go. I couldn't add proper tags. I couldn't scroll in the text editor at all. Would like to report those.

Transform [sender=unknown_publisher]Unknown reason for transform failure

$
0
0
Hello everybody i implement **amcl** for my robot and it works somehow well but when i change the **fixed frame to /map** i get error in laser scan ! anybody know why ? thanks for help in advance ![image description](/upfiles/15111761357529778.png)

Robot rotates around a point instead of navigating to Goal Position when working with amcl

$
0
0
Hello, I have a custom robot with 2 wheels, a camera, and a laser rangefinder. I am trying to localize the robot using amcl. I had everything set up. When rviz was launched I would define the initial 2D pose estimate and the used to define the 2d Nav Goal in RViz as well. However, I implemented a basic node to publish to the topic so that I could have the robot move to the goal directly. When I run that node, RViz displays the correct path to the goal. But the robot starts moving forward and after a while (usually around the same spot) it starts to rotate around a single point for some reason. I am not sure why that's the case. It works fine when I try to run it via RViz but not via the node. The code I am working with for the node - #include #include #include #include typedef actionlib::SimpleActionClient MoveBaseClient; int main(int argc, char** argv) { ros::init(argc, argv, node_name); // create the action client // true causes the client to spin its own thread MoveBaseClient ac("move_base", true); // Wait 60 seconds for the action server to become available ROS_INFO("Waiting for the move_base action server"); ac.waitForServer(ros::Duration(5)); ROS_INFO("Connected to move base server"); // Send a goal to move_base move_base_msgs::MoveBaseGoal goal; goal.target_pose.header.frame_id = "map"; goal.target_pose.header.stamp = ros::Time::now(); goal.target_pose.pose.position.x = 0.995; goal.target_pose.pose.position.y = -2.996; goal.target_pose.pose.orientation.w = 1; ac.sendGoal(goal); // Wait for the action to return ac.waitForResult(); if (ac.getState() == actionlib::SimpleClientGoalState::SUCCEEDED) ROS_INFO("You have reached the goal!"); else ROS_INFO("The base failed for some reason"); return 0; } I run the above using `rosrun package_name node_name` This is what it looks like right now. The robot started to rotate instead of following the green path. ![image description](/upfiles/15106129894124305.png)

When working with amcl why does the robot try to move closer to the wall in an open area rather than navigating more towards the goal?

$
0
0
For example, the following ![image description](/upfiles/15112309833933597.png) The robot is closer to the wall instead of following the path to the goal (green line). I have noticed that happen all the time. Why is that so? Is there a particular parameter that influences this? As soon as the robot gets close to the wall it gets better at navigating straight to the goal I think. Or is it something related to amcl algorithm(s)? Here is the robot shortly after reaching close to wall and then moving towards the goal again. ![image description](/upfiles/15112311322680271.png) What causes such a pattern/behavior?

map, odom, robot footprint are not aligned in RViz

$
0
0
Hello, I have a custom robot which I am trying to localize using amcl. I am using a map from clearpath. I face a very weird issue with the robot when simulating it in rviz. - The robot model "cuts across" the map. - The "odom" frame is underneath the "map" and the "robot_footprint" frames. - At times the "robot_footprint" frame fluctuates in respect to the "odom" or "map" frames. Here are the images to denote the above (please ignore my awful paint skills) ![image description](/upfiles/15125158316476968.png) As you can see, 1. shows "odom" underneath "map" frame. 2. shows "robot_footprint" and "map" on the same plane (plane is the global costmap). 3. shows the robot model "cutting" across the map. 4. shows "robot_footprint" fluctuating/flickering with respect to "map"/"odom". I am not entirely sure how to fix this. The above are when "map" is selected as "Fixed Frame" in "Global Options". Here are some configuration details - **Part of URDF** >>>>>>>>>0 0 0.1 0 0 0>>>>> ixx="0.1" ixy="0" ixz="0"> iyy="0.1" iyz="0"> izz="0.1"> />>>>>>>>>>>>>>> **costmap_common_params.yaml** map_type: costmap origin_z: 0.0 z_resolution: 1 z_voxels: 2 obstacle_range: 2.5 raytrace_range: 3.0 publish_voxel_map: false transform_tolerance: 0.5 meter_scoring: true robot_radius: 0.3 inflation_radius: .6 observation_sources: laser_scan_sensor laser_scan_sensor: {sensor_frame: hokuyo, data_type: LaserScan, topic: /robot/laser/scan, marking: true, clearing: true} **global_costmap_params.yaml** global_costmap: global_frame: map robot_base_frame: chassis update_frequency: 10.0 publish_frequency: 5.0 width: 40.0 height: 40.0 resolution: 0.05 origin_x: -20.0 origin_y: -20.0 static_map: true rolling_window: false **local_costmap_params.yaml** local_costmap: global_frame: map robot_base_frame: chassis update_frequency: 10.0 publish_frequency: 5.0 width: 10.0 height: 10.0 resolution: 0.05 static_map: false rolling_window: true Any help is appreciated, thank you!

Is this amcl pose with covariance good? I barely added any parameters. How can I make my attempt more challenging?

$
0
0
![image description](/upfiles/15125863049756063.png) The above seems to be good. The ellipse/circle remains small and the arrow doesn't show too large a range. But the weird part is that I didn't add any parameters to the amcl node. So the default parameter values do such a good job? **What can I do to challenge my robot simulation such that I can tune different parameter values and gradually observe the results?** I don't like that I get good results without doing much work here.

How to set the initial pose manually in amcl?

$
0
0
The amcl_node initial the pose by using `initial_pose_sub_ = nh_.subscribe("initialpose", 2, &AmclNode::initialPoseReceived, this);` I was wondering where does this message pub ? And is there anything I should notice when I set the initial pose manually.

Navigation with amcl and cartographer

$
0
0
When I am using amcl and cartographer for navigation, after I set the initialpose and goal in rviz , the pointcloud don't stay around the robot model , for example , my robot model is here and the pointcloud would be there . And the starting point of the planning route would be on the model . And the odom frame doesn't on the model . I want to know how to solve this problem, thank you !

Synchronize amcl_pose with image?

$
0
0
Hi, So I am trying to collect synchronized pairs of /amcl_pose and /image_rect_color on the Fetch robot. Here's the code I'm running. #!/usr/bin/env python2 # amcl_listener from copy import deepcopy import rospy from math import sin, cos from geometry_msgs.msg import PoseWithCovarianceStamped as Pose from sensor_msgs.msg import Image import cv2 import pdb as pdb class amcl_listener(object): def __init__(self): # Create a node rospy.init_node("amcl_listener") self.amcl_list = [] self.image_list = [] # Make sure sim time is working while not rospy.Time.now(): pass self.img_sub = rospy.Subscriber( '/head_camera/rgb/image_rect_color', Image, self.cb_image) self.amcl_sub = rospy.Subscriber( '/amcl_pose', Pose, self.cb_amcl) self.curimg = None self.amcl_pose = None while not rospy.is_shutdown(): pass def cb_image(self, msg): self.curimg = msg print "got image at ", rospy.Time.now() def cb_amcl(self, msg): self.amcl_pose = msg print "got amcl at ", rospy.Time.now() if __name__ == "__main__": start = amcl_listener() Here comes the problem, The difference between the time the first image message arrives and the first amcl pose message arrives is close to 2 seconds according to rospy.Time.now(). I tried to compare header stamps but they are also problematic. Any help would be appreciated!

Will A* in AMCL direct robot into grey area on map

$
0
0
I would like to edit the map produced by a SLAM algorithm, to add in no go areas. These areas do not ave a physical barrier. Would greying them out be sufficient to prevent the A* algorithm used by AMCL from directing the robot into them? If not, how would one accomplish this? Also, can AMCL make use of .png maps, or does it have to be the default pgm? Thanks

What is the benefit of using LIDAR and Encoders vs Indoor Beacons

$
0
0
Im trying to figure out why there is so much documentation on using LIDAR and Odometry data for indoor localization, but not much on using Indoor Bluetooth beacons for robotic localization . A lot of people seem to use either LIDAR + Encoder or Beacons only (bluetooth beacon triangulation). Im confused as to why there are no mentions of usage for indoor localization solutions using both LIDAR + Encoder, and Bluetooth beacons. Im wondering too, how would a 1 plane LIDAR (such as RPLidar) be able to determine a robot's location in a large environment (50 x 50 meter room), especially within similar hallways which give similar laser scan readings? It seems that all of my research on indoor localization points to LIDAR + Odometry, but there are hardly any mentions for indoor navigation, and localization by using bluetooth or radio beacon triangulation. Why is that, and what are the advantages and disadvantages of each system? If I were to incorporate beacons into my indoor localization system, what would be the approprate route given that I have LIDAR, and Odometry data for AMCL (where would the blue tooth beacon system fit into the equation?)

translate a map from map server

$
0
0
I want to reuse a map for localization. I load a saved map from the map server. Idealy i would like the map to be at the fixed coordinate origin(0,0). When I start the server though the upper left corner of the saved map is put into the origin. How can I move the map. In the corresponding yaml file I find a value for `origin: [0.0, 0.0, 0.0]` In the documentation I see that this the value of the lower left corner of the map. However if I edit this value, the map doesnt translate in the visualization. What would be the way to go here? I could publish a static transform for the map frame. But then I would like to publish this only once at start up or at certain circumstances ?For example if my localization in the map is too far off. I cant see how I to realize this in code. It seems to me to be a very standard problem maybe someone here can help me. Thanks

How to get /PoseWithCovarianceStamped from /odom

$
0
0
Hi, I was wondering how is it possible to feed amcl package with `/geometry_msgs/PoseWithCovarianceStamped`, since the most odometry packages publishes `nav_msgs/Odometry`. Is there any way or code to get `/geometry_msgs/PoseWithCovarianceStamped` from `nav_msgs/Odometry` ? Thanks a lot! Cheers, André Castro.

AMCL without Laser Scans

$
0
0
Hi, I'm trying to locate a robot inside a map. I have odometry fused with IMU, a 2d map and the initial position inside the map. But I don't have the `/scan` topic due to noisy sensor. Is there a way to do localization with those? I can't use amcl package because it needs scans to map matching. I also can provide fiducial markers to update the world->odom tf. But not all the time. I was looking at this [tutorial for amcl](http://wiki.ros.org/bfl/Tutorials/Example%20of%20using%20a%20particle%20filter%20for%20localization%20by%20bfl%20library) and this is something I'm looking want to have. Thanks very much! André Castro.

Get a Quality Score from AMCL

$
0
0
I'm trying to figure out a way of creating a quality score for the amcl_pose. The most naive solution I could think of is to see "how small/big" the particle cloud gets, and normalize both sizes as 0 and 1. Then map the current cloud size to that scale. Has anyone done something similar before? Or does anyone know where I can get the particle cloud size from the AMCL package or the code directly? The point is that when the robot gets really lost (the quality score drops below an acceptable area), I'll ignore it. Also open to other ideas.
Viewing all 398 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>