Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 398 articles
Browse latest View live

hector_localization

$
0
0
Hi folks, I'm running the navigation stack with turtlebot and all blocks function good. Now, I'm trying to replace the amcl localization with the hector_localization. I installed hector_localization in my catkin_workspace and I successfully compiled it. My question is: how can I make use of Hector_Localization in the place of AMCL in the navigation stack? I searched the web but didn't find an answer. Any help is appreciated. Regards. Assil.

problem with LaserScan in rviz

$
0
0
Hi, I have been using a depth_image_to_laserscan node to create a scan topic which I then use in gmapping and amcl. when I visualize the scan topic (even though hz of /scan in almost 30) it drops many messages when used in amcl but works perfectly with gmapping. I have checked the tf tree it is fine and the very same setting which works perfectly with gmapping has problems with amcl. My fixed frame is map in both cases. I tried changing the fixed frame to depth_image for amcl and it solved the problem. But I need to use it for navigation and I need my fixed frame to be /map. Could this be that map is not available at some instances of time and therefore the tf is not available (a message complaining about transform to depth_camera_link is not given is shown under the status of laserScan)? I also get a new error message complaining about canBus : Bus::open: Can not get ibus-daemon's address. But this is always issued but gmapping works fine. I have realized that amcl_pose is only published when I initialize the robot in rviz and apart from that there is nothing published to this topic. I also get this error in rviz for LaserScan for frame [/depth_camera_link]: No transfrom to fixed fram /map. Tf error:[lookup would require extrapolation into the future] Sorry if it was too long. Thanks in advance for your help.

What to prepare to run amcl node correctly?

$
0
0
I use ubuntu 12.04 64bits with ROS hydro. If I want to run amcl node, what should I prepare? I try to list to do list that I know. 1. Run gazebo with PR2 and construct an environment. 2. Build a map by gmapping node. 3. Save map to pgm file and yaml file. 4. load pgm and yaml file by map_server. 5. run amcl launch >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 6. Teleop PR2 7. Receive amcl_pose topic to get accurate robot pose. Is there anything am I missing? I also have several questions: a. Should I write a node to publish odom topic? b. How to identify the robot pose that amcl output is precise or not? What is the normal settings that could be used in gazebo and real scene? c. When should I set initial pose ? What is the correct way to set initial pose? How to know the correct initial pose? By try and error? Thank you very much~~

How to use robot_pose_ekf and other ROS Packages?

$
0
0
Hi there, I am super new to ROS. I have a decent amount of programming experience and have academic coursework under my belt in AI and robotics. I have also worked through the ROS tutorials. That being said, I have 0 real world experience with ROS. I am working on a project where I need to use robot_pose_ekf and maybe amcl. I think I understand how to publish the messages they subscribe to, but I have no idea how to play around with the code for these packages, or how to get them on my computer, or how to actually use them with my robot, aside from what they publish and subscribe to. Any kind of help on how to use these tools would be super appreciated. Thank you!! (PS - I have the sensors needed for all of these things)

displaying the robot's actual trajectory in RViz

$
0
0
Hi all, I have a mobile robot which is navigating around a room, I already have the map of the room. I am using rotary encoders for odometry. I am fusing the data from Rotary encoders and IMU using [robot_pose_ekf](http://wiki.ros.org/robot_pose_ekf). I am using [amcl](http://wiki.ros.org/amcl?distro=jade) for localization and [move_base](http://wiki.ros.org/move_base?distro=jade) for planning with SBPL Lattice Planner as the global planner and Trajectory Planner as the local planner. Now when I run RViz, I can visualize the global plan (desired trajectory) generated by move_base but I would also like to visualize the actual trajectory (based on odometry data and amcl) followed by the robot in RViz. I want to do this so that I can see how much is the actual trajectory different from the desired trajectory. Is there any package in ROS which does this (I am not able to find any such package)? If not, what is the best way to visualize the robot's actual trajectory in RViz? Any help will be appreciated. Thanks in advance. Naman Kumar

Using AMCL for navigation with navigation stack, std_msgs/Range, Range_sensor layer

$
0
0
I have set-up the navigation stack on my differential drive robot which has three ultrasonic sensors 45 degrees apart(I know the map is not going to be great with these sensors but it is my only option). I have got a global and local costmap setup similar to the tutorial on the navigation stack ( [robot_setup](http://wiki.ros.org/navigation/Tutorials/RobotSetup "navigation_stack") ). I incorporated three range_sensor_layers for my three ultrasound sensors in the costmaps. My question is, What do I start with possibly AMCL with a created pgm map of its environment or gmapping? My ultimate goal is to end up with an autonomous 2D mapping robot. I am currently trying to use AMCL with a template map but I do not know how to translate the ultrasonic readings in a way that AMCL can use them. I know AMCL expects a laser scan to be published on the /scan topic? Any help will be greatly appreciated, Please let me know if any further information is needed.

Cause path planning to take wider turns

$
0
0
I am having trouble getting my robot to find a path through doorways. Sometimes the global costmap just doesn't give enough space. *A* solution is to decrease the inflation_radius, however the result is that it plans paths with very tight corners, meaning sometimes the robot runs over items near the wall that stick out below the sensor level. Is there a way to coerce the planner into accepting tight spaces when there is no alternative, like doorways, but take a wide birth around corners when it is possible? It feels like a need another "inflation" box for "desired personal space".

AMCL with robot_pose_ekf

$
0
0
I have a map that ı used gmapping. But now ı need to use AMCL package. My AMCL is working but result is so bad. I am using robot_pose_ekf ( odom_combined --> base_footprint). you think is there a problem this combine. I use 2d pose estimate button for inital pose. my lidar is RPlidar .( 5 hz, 6 meters range) my odom_frame_id : odom_combined and base_frame_id : base_footprint **what can I do on AMCL parameters for a good result please help me...** this is my AMCL result: http://i.imgur.com/eztGjLg.png this is my tf trees: http://i.imgur.com/uftS8h5.png this is my AMCL parameters :

No laser scan received

$
0
0
I'm trying to use **amcl** to localize my robot using a bag file. Steps I do are these: 1) Publishing the **map**: `rosrun map_server map_server map.yaml` 2) Running **amcl**: `rosrun amcl amcl scan:=base_scan _odom_frame:=odom_combided` 3) Playing the bag file: `rosbag play small.bag` Published topics are: /amcl/parameter_descriptions /amcl/parameter_updates /amcl_pose /base_odometry/odom /base_odometry/odometer /base_odometry/state /base_scan /clock /initialpose /map /map_metadata /move_base_simple/goal /particlecloud /robot_pose_ekf/odom_combined /rosout /rosout_agg /tf /tilt_scan /torso_lift_imu/data /torso_lift_imu/is_calibrated But **amcl** node prints out this message continuously: `[ WARN] [1391522838.082207618]: No laser scan received (and thus no pose updates have been published) for 1391522838.082082 seconds. Verify that data is being published on the /base_scan topic.` And **/amcl_pose** publishes nothing. **rxgraph** output:
![rxgraph](/upfiles/13915357152216836.png)

Amcl defaut scan topic ?

$
0
0
Hi, I'm using amcl with a defaut topic scan and it keep telling me: "No laser scan received (and thus no pose updates have been published) for XXX seconds" I have double check my topic scan and made an initial pose with RViz. Thanks for your answers.

AMCL update rate

$
0
0
Hi, all There is an parameter "update_min_d" in AMCL to specify the minimal update distance. Even though I set the parameter to be as small as 0.005, but the data I received indicates the minimal update distance is about 0.02. Is there any solution to this problem?

navigation stack dependencies

$
0
0
Hi all, I am new to ROS and following the ROS navigation tutorials for Indigo to implement AMCL I have queries about the step to setup the dependencies catkin_create_pkg my_robot_name_2dnav move_base my_tf_configuration_dep my_odom_configuration_dep my_sensor_configuration_dep As I am using a hokuyo UST-10LX which works with the urg_node. Is the "my_sensor_configuration_dep" simply urg_node? Or is it setup with the tutorial below? http://wiki.ros.org/navigation/Tutorials/RobotSetup/Sensors Whereby I have to create a package "my_sensor_configuration" with catkin_make and use the code in the above tutorial? I have looked at the http://answers.ros.org/question/191971/navigation-package-dependencies/ http://answers.ros.org/question/10775/navigation-tutorial-dependencies/ But I am not too sure whether it is just "urg_node" or has to setup based on http://wiki.ros.org/navigation/Tutorials/RobotSetup/Sensors Tutorials I am following http://wiki.ros.org/navigation/Tutorials/RobotSetup/Sensors http://wiki.ros.org/navigation/Tutorials/RobotSetup Please assist!

AMCL acceptable robot speed

$
0
0
I am using amcl. my robot speed is 40 cm/sec. AMCL probability estimation is good and But robot is losting on gmapping map. you think problem about speed value ??? my problem : ( I used 2d pose estimation button for initial pose ) ![image description](/upfiles/14406672849162762.png)

AMCL loss problem

$
0
0
hello... I work on AMCL. if I work in little area AMCL work very well. But if I work in corridor AMCL always is losting in the middle of corridor. My AMCL parameters are good so amcl pose estimation (pose arrays) doesnt spread. why it get lost always same area ??? problem is about lidar max range or gound type ? lidar hokuyo : 5.4 m max range type of floor : tile The length of the corridor : ![image description](/upfiles/14413770717411418.jpg) [**VIDEO ABOUT PROBLEM :** ](https://www.youtube.com/watch?v=VFss-e_lgmQ&feature=youtu.be) VIDEO INFO : **video timing (03:08) : robot is actually in front of the first door, but amcl estimation is still in front of the first floor video timing (04:07) : robot is actually in front of the third door, but amcl estimation is still in front of the second floor**

Problem when using navigation stack in the rviz

$
0
0
Hello everybody, I'm trying to using navigation stack in the rviz. I have the launch file looks like this: My fake_move_base_amcl.launch file looks like this: when I run the launch file,there are map and RobotModel.But no autonomous navigation. The Warning as following: [INFO] [WallTime: 1441624427.388872] Started DiffController (base_controller). Geometry: 0.26m wide, 4100.0 ticks/m. process[rviz-8]: started with pid [16162] [ WARN] [1441624432.486553111]: Waiting on transform from base_footprint to map to become available before running costmap, tf error: [ WARN] [1441624437.562957259]: Waiting on transform from base_footprint to map to become available before running costmap, tf error: Thank you for any kind of help.. the tf tree looks like: ![image description](/upfiles/14416752072757177.png)

How to build a map using slam_gmapping & amcl

$
0
0
Hi, I want to build a map using slam_gmapping algorithm with rplidar data. Now I could generate a good map with high quality when the rplidar is static, which I mean the map with clear boundary. However, when I move the rplidar with my hand, the boundary is not that smooth. I think the reason for this is the bad result of scan register when moving the rplidar. Since AMCL could estimate the location of rplidar, I want to use this information with slam_gmapping to generate a map with good quality. Is it possible to get it? If yes, how?Thanks very much! :)

AMCL parameters spreading out is good or bad ?

$
0
0
hello... I cant understand , How do I decide whether the amcl parameters result is good. so AMCL Pose estimations spreading out is good or bad ? which image is good result ? this my new amcl parameters result : ![image description](/upfiles/1441953768493385.png) this my old amcl parameters result : ![image description](/upfiles/14419538666311267.png)

amcl estimation quality

$
0
0
I would like to know how much to trust the amcl output pose. One thought is to see how well the laser data correlates with obstacles in the cost map, another is to calculate the variance of the particle locations. Is there some standard way to estimate the quality of the amcl pose?

Using robot_localization with amcl

$
0
0
Hi all, I am starting to set up awesome [robot_localization](http://wiki.ros.org/robot_localization) package and I am using rotary encoders (`nav_msgs/Odometry`, both pose and twist) and IMU (`sensor_msgs/Imu`) which provides continuous data and [Ultrawide band sensor](http://www.decawave.com/products/evk1000-evaluation-kit) which provides global absolute position data as a `nav_msgs/Odometry` message (only pose, (x,y and yaw)) . As suggested on the wiki page of the package, I can fuse data from the encoders and the IMU in the `odom_frame` which gives the transform `odom_frame -> base_link_frame`. For the `map_frame`, I can fuse data from rotary_encoders, IMU and Ultrawide band sensor (It is similar to GPS in the sense that it provides global absolute position data) and it gives the transfrom from `map_frame -> odom_frame` but next I am using [amcl](http://wiki.ros.org/amcl) to improve the pose_estimate and amcl also provides `map_frame->odom_frame` transformation. So, I want to ask how should I deal with the transformations (specifically `map_frame->odom_frame`), Should I just not use the `map_frame->odom_frame` transformation from `robot_localization` and let amcl take care of it, If yes how can I not use the transformation? Or is there any better way to deal with it? **Update 1:** If I give output of AMCL and UWB to second instance of `ekf_localization_node`, it will give me the required transformation (`map_frame -> odom_frame`) and will publish the output odometry message on `odometry/filtered` topic but I am using [move_base](http://wiki.ros.org/move_base) which expects robot pose on `/amcl_pose` topic. So, I can remap the output on `odometry/filtered` to `amcl_pose` and change the message type from `nav_msgs/Odometry` to `geometry_msgs/PoseWithCovarianceStamped`, Does that sound okay? Also, another approach which I am thinking is to run the first instance of `ekf_localization_node` (in `odom_frame`) with rotary encoders and IMU as input which will give `odom_frame->base_link_frame` transformation and then run second instance of `ekf_localization_node` (in `map_frame`) with UWB, rotary encoders and IMU as input. I am slightly confused about how the second instance will work. Will it modify the `odom_frame -> base_link_frame` tf by considering (or changing) the data from UWB in the `odom_frame`? If yes, then I can directly use the modified `odom_frame->base_link_frame` tf (which includes UWB, encoders and IMU) (and ignore `map_frame->odom_frame` tf from second instance of `ekf_localization_node`) in amcl and everything should be fine, right? **Update 2:** Thanks @Tom Moore for the answer! So just to clarify, run the first instance of `ekf_localization_node` in `odom_frame` with wheel odometry and IMU data as input which will give us the `odom_frame -> base_link_frame` transformation. Then run the second instance of the `ekf_localization_node` in the `map_frame` with wheel odometry, IMU data, UWB data (it is in `map_frame`) and `amcl_pose` which will give us the `map_frame -> odom_frame` transformation using existing `odom_frame -> base_link_frame` tf. Thanks in advance. Naman Kumar

setting initial pose of the robot using amcl

$
0
0
Hi all, This is basically a continuation of [this question](http://answers.ros.org/question/218137/using-robot_localization-with-amcl/) but I am asking it as a new question because its different from it. I am running two instances of [robot_localization](http://wiki.ros.org/robot_localization), the first instance has Rotary encoders and IMU as input in `odom_frame` which gives `odom_frame->base_link_frame` tf and the second instance has Rotary encoders, IMU, Ultra wide band sensor and AMCL(`/amcl_pose`) as input which gives `map_frame->odom_frame` tf and I am not using the `map_frame->odom_frame` transformation from [amcl](http://wiki.ros.org/amcl). Now when I try to set the initial pose from RVIZ or programmatically, it does not seem to be working and does not change the position of the robot because I guess `amcl` does not publishes the `map_frame->odom_frame` (or `base_link`) transformation. **Is there a way to set the initial pose (`/initialpose` topic) of the robot using `amcl` when `amcl` is not publishing `map_frame->odom_frame` (or `base_link`) transformation?** Thanks in advance. Naman Kumar
Viewing all 398 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>