View on GitHub

Abstract Map

A Python implementation of the spring-dynamics based abstract map used in "Robot Navigation in Unseen Spaces using an Abstract Map"

Summary of the abstract map's purpose

By using the abstract map, a robot navigation system is able to use symbols to purposefully navigate in unseen spaces. Here we show some videos of the abstract map in action, and link to the open source code we make available with the publication of results. The code includes a Python implementation of the abstract map, a 2D simulator for reproducing the zoo experiments, and a mobile phone app for playing symbolic spatial navigation games with a phone.

Quick links

Open source abstract map resources

The first open source contribution provided is Python code for a full implementation of the abstract map available here. The implementation features:

We also provide a configured 2D simulator for reproducing the zoo experiments in a simulation of our environment. The simulator package includes:

Lastly, we provide code for the mobile application used by human participants in the zoo experiments. The phone application, created with Android Studio, includes the following:

We hope these tools can help people engage with our research, and more importantly they can aid future research into the problem of developing robot that utilise symbols in their navigation processes.

Videos of the abstract map in action

Below are videos for five different symbolic navigation tasks completed as part of the zoo experiments described in our paper. In the experiments the robot was placed in an environment with no existing map, and given a symbolic goal like “find the lion”. Could the abstract map, using only the symbolic spatial information available from the AprilTags in the environment, successfully find the goal?

Human participants who had never visited the environment before were given the same task, with the results showing the abstract map enables symbolic navigation performance comparable to humans in unseen built environments.

Find the lion

Find the kingfisher

Find the polar bear

Find the anaconda

Find the toilets

Acknowledgements & citing our work

This work was supported by the Australian Research Council’s Discovery Projects Funding Scheme under Project DP140103216. The authors are with the QUT Centre for Robotics.

If you use the abstract map in your research, or for comparisons, please kindly cite our work:

@ARTICLE{9091567,  
    author={B. {Talbot} and F. {Dayoub} and P. {Corke} and G. {Wyeth}},  
    journal={IEEE Transactions on Cognitive and Developmental Systems},   
    title={Robot Navigation in Unseen Spaces using an Abstract Map},   
    year={2020},  
    volume={},  
    number={},  
    pages={1-1},
    keywords={Navigation;Robot sensing systems;Measurement;Linguistics;Visualization;symbol grounding;symbolic spatial information;abstract map;navigation;cognitive robotics;intelligent robots.},
    doi={10.1109/TCDS.2020.2993855},
    ISSN={2379-8939},
    month={},}
}

Further information

Please see our paper for full details of the abstract map’s role in robot navigation systems & the zoo experiments shown above.

If you want more information about the abstract map see our related publications, or you can contact the authors via email: b.talbot@qut.edu.au