NeuroPathFinder is an innovative project aimed at enhancing pathfinding algorithms and robotics simulations through the integration of complex indoor environment mapping and advanced pathfinding techniques. This repository hosts a module that plots a navigation map, which can be utilized in pathfinding algorithms and robotics simulations, and will soon include implementations of various pathfinding algorithms.
-
Navigation Map Plotting: A Python script capable of plotting a navigation map with predefined obstacles, customizable start and goal points, ensuring these points are not located on obstacles. This feature is crucial for testing pathfinding algorithms in a simulated environment.
-
Pathfinding Algorithms (Coming Soon): Implementation of various pathfinding algorithms to navigate through the plotted maps efficiently.
- Python 3.6 or higher
- Matplotlib library
Ensure Python 3.6 or higher is installed on your system. You can download it from python.org.
It's recommended to use a virtual environment for Python projects. To set it up:
python -m venv venv
source venv/bin/activate # On macOS/Linux
.\venv\Scripts\activate # On Windows
To install Matplotlib, run the following command in your terminal:
pip install matplotlib
To run the Navigation Map Plotter, navigate to the plotter directory and execute:
python navigation_map_plotter.py
This command plots the navigation map using the default start and goal points along with the predefined obstacles. For custom start and goal points, modify the script accordingly.
The repository includes a research paper titled "Navigating in Complex Indoor Environments: A Comparative Study". This paper offers a comprehensive comparison of various pathfinding algorithms in complex indoor environments, highlighting the challenges and proposing innovative solutions to enhance navigation accuracy and efficiency.
Contributions to NeuroPathFinder are welcome! Whether it's implementing new algorithms, enhancing the navigation map plotter, or improving the documentation, your contributions are valuable to us.
This project is licensed under the MIT License - see the LICENSE file for details.