OUR SOLUTION


A navigation system is a set of tools and processes that help individuals or vehicles determine their location, plan routes, and navigate to desired destinations. It includes components such as positioning, mapping, route planning, guidance, real-time updates, and user interfaces.


Introducing, the Guide Navigation System or GuideNS™ - a Visual-Inertial SLAM (Simultaneous Localization and Mapping) based navigation system for forklifts and autonomous robots operating in indoor warehouses and other GPS-denied environments. Our Visual-Inertial SLAM technology can localize the three dimensional (3D) position and orientation of moving assets in real time with centimeter level accuracy through multiple floors or large areas.



HOW IT WORKS


The key components of our navigation solution are:

  • Edge device - reference hardware package with off-the-shelf stereo camera + IMU (inertial measurement unit) sensor and low power embedded computer.
  • Core algorithms - Visual-Inertial SLAM based navigation software running on the edge device. Our navigation software can provide - localization and mapping, route planning and guidance, and even motion control for robot platforms (all on the edge device).
  • Cloud backend - our backend collects and stores real time navigation data. It makes the data available via web APIs for both real time streaming and historical queries. Further, it enables cooordination between multiple vehicles - manually driven or autonomous robots.
  • Data visualization & analytics - our web browser based dashboard client allows for real time data visualization and historical data analysis.

Guide Robotics Navigation Service

Depending on the use case (manually driven vs on-robot) we use on-device and cloud APIs to deliver indoor positioning and other navigation functionality as a service. We support modern web based APIs including gRPC and REST for cloud-based operation (e.g. interface with Warehouse Management Systems), as well as ROS (Robot Operating System) for on-robot navigation.


Talk to us to learn more about successful applications of GuideNS™ in practice.



OUR VISUAL-INERTIAL SLAM


Visual-Inertial SLAM is a cutting-edge technique that fuses visual data from cameras with inertial measurements from IMUs to create a 3D representation of the environment while simultaneously estimating the sensor's pose (position and orientation) within that environment. The visual information is extracted from the camera images, while the IMU provides motion measurements, including acceleration and angular velocity. By combining these two sensor modalities, Visual-Inertial SLAM overcomes the limitations of using either visual or inertial sensors alone, resulting in more robust and accurate localization and mapping capabilities.


Our Visual-Inertial SLAM algorithm (key elements shown below) is optimized for real-time performance and low-cost hardware. The localization is highly accurate, as it creates and pre-loads visual landmark maps on the fly for centimeter level accuracy, even in the face of challenging visual environments. Our navigation solution for AGVs and AMRs can incorporate sensor fusion with Lidar and wheel odometry to maximize the precision and efficiency of the vehicles’ operation.


Guide Robotics Visual SLAM

WHY US


The core Visual-Inertial SLAM technology was developed and field tested by SRI International for more than a decade under extensive funding from the US Government. While there are many open source Visual-Inertial SLAM algorithms available, we believe that our solution has a significant advantage over the competition based on the following factors.


The Guide Robotics advantage