Location in Space and Robot Axis – The degrees of freedom of a robot typically refer to the number of movable joints of a robot. The following article relates specifically to industrial robots as they are used by industrial robot integrators including Motion Controls Robotics, a Level 4 Certified Servicing Integrator for FANUC Robotics. Accessing such a tracking pipeline in order to extend it with our sensor fusion is usually not possible.What does it mean to have a two or three axis pick and place unit or a two, four, five or six axis robot, or even a seven axis robot, and how does that refer to degrees of freedom of a robot?Ī relatively technical article in Wikopedia describes degrees of freedom in general terms. Most AR HMDs use their own proprietary tracking technology that is only suitable for stationary use-cases, but doesn’t work in moving vehicles. We realized this with our LPVR-DUO tracking system.Īpplying this middleware to existing augmented reality headsets on the market turned out to be challenging. In contrast to stationary applications where an IMU is sufficient to track the rotations of an HMD, in the in-vehicle use-case, an additional IMU needs to be fixed to the vehicle and the information from this sensor needs to become part of the sensor fusion. Building on this foundation we extended the system to work as a tracking solution for transportation platforms such as cars, ships or airplanes (Figure 1). In the first version the purpose of this middleware was to enable location-based VR with a combination of optical and IMU-based headset tracking. Over several years we have developed our LPVR middleware. This article describes our first steps in the development of an AR HMD for in-car, aerospace and naval applications. The combination of all available data sources in the robot allows LPNAV-VAC to create high definition maps of the environment while using low-cost, off-the-shelf components. While this principle of simultaneous localization and mapping (SLAM) is an established method for some robot navigation systems, these solutions tend to rely on laser scanners (LIDAR) or vision-only reconstruction. As new sensor information arrives the map is continuously adapted to reflect an optimized view the robot’s environment. It is essential that sensor fusion algorithm is able to dynamically update the map it is constructing. A central computing unit combines the information from these data sources to simultaneously create a map of the surroundings of the robot and calculate the position of the robot inside the room. LPNAV-VAC combines three different data sources in order to calculate a robot’s position inside a room: an inertial measurement unit, data from the robot’s wheel encoders and video images from a camera installed on the robot (Figure 1). Uncontrolled outdoor environments are particularly challenging as lighting conditions can vary very strongly and perception can be disturbed by pedestrians, passing cars etc. After further optimization of the algorithm we are now able to show the system working well in outdoor settings. In a previous post we have shown the capability of LPNAV to operate in a small, crowded indoor environment. This offers a big cost-saving potential for applications in which so far the transition from indoor to outdoor settings required specialized equipment or manual handling. when transporting an item from inside a warehouse to a truck parked in front of the warehouse. With the help of LPNAV, mobile logistics platforms can operate (localize) in both, indoor and outdoor environments with the same set of sensors (Figure 1) and a unified map, e.g. LPNAV enables automatic guided vehicles (AGV) to rapidly understand their environment and be ready for safe and efficient operation no calibration, manual map building etc.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |