What is SLAM technology?
Updatezeit: 2022-11-11 17:33:31
Contents
What is SLAM technology?
Simultaneous Localization and Mapping (SLAM) is a technology field that addresses the localization and mapping of robots moving in unknown environments.
Simply put, SLAM is a technology that allows a robot to acquire information about its environment through sensors, where it is, where it is going, how it is going, and what is in front of it. Then the system will get its orientation and path planning based on the environmental information.
A little hard to understand? No problem, let's take an example.
Let's say you are on a business trip to an unfamiliar city. To familiarize yourself quickly with the environment and complete your task of checking into a hotel, you should do the following things.
1. Feature extraction
Observe the surroundings with your eyes and remember their features.
2. Map construction
Construct a 2D or 3D map of the environment in your brain based on the information obtained from your eyes.
3. Bundle Adjustment or EKF
As you walk, you constantly acquire new features and landmarks and adjust your mental map model.
4. Trajectory
Determine your position based on the feature landmarks you have acquired during the previous walk.
5. Loop-closure Detection
When you have unintentionally walked a long way, match the landmarks in your mind to see if you have returned to the original path.
The above five steps are performed simultaneously, called Simultaneous Localization and Mapping.
As an indispensable and important technology for autonomous mobile robots, SLAM technology is receiving more and more attention.
SLAM technology is widely used in robotics, UAV, driverless, AR, VR, and other fields, relying on sensors to achieve autonomous localization, map construction, path planning, autonomous navigation, and other machine functions.
Laser SLAM or vision SLAM?
The sensors currently used in SLAM are mainly divided into two categories: Lidar-based laser SLAM (Lidar SLAM) and vision-based VSLAM (Visual SLAM).
Visual SLAM, like the eye, is the main source of external information and can acquire massive and redundancy-rich texture information from the environment, which is the advantage of visual SLAM.
The camera is often used as the "eyes" of the robot because of its small size, low energy consumption, and low cost, which is the basis of visual SLAM.
The robot uses the camera's image information as a basis to map out its surroundings and then transmits it to the "brain." Finally, the system makes a judgment to complete the robot's positioning.
This technology is difficult and complex to process information, and it is easily affected by lighting conditions, so in some cases, visual SLAM is not enough.
That's why laser SLAM is here to help.
Laser SLAM uses 2D or 3D LiDAR (single- or multi-line LiDAR). 2D LiDAR is generally used on indoor robots (such as floor sweepers), while 3D LiDAR is generally used in unmanned vehicles, robots, AMR/AGV, etc. The emergence and popularity of LiDAR have led to faster and more accurate measurements and richer information.
LIDAR distance measurement is more accurate, the error model is simple, the operation is stable outside the special environment, and the point cloud processing is easier, which can fully adapt to the dynamic changing environment. Laser SLAM theoretical research is also relatively mature, and the corresponding products are more abundant.
Through comparison, it is found that laser SLAM and vision SLAM have their strengths and limitations individually, while the fusion complements each other's strengths and weaknesses.
For example, vision works stably in dynamic environments with rich textures. It can provide very accurate point cloud matching for laser SLAM, while the precise direction and distance information provided by LiDAR will be more powerful on correctly matched point clouds.
In environments with severe light deficits or missing textures, the positioning work of laser SLAM allows vision to record scenes with little information.
Future Applications
SLAM technology has already achieved good landing results and achievements in many fields, including indoor mobile robots, AR/VR, drones, uncrewed vehicles, and so on.
In the future, the continuous improvement of sensor accuracy and the gradual reduction of cost will bring revolutionary changes to more industry fields.
As SLAM technology becomes hot, more and more talents will come into the field of mobile robotics, injecting more fresh blood and bringing new technical directions and research fields.
Vorherige: CR1220 Battery Equivalent, Specification, Application
Nächste: 1N5817 Schottky Diode Datasheet, Equivalent, and Pinout
Ratings and Reviews
Verwandte besondere
-
MIC24051YJLTR
Microchip
QFN-28 > -
MCP1792T-5002H/CB
Microchip
VOLTAGE REGULATOR 5.0 V 3LD SOT- > -
VSC8504XKS-05
Microchip
IC TELECOM INTERFACE 256BGA > -
VSC7224XJV
Microchip
Backplane Transceiver 4TX 4RX 14.5Gbps 4 > -
PM8055B-FEI
Microchip
SXP 48X12G > -
MPL360BT-I/Y8X
Microchip
IC TELECOM INTERFACE 48TQFP > -
ATA6560-GAQW-N
Microchip
CAN Bus, ISO 11898-2, 11898-5, SAEJ2284, > -
USX2064T/M2
Microchip
IC HUB CTLR 4PORT USB 2.0 HS 36S > -
ATTINY402-SSFR
Microchip
8bit AVR Microcontroller, 20MHz, 4 kB F > -
ATTINY3217-MNR
Microchip
8bit Microcontroller, 20MHz, 32 kB Flash > -
RN4870-I/RM128
Microchip
Bluetooth v5.0 (BLE) SMART SOC Class II > -
PM8571B-F3EI
Microchip
PCI Express Switch 24Lanes 12Ports 650-P > -
ATWINC1510-MR210PB1961
Microchip
Wireless LAN Module, 2.4GHz, SmartConnec > -
ATWINC1500-MR210UB1961
Microchip
Module 802.11b/g/n 2.472GHz 11000Kbps 28 > -
ATECC508A-SSHDA-B
Microchip
Authenticator CMOS 10KB 8-Pin SOIC N Tub >
Hot Stocks
Mehr- 24CW1280T-I/OT
- LE9540CUQCT
- AT45DB081E-SSHN-T
- MCP2221-I/ST
- ATSAMS70N20A-AN
- ATSAM4N8BA-AU
- VSC8224XHG
- VSC7395XYV
- UCN5895A
- TSC87251G2D-24IB
- TC4420COA
- PIC32MX150F128D-I/PT
- PIC24EP512GU810-I/PF
- PIC18F8722-E/PT
- PIC18F1320-I/SS
- PIC16F1824-I/SL
- MT90812AL1
- MIC5205YM5-TR
- MIC4680YM
- MIC4124YME
- MIC2026-2BM
- MD0100N8-G
- MCP79510-I/MS
- MCP73832T-2ACI/OT
- MCP6L01UT-E/OT
- MCP4921-E/SN
- MCP4011-103E/MS
- MCP3553-E/SN
- MCP3221A5T-I/OT
- MCP3021A0T-E/OT
- MCP2551-E/SN
- MCP2515-E/ST
- MCP2150-I/SO
- MCP100T-450I/TT
- MCP100T-315I/TT
- LAN9221-ABZJ
- KSZ8041NL-TR
- ENC424J600-I/ML
- ENC28J60-I/SS
- ENC28J60-I/SP
- ENC28J60-I/SO
- CF745-04/P
- CAP1114-1-EZK-TR
- AT90USB1287-AU
- AR1100-I/SS