AEye has introduced a groundbreaking technology that delivers considerable advances in perception and motion planning for autonomous vehicles. Termed "iDAR" (Intelligent Detection and Ranging), this new form of intelligent data collection combines the world’s first agile MOEMS LiDAR, pre-fused with a low-light camera and embedded artificial intelligence to create software-definable and extensible hardware that can dynamically adapt to real-time demands. The company has also made available its iDAR Development Partner Program for OEMs, Tier 1s and universities interested in integrating iDAR into their vehicles.
iDAR's intelligent sensing directly addresses shortcomings of first generation spinning or raster scanning LiDAR technologies, which silo sensors and use rigid asymmetrical data collection that either over-sample or under-sample information. This dynamic exposes an inherent trade-off between density and latency in legacy sensors, which restricts or eliminates the ability to do intelligent sensing. For example, while traditional 64 line systems can hit an object once per frame (every 100ms or so), AEye's intelligent sensing enables the system to selectively revisit any chosen object twice within 30 microseconds. This embedded intelligence optimizes data collection, enabling the system to transfer less data while delivering better quality, more relevant content.
iDAR is designed to intelligently prioritize and interrogate co-located pixels (2D) and voxels (3D) within a frame, enabling the system to target and identify objects within a scene 10-20x more effectively than LiDAR-only products. Additionally, iDAR is capable of overlaying 2D images on 3D point clouds for the creation of True Color LiDAR. Its embedded AI capabilities enable iDAR to utilize thousands of existing and custom computer vision algorithms, which add intelligence that can be leveraged by path planning software.
There are three core features of iDAR that key in on its ability to emulate human vision:
- Agile LiDAR - iDAR uses a distributed architecture and at-the-edge processing to dynamically track targets and objects of interest, while always critically assessing general surroundings.This enables path-planning software to address regions and objects of interest, or to apply differentiated focus on select objects or obstacles. By adding intelligence to the sensor layer, objects of interest can be identified and tracked with minimal computational latency. For example, the system can identify objects and then revisit them even within the same frame, giving the perception and path planning layers the ability to make calculations such as multi-directional velocity and acceleration vectors simultaneously. This allows for faster and better predictability of behavior and intent.
- True Color LiDAR - iDAR’s True color LiDAR instantaneously overlays 2D real-world color on 3D data, adding computer vision intelligence to 3D point clouds. Traditional LiDAR-based systems have to post-process, with inherent delays and computational drain due to registration and alignment challenges. True Color LiDAR’s unique fusion enables absolute color and distance segmentation, and co-location with no registration processing, so almost no computational penalty. This enables much greater accuracy and speed in interpreting signage, emergency warning lights, brake versus reverse lights, and other scenarios that have historically been tricky for legacy LiDAR-based systems to navigate. In addition, this approach dramatically expands the ability to do “enhanced training” for autonomous vehicles.
- Software-definable and Extensible LiDAR - iDAR's software-definable hardware adds three feedback loops that do not exist today: one at the sensor layer, one at the perception layer, and another with path planning software. By enabling customizable data collection in real-time, the system is able to adapt to the environment and dynamically change performance based on the customer’s/host’s applications and needs. In addition, it can emulate legacy systems, define regions of interest, focus on threat detection, and/or be programmed for variable environments, such as highway or city driving. This configurability leads to optimized data collection, reduced bandwidth, improved vision perception and intelligence, and faster motion planning for autonomous vehicles.
AEye will demonstrate iDAR and announce its automotive product suite at CES 2018 in Las Vegas from January 9-12.To make an appointment for a live demonstration or learn more about participating in AEye’s iDAR Development Partner Program, you can contact firstname.lastname@example.org or visit AEye at CES at Booth 2506.