Automotive Aftermarket Technology

Search Autoparts/Aftermarket-business/Automotive-aftermarket-technology/

Autonomous vehicles aided by AEye's next generation of artificial perception

Tuesday, May 22, 2018 - 07:00
Print Article

PLEASANTON, Calif., May 22, 2018 /PRNewswire/ -- AEye, a leader in artificial perception systems, introduces a new sensor data type called Dynamic Vixels™, which are designed to more intelligently acquire and adapt data for the company's iDAR™ (Intelligent Detection and Ranging) perception system. This advancement in AEye technology further strengthens its biomimicry approach to visual perception, essentially enabling vehicles to see and perceive more like humans to better evaluate potential driving hazards and adapt to changing conditions.

In simple terms Dynamic Vixels combine pixels from digital 2D cameras with voxels from AEye's Agile 3D LiDAR (Light Detection and Ranging) sensor into a single super-resolution sensor data type.  For the first time, a real-time integration of all the data captured in pixels and voxels is combined into a data type that can be dynamically controlled and optimized by artificial perception systems at the point of data acquisition. Dynamic Vixels create content that inherits both the ability to evaluate a scene using the entire existing library of 2D computer vision algorithms as well as capture 3D and 4D data concerning not only location and intensity but also deeper insights such as the velocity of objects.

"There is an ongoing argument about whether camera-based vision systems or LiDAR-based sensor systems are better," said Luis Dussan, Founder and CEO of AEye. "Our answer is that both are required – they complement each other and provide a more complete sensor array for artificial perception systems. We know from experience that when you fuse a camera and LiDAR mechanically at the sensor, the integration delivers data faster, more efficiently and more accurately than trying to register and align pixels and voxels in post-processing. The difference is significantly better performance."

AEye's iDAR perception system mimics how a human's visual cortex evaluates a scene and calculates potential driving hazards. Using embedded artificial intelligence within a distributed architecture, iDAR employs Dynamic Vixels to critically and actively assess general surroundings to maintain situational awareness, while simultaneously tracking targets and objects of interest. As a core data element for a scalable, integrated system, Dynamic Vixels enable iDAR to act reflexively to deliver more accurate, longer range and more intelligent information faster.   

Is Your Shop Changing Enough?

SHIFTMobility Screen

Is your shop technology changing enough to keep pace with vehicle and consumer changes? It’s time to put your SMS in the palm of your hand and connect to customers and their vehicles like never before.

See The Video

"One nice consequence that comes out of the architecture is we give our customers the ability to add the equivalent of "human reflexes" to their sensor stack," says Dussan.

Dynamic Vixels can also be encrypted. This patented technology enables each sensor pulse to deal appropriately with challenging issues such as interference, spoofing, and jamming.  Issues that will become increasingly important as millions of units are deployed worldwide.  

Simply put, this new way of collecting and inspecting data using at the edge-processing of the iDAR system enables the autonomous vehicle to more intelligently assess and respond to situational changes within a frame, thereby increasing the safety and efficiency of the overall system. For example, iDAR can identify objects with minimal structure, such as a bike, and differentiate objects of the same color such as a black tire on asphalt. In addition, Dynamic Vixels can leverage the unique capabilities of agile LiDAR to detect changing weather and automatically increase power during fog, rain, or snow. 

Likewise, iDAR's heightened sensory perception allows autonomous vehicles to determine contextual changes, such as in the case of a child's facial direction, which can be identified to calculate the probability of the child stepping out onto the street, enabling the car to prepare for the likelihood of a halted stop. 

"There are three best practices we have adopted at AEye," said Blair LaCorte, Chief of Staff. "First: never miss anything; second: not all objects are equal; and third: speed matters. Dynamic Vixels enables iDAR to acquire a target faster, assess a target more accurately and completely, and track a target more efficiently – at ranges of greater than 230m with 10% reflectivity."

The iDAR perception system includes inventions covered by recently awarded foundational patents, including 71 intellectual property claims on the definition, data structure and evaluation methods of dynamic Vixels. These patented inventions contribute to significant performance benefits, including a 16x greater coverage, 10x faster frame rate, and 7-10x more relevant information that boosts object classification accuracy while using 8-10x less power.

AEye's first iDAR-based product, the AE100 artificial perception system, will be available this summer to OEMs and Tier 1s launching autonomous vehicle initiatives.

About AEye
AEye is an artificial perception pioneer and creator of iDAR™, a new form of intelligent data collection that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors, including Kleiner Perkins Caufield & Byers, Airbus Ventures and Intel Capital. For more information, please visit www.aeye.ai.

Article Categorization
Article Details
< Previous
Next >
blog comments powered by Disqus