Automotive - Berkeley, California, United States
Self-Driving has been an aspiration of mankind for many years, and the recent advancements in this space have been tremendous. LiDar and Camera sensors gained an unprecedented momentum, but they are limited and can't perform well in bad weather and lighting (i.e. fog, rain, snow, sunsets.) Generally, Radar sensors are not sensitive to weather and lighting conditions. Existing Radar sensors are crude and limited by the physics of their signal purity. Therefore, these Radars are not accurate enough to bridge the gap left open by LiDar and Camera. We develop a substantially more accurate Radar sensor by synthesizing a Radar signal that is 100x purer at a cost that is 100x less than that of a LiDar.Lives are at stake on the roads and car technology relies on human drivers, who are inherently unsafe. With our 100x purer signal, we provide unprecedented accurate location information transforming the radar data from crude to sharp. We enable cars to "see" their environment accurately under all driving conditions, turning Self-Driving to Safe-Driving.LiDar and Camera are limited to visual information. Our Radar exploits electromagnetic information that is orthogonal to visual data. Data orthogonality is critical for sensor fusion and safety. This new type of data is unchartered territory for AI/ML/DL/NN which we harvest and utilize. One example is the non-visual detection of pedestrians. Further, this data can be used to determine material composition, classification/recognition of objects, and electromagnetic-mapping.We accelerate the path to Level 5 autonomous vehicles by providing crucial information needed for a well-rounded sensor fusion technology - in all weather conditions.
Gmail
Google Apps
GoDaddy Hosting