Autonomous Vehicle Technology: Everywhere, in Every Way, for Everyone

May 30, 2022

To celebrate National Autonomous Vehicle Day, we delve into the scalable approach behind the various aspects of our self-driving technologies.

At Mobileye, we’re on a mission to bring the benefits of self-driving technology everywhere, in every way, for everyone.

This is not just a marketing slogan for us here at Mobileye. It’s the central philosophy that guides us in developing our technologies to scale: to locations around the world, in a variety of applications, and for the mass market.

To celebrate National Autonomous Vehicle Day in the United States, we’re pleased to expand on what we mean by “everywhere, in every way, for everyone.” Join us for a glimpse behind the curtain at our scalable-by-design approach to self-driving technologies.

Testing Around the World

To pave the way towards the rapidly approaching future of autonomous vehicles, Mobileye has spent the past several years testing our AVs not in a single, geo-fenced environment, but in real-world conditions, across a variety of locations around the globe.

To date, our AV test fleet has tackled roads in Israel, Germany, France, the United States, Japan, and China. Each new location poses fresh challenges for our technology to overcome and new environments to put our tech to the test, whether on rural, urban, suburban, or interurban roadways. And we don’t steer clear of the toughest city-center conditions in places like ManhattanParis, and Jerusalem.

Road Experience Management™

The breadth and scope of these locations are enabled by underlying technologies designed for global scalability and adaptability. Our Road Experience Management (REM™) mapping system, for example, crowdsources data from the cameras in some 1.5 million REM-enabled vehicles already on the road equipped with our computer-vision technology. We’ve found this combination of cost-effective cameras and the power of the crowd to be far more efficient and scalable than the typical method of scanning by dedicated LiDAR mapping vehicles.

REM compiles this data into the Mobileye Roadbook™, our highly precise AV map of the driving environment worldwide. REM has already mapped billions of kilometers of roadway around the world, and is currently mapping new roads (and continuously updating the existing map) at a rate of millions of kilometers every day.

When our AVs reach a new location that hasn’t been added to the Mobileye Roadbook yet, all we need to do is push a proverbial button to compile the maps from data we have already collected, and our autonomous vehicles are good to go.

Responsibility-Sensitive Safety

Our Responsibility-Sensitive Safety model (RSS) is similarly developed with scalability at its core. RSS is our open mathematical model for AV safety, designed to engender public trust in self-driving technologies.

RSS consists of five universal “rules of the road” through which all of the AV’s decisions are filtered. Because these rules are transparent and independently verifiable, RSS is designed to be integrated into a broad array of standards and regulations across industry and government. And its parameters can be adjusted to fit local driving cultures in different parts of the world.

Different Solutions for Different Applications

The product of all our research and development of self-driving technologies takes many forms.

Mobileye Chauffeur™, for example, is designed for consumer autonomous vehicles – like the one we’re currently developing with Zeekr.

Meanwhile, Mobileye Drive™ is designed to enable autonomous commercial vehicles – such as robotaxis, self-driving shuttles, and autonomous delivery platforms. It’s already being integrated by a range of customers and partners including Udelv, Transdev, Lohr, Benteler, Beep, Schaeffler, Moovit, and Sixt.

Both of these turnkey self-driving solutions incorporate passive (cameras) and active (radar and LiDAR) sensors, developed and operating independently of each other, under an approach we call True Redundancy™. This method tasks each of the two parallel subsystems with creating complete and independent models of the environment on which the driving policy can then base its decisions. Compared to the typical industry approach of sensor fusion (which relies on one combined model of the driving environment), this approach results in a more robust sensing system and creates an additional failsafe.

An added benefit of the True Redundancy approach is that we’re able to take the camera-only subsystem from our developmental AVs and adapt it into Mobileye SuperVision™. This hands-free/eyes-on premium driver-assistance system is already in production in the Zeekr 001.

The EyeQ® Family of SoCs

All of these solutions and more are built around EyeQ, our family of automotive-grade Systems-on-Chip.

Now in its sixth generation, EyeQ chips have been incorporated into more than 100 million vehicles to date. Each successive iteration is based on the same core expertise and architecture, and builds upon the accumulated experience of the generations that have come before.

EyeQ is a highly efficient, extensively proven, and broadly scalable family of SoCs that’s trusted by customers around the world to handle a broad spectrum of advanced mobility applications. It’s the brain behind everything Mobileye does, from driver-assistance and autonomous-driving systems to retrofit collision-avoidance devices and data services for transportation infrastructure and smart city planning.

Self-Driving Mobility for the Masses

From the dawn of the automobile more than a century ago through to the present day, the use of a private automobile has generally been available only to those with the means to buy one, and the ability (and license) to drive one. By removing the driver from the equation, however, self-driving Mobility-as-a-Service stands to open up private mobility, at a more attainable cost, to a far wider userbase – including children, the elderly, and individuals with physical and mental disabilities.

Mobileye is working with an array of partners to bring self-driving mobility services to locations around the world. We’ve already run our first self-driving MaaS pilot project in France, and have broader services due to commence in both Germany and Israel before the end of this year – with additional projects already in varying states of progress with partners worldwide.

After years of development and decades of leadership, we’ve come to embrace the power of technology and the promise of the autonomous future to transform the way people and goods get around. And we’re working to make that long-held dream a reality – not just somewhere, in some ways, for some people, but everywhere, in every way, for everyone.