Recording of our Webinar With Experiment Details
In this webinar, Regulus Cyber researchers shared the findings (including never seen before footage) of the Tesla Model 3 experiment and explain what happened and what can be done to prevent such incidents. The webinar is aimed at anyone interested in autonomous vehicles and stakeholders in driverless technology.
In the first week of June, Regulus Cyber experts test-drive the Tesla Model 3 using Navigate on Autopilot (NOA). An active guidance feature for its Enhanced Autopilot platform, it’s meant to make following the route to a destination easier, which includes suggesting and making lane changes and taking interchange exits, all with driver supervision.
While it initially required drivers to confirm lane changes using the turn signals before the car moved into an adjacent lane, current versions of Navigate on Autopilot allow drivers to waive the confirmation requirement if they choose, meaning the car can activate the turn signal and start turning on its own.
Tesla Model S was also tested, and during the spoofing experiment, it showed different results such as wrong navigation cues, incorrect battery power warnings (when calculating the distance) and changes to the suspension. However, the spoofing did not affect the actual driving as it does not have the NOA feature available.
The Model S test revealed that there is a link between the car’s navigation and air suspension systems. This resulted in the height of the car changing unexpectedly while moving because the suspension system “thought” it was driving through various locations during the test, either on smooth roadways, when the car was lowered for greater aerodynamics, or “off-road” streets, which would activate the car elevating its undercarriage to avoid any obstacles on the road.
Tesla model 3 was successfully spoofed in several attack scenarios. The navigate on autopilot feature is highly dependent on GNSS reliability and spoofing resulted in multiple high-risk scenarios for the driver and car passengers. Tesla Model 3 spoofing during navigating on autopilot led to extreme deceleration and acceleration, rapid lane changing suggestions, unnecessary signaling, multiple attempts to exit the highway at incorrect locations and extreme driving instability. This test proves beyond doubt the crucial dependence on GNSS for any level 2+ autonomous navigation and the high threat spoofing poses to drivers and passengers utilizing these features.
Make: Tesla Motors
Model: Model 3
Software version: v9.0 (2019.16.3.2 a19d0e4)
Equipment used for spoofing and hacking
Jammer – ADALAM PLUTO configurable SDR manufactured by Analog Devices ($150).
Spoofer – Blade RF SDR manufactured by nuand ($400) with external PPS sync connected to a laptop
*The equipment used was purchased online and easily accessible to anyone.
Important terms regarding Tesla mentioned in this research
Cruise – a mode in which the driver designated a certain maximum speed and the car maintains it.
Autopilot – In addition to the cruise, the mode that can only be activated when the car’s cameras are recognizing lane markings. During this mode, the car is responsible for 3 additional activities – maintaining a safe distance from the car in front, adjusting speed according to road conditions, and maintain the middle of the lane. The driver has to hold the wheel momentarily every few seconds.
Navigate on autopilot (NOA) – Tesla’s semi-autonomous mode, this can only be activated if the car is driving on a road that has 2 lanes in each direction PLUS the car has a clear destination. This mode includes all the features of cruise and autopilot with the addition of 2 new additional activities – changing lanes to maintain maximum speed to pass slow vehicles (the car does require the driver to agree to lane change with blinker) and autonomously exiting highways at the relevant interchange. Exiting highway feature is engaged on its own and does not require driver confirmation, the car automatically engaged the blinker and changes lanes and physically turns off the high into the exit up to a distance of 250m before requiring the driver to regain control.
Autonomous Vehicle Spoofing Experiments
During the Tesla 3 experiment, the spoofing antenna was mounted on the roof. This was done to simulate an outside attack and see if the car is capable of isolating against the spoofing. This is the typical case in which an external attacker would try to influence the car. This was also done to prevent the spoofing from affecting any nearby cars or other GNSS receivers.
The spoofer can easily use an off the shelf high-gain directional antenna to get a range of up to a mile. If they add an amplifier, a range of a few miles is very much possible. It has already been proven that spoofing can even occur across dozens of miles, for example in the Black Sea spoofing attack in June 2017.
Regulus Cyber initially discovered the Tesla vulnerability during its ongoing study of the threat that easily accessible spoofing technology poses to GNSS (global navigation satellite systems, also known as GPS) receivers. The Regulus Cyber researchers found that spoofing attacks on the Tesla GNSS (GPS) receiver could easily be carried out wirelessly and remotely.
Tesla emphasizes that “in both of these scenarios until truly driverless cars are validated and approved by regulators, drivers are responsible for and must remain ready to take manual control of their car at all times.”
It appears the Tesla Navigate On Autopilot (NOA) has no reliance on GPS for the actual physical driving decisions. It relies on its own visual sensors, just like a human driver.
There is one exception. The feature “Navigate on Autopilot” (NOA) uses GPS and Google map data, because the point of that feature is to follow a route. The car is only using GPS and map data to determine what lanes it should be in and what exits to take. Actual control of the car is still the job of the onboard sensors. That means that spoofing it basically manipulating the car’s autonomous turning decision, which means an attacker can remotely engage the car to turn while driving with NOA engaged.
Sensor fusion used the GNSS data together with the camera data to make the mistake of turning off the highway.
Tesla’s computer used the GPS position to understand where it is. It uses the camera to identify lanes/exits and the radar to avoid collisions and keep the distance from other cars. Tesla does not use LiDAR and even if it did, it wouldn’t matter. The experiment shows that utilizing GNSS for autonomous navigation makes spoofing a wireless threat that manipulates the car’s path.
Tesla Model 3 Experiment Description
The test was designed to reveal how the semi-autonomous Model 3 would react to a spoofing attack, the Regulus Cyber test began with the car driving normally and the Navigate on Autopilot (NOA) activated, maintaining a constant speed and position in the middle of the lane.
The test started with normal driving, having Navigate on autopilot engaged, driving on a main high way at 95 KPH. The navigation destination was a nearby town requiring the car to autonomously exit an interchange in 2.5 km.
Using a small 1 meter (3 feet) range antenna mounted on the roof, the researchers transmitted fake satellite coordinates that got picked up by the Model 3 receivers. These coordinates were a location on the highway, 150 meters before the exit. The exact moment that the car was spoofed to the new location, it passed a dotted white line on it’s right hand side, leading to a small road into an emergency pit stop.
Although the car was a few miles away from the planned exit when the spoofing attack began, the car reacted as if the exit was just 500 feet away— slowing down from 60 MPH to 24 KPH, activating the right turn signal, and making a right turn off the main road into the emergency pit stop. During the sudden turn the driver was with his hands on his lap since he was not prepared for this turn to happen so fast and by the time he grabbed the wheel and regained manual control, it was too late to attempt to maneuver back to the highway safely.
Yoav Zangvil, Regulus Cyber CTO and co-founder, explains that GNSS spoofing is a growing threat to ADAS and autonomous vehicles. “Until now, awareness of cybersecurity issues with GNSS and sensors has been limited in the automotive industry. But as dependency on GNSS is on the rise, there’s a real need to bridge the gap between its tremendous inherent benefits and its potential hazards. It’s crucial today for the automotive industry to adopt a proactive approach towards cybersecurity.”
The Regulus Cyber testing is designed to assess the impact of spoofing with low-cost, open source hardware and software, the same kind of technology that is accessible to anyone via e-commerce websites and open source projects online. The very same hardware teenagers use to cheat PokemonGo or Uber drivers to fake their commute. This dangerous technology is everywhere.
Taking control of Tesla’s GPS with off-the-shelf tools took less than one minute. The researchers were able to remotely affect various aspects of the driving experience, including navigation, mapping, power calculations, and the suspension system. Under attack, the GNSS system displayed incorrect positions on the maps, making it impossible to plot an accurate route to the destination.
Prior to the Model 3 road test, Regulus Cyber provided its Model S research results to the Tesla Vulnerability Reporting Team, which responded with the following points at that time
“Any product or service that uses the public GPS broadcast system can be affected by GPS spoofing, which is why this kind of attack is considered a federal crime. Even though this research doesn’t demonstrate any Tesla-specific vulnerabilities, that hasn’t stopped us from taking steps to introduce safeguards in the future which we believe will make our products more secure against these kinds of attacks.
The effect of GPS spoofing on Tesla cars is minimal and does not pose a safety risk, given that it would at most slightly raise or lower the vehicle’s air suspension system, which is not unsafe to do during regular driving or potentially route a driver to an incorrect location during manual driving.
While these researchers did not test the effects of GPS spoofing when Autopilot or Navigate on Autopilot was in use, we know that drivers using those features must still be responsible for the car at all times and can easily override Autopilot and Navigate on Autopilot at any time by using the steering wheel or brakes, and should always be prepared to do so.”
“This is a distressing answer by a car manufacturer that is the self-proclaimed leader in the autonomous vehicle race,” Zangvil comments. “As drivers and safety/security experts, we’re not comforted by vague hints towards future safeguards and statements that dismiss the threats of GPS attacks.” He offers the following counterpoints in response:
Attacks against any GPS system are indeed considered a crime because their effects are dangerous, as we’ve shown, yet the same devices we used to simulate the attacks are legally accessible to any person, online via e-commerce sites. Taking steps to “introduce safeguards for the future” indicates that spoofing is, in fact, a major issue for Tesla, which relies heavily on GNSS.
In the case of cars, a spoofing attack is confusing in the best case, and a threat to safety in more severe scenarios. The more GPS data is leveraged in automated driver assistance systems, the stronger and more unpredictable the effects of spoofing becomes. The fact that spoofing causes unforeseen results like unintentional acceleration and deceleration, as we’ve shown, clearly demonstrates that GNSS spoofing raises a safety issue that must be addressed.
In addition, the spoofing attack made the car engage in a physical maneuver off the road, providing a dire glimpse into the troubled future of autonomous cars that would have to rely on unsecure GNSS for navigation and decision-making.
Given that the trust of the public still has to be earned as the automotive industry moves towards autonomy, the leading players are accountable for a responsible deployment of new technology.
As Tesla clearly stated, drivers are responsible for overriding autopilot under a spoofing attack, so it appears its auto pilot system can’t be trusted to function safely under a spoofing attack. Because every GNSS/GPS broadcast system can be affected by GNSS/GPS spoofing, the issue is everyone’s problem and shouldn’t be ignored; furthermore, governments and regulators that have a mandate to protect the public’s safety must engage in proactive measures to ensure only safe GNSS receivers are used in cars.
“According to Tesla, they’ll soon be releasing completely autonomous cars utilizing GNSS, which means that, in theory, an attacker could remotely control the car’s route planning and navigation,” Zangvil says. “We’re obligated to ask what steps they’re taking to address this threat, and whether new safeguards will be implemented in its next generation of entirely autonomous cars.”
Although Regulus Cyber researchers tested only the Model S and Model 3, they concluded that the “disturbing vulnerability” of Tesla’s GNSS system is most likely an industry-wide issue, since “the most common GNSS chipsets used are all vulnerable to our testing – you can learn more about it in the Regulus Cyber resiliency report.
“Just a few months ago we saw that during a spoofing incident in a car show in Geneva, seven different car manufacturers complained that their cars were being spoofed. This incident proves that many other automotive companies that are working on the next generation of autonomous cars are also vulnerable to these attacks. As an industry, to win public trust and succeed, every car manufacturer should be proactive and prepare against these threats,” Zangvil says.
Yonatan Zur, Regulus Cyber CEO and Co-Founder, emphasized this goes way beyond Regulus Cyber and Tesla: “We designed a product to protect vehicles from GNSS spoofing because we believe it is a real threat. We have ongoing research regarding this threat as we believe it’s an issue that needs solving. These new semi-autonomous features offered on new cars place drivers at risk and provides us with a dangerous glimpse of our future as passengers in driverless cars. By reporting and sharing incidents such as this we can ensure the autonomous technology will be safe and trustworthy. “
Outdoor Spoofing Video By Regulus Cyber Researchers
TOWARDS NAVIGATION SAFETY FOR AUTONOMOUS CARS
Article by InsideGNSS – https//insidegnss.com/towards-navigation-safety-for-autonomous-cars/
THE CASE FOR AUTOMOTIVE GNSS CYBER SECURITY MEASURES
As the use of ADAS and Autonomous Driving technology grows in automotive, so does the dependency on...
HOW TO SPOOF SOMEONE’S GPS NAVIGATION TO SEND THEM THE WRONG WAY
Researchers have for the first time demonstrated that it’s possible to spoof turn-by-turn GPS road...