Cyber Security Firm Claims That Tesla Cars Are Vulnerable to GPS Spoofing Attacks
An Israeli firm called Regulus Cyber claims that spoofing attacks on the Tesla GNSS (GPS) receiver could be carried out wirelessly and remotely, and that they managed to interfere with a Model 3's autopilot system.
According to the firm, during a test drive using Tesla's Navigate on Autopilot feature, "a staged attack caused the car to suddenly slow down and unexpectedly veer off the main road."
The Regulus Cyber researchers found that spoofing attacks on the Tesla GNSS (GPS) receiver could "easily be carried out wirelessly and remotely," exploiting security vulnerabilities in mission-critical telematics, sensor fusion, and navigation capabilities.
Tesla's Enhanced Autopilot platform is meant to make following the route to a destination easier, which includes suggesting and making lane changes and taking interchange exits, all with driver supervision. While it initially required drivers to confirm lane changes using the turn signals before the car moved into an adjacent lane, current versions of Navigate on Autopilot allow drivers to waive the confirmation requirement if they choose, meaning the car can activate the turn signal and start turning on its own.
The Regulus Cyber test began with the car driving normally and the autopilot navigation feature activated, maintaining a constant speed and position in the middle of the lane. "Although the car was three miles away from the planned exit when the spoofing attack began, the car reacted as if the exit was just 500 feet away—abruptly slowing down, activating the right turn signal, and making a sharp turn off the main road. The driver immediately took manual control but couldn't stop the car from leaving the road," the company says.
The testing also revealed a link between the car's navigation and air suspension systems. This resulted in the height of the car changing unexpectedly while moving because the suspension system "thought" it was driving through various locations during the test, either on smooth roadways, when the car was lowered for greater aerodynamics, or "off-road" streets, which would activate the car elevating its undercarriage to avoid any obstacles on the road.
Tesla's Vulnerability Reporting Team responded with the following points:
Any product or service that uses the public GPS broadcast system can be affected by GPS spoofing, which is why this kind of attack is considered a federal crime. Even though this research doesn't demonstrate any Tesla-specific vulnerabilities, that hasn't stopped us from taking steps to introduce safeguards in the future which we believe will make our products more secure against these kinds of attacks.
The effect of GPS spoofing on Tesla cars is minimal and does not pose a safety risk, given that it would at most slightly raise or lower the vehicle's air suspension system, which is not unsafe to do during regular driving or potentially route a driver to an incorrect location during manual driving.
While these researchers did not test the effects of GPS spoofing when Autopilot or Navigate on Autopilot was in use, we know that drivers using those features must still be responsible for the car at all times and can easily override Autopilot and Navigate on Autopilot at any time by using the steering wheel or brakes, and should always be prepared to do so.
These marketing claims are simply a for-profit company's attempt to use Tesla's name to mislead the public into thinking there is a problem that would require the purchase of this company's product. That is simply not the case. Safety is our top priority and we do not have any safety concerns related to these claims."
Regulus physically affixed an antenna to the roof of the Model 3 and wired it into its systems before the demonstration. Putting an antenna on the roof of the Model 3 allowed Regulus to use far less power than would otherwise be required, and therefore the firm could be far less worried about accidentally impacting other, unrelated GPS devices nearby.
Their attack against Tesla's car essentially used GNSS spoofing to convince the vehicle that it isn't where it thought it was, and it should turn on the wrong road. But that doesn't mean that the car would crash on the nearby tree. The car's radar, ultrasonics, and a suite of eight cameras would kick in to avoid such an incident. The point is that an autonomous or semi-autonomous automotive application only uses the GPS to decide which road to take; what is or is not a road at all is decided by local sensors.
On the other hand, GNSS spoofing could be an issue in the future. The safeguards against attacks are poor, and with the right tools, bad actors could achieve hostile GPS takeovers.