Elon Musk says Tesla Vision will soon detect turn signals, hand gestures
The latest full self-driving Beta v9 software update from Tesla uses Tesla Vision - a computer vision system that depends on optical imagery and not on readings from radar sensor that was a part of Tesla's sensor suite earlier. The company's CEO announced that the Tesla Vision system will soon be able to detect turn signals on other vehicles, police and ambulance lights, hazard lights and even hand gestures, Electrek reported.
Musk has touted the latest software update as "mind-blowing" and one that has the ability to improve faster through machine learning (ML), the report stated. Tesla owners who have received the new software update in the early access program have already started testing it and some have also reported specific improvements in the system such as detection of taillights.
The development of the Tesla Vision will be useful in situations where the self-driving system will be able to react to emergency vehicles when needed. Tesla is also working on recognizing sirens and alarms. Further, the EV maker is looking to go big on cameras and neural nets for its self-driving system, making the system closest to human abilities. "The whole road system is designed to work with optical imagers (eyes) and neural nets (brain). That’s why cameras and silicon neural nets are the solution," Musk told Electrek last month.
Currently, Tesla is only testing its latest FSD Beta v9 software update on roughly 2,000 vehicles via the early access program. The company plans to release the update to a wider audience in "about a month" depending on the feedback it gets from its test drivers.
While releasing the latest update, Tesla warned its early access users to make use of the package with caution. It warned that drivers will have to pay constant attention to the road and also be prepared to act immediately in case of an emergency, especially at blind corners, crossing intersections or in narrow driving paths