Autonomous cars on open source

(2019 Forbes article)



image

Brad Templeton with an early version of the Stanford Robocar.



About the Author: Brad Templeton is a software engineer, an evangelist of robotic cars since 2007, and worked on Google in his early years. Founder of ClariNet , honorary chairman of the Electronic Frontier Foundation and director of the Foresight Institute , founder of the faculty at Singularity University .



It is difficult to follow the rules of open development or amateur inventions when you create a robot weighing 1.3 tons and send it to ride along the streets among pedestrians. Nevertheless, amateur innovators are very important for any developing technology. How to resolve the contradiction?



Recently, there have been several interesting announcements in the field of open source development.





Open source packages from UDacity, EB Robinos, Autoware, Nvidia, and comma.ai have also been released. And although while we are not at the level where you can download a set of tools and create an amateur car, this time will come.



I have a long history of working with open source tools, and I have released several open source software packages under free licenses. As chairman of the Electronic Frontier Foundation, I often defended the rights of such software.



These tools will help developers work faster and team up to create the best autonomous cars. This is especially true of the tools used to create software for robotic machines, as there is a strong incentive for collaboration in this area. Teams will integrate useful open source components into their machines, if licenses allow it. Some teams will even be able to fully create vehicles using open source. There are examples of open source projects that far outperform far more costly business initiatives. On top of this, there are good reasons to argue that open source tools can be more secure, as each part is under the gaze of the whole world. Of course, attackers see the source and this can help them in finding vulnerabilities, but there are a lot more good guys who also see this code and work to protect it.



All of this is expected, but there is also a problem for a person who is a star of open-source development methods - single masters. People who improve systems do this simply because they use them themselves and want to make them even better and share their successes. Such people bear the main responsibility for most of the open tools that we use, although some important and large projects, of course, were created by large teams with professional financing.



Can you take a set of open source software, download it to your machine and make it drive on its own while you watch TV? Can you make her work without a driver so she can come and pick you up? Can you make changes to your liking or download other modifications and go on the road with them? This is a much more complicated question.



People do these things all the time in other areas of programming. However, the phone is very different from the car. A security hole in your phone can give attackers access to your personal data, even billing information, and this is serious. But vulnerability in a car can cost you or others on the road.



Self certification



Nowadays, regulators have a policy of non-intervention. And although they are ahead of time and create rules for technologies that do not yet exist, they mainly focus on the conditions of self-certification for machines that are produced by the main players in the market.



Self-certification means that the company is testing the machine and declares that it really meets the goals and safety requirements set forth in the rules, and third parties or the state do not verify any of this. Instead, if it turns out that the company made a mistake, or, even worse, lied about compliance with the requirements, then they will face torture in court. Ideally, this should be such torment that will motivate companies to conduct tests even better than any third party or state.



It is not that difficult. The truth is that no one knows how to create an independent testing center, since it is not clear what it will check or how it will work. Not even the government itself has a clue. For ordinary cars, there are tests that are well understood, for example, crash tests. Also, the number of aspects that are checked by external organizations is very different from country to country. But in most cases, people who create cars or components for them know much more about safety checks and confirmations than anyone else, and motivating manufacturers to be honest is an effective technique.



In the case of robotic machines, such tests can only be carried out through an in-depth study of the software and its operation. You must fully understand this in order to do this. There are several universal tests that can be carried out by independent laboratories, and which will be really useful. Over time, there will be more tests and more laboratories for them, but developers will create new approaches that will not fit into the old tests. Standards and rules can define generally accepted and short-term relevant techniques. They are not very useful in a rapidly changing field in which new approaches are regularly invented.



Value



While self-certification seems to be the only system that works for a single master, unfortunately, the principles at the core of this system require that the certifier has enough weight to confirm its claims. One person, as a rule, does not have the means to avoid the torment of the lack of proper certification. Groups of people can avoid such situations, but it is difficult even for them. Tests will also be expensive if certified by a third party. All this means that this is an infrequent practice that can be carried out rationally only for software that runs thousands of cars.



Even if there is a software package that is certified as safe, what if you make changes to it that are relevant to important parts of the code that affect security? Can you drive such a car on the road with many obstacles around? This is still a tricky question.



Solutions



A possible solution would be to purchase insurance. Unfortunately, this also has its difficulties. Most likely, the risk will be higher than that which is covered by typical modern insurance. Also, insurance companies do not have methods for determining how good a programmer you are and how safe your modifications are.



You should be able to make your modifications and drive on the road in safe driving mode with at least one person, and probably two testers, who are monitoring the new software and are ready to take control in case of any problems. This is the way cars are tested today; so it was with the Tesla autopilot. This works, but only for serious developers who are willing to invest a lot of time and resources, and does not work for amateurs who just want to make changes to the software of their cars.



There may be a situation in which the programmer, after a small check of safe driving, can send his edits to a larger organization that can integrate these edits into his tests. They can test hundreds of (disjoint) modifications from different developers at the same time. These programmers will be able to share the cost of this and not only testing together to bring their code to the level at which certification is possible. However, it is still expensive.



"Mentor"



It is also possible to create a kind of software “mentor”. It can be a simple certified program, which is a safe, automated driver. In fact, this program can be taken from the core of a recognized, highly reliable system that has all the security certificates. Your machine will work on its own modified system, but at the same time, the “mentor” will constantly monitor its work. If your system does something that the mentor does not like, then he will take control over himself and drive the car to a safe place, or home.



The mentor will drive carefully. This means that using this technique, you cannot create anything more aggressive than it. If the mentor says that you need to stop, and your program says that you need to go, then your program is turned off. Human management may have a higher priority than a mentor. Functions that require more permissions than the mentor can give are not feasible for development and testing on the roads by single masters, although they can still work on them in simulators.



In fact, simulators are the salvation of amateur engineers, especially considering that simulators are getting better and better. They exist to quickly find problems and unsuccessful solutions in new patches, and also, so that before driving on the road, the car trained on a virtual run for many miles.



Most likely, the mentor should be based on a certified version of the same set of open source software that is modified by an amateur engineer. And in fact, if the programmer finds out that the action that the mentor refused to perform is actually safe, then he will be able to send an error message (and even a correction), and ultimately (although not immediately), the mentor can be improved trained more behaviors in the programs under test.



Security is a particular concern. Modified software may have security holes that allow an attacker to take control of a mentor. We need the mentor to work efficiently, but another level of quality is the ability of the mentor to recognize a malicious program that is trying to trick him. Any vulnerability of a mentor that could lead to him committing any unsafe action could be a problem.



The need for artisans



We need those who are called artisans in the automotive world, or, as we call them in the computer world, hackers (the use of the word “hacker” in the sense of a computer criminal is used mainly outside the community of software developers. Within the community, this word has such the same meaning as the “artisan master”, and the expert on computer crimes or hacking is another type of hacker). All large automobile companies began their journey as handicraft industries. Many states even have laws that make special exceptions to the usual vehicle safety rules for unique vehicles created by artisans, provided they comply with basic transportation safety rules. And it works, especially because these vehicles always have a driver, and as a rule, this is their creator. In the software world, hackers (in a non-criminal sense) contributed a huge share of innovation.



We even want to support the idea of ​​a small car company that can distribute or sell modified systems for robotic machines with an integrated mentor to interested customers. Unlike clients in other areas, these will put other people at risk, and not just themselves, but innovation also requires small businesses.



A mentor like the one that has been described does not yet exist. But it should be in our plans, since the world of unmanned vehicles without innovations of artisans and hackers will be less developed and safe than a world in which only large companies can contribute.






image



About ITELMA
We are a large automotive component company. The company employs about 2500 employees, including 650 engineers.



We are perhaps the most powerful competence center in Russia for the development of automotive electronics in Russia. Now we are actively growing and we have opened many vacancies (about 30, including in the regions), such as a software engineer, design engineer, lead development engineer (DSP programmer), etc.



We have many interesting challenges from automakers and concerns driving the industry. If you want to grow as a specialist and learn from the best, we will be glad to see you in our team. We are also ready to share expertise, the most important thing that happens in automotive. Ask us any questions, we will answer, we will discuss.


Read more useful articles:






All Articles