Self-driving cars are slowly being developed by design teams from Google, Tesla and many other companies. While these machines are still in the early stage of development, it's not clear whether the market will show an interest in such a product.
However, one Google Car designer isn't doing it for the market. Chris Urmson, director of the search giant's self-driving car project, recently appeared at the 2015 TED Conference to talk about Google's plans for the self-driving car. Urmson first noted that the company plans to have its self-driving car ready to sell in the next five years. He also mentioned that this goal wasn't driven just by company deadlines. Instead, Urmson wants to make sure his 11-year-old son has access to a self-driving car so that he doesn't have to worry about taking a driver's exam.
But why should people care about self-driving cars? Urmson noted that these vehicles already provide assistive tools that make driving a lot easier. So, adding a self-driving feature would be a lot easier than some creators would expect. Urmson also noted that self-driving cars could help minimize time spent in traffic and decreasing the number of automotive accidents.
But can the Google Car respond to unusual situations well? Urmson says yes. He went on to show off how successful the Google car was at responding to abnormal situations, such as a child driving an electric car on the road, or a woman chasing a duck across the road.
While these reports are certainly good signs for the future of the self-driving car, some designers note that all self-driving cars should provide users with the ability to intervene and take control when things get out of control.
Urmson was one of many speakers during the "Machines that Learn" TED Session, which presented a variety of views on the issue of how to incorporate artificial intelligence. While Urmson was one of the stronger pro-AI voices at the event, other technologists and philosophers spoke out about the technology. Philosopher Bolstrom noted that AIs need to be designed so that they have the same value system as their owners and creators. Oren Etzioni of the Allen Institute for Artificial Intelligence noted that AI-empowered technology does not (necessarily) need to receive the ability to act autonomously yet in order to operate.