With So Many Parties on Board, Self-Driving Is Closer Than You Think
In our last blog, we talked about some of the major players in autonomous vehicles and how rapidly the technology is evolving. Little did we know that, just days after our post, there would be a big upset involving one very prominent self-driving car application. Shortly after the December 14, 2016, rollout of Uber’s self-driving taxi service in San Francisco, CA, video emerged showing one of the autonomous driving vehicles running a red light. Although Uber said the car was being operated by a human driver at the time and had no passengers on board, the repercussions were swift: Just a week after launch, Uber pulled its self-driving cars off the road when the state of California revoked the cars’ registrations.
Clearly, there are going to be some detours in the journey from great idea to self-driving cars for all. But what we can say for certain is that, even with the occasional setbacks, the technology behind autonomous vehicles continues to move forward, driving us to drastically rethink the machines that take us from point A to point B.
Where Are Autonomous Cars Now — and Where Are They Going?
The basic difference between the automation in the cars we currently drive day-to-day and that in the autonomous driving vehicles of the not-so-distant future has to do primarily with the level of driver involvement. Many newer vehicles today already include a range of controls that automatically handle certain aspects of driving for us, such as assisted braking and adaptive cruise control — things that are considered level 1 of automation.
The autonomous cars that are commercially available right now are around level 2 of automation; a good example is Tesla’s semi-autonomous driver assistance (“autopilot”) capabilities we talked about in our last blog on autonomous vehicles. Most of the prototypes being tested are at level 3, where the driver is safe to pay no attention in certain situations, or level 4, where the driver does not need to be engaged unless there are severe weather or unexpected road conditions. However, as research has come to show, a human “driver” checking texts or dozing off simply cannot regain attention and focus sufficiently fast to react correctly and quickly enough to regain control and take effective actions in an imminent emergency situation. Therefore, many manufacturers are concluding that levels 3 and 4 are simply not going to work, because the fail-safe — the human being — is not capable of responding fast enough to avoid a hazard to themselves or others.
The ultimate degree of automation is level 5, where a driver doesn’t need to be present at all. This is the technology required for full ride-sharing futurism. That’s the envisioned scenario where no one owns cars; instead, you just summon a vehicle and it comes to pick you up. The autonomous vehicle concepts of both Google and Mercedes are level 5, using zero human input except the desired destination.
What Powers the Automation?
While automakers and other players are guarded about revealing too many details, there are some basic technology components we know are integral to the design of autonomous vehicles. For example, several types of sensors are used to gather the visual input necessary for a self-driving car to navigate roadways successfully and safely. Cameras are a relatively inexpensive type of sensor that can provide the basic visual information a human would gather as he or she drive; multiple cameras are used to provide depth of field. Radar is already used for certain level 1 vehicle controls; the drawback is that radar is only good at close range.
LIDAR (light detection and ranging) is an onboard laser system that maps a car’s surroundings as it moves. You are probably familiar with LIDAR as the big spinning unit found on the top of most of Google’s test vehicles; solid-state LIDAR does not require spinning and is also being used in some autonomous vehicles now being developed. Although LIDAR creates highly accurate 3D mapping, it does have drawbacks; in addition to being very expensive, the technology is affected by weather, with its signal bouncing off of rain or snow.
In today’s already highly computerized cars, much of the transfer of sensor information occurs through a Controller Area Network (CAN bus), a protocol designed to allow micro controllers and devices to communicate with each other without a host computer. However, truly autonomous cars require more robust data connectivity, as well as the computing power and software to take all the visual input and other gathered data, organize it, interpret it, and turn it into actionable driving — all in real time. This in turn requires:
* A significant (one might say, massive) amount of processing power
* Bringing data together from different sensors (“sensor fusion”) to compute something more than what could be determined by one sensor alone
That is why machine learning — a type of artificial intelligence (AI) that gives computers the ability to, basically, learn “on their own” when exposed to new data — is a critical and rapidly developing aspect of autonomous driving design. Going a step further, the type of machine learning called *deep learning* seeks to emulate the way human beings gain new knowledge. By automating predictive analytics, deep learning could enable a car to get smarter every time it is driven and even learn from other drivers and other cars.
Naturally, for safety an autonomous vehicle must be very smart to begin with! But whether the vehicle is designed to interpret sensory input and then select from a series of hard-coded driving decisions OR it is used to directly map input (from sensors and other sources) to driving output from end to end, machine learning is surely part of every car on the drawing board or on the road.
What Are the Implications Beyond Driving?
Of course, wherever there is computer technology, information security is an important concern, and autonomous vehicles are no exception. In fact, the implications of hacking in self-driving cars are huge. Someone traveling in a multi-ton vehicle at a high rate of speed certainly does not want to have to worry that the vehicle might be taken over and remotely controlled by a third party.
In 2015, Chrysler recalled more than a million vehicles when two security researchers (not bad guys, fortunately) discovered a software vulnerability that allowed them to wirelessly hack a Jeep and take over its dashboard functions, steering, transmission, and brakes. Chrysler provided vehicle owners with a USB drive-based software update and took network-level steps to detect and block hacking via the vehicles’ cellular network connection. Then in 2016, a security lab demonstrated a hack of a Tesla Model S, remotely accessing and controlling the vehicle in both park and drive modes via the car’s CAN bus and a malicious WiFi hotspot. Within days, Tesla deployed an over-the-air software update to address the potential security issues.
These cases showed car makers and their partners that stepped-up security is vital, ensuring that stronger safeguards will continue to be put in place. However, with every autonomous car requiring a computer and network connectivity for driving to take place, it may only be a matter of time before someone hacks a self-driving vehicle while it is in use.
On a related note, privacy is also an increasingly challenging area for cars and drivers. Today, someone on the lam could switch off the GPS in their car and unfold a map. But with the innate functionality of an autonomous car requiring near-constant use of GPS, that vehicle must be tracked virtually all the time. In addition, while cameras serve important sensor functions and will provide input for machine learning, they are also surveillance tools. So, there is the potential for every inch of road space to be photographed and scrutinized on a near-constant basis once self-driving cars (and their system of camera sensors) are fully rolled out.
And on a more human note, while the advent of reliable and fully autonomous driving vehicles will create jobs in manufacturing and related fields, at some level it will also mean the loss of some jobs. Mobility-on-demand and ride-sharing applications will make taxi and bus drivers obsolete. Self-driving vehicles on the farm will increasingly reduce the number of human workers required. Truckers will still be needed to maintain their rigs, but they will not be driving them and so fewer will be required to maintain a fleet of trucks for automated deployment. And in late 2016, an online takeaway (takeout) company launched robotic food delivery in London. So, from the package handler to the kid driving pizza to your door, these workers may eventually be replaced by, if not autonomous vehicles, then a related technology in the form of delivery bots.
Why Is Metal Cutting Talking About It?
Metal Cutting offers custom metal cutting service for specialty products and precision, small metal parts for wide array of industries, including automotive companies and their technology partners — providing parts cut-off, machining, finishing, and other techniques needed for applications that require tight tolerances, specific surface finishes, and highly customized ends, tapers, and diameters.
Want to explore whether Metal Cutting is the best partner for *your* automotive parts or other applications? Download the free guide, 7 Secrets to Choosing a New Contract Partner: Technical Guide to Outsourcing Your Precision Metal Fabrication.