Posted in | News | Automotive Robotics

Ford Smart Mobility Expands Fleet of Third-Generation Autonomous Vehicles

Ford is tripling its fleet of fully autonomous Ford Fusion Hybrid test vehicles – making it the largest in the automotive industry – and will use a new-generation sensor technology as the company further accelerates its autonomous vehicle development plans.

Ford Motor Company is tripling its fleet of Fusion Hybrid autonomous research vehicles this year – making the company’s fully autonomous vehicle fleet the largest of all automakers – and accelerating the development and testing of its virtual driver software in both urban and suburban environments. (Photo: Business Wire)

This year, Ford will add 20 Fusion Hybrid autonomous vehicles, bringing the company's autonomous fleet to about 30 vehicles being tested on roads in California, Arizona and Michigan.

“Using the most advanced technology and expanding our test fleet are clear signs of our commitment to make autonomous vehicles available for millions of people,” said Raj Nair, Ford executive vice president, Global Product Development, and chief technical officer. “With more autonomous vehicles on the road, we are accelerating the development of software algorithms that serve to make our vehicles even smarter.”

Building on more than a decade of Ford autonomous vehicle research, this expansion is a key element of Ford Smart Mobility – the plan to take Ford to the next level in connectivity, mobility, autonomous vehicles, the customer experience, and data and analytics. The newest vehicles are on Ford’s third-generation autonomous vehicle development platform, built using Fusion Hybrid sedans, similar to the second-generation platform.

Ford recently announced its fully autonomous cars will take to the streets of California this year. The company already tests autonomous vehicles at its proving grounds, as well as on public roads in Michigan. Ford was the first automaker to test a fully autonomous vehicle at Mcity– a 32-acre, full-scale simulated real-world urban environment at the University of Michigan.

Advances in sensing, software and hardware

Ford is using Velodyne’s newest LiDAR sensors – named Solid-State Hybrid Ultra PUCK™ Auto for its hockey puck-like size and shape – on its third-generation autonomous vehicle platform.

Solid-State Hybrid Ultra PUCK Auto sensors boast a longer range of 200 meters, making them the first auto-specific LiDAR sensors capable of handling different driving scenarios. Ultra Puck will accelerate the development and validation of Ford’s virtual driver software, which serves as the decision-making brain that directs vehicle systems.

Solid-State Hybrid Ultra PUCK Auto’s lightweight, sleek design makes it optimal for packaging on a vehicle, such as on the sideview mirror. The design means Ford can reduce the amount of LiDAR sensors from four to two on new Fusion Hybrid autonomous vehicles, and get as much useful data due to the more targeted field of view.

”Adding the latest generation of computers and sensors, including the smaller and more affordable Solid-State Hybrid Ultra PUCK Auto sensors helps bring Ford ever closer to having a fully autonomous vehicle ready for production,” said Jim McBride, Ford technical leader for autonomous vehicles.

The vehicle's hardware systems, which interact continuously with the virtual driver, are equally important.

Third-generation autonomous Fusion Hybrid sedans will have supplemental features and duplicate wiring for power, steering and brakes. These supplemental features will act as backups, if needed.

Ford’s autonomous journey

Ford has been using Velodyne LiDAR sensors for roughly a decade, dating back to Defense Advanced Research Projects Agency autonomous vehicle challenges.

Ford was among the first to use the Velodyne LiDAR sensor, an innovation that significantly changed the autonomous vehicle landscape. LiDAR emits short pulses of laser light to precisely scan the surrounding environment millions of times per second and determine the distance to objects, allowing the vehicle to create a real-time, high-definition 3D image of whatever’s around it.

Ford’s first-generation autonomous vehicle platform was built using a Ford F-250 Super Duty for participation in the DARPA challenges in 2005 and 2007. In 2013, Ford introduced its second-generation autonomous vehicle platform, using a Fusion Hybrid sedan.

Ford was one of only six teams to participate in both the DARPA Desert Classic and Urban Finals challenges, supported by four engineers who still are on the company’s autonomous vehicle development team.

“We’ve come a long way since DARPA,” said McBride. “A decade ago, no one in the field knew what the art of the possible was. Today, we’re all hustling to make the most ambitious dreams become a reality.”

The first-generation autonomous vehicle platform helped Ford understand that fully autonomous driving was technically feasible in the near future, and – through ambitious research – how it could achieve this.

Fusion Hybrid sedans were chosen for the second-generation vehicles because they have the newest and most advanced electrical architecture. With the latest generation of computers and sensors – including the smaller, but more advanced Velodyne LiDAR HDL-32E sensor – Ford’s autonomous vehicle platform moved a step closer to production.

The objective of the second-generation vehicle fleet is to test many of the computing and sensor components required to achieve fully autonomous driving capability, as defined by SAE International Level 4, which does not require the driver to intervene and take control of the vehicle.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.