Robo Truckers and the AI-Fueled Future of Transport

Concerns about artificial intelligence replacing long-haul drivers are not new, but the real story is more nuanced.
Photo collage of the inside and outside of a truck with computers and sensors
Photo-illustration: WIRED Staff; Getty Images

Economists and policymakers are becoming increasingly concerned about the effects of automation and artificial intelligence on employment—including whether some kinds of jobs will cease to exist at all. Trucking is often thought to be one of the first industries at substantial risk. The work is difficult, unsafe, and often deadly and high rates of driver turnover are a constant problem in the industry. As a result, autonomous trucks have become a site of tremendous technical innovation and investment—and some forecasters project that truck driving will be one of the first major industries to be targeted by AI-driven automation.

Technology-driven unemployment is a real threat, but robotic trucks are very unlikely to decimate the trucking profession in one sudden phase transition. The path to fully autonomous trucking is likely to be a gradual slope, not a steep cliff—a trajectory shaped not only by technical roadblocks, but by social, legal, and cultural factors. Truck drivers’ daily work consists of many complex tasks other than driving trucks—maintenance, inspections, talking to customers, safeguarding valuable goods—many of which are far more difficult to automate than highway driving. A host of new legal regimes across states will be required to ensure that the technology can be deployed safely. And widespread apprehension around autonomous vehicles (and autonomous trucks especially) will likely delay adoption. All of these factors will slow the degree at which autonomous trucks take to American highways.

Instead of thinking about a sudden wave of trucker unemployment, then, we should think about how AI will change what truckers’ work looks like over the long haul. There will still be human truckers for a long time to come—but this doesn’t mean that what it means to be a human trucker won’t change substantially. Rather than whole-cloth replacement of human truckers, autonomous technologies might require integration between human and machines over a long period of time, as truckers are required to coordinate their work—and themselves—with the technology.

There are several possible forms this integration might take.

Passing the Baton

One vision of the future imagines machines and humans as coworkers. In this model, people and machines “pass the baton” back and forth to one another, like runners in a relay: The worker completes the tasks to which she is best suited, and the machine does the same. For example, a robot might take responsibility for mundane or routine tasks, while the human handles things in exceptional circumstances, or steps in to take over when the robot’s capacities are exceeded.

Human/robot teams hold some promise both because they try to seize on the relative advantages of each—and because the model presumes that humans get to keep their jobs. In fact, some believe that human jobs might become more interesting and fulfilling under such a model, if robots can take on more of the “grunt work” that humans currently are tasked with completing.

The human/robot team is not an especially farfetched idea for trucking work. In fact, most of us encounter a version of this model every time we sit behind a steering wheel. Modern cars commonly offer some form of technological assistance to human drivers (sometimes called “advanced driver-assistance systems”). Adaptive cruise control is an example: When a human driver activates it, the car automatically adjusts its own speed to maintain a given driving distance from the cars in front of it.

While adaptive cruise control might seem very different from a fully autonomous vehicle—and it is, technologically—the two technologies reside on the same spectrum. And in even the most advanced semiautomated technologies on the road today, humans are still required to be prepared to take control of the vehicle; that is, even if the machine has the baton most of the time, the human has to be prepared to grab it immediately when the machine doesn’t know what to do.

What would the handoff model mean for truckers? In theory, the truck would handle the bulk of the driving in good conditions, and the human trucker would take over in situations where the machine has trouble—say, in a construction zone or crowded intersection, or when visibility is poor. When the machine is in charge, the theory goes, the trucker might be “unshackled from the wheel” and freed up for other tasks.

This vision is similar to the transformation of the bank teller’s role after the advent of the ATM: The machine does the boring routine work, freeing up the human for more interesting or skill-matched pursuits. But it leaves open big questions about whether or how truckers would be paid for time in the cab while the truck drives itself—after all, if trucking companies are still paying big labor costs, are autonomous trucks worth the investment?—and also wouldn’t necessarily address problems around overwork and fatigue.

There’s another problem that’s even more fundamental. Baton-passing is incredibly—perhaps intractably—difficult to execute smoothly in situations like driving. Recall that the machine passes off responsibility to the human in the situations it finds most difficult: when conditions are unusual, when there is something in the environment it isn’t equipped to contend with, when there’s a mechanical malfunction or emergency. Those situations are very likely to be safety critical. One review of the scholarly literature found “a wealth of evidence” that automating some aspects of driving led to “an elevated rate of (near-) collisions in critical events as compared to manual driving … Essentially, if the automation fails unexpectedly with very little time for the human to respond, then almost all drivers crash.”

This problem is so severe because the time scale in which the baton is passed is miniscule: Because of the nature of driving, a human is likely to have an extremely short window—perhaps only a fraction of a second—in which to understand the machine’s request to intervene, assess the environmental situation, and take control of the vehicle. This tiny time window is the reason why human drivers in semiautonomous cars are warned that they must stay alert the entire time the car is driving. Despite the image of humans relaxing, napping, texting, eating, and being otherwise freed up from the requirements of driving, this image is patently unrealistic given the need for quick, safety-critical handoffs at current levels of automation.

Audio and visual alarms can help humans know when a handoff is coming, but the immediacy of the need to take control means that humans must still pay constant attention. However, a 2015 NHTSA study found that in some circumstances it could take humans a full 17 seconds to regain control after a vehicle alerted them to do so—long beyond what would be required to avoid an accident.

Not only is it hard for humans to intervene when intervention is called for, it’s cognitively unrealistic to expect humans to remain alert to the environment in case of emergencies—particularly as those emergencies become rarer, and as people’s driving skills atrophy. This is what human factors researcher Peter Hancock calls the “hours of boredom and moments of terror” problem. Humans are notoriously bad at staying attentive to monotonous situations in which there is only rarely something extremely important for them to notice and act upon. As Hancock frames it: “If you build vehicles where drivers are rarely required to respond, then they will rarely respond when required.”

This irony creates severe problems for human/robot handoffs in autonomous cars and trucks. So long as humans have some duty to monitor the driving environment—which they do at the current state of the art—humans will almost inevitably do a poor job at accepting the baton from the machine. Does this mean there’s no hope for safe autonomous vehicles? Not necessarily. If robots and humans make bad coworkers because of the weaknesses of the human, one solution might be to increase the level of automation even more, obviating the need for short-term handoffs to a human at all. This could create a second model of integration: network coordination.

Divide and Conquer

Another way to think about the division of labor between humans and machines is as a matter of more systemic work-sharing. Rather than a focus on in-the-moment driving, we might think about humans and machines as sharing truck-driving work in a broader way: by dividing up responsibilities over the driving route. 

We’ve been thinking about the work of truck driving as a set of small, often simultaneous driving tasks: change lanes, hit the brakes, watch for road obstacles. We could instead think about it as a series of predictable segments: travel down the interstate, exit the highway and take local roads, steer around the receiver’s docks. In this model, humans and robots still share the labor of trucking work, but take turns being wholly responsible for driving—much as you and a friend might take turns driving on a road trip—with temporally and geographically predictable points of transition between the two of them. Some truckers already do this when they “drive team,” taking turns driving (often while one driver sleeps). If we think of human/robot teams working together in tandem over these segments, a second model of integration emerges: network coordination. Several trucking technology firms have set their sights on this sort of model.

But wait, you might think. The reason for autonomous cars to hand off control to humans is that they aren’t fully capable of driving themselves—they can’t negotiate unexpected obstacles well, they lack humans’ tacit knowledge, they can fail catastrophically in new and complex situations. If this happens, how can we envision giving a machine total control over an entire portion of the route, without a human driver being expected to step in?

Part of the answer is that the difficulties autonomous vehicles encounter are “lumpy”—they’re much more likely to occur in some route segments than others. Though they’re far from perfect in any setting, autonomous vehicles perform much better on highways than on city streets: speeds are more constant, there are fewer intersections and unexpected obstacles, and contexts are generally more predictable and easier for a machine to negotiate. Things get much more complicated at the endpoints, when trucks leave the highways and venture into cities and towns to pick up or drop off loads. And when a truck arrives at a terminal, it doesn’t just drop its load immediately and take off. A trucker might spend hours at a terminal making “yard moves”—queuing to be loaded or unloaded, backing the truck into the right bay, and following the directions of the customer. Some truckers load and unload freight themselves; others coordinate with the customer’s unloading crew (or with “lumpers,” third parties who unload the delivery on behalf of the customer).

All of this requires irregular driving in response to immediate human direction, sometimes in large lots without lanes or traffic markings—and is nearly impossible for a machine to do on its own. (As a point of comparison, think of how planes taxi around at airports—despite the widespread use of autopilot in the air, there’s little chance that airport taxiing will be automated anytime soon.) So, a natural division of labor in trucking might be that advanced autonomous trucks drive themselves over the long haul, and humans take the wheel for the endpoints—what’s often called the “last mile” in transportation and logistics. In 2017, Uber announced such an approach: an autonomous truck network, connected by local hubs throughout the country. Autonomous trucks would run the long hauls between the hubs, and human truckers would pilot the trucks from hubs to delivery.

It isn’t a feasible model—yet. But some autonomous vehicle technology companies think the human/machine coordination challenges at current levels of semiautonomy are so difficult and intractable that they are essentially attempting to “skip” those levels, focusing their attention on developing vehicles that can drive with no human involvement under particular conditions (such as highway driving within a prespecified area or under only certain weather conditions). In trucking, if full autonomy could enable the truck to drive without the driver’s constant attention (and for longer time periods—since robots don’t get tired), the prospect seems more economically viable than a model that requires a driver to be engaged as a backup (and, presumably, paid).

However, the only way the network coordination model is a viable option is if the pay structure of trucking adjusts with it. Truckers are paid by the mile, and the great majority of miles driven (and thus money earned) takes place on the highway—not on traffic-packed local roads or while maneuvering around at a terminal. The parts of the job that network coordination models might automate are precisely the parts that make up the lion’s share of a trucker’s wages. 

Truckers have argued for pay reform in the industry for decades, but have lacked the political capital to make change. Uber’s proposal seemed like it might be an unholy alliance that could actually help improve truckers’ lots: by working in its own interests, it might have had the power to reshape the industry’s pay structure and create a viable way forward for humans and machines to work together. But the company abruptly shuttered its autonomous truck division in July 2018, only months after announcing its hub-and-spoke model. Uber’s shift away from autonomous trucks suggests there is little hope of achieving the network coordination model anytime soon; the project would involve substantial regulatory change and infrastructure costs, and it’s hard to imagine other companies that could pull it off in the near term.

A variation on network coordination could involve allowing truck drivers to take the wheel remotely for the “last mile” of operation. Starsky Robotics, founded with significant venture capital investment in 2016, developed a “teleoperation” system in which trucks drove themselves to a certain point, and human drivers subbed in remotely from the highway exit to the terminal—as if they were playing a video game or operating a drone. In theory, such a system could allow a single driver to pilot dozens of vehicles a day, for short periods of time, all over the country—and still return home each night. (As one remote trucking executive framed it: “Think about the mom who is home driving a truck. She can drive multiple assets and never leave her kids.”) Some refer to this as a “call-center” model in which the robot calls into a human phone bank for support or handoff at predetermined points in the route.

But it isn’t obvious that a model like this is sustainable, either. For one, the handoff problems seem likely to be only exacerbated by distance. And there are other problems unique to the model: Ford shut down its system testing a similar idea after the vehicles repeatedly lost their cell signal so that human operators couldn’t see the video feed. Starsky Robotics closed its doors in 2020; in a valedictory blog post, its chief executive chalked up the company’s closure in large part to the assessment that “supervised machine learning doesn’t live up to the hype” in terms of operational capability in autonomous trucking.

The Rise of the RoboTrucker

The future of trucking might someday look like these baton-passing or network-coordination models of shared labor. But right now, human/machine interaction in trucking looks very different. What we see happening in trucking now involves a much less discrete parceling-out of functions between humans and machines. Instead, truckers’ physical bodies and intelligent systems are being integrated into one another.

There are two kinds of technologies that turn truckers into RoboTruckers. The first are wearables, which monitor elements of the trucker’s internal bodily state and use them as metrics for management. For example:

  • The SmartCap is a baseball cap (also available as a headband) that detects fatigue by monitoring a driver’s brainwaves (essentially doing a constant EEG). Rear View Safety and Ford’s Safe Cap are similar systems. Systems like these can be configured to send an alert to a fleet manager or a family member, to flash lights in drivers’ eyes, to sound alarms, or to jolt the wearer back to alertness with vibrations.
  • Optalert, an Australian company, manufactures a pair of glasses that monitors the speed and duration of a trucker’s blinks in order to give him a real-time fatigue score.
  • Maven Machines’ Co-Pilot Headset detects head movement that suggests the driver is distracted (for example, looking down at a phone) or tired (for example, failing to check his side-view mirrors regularly).
  • Wrist-worn Actigraph systems both monitor and predict fatigue rates over time. The technology, initially developed by an Army research lab, blends biometric data about a trucker’s alertness with other data (like start time) to forecast how long he can drive before becoming too tired.

A number of other wearable devices are under development. For example, Steer, another wrist-based wearable being developed by a Latvian firm, measures heart rate and skin conductivity. It vibrates and flashes lights if it begins to detect signs of fatigue and delivers a “gentle electric shock” to the driver if fatigue continues. Mercedes has prototyped a vest to monitor a trucker’s heart rate; the system can stop the truck if it senses the trucker is having a heart attack.

The second set of technologies are cameras pointed at the driver designed to detect his level of fatigue, often by monitoring his eyelids to track his gaze and look for signs of “microsleep.” Seeing Machines is one of several companies that market driver-facing cameras that use computer vision to monitor a driver’s eyelids and head position for signs of fatigue or inattention.

If the driver’s eyes close or look away from the road for too long, it sounds an alarm and sends a video to his boss—and can also cause the driver’s seat to vibrate in order to “goose” him back into attention. Another driver-facing camera vendor, Netradyne, uses deep learning and data from driver- and road-facing cameras to generate scores for drivers based on their safe and unsafe driving behaviors.

Some industry insiders believe that it’s only a matter of time before trucker wearables and driver-facing camera systems become standard—or even legally required. There are also early indications that such systems might be of interest for insurance purposes; one carrier’s safety director said he expects a mandate for use of fatigue monitoring “not from the feds, but from the underwriters.”

From the truckers’ point of view, there’s something viscerally offensive about the micromanagement enabled by these technologies. This is the felt reality of AI in trucking labor now: using AI to address human “weakness” through constant, intimate, visceral monitoring. There’s an enormous distance between the narrative of displacement that characterizes most public discussion of AI’s effects on truckers and how these effects are actually being experienced through these technologies. The threat of displacement is a real one, particularly to truckers’ economic livelihood—but driverless trucks are not yet borne out by common experience, and drivers are also not yet handing off a baton to or splitting routes with a robot coworker. Truckers’ encounters with automation and artificial intelligence have not yet supplanted them.

Instead, technologies like the ones we’ve discussed above represent a distinct and simultaneous threat: a threat of compelled hybridization, an intimate invasion into their work and bodies. AI in trucking today doesn’t kick you out of the cab; it texts your boss and your wife, flashes lights in your eyes, and gooses your backside. Though truckers are, so far, still in the cab, intelligent systems are beginning to occupy these spaces as well—in the process, turning worker and machine into an uneasy, confrontational whole.


Data Driven: Truckers, Technology, and the New Workplace Surveillance by Karen Levy. Copyright © 2023 by Princeton University Press. Reprinted by permission.