Professor Karen Levy has been studying the automation of the trucking industry. That is why New York Times writer Noam Scheiber interviewed her in preparation for his article on robots in the Amazon warehouse on Staten Island. Schreiber and Levy are studying the same thing, although the one is looking at highly automated packing warehouses and the other at long haul trucking.
This steady stripping of human judgment from work is one of the most widespread consequences of automation — not so much replacing people with robots as making them resemble robots.
That’s the Scheiber puts it. People react to this, of course.  Levy tells about a trucker who had figured out how to play solitaire on the computer that the company installed in his cab and observes, “It was a super meaningful way for him to preserve a little bit of decisional autonomy.”
But that’s really where the trouble is, isn’t it? People need some “decisional autonomy” to reassure themselves that they are human; that they are persons. The needs of the system—shipping orders at the Amazon warehouse and maintaining the most efficient operation of trucks (and of truckers)—are not compatible with “decisional autonomy.”
The system works better with a single rationality engine. The people in the system work better when are able to act as competent agents. This is one of the major dilemmas of our time and I don’t see a happy resolution to it at all.
Marx imagined a communist utopia where mechanized industry did all the work and the people were free to pursue their own interests. Little groups of neighbors would form themselves into literary societies and orchestras and so on. That’s not how things are looking at capitalism’s current stage of development.
Here is a conversation from Levy’s study of truck drivers. Note the difference between the way the questions are put and the way the answers are put. “Consider,” she says, “the following exchange I had with one driver about how to get from Oregon to Indiana. 
Q: So you don’t use GPS though?
A: GPS? No. Honey, I’ve been driving for twenty-nine years, I’ve been all over the United States, I don’t need a GPS. I don’t even need a map.
Q: You don’t use a map?
A: [laughing] No.
A: Hell, no. I could drive—where do you want to go?
Q: West Lafayette, Indiana. […]
A: Go around Ontario, Oregon, over to Pocatello. Go south on Pocatello, go to McCammon, that’s 30, it runs—McCammon runs over to 80, I-80, that’ll come out by Little America, take Little America—or the 80, excuse me—run that over to Chicago, right? Get through Chicago, now from there it’s up to you which way you want to go. […] You’d have to go south on 65, down towards Indianapolis. […]
Q:So how do you learn all this [about different routes]?
A: Honey, driving them.
She sums up exchanges of this kind by saying: “Road knowledge” gleaned from years of experience serves as a clear source of value and professional identity for these workers. 
The Future of Road Knowledge
The principle works the same way at the Amazon center. The workers are doing repetitive jobs and the question is how fast they can do them. A change in what a worker does that shaves one second off his part of the process is worth a lot of money to the company and Schieber’s piece gives us workers who look at it that way.
Here, for instance, is a “stower”  named Jing Zhang:
Mr. Zhang seemed like a state-of-the-art Amazon employee — someone who saw the world through the eyes of a manager. “I try to find ways to make me more efficient,” he said. He figured out how to reduce wasted movement by unpacking the box closest to the shelving unit first, then replacing it with the next-closest box, rather than wandering to and from other boxes.
Seeing things through the eyes of a manager means privileging the system perspective over the personal perspective. It means the end of “road knowledge” as the trucker would put it and the end of decisional autonomy, as Levy herself puts it.
Shawn Chase has chosen a different way to keep his head in the game.
A picker named Shawn Chase said he motivated himself by competing with a friend in a different part of the warehouse to see who could earn the higher productivity ranking…“Last week I was 41st in the building,” he said. “This week I’m trying to be top 10.”
But, of course, “keeping your head in the game” is not the choice everyone makes. Karen Levy remembers a trucker “who had figured out how to play solitaire on the computer that the company installed in his cab. Her comment about this strategy is that it was “a super meaningful way for him to preserve a little bit of decisional autonomy.”
This steady stripping of human judgment from work is one of the most widespread
consequences of automation — not so much replacing people with robots as making them resemble robots. “The next pod comes, and a pod comes after that, and after that,” Mr. Long told me. “All day till you get off.”
That is Scheiber’s view. In the meantime, Levy says, “what you end up doing is making people better cogs.”
So there it is. If the future is the full use of robots, humans will have to find something to do. Maybe it will be what Marx envisioned. In the meantime, there will be a mixing together of robots and humans under the direction of the system devised for the process. This system will have no place for decisional autonomy. “Road knowledge” will disappear in due time, but long before that, it will be demeaned as a weakness The picture to the right shows a much happier kind of resolution.
Decisional autonomy is easier to insist on when someone is directly taking it from you. A boss who is a bully will create a complete palette of acts of covert resistance among the workers. “You can’t do that to me” is the cry of the heart in such circumstances. But when “the system” devises the kind of interface with robots which strips away all decisional autonomy, it is hard to find someone to receive your acts of defiance.
If you are in that situation, you need to step up and be a corporate hero, like Jing Zhang, or a subversive, like the trucker playing solitaire. The system is stacked in favor of the robots, so larger and larger numbers of people are going to have to find a way to do without “decisional autonomy.”
We used to call that “being human.”
 I say “of course,” because I am thinking about my time and the people I have known. It may be that the workers of the future will simply not feel it as a loss.
See her article inFeminist Media Studies. Apr2016, Vol. 16 Issue 2, p361-365
 And it isn’t just the automatic routing that bothers the truckers Levy studied. “As two drivers put it in regulatory comments:”A computer does not know when we are tired, [f]atigued, or anything else. Any piece of electronics that is not directly hooked up to my body cannot tell me this. … I am also a professional [and] I do not need an [EOBR] telling me when to stop driving … I am also a grown man and have been on my own for many many years making responsible decisions!”
 In the Amazon line, the stowers come right before the pickers. In fact, Zhang is making an acute comment when he says, “Our customers are the pickers.” Making things easier for the next agents (whether human or robot) is, in fact, Zhang’s job.