Nobody could say exactly when the robots arrived. They seemed to have been smuggled onto campus during the break without any official announcement, explanation, or warning. There were a few dozen of them in total: six-wheeled, ice-chest-sized boxes with little yellow flags on top for visibility. They navigated the sidewalks around campus using cameras, radar, and ultrasonic sensors. They were there for the students, ferrying deliveries ordered via an app from university food services, but everyone I knew who worked on campus had some anecdote about their first encounter.
These stories were shared, at least in the beginning, with amusement or a note of performative exasperation. Several people complained that the machines had made free use of the bike paths but were ignorant of social norms: They refused to yield to pedestrians and traveled slowly in the passing lane, backing up traffic. One morning a friend of mine, a fellow adjunct instructor who was running late to his class, nudged his bike right up behind one of the bots, intending to run it off the road, but it just kept moving along on its course, oblivious. Another friend discovered a bot trapped helplessly in a bike rack. It was heavy, and she had to enlist the help of a passerby to free it. “Thankfully it was just a bike rack,” she said. “Just wait till they start crashing into bicycles and moving cars.”
Among the students, the only problem was an excess of affection. The bots were often held up during their delivery runs because the students insisted on taking selfies with the machines outside the dorms or chatting with them. The robots had minimum speech capacities—they were able to emit greetings and instructions and to say “Thank you, have a nice day!” as they rolled away—and yet this was enough to have endeared them to many people as social creatures. The bots often returned to their stations with notes affixed to them: Hello, robot! and We love you! They inspired a proliferation of memes on the University of Wisconsin–Madison social media pages. One student dressed a bot in a hat and scarf, snapped a photo, and created a profile for it on a dating app. Its name was listed as Onezerozerooneoneone, its age 18. Occupation: delivery boi. Orientation: asexual robot.
Around this time autonomous machines were popping up all over the country. Grocery stores were using them to patrol aisles, searching for spills and debris. Walmart had introduced them in its supercenters to keep track of out-of-stock items. A New York Times story reported that many of these robots had been christened with nicknames by their human coworkers and given name badges. One was thrown a birthday party, where it was given, among other gifts, a can of WD-40 lubricant. The article presented these anecdotes wryly, for the most part, as instances of harmless anthropomorphism, but the same instinct was already driving public policy. In 2017 the European Parliament had proposed that robots should be deemed “electronic persons,” arguing that certain forms of AI had become sophisticated enough to be considered responsible agents. It was a legal distinction, made within the context of liability law, though the language seemed to summon an ancient, animist cosmology wherein all kinds of inanimate objects—trees and rocks, pipes and kettles—were considered nonhuman “persons.”
It made me think of the opening of a 1967 poem by Richard Brautigan, “All Watched Over by Machines of Loving Grace”:
I like to think (and
the sooner the better!)
of a cybernetic meadow
where mammals and computers
live together in mutually
programming harmony
like pure water
touching clear sky.