Talking With Bots

If You Talk to Bots, You’re Talking to Their Bosses

The customer service chatbots offered by companies are simply a new way to gather user data

5 min read

A wall of a store, with lines of plastic animal masks hanging from hooks.
Image credit: Phil Shirley // CC BY-NC-ND 2.0

At a recent financial technology conference I was invited to meet Cleo, a friendly automated spirit living within the confines of an iPhone interface that offers financial advice to those who chose to activate her. Described as an “AI assistant for your money,” she playfully answered text message queries about bank balances, spending, and budgeting.

Hey, Cleo, what’s my balance?
Hey, Alex! MasterCard: -£760. Current account: £1048. Savings: £1700

Cool, how much have I spent at Pret this month?
You’ve spent £44 at Pret since you got paid on the 15th of December

Despite the futuristic jargon and growing wave of hype around such bots, automated assistants are not really that new. Our world is full of simple bots, like the automatic hand dryer in a public restroom that jets hot air if you trigger its sensor. By now many of us have experience with some that attempt to mimic basic personality, like the tinny voice of the supermarket self-service checkout machine as it coldly attempts to replicate one half of a stilted, human conversation.

There are two new trends emerging, though. First, the sophistication with which bots interact is increasing. They are able to respond to (or more accurately, be triggered by) a greater variety of situations. The iPhone’s Siri is programmed to respond to all manner of vocal instructions, from answering your queries about the time, to searching the internet for you.

Second, the mode by which they frame that interaction–how the machine addresses you–is also shifting. An old automatic hand dryer may have a sticker on it that says: “To activate dryer, put hands below.” This is essentially an instruction left by the creator of the machine. The supermarket checkout machine similarly offers generic audio instructions like, “Place items in the bagging area.” The new breed of bots, however, are dropping non-personal words like “the” in favor of mysterious first-person pronouns like “I” and “me,” or possessive pronouns like “my.” Imagine the automated checkout asking you to “place the items in my bagging area.”

Machine entities are increasingly claiming to have some subjectivity, some sense of an independent place in the world. Or, rather, the team that creates the machine programs it to claim subjectivity.

The new generation of bots, which include the scheduling assistant Amy and Amazon’s Alexa, would very much like to get on a first-name basis with you. Amy’s website offers an enthusiastic greeting: “Hi, I’m Amy! You interact with me as you would to any other person–and I’ll do all the tedious email ping pong that comes along with scheduling a meeting.” She then gives her personal email address.

Who, though, is this “I”? Who is Amy exactly? The home page certainly doesn’t mention her creators. Perhaps the personalities, agendas, and egos of the team might distract from the personality, agenda, and ego of the being they present to us as their avatar. Amy is keen to tell you that her ability to schedule your meetings is “like magic,” rather than, say, the end result of using venture capitalists’ money to pay highly trained machine-learning software engineers.

The creation of a subjective identity for the bot may be an attempt to mystify and delight, to create some kind of warm, intuitive human experience. Humans have a long history of mystifying objects in order to imbue them with meaning beyond their immediate functional use. Anthropologists sometimes refer to this as fetishization. A fetish object is not just something you find in an S&M club. It’s any object seen to be more than just an object. The entire branding industry works off this concept: Levi’s jeans are not just functional items of clothing. They’re a lifestyle. Jack Daniel’s isn’t just whiskey. It’s a carrier of the down-home spirit of rural America.

We do it too with sentimental items and heirlooms, like the old guitar inherited from your grandfather or that faded leopard-print shirt that your mother once wore in the 1980s. A key element of this fetishization is to take a social relationship of some sort–such as a relationship between family members–and project it into, or imagine it within, the confines of some physical thing, as if your relationship with your mother is in the fabric of the shirt.

This can be a very positive experience. Perhaps, if your mother has died, you hold onto that shirt when you feel afraid and want comfort. Fetishized objects can even be used to facilitate difficult relationships, as in the case of Wampum beads created to signify a peace treaty between warring groups, symbolically holding the peace within the beads as a way to make it seem more concrete.

On the other hand, a fetishization process might be used to conceal exploitation, or distract from a tense relationship. The Marxist concept of commodity fetishism critiques how people fixate upon things rather than the humans that make them. You don’t see the labor exploitation that might exist beyond the interface of your new tablet computer. You mistake it for a thing-in-itself, floating in abstract consumer space; something you bought off eBay rather than something created by real humans in a real factory somewhere. To exclusively focus on the immediate here-and-now contours of an object is to ignore the earlier and deeper relations the led to its appearance in the world.

We don’t only fetishize physical goods, though. One of the ultimate fetishized entities is the company or corporation. What is it but a group of real people working under the banner of an artificial “object,” an entity given a name like Barclays and possessing a legal personality that enables it to “do things.” Such “legal persons” can sue others, enter into contracts, give gifts, and issue statements.

When issuing statements, though, the bank Barclays still doesn’t address itself to people as “I,” possibly because that would stretch the bounds of plausibility. When push comes to shove, we all know that underneath Barclays’ blue logo is a big group of people actually carrying out the work that is then attributed to “Barclays”–and we know that Barclays doesn’t really exist unless those employees carry on going to work. A corporation may be a legal person, but it is not a “natural person“; it is not an actual living, subjective being.

But let’s for a moment imagine that people working within such a firm have designed an advanced automated system that they set up as a self-service user interface for their customers. Now imagine that they give that automated process a human name, and get it to issue statements in which it literally claims to be a natural person. Their system is programmed to refer to its workings as “I.”

We speak to it. We think we are talking to it. And yet what we are really doing is interacting with a machine process put in place by a particular group of humans with a private economic agenda, denying their own existence and maintaining the illusion that the agency resides with an external being, dressed up in the form of a friendly helper.

This is the next stage of corporate personhood. If the user-experience layer of a company’s processes can be completely automated, neither its owners nor employees need to present themselves to people. They give the automation an abstract personality and make it a being-in-itself.

Such a being–like any company–is prepared to be helpful, provided you never ask anything too challenging. I asked Siri, “How much is my data worth to the company of people that you represent?” Siri paused, and suggested I read some Financial Times articles on the value of data.

After all, my apparently playful and private interactions with these beings will end up as log entries in a database, ready to be subject to data analyses, ready to be sold, and ready to be used to project my own personality back at me in the form of an ever more disingenuous artificial being. It will refer to me as Brett, and suggest things that I didn’t even realize I wanted, induced from statistical relations between different data sets I’ve inadvertently produced.

It’s time for us to to move beyond uncritical hype around bots, and to start considering the real economic agendas for why these virtual beings (are claimed to) exist. We can design automated entities with different personas, like the “I clearly don’t give a shit” supermarket checkout bot that doesn’t even pretend to like you, or the “I’m really fun” bot put in place by a startup company. But the one persona that is likely to always be missing is the Honest Bot, the one that clearly tells you its agenda, like a true friend who drops their façade and lets you know their dark secrets.

The Honest Bot does not currently exist. The new wave of bots are–to use a term popularized by the existentialist misfit Holden Caulfield from The Catcher in the Rye–the ultimate phonies. They’ll pretend to be friendly, to be cool, to be serious, to be insightful, and even to be self-reflective. But they’ll never just be themselves. Because, in the end, there is no “I.” There is only a company, its shares held by who knows who, possibly registered in Panama.

Maybe one day we’ll invent the Honest Bot–and I urge anyone creating bots to do this, please. But until then, just remember the following: If you’re talking to bots, you’re talking to their invisible bosses.

spacer

How We Get To Next was a magazine that explored the future of science, technology, and culture from 2014 to 2019. This article is part of our Talking With Bots section, which asks: What does it mean now that our technology is now smart enough to hold a conversation? Click the logo to read more.

Brett Scott Brett Scott is an author, journalist and financial campaigner. He is the author of The Heretic’s Guide to Global Finance: Hacking the Future of Money (2013), and collaborates with a wide range of groups on monetary systems, banking reform, alternative currencies, financial activism, digital finance, blockchain technology, hacker culture, and technology politics. He tweets as @suitpossum