In 1974, Pulitzer Prize-winning author Studs Terkel published an oral history of Americans on the job in his book Working. He interviewed people from all walks of life, whether newspaper delivery boy or stockbroker, public school teacher or sex worker.
Within months of starting her career, one of Terkel’s interviewees, flight attendant Terry Mason, found herself jet-setting to Europe and meeting celebrities. But soon enough, the less-than-glamorous reality of the profession sunk in, she told him: “It’s always: the passenger is right. When a passenger says something mean, we’re supposed to smile and say, “‘I understand'”¦ Even when they pinch us or say dirty things, we’re supposed to smile at them.”
Five years after Mason’s interview was published, sociologist Arlie Russell Hochschild coined the term “emotional labor” when studying this disconnect phenomenon in flight attendants. By her assessment, the profession expected these women to play up a certain social role–”a beautiful and smartly dressed [American] Southern white woman, the supposed epitome of gracious manners and warm personal service”–in the interest of the company. Underneath, Hochschild found that flight attendants seethed with anger at disrespectful and abusive passengers. An outward smile often masked deep-seated anxiety, resentment, and symptoms of depression.
Around that same time, psychologist Christina Maslach began exploratory research in the area of job burnout, eventually authoring the now-standard Maslach Burnout Inventory, a 22-item survey used to measure a person’s degree of professional burnout, in 1981. Survey statements include, “I feel used up at the end of the workday,” “I’ve become more callous towards people since I took this job,” and “I feel [customers] blame me for some of their problems.”
Over three decades later, working in a service job still demands the stamina and resilience to handle a barrage of customer complaints–and often even abuse–with a smile. Professions that require emotional labor, which involves inducing or suppressing emotion for the sake of a job, continue to see unprecedented levels of attrition, especially among customer service representatives, flight attendants, doctors, nurses, school teachers, and hotel employees.
But as robotics and computing evolve, some researchers foresee a future where technology can relieve the long-held emotional burden of some of these professions. They think we owe it to the service workforce to offer a new frontline of robotic protection. In fact, they think it’s far overdue–but does that actually mean just automating these jobs away?
It must be noted that emotional labor is also very often gendered labor. It was by no coincidence that Hochschild, a woman, coined the term and initially saw her research taken up most seriously in feminist circles. According to the U.S. Bureau of Labor Statistics, women make up 95 percent of all secretaries and administrative assistants, 93 percent of flight attendants, 73 percent of cashiers, 70 percent of waitresses, 66 percent of hotel desk clerks, and 65 percent of customer service representatives.
Even outside these conventional service sectors, women still bear the brunt of regulating emotions in the workplace. Hochschild noted that women generally handle the management of feelings more often than men, both at work and home. Even in higher-level corporate settings, where both genders are expected to defer to clients, women may be counted on for emotional extras like remembering birthdays or making small talk.
In today’s on-demand economy, where ratings and social media can make or break a company, this kind of work has more relevance than ever. Restaurant patrons who feel slighted won’t hesitate to give a dreaded one-star review on Yelp based on bad service alone. In some cities, Uber drivers require an almost perfect five-star rating to prevent their account from being deactivated. Even in U.S. hospitals healthcare professionals feel the pressure to smile and perform, as low patient satisfaction ratings–measured by a federally mandated survey that asks patients to rate their communication with doctors and nurses–lead to financial penalties taken from the hospital’s Medicare reimbursements.
Most current approaches to successfully performing emotional labor and easing professional burnout fall on the employee’s shoulders. One established method for curbing emotional exhaustion is called deep acting, or modifying one’s inner feelings to match outward expression. This, according to Hochschild, “from one point of view involves deceiving oneself as much as deceiving others.”
In her book The Managed Heart, Hochschild distinguishes deep acting from surface acting as two separate approaches to performing emotional labor. For instance, when dealing with a difficult passenger, a flight attendant who forces a smile while muttering obscenities under her breath adopts surface acting. The expression on her face is painted on and “not part of her true self.” But an employee who adopts deep acting might try deep breathing to calm herself down, or tell herself that the passenger is probably afraid of flying. She uses these tactics to minimize the gulf between her real feelings and those expected of her.
Deep acting, according to research, allows a worker to perform emotional labor with reduced emotional dissonance. Studies on administrative assistants and hotel service providers cite lower levels of stress, exhaustion, and cynicism in those who use the technique. While surface acting has been associated with job burnout and depression, those who practice deep acting tend to feel a greater sense of personal accomplishment at work.
But while this and other recommendations to try mindfulness practice, meditation, and exercise may help laborers better manage problem customers and stressful jobs, they squarely place the responsibility of dealing with the rigors of emotional labor on the “victim.” Customers are never told to take ownership for the way they treat those serving them, or encouraged to practice their emotional intelligence when interacting with service representatives. As Laurie Penny writes about turning the ideology of self-care into a politicized anecdote to systemic issues: “Essentially, if we are sick, sad, and exhausted, the problem isn’t one of economics. There is no structural imbalance, according to this view–there is only maladaptation, requiring an individual response.”
That aside, when dealing with downright abusive customers no amount of deep acting will suffice. Nor should it be expected to. At long last, here’s where professions with especially high rates of turnover–retail, customer service, nursing, social work–might benefit from a technological buffer.
“In the future, robots may serve in a variety of support roles, such as home assistance, office support, nursing, childcare, education, and elder care,” said Gurit Birnbaum, a psychologist at Interdisciplinary Center (IDC) Herzliya. Last year, Birnbaum and her colleagues published a study designed to answer a key question in human-robot interaction: Can a robot’s actions make humans feel comfortable and emotionally supported? If the answer was no, then robots probably couldn’t perform many service jobs.
In their experiment, a non-humanoid robot named Travis reacted responsively (e.g., with a head nod or reassuring words) to people relating emotional stories. Upon observing the interaction between the robot and human participants, the researchers noted approach behaviors toward Travis, such as leaning forward and making eye contact. The experiment’s participants also said they felt listened to and more open to the idea of robot companionship afterward.
Unbeknownst to them, Travis wasn’t entirely automated–in a Wizard of Oz-like setup, there was a human behind the curtain in control of the machine’s reactions. Nevertheless, the trial proved that people are willing and open to the concept of interacting with robots on an emotional level.
Other studies suggest that the human tendency to socialize inanimate objects increases the likelihood that we will accept emotional labor performed by service robots. A 2013 experiment had subjects watch videos of a faceless human choking and punching a dinosaur robot named Pleo, accompanied by suffering-type sound effects. After, viewers reported feeling empathic concern for Pleo. Their skin conductance level–a measure of emotional arousal–had become elevated while watching the video, offering an objective measure of emotional response.
Enter a humanoid robot named Pepper. Released in 2015, the cherubic-faced bot has already found its place in retail stores and hospitals as an interactive information kiosk of sorts. From Tokyo-based SoftBank Robotics, Pepper is designed to perceive human emotions and react accordingly. The friendly-looking bot is also already a staple inside some Japanese homes, acting as the family’s robot companion.
Still, with well over 10,000 Peppers “in the workforce,” its empathy and emotion recognition abilities need work. The Verge’s Tokyo reporter, Sam Byford, says the robot’s spoken responses feel canned, instead of being synthesized in real-time. For now, Pepper can only say some stock phrases and, for example, send text messages to answer shoppers’ questions, because it’s not yet advanced enough to orate them. Rather than conversing in a natural back-and-forth way, notes Byford, Pepper seems more like a walking, talking billboard that boasts about SoftBank’s other products.
At the moment, humanoid robots may be confined to the realm of gimmicky entertainment. But when the technology does get there–as it most certainly will–the potential for service robots is vast.
Then there’s Ellie, a clinical therapist whom I recently met in the lab of computer scientist Jonathan Gratch. In a soothing voice, she asked me questions like, “Where are you originally from?” and “What do you like about living in Los Angeles?” As I responded, Ellie nodded her head and smiled at all the right moments, immediately putting me at ease.
As charming as she is, Ellie isn’t a real person. She’s a virtual human–the creation of Gratch and his colleagues at the University of Southern California’s Institute for Creative Technologies. A software agent, she looks like an ordinary CGI character displayed on a screen. But she’s not scripted–behind the scenes, she uses sophisticated machine vision and voice analysis to interact with humans in real-time.
“Researchers focused on emotional labor have suggested using virtual humans as a first line of defense, so to speak, in customer service,” said Gratch, who is also a computer science professor at USC’s Viterbi School of Engineering. “Anywhere you have people doing emotional labor, there could be a potential for this technology to serve that role without incurring the negative effects on a worker’s health.”
I certainly respect Gratch’s line of thinking, but isn’t he ignoring that most customers absolutely hate interacting with robot customer service representatives? Are we really going to change just because we can see a “virtual human” on a computer screen?
After the brief demonstration, Gratch and his colleague lifted the curtain on Ellie. As it turned out, throughout our conversation she had been tracking an astounding amount of data about me: my facial expression, attention level, upper body movements, voice pitch, eye gaze, and smile level. She recorded, moment to moment, if I had seemed happy or sad, engaged or distant, and tweaked her responses accordingly.
To go one step further, Ellie actually increases people’s willingness to disclose personal or embarrassing information about themselves, compared to when talking to a real human. Gratch’s point? There’s trust and affection there, and her facial presence helps. Another study comparing a virtual human with an audio-based system like Siri found that the embodied agent evoked less anger and a more positive mood when bad news was delivered.
Early experimental evidence suggests that humans tend to have more empathy for embodied, responsive characters. And when we inevitably get mad or frustrated, machines can’t experience emotional fatigue or feel pain, which is undoubtedly an advantage for their place in service work.
We can’t, however, shelve the unintended consequences of entirely replacing human service professionals with machines. Specifically, since the professional service industry is mostly comprised of women and, for some paid out on an hourly basis, those with less education. Assuming such work becomes automated in the future, these people may not have to perform the emotional labor associated with their previous professions–but they could find themselves out of a job. If robots begin to replace occupations for specific demographics, this will quickly become a political issue, even if we can get the technology right.
Guy Hoffman, an assistant professor in the Sibley School of Mechanical and Aerospace Engineering at Cornell University, invented Travis and other robots to study how they can enhance our everyday lives. In his opinion, machines have already taken over service jobs previously held by bank tellers, airline check-in attendants, and elevator operators.
“In the 1940s or 50s, you can imagine that saying hello to the elevator operator was an important part of a person’s day. Some may have said, “‘I would never enter an elevator not driven by a person,’ or “‘I don’t want to give up the morning greeting of my elevator person,'” said Hoffman. “But gradually, “‘human elevator operator’ as a profession disappeared. I think people underestimate our adaptation to change.”
Science and technology giants like Stephen Hawking, Bill Gates, and Elon Musk have already expressed their reservations about artificial intelligence, robot workers, and how far automation will go. Perhaps finding some middle ground will be the answer. One possible future, Gratch suggests, could be that human service professionals be reserved for the elite, while the rest of us will deal only with machines. No airline Platinum status? Your call gets transferred to a virtual human like Ellie.
“That’s one possible negative trajectory with this technology being used for more service jobs. But researchers are looking into the strengths of technology versus the strengths of people, rather than just swapping out people with technology,” said Gratch. “The real question is, is there some way to combine the two to do better?”
How We Get To Next was a magazine that explored the future of science, technology, and culture from 2014 to 2019. This article is part of our The Way We Work section, which looks at new developments in employment and labor. Supported by Pearson. Click the logo to read more.