4 Jobs That We Shouldn’t Trust to Robots

Technology has become an integral part of everyday life, but there has to be a line, right? From hiring and firing employees

It’s 2019 and robots are everywhere. They’re in our pockets, our cars, our coffee makers, and yes even our toilets. But there are some jobs that robots just aren’t cut out for, and our societal inability to tell the difference has caused some serious problems.

Technology evolves pretty fast. In fact, the only thing that evolves faster is our ability to integrate new technology into our daily lives. Smartphones have almost become a greater necessity than oxygen in their short decade of existence, virtual assistants are bringing the robot helper dream into reality for millions of households worldwide, and artificially intelligence software is automating more jobs than ever before.

However, while robots are particularly adept at answering trivia questions and reporting the weather, a lot of jobs require empathy, common sense, and other personality traits found exclusively in humans. Sometimes, in our drive to be as efficient as possible and use tech to ease our own workloads, we forget that.

Hiring

Not OK, computer. Amazon’s hiring AI had a sexist streak

Looking for a job is an incredibly stressful process, but hiring someone can be just as intense. After reading through hundreds of resumes and conducting dozens of interviews, it’s understandable that recruiters would eventually turn to robots to lighten the load. Unfortunately, as it became apparent last year, relying on robots to be unbiased is a much bigger ask than you might realize.

According to Reuters, Amazon has to scrap its artificially intelligent recruiting software for being biased against women candidates. The software, originally developed in 2014, was trained on information submitted by applicants over a 10-year period of time. Because of the tech industry’s infamous gender gap problem, most of those candidates were men, which led the software to penalize candidates with the word “women” anywhere on their CV.

This is one of the largest obstacles for artificial intelligence. Because artificial intelligence relies on societal patterns and data, and our society has a tendency to be a bit sexist, robots like Amazon’s recruiting software inherently adopt sexist practices. To make this software practical, developers need to find a way to write bias out of the programming. Unfortunately, we haven’t figured out how to do that with humans yet, let alone robots, so maybe we should stick to good old fashioned human hiring.

Firing

Getting fired sucks. Losing your source of income is enough to make some people get seriously worried about what their future holds, and there’s no robot that can replicate the human emotions needed to calmly and reasonably guide them through the transition. But that doesn’t mean some companies haven’t tried.

One story in particular illustrates the problem with an automated firing system. The saga saw Mr. Ibrahim Diallo fired from his job without reason or explanation. When he arrived at work, his key card wasn’t functioning properly. After being let in by a security guard, he found himself locked out of all work stations and computers. He talked to his manager, who assured him it was a mistake and that he was not fired. Shortly after that, two men arrived at his office with instructions to remove him from the building.

“I was fired. There was nothing my manager could do about it. There was nothing the director could do about it. They stood powerless as I packed my stuff and left the building,” wrote Diallo in a blog post detailing the situation.

Diallo’s company employed an automated system that, once an employee was fired, activates a number of processes, including deactivating key cards, removing access to work stations, and alerting security to a now-ex-employee that needs to be removed. Each process begins as soon as another ends, with zero human interaction necessary.

“The system was out for blood and I was its very first victim.”

However, as Diallo’s manager explained to him, he wasn’t fired. A former manager had forgotten to file his paperwork correctly, which had caused his contract to lapse and be listed as a former employee. Subsequently, the “firing” system activated and there was nothing anyone could do. It took three weeks for managers and directors with the company to undo the mistake, costing Diallo three weeks salary and the trust of an automated firing system.

Doctoring

Doctors are some of the most well-trained professionals in the world, but technology has certainly improved their ability to do their job. In fact, robotic surgeries have become fairly common in hospitals around the country, helping skilled surgeons to perform even more life-saving procedures. However, bedside manner is certainly a tool in a good doctor’s utility belt, and robots just aren’t there yet. We just wish someone had told this to the following doctor…

At the Kaiser Permanente Medical Center in Fremont, California, one doctor thought it would be a good idea to deliver some bad news to a patient by way of a robot attached to a video-link screen. After telling the 78-year-old man and his family that he had a few days to live, the family expressed their extreme and understandable frustration at, what they called, “an atrocity of how care and technology are colliding.”

Robotic surgery is one thing. But telling someone that they’re going to die is not a job for a robot, no matter how creepily realistic some of them get. Human empathy, compassion, and even touch aren’t yet replicable in robots, and they are absolutely necessary when it comes to these kinds of medical duties.

Judging and Sentencing

Much like hiring new candidates, firing old employees, and practicing medicine on scared patients, being a judge is a particularly nuanced profession that requires a human touch to be done right. However, with the criminal justice system experiencing notable problems with speed, employing an algorithm to help with the smaller details can be an attractive alternative to month-long wait times. It also, however, has some pretty serious consequences if you don’t check up on it.

In a ProPublica report titled Machine Bias, researchers looked into a computer program designed to evaluate criminals on their potential to commit another crime, specifically in Florida. They found that, not only were these assessments wildly inaccurate, they also skewed significantly racist, falsely flagging black defendants twice as often as white defendants.

These risk assessments are used in at least 10 states in the US to help judges decide on things like assigning bond amounts to length of prison stay. And had ProPublica not taken the initiative to look into these risk assessments, the system would be an even more biased mess.

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Conor is the Lead Writer for Tech.co. For the last six years, he’s covered everything from tech news and product reviews to digital marketing trends and business tech innovations. He's written guest posts for the likes of Forbes, Chase, WeWork, and many others, covering tech trends, business resources, and everything in between. He's also participated in events for SXSW, Tech in Motion, and General Assembly, to name a few. He also cannot pronounce the word "colloquially" correctly. You can email Conor at conor@tech.co.
Explore More See all news
Back to top
close Building a Website? We've tested and rated Wix as the best website builder you can choose – try it yourself for free Try Wix today