THE 6th ANNUAL LIST OF EMERGING ETHICAL DILEMMAS AND POLICY ISSUES IN SCIENCE AND TECHNOLOGY FOR 2018
Click the links below to read more about each issue and get links to news stories and other resources.
Sentencing software The black-boxing of the American legal system
While we don’t know just how many jurisdictions are using it, we do know that sentencing software, such as that made by Equivant, is already out there. In 2016, Eric Loomis was sentenced to six years in prison for attempting to flee an officer and operating a motor vehicle without the owner’s consent. It didn’t help that the car had been used earlier that day in a drive-by shooting or that Loomis was a registered sex offender. At the sentencing hearing, the court mentioned that it arrived at the sentence with the help of a “COMPAS assessment,” which helped determine Loomis’ risk of recidivism. COMPAS is a program sold by Northpointe, Inc. and marketed as a means to guide courts in their sentencing. According to Northpointe, the program is “designed to incorporate key scales from several of the most informative theoretical explanations of crime and delinquency…” But the issue, according to Loomis’ lawyers is that the algorithm is designed by a private company that will not reveal how it works. The Wisconsin Supreme Court decided that Loomis had no right to Northpointe’s proprietary software.
Emotion-sensing facial recognition Are we having fun yet?
Rana el Kaliouby is the CEO and cofounder of Affectiva, the company that wants to see how annoyed you are while you shop, and eat, and play video games, and just generally do anything that could involve you spending money at some point. Affectiva’s emotion recognition software can be incorporated into all sorts of things in order to provide “deep insight into unfiltered and unbiased consumer emotional responses to digital content.” Essentially, this means that the software allows companies to see exactly how you respond and react when using their website or playing their game, or using their app so that they can make adjustments to improve your experience (and their business). But how do they collect this data? Simple. Webcams. Say hello!
Ransomware We’re under attack!
2017 saw millions of dollars paid to cyber criminals by companies and individuals. At its most basic, ransomware is like a virus that can get into your computer, system, or database, and encrypt your files so you can no longer read them. The ransomers then ask for money in return for the encryption key. To add to the indignity, some of them will also include a creepy clown photo or threats of physical violence.
You don’t have to be a computer genius to launch an attack. The Dark Web currently has about 45,000 ads for ransomware for sale. A lot of it is designed to hit regular citizens. Since many of us blend our business and personal communications, the attacks can get pretty awkward. 59% of people attacked said they paid the ransom out of their own pockets (only 37% of ransoms were paid by their bosses), with payouts averaging $1,400.
Helix Life: there’s an app for that
Helix is the 23andMe for the App Generation. No longer about simply discovering your ancestry, this new wave of digital genomics is all about optimizing your body and mind and preparing for your future – straight from your phone, of course. You can buy individual apps that will read a sequence and tell you everything you’ve always wanted to know, like whether or not you’re a night owl (because, presumably, you need genetic confirmation of that).
But how do we know certain characteristics can be measured accurately? How would potential customers educate themselves about the status of the research used to do genetic analyses? If we get into diagnosis, what role will the FDA play in monitoring these apps? Do you even want your phone to break the news that you have cancer?
Robot priests Welcome to the robo-reformation
BlessU-2 was designed to be controversial and spark debate about the future of the church. When the Protestant Church of Hesse and Nassau rolled it out to mark the 500th anniversary of the Reformation in Wittenburg, Germany this summer, Stephen Krebs, one of its clergymen-creators, specifically said a robot could not hope to provide the pastoral care of a priest. The robot was built to deliver blessings in 5 languages all while moving its arms and flashing lights at you (the latter of which seems fairly unnecessary), begging the question: is nothing sacred?! But BlessU-2 is not alone. Last year (and going forward) Japan’s SoftBank Robotics is producing a new line of Pepper robot Buddhist monks. Pepper can deliver blessings and beat on a drum in a fairly crude manner. But this robot isn’t meant to be a mere conversation starter. The plan is to have these monks deliver funeral rites to the exploding elderly population in Japan for about 1/5th of the cost of a human-led, traditional funeral. And this brings up an interesting question – could a robot be a legitimate addition to religious services in places where, and to people who, have no alternative?
Social credit systems Street cred takes on a whole new meaning
In 2014, China’s State Council issued a report called “Planning Outline for the Construction of a Social Credit System” in which they alerted citizens of their plan to unveil a personal scorecard for every person and business in China, based on their level of trustworthiness, and that participation would be mandatory for every Chinese citizen by the year 2020. Though it’s still unclear how, exactly, it will work, it will attempt to rate people in four areas: “honesty in government affairs” (政务诚信), “commercial integrity” (商务诚信), “societal integrity” (社会诚信), and “judicial credibility” (司法公信).
The government is currently collecting data on shopping habits, credit rating, online behavior, friend connections, and anything else that might be used to give China’s 1.3 billion people a score from 350-950. People and businesses with higher scores will find it easier to do business (anything from signing contracts to checking into their hotels faster) – people with low scores, not so much. It’s suggested that people with low ratings will have slower internet speeds; restricted access to restaurants, nightclubs or golf courses; or that their right to travel freely abroad could be removed. Scores will influence a person’s rental applications, employment, ability to get insurance or a loan, and even social-security benefits.
The rise of the friendbot From digital resurrection to custom friendship
Even grieving has taken on new form in the 21st century. Eugenia Kuyda created an app called Luka last year after the death of her best friend Roman. Like many of us, Roman left behind a digital footprint made up of social media posts and chat messages. Kuyda collected those items from as many friends as she could and used them to create a sort of memorial chatbot designed to feel a lot like talking to Roman and to replicate the sense of intimacy and security she missed so much. Since then, thousands of people have chatted with Roman’s bot.
“With Me” is an app created by a South Korean programmer that allows you to make a 3-D avatar of your dead loved one using their old photos so that you can take some posthumous selfies (the catch is that this person needs to visit a 3-D scanning booth prior to their death in order to get the right images). The avatar can even read your facial expressions and ask you what’s wrong or comfort you. These digital duplicates raise all sort of issues about personal coping skills and resilience.
The Textalyzer Stay below the legal limit – 0
Distracted driving is becoming increasingly problematic in the United States and all over the world. Automobile accident fatality rates have increased dramatically over the last two years. The suspected culprit… texting and driving. Well, not only texting, but also Snapchatting, rifling through your playlist, checking e-mail, Candy Crushing, Tweeting, Facebooking, adjusting Google Maps, and scrolling through the Internet. Cell phone usage while driving is illegal in 47 states and the District of Columbia, but we see drivers of all ages doing it every day. One proposed solution is a machine dubbed the “Textalyzer.” It’s made by Cellebrite (the same company that made the software that can break into locked iPhones) and would give police officers the ability to access a driver’s phone after a crash or traffic infraction to see if they were using the device in the time leading up to the crash. The officer would plug the Textalyzer into the driver’s cell phone and retrieve a history of what they’ve been up to. They claim the content of texts or searches will not be accessible to the officers, just information about the time and length of usage on the phone. This data would, however, include exactly what apps you were using at exactly what time. Even just a swipe of the screen can be detected.
The Citizen app Keeping you safe or creating more criminals?
Developers behind an app called “Citizen” advertise it as a way for innocent citizens to stay safe and aware in areas wracked by crime. This controversial app at its core acts as a digital police scanner, notifying people of ongoing crimes or major events in their area. In addition, it allows for live streaming video directly through the app, providing “complete transparency of your neighborhood around you”. Thus, it gives users the option of filming both ongoing crimes and the police response to those crimes, all in an effort to improve the safety of communities everywhere. But “Citizen” was first released in October 2016 under a different name, with different branding: “Vigilante” was promptly removed from Apple’s App Store due to a violation of their review program with concerns centered around user safety. This first app release was published with the tagline, “Can injustice survive transparency?” alongside a dramatized video showing a violent assault being stopped by users of “Vigilante.” Police backlash against this release was strong, with the New York Police Department going as far to say, “Crimes in progress should be handled by the NYPD and not a vigilante with a cellphone.”
Google Clips Precious moments brought to you by Big Brother
Google introduced their cute little hands-free camera to the world on October 4th and marketed it specifically to pet owners and parents (people who not only like to keep an eye on things, but like to take photos of things that won’t sit still). Clips is a blend of AI and facial recognition software designed to “capture beautiful, spontaneous images” of a person’s life. It can be set up anywhere (or attached to you) and will constantly scan your environment (in its 130-degree field of view) for the faces you interact with most (because they, presumably, belong to the people and pets you love). Then, it will record up to 16GB of motion photos when it “senses” potentially picturesque life moments. While there’s no camera display, you can monitor what’s happening live from your phone. If you’re thinking that seems like constant surveillance, you’re not wrong. The obvious issue here is privacy. There will be those who don’t believe anything Google says about the camera not being a surveillance device (a fairer argument would be that it’s certainly not optimized to be one). But there’s also an interesting issue here about letting Google’s new AI algorithm Moment IQ decide which of your life’s moments truly deserve to be captured.