Fernando Fischmann

Tech Ethics Issues We Should All Be Thinking About In 2019

3 January, 2019 / Articles

Technology doesn’t police itself. New technologies require us to pay attention, voice our concerns constructively, and demand accountability when people are harmed.

For the seventh year in a row, we have released a list of ten technologies that people should be aware of in the hopes of giving non-experts a window into what’s going on in labs around the world. The goal has always been to raise some of the ethical and policy issues that surround these technologies, not to scare anyone, but to drive home just how much the average American might be unaware of when it comes to what’s coming down the pipeline or already in their homes, potentially doing harm.

Over the years, the list has touched on everything from healthcare innovations to biased algorithms and advanced military weaponry. And while there’s been the occasional entry on head transplants and cyborg roaches, most of these issues will touch us all in some way. Our most challenging task will be to try to de-politicize as many of these discussions as possible so that we can truly talk things out. Of course, when it comes to implementing policy (or even suggesting that policy is the right way to approach an issue), politics creep right back in. Clearly, we’ve got a lot of work ahead of us before we can make headway, but that only means we better get on it as soon as possible.

In 2019, the list includes some technology you’ve definitely heard of (such as 5G) and some that will come as a surprise. This is the list for 2019:

Pet cloning: For $25k-$50k, you can now clone your cat or dog. However, there are no guarantees you’ll get a new pet that looks or acts like your old one, and the animal-lovers out there would do well to note that the host animals used to gestate clones have a pretty miserable life. Is it right to invest in this technology when there are so many animals in need of homes already out there?

DIY neurohacking: At-home neurostimulation devices have hit the market, but plans for making your own are all over the Internet. Customers hope that zapping their brains with a small electrical current will help improve everything from memory to attention, but we don’t know the long-term effects of neurostimulation. Combine that with high hopes and a few extra zaps with the intention of superpowering your brain and you’ve got a recipe for potential disaster. Should there be some kind of oversight, or should we let people do whatever they want to their brains? What about parents who let their children use such a device in the hopes of boosting their grades?

Behavioral biometrics: Forget PINs and passwords. More institutions are now using hand-eye coordination, the angle at which you hold your device, finger pressure, hand tremors, navigation patterns, and other hand movements to judge whether you’re really you when you log into an app. We all want to be protected from hackers, but we might also want to think about how this information is being collected, stored, and used. Do we have a right to know what our behavioral profiles look like? Will we have to sue companies to get the information?

5G: 5G is on its way, but it’s not here yet, despite the false promises of companies like AT&T. With speeds of up to 1 gigabit per second, 5G has the ability to change the world by enabling speed-of-thought communication. But 5G is also an enormous and expensive infrastructural undertaking and will require new regulatory frameworks. Devices will have to be replaced to take advantage of the new network, increasing the amount of electronic waste we produce. Rural areas that still have incredibly slow wifi will be further left in the dust as the digital divide deepens. And to top it off, it will require an enormous amount of energy to run, far more than we’re able to produce through renewables. None of this will stop 5G from coming, but what should we be doing now to help prepare?

The datafication of children: Children don’t have any right to privacy when it comes to their parents. That’s why it’s so alarming that parents are the biggest violators of their kids’ privacy. Ultrasound photos on Facebook and live updates from the delivery room mean children now have digital footprints before they’re even born. And because any data that can be hacked will, we’ve already seen children and their families extorted for money as a result. The FBI even recently warned that kids are at risk in school as well and hackers have already stolen academic and behavioral data from thousands of schools. What does this mean for your child’s future, especially in an era when background checks are ubiquitous and all-knowing?

Insect Allies: DARPA’s military research can always be counted on for a good scare. It’s some of the strangest and most cutting-edge research in the world and while it’s designed for national security, it often raises a lot of ethical issues. Their Insect Allies Project has been around for a while, but has recently received more attention. The goal is to create genetically modified insects that can deliver viruses to plants. The viruses deliver new genes to the plants in an effort to make them more resistant to climate change and human interference. But it’s easy to argue that they can just as easily be used to decimate crops and wipe out the food supply of millions of people. Is this a biological weapon? Will it motivate other countries to develop the technology in defense?

Sidewalk Labs: If you really want an example of how the public should comment on new innovations, look to Toronto where citizens are concerned about a new project designed to create a hyper-efficient city in an underutilized 12-acre area called Quayside. Through the use of interconnected sensors, a company called Sidewalk Labs (which is a subsidiary of Alphabet, the parent company of Google) want to monitor traffic, pedestrians, weather, pollution, building occupancy, and sewage. But these sensors will track people and everything they do in an effort to create the smartest city possible, and that can’t happen without the help of a lot of third-party entities that need access to this data. If Sidewalk Labs wants the support of Toronto’s citizens, they’re going to have to answer some questions about where the data is going and how they plan to remediate any damage they cause to people or the environment.

Autonomous translation: Companies like Microsoft are currently trying to build AI that can perform real-time translations of human speech. But language is wildly complex and our slang and idioms make it difficult for a computer to get a good enough translation to trust it in tricky situations like military engagements. But translators aren’t always available, so we have no choice but to keep working on the technology. Despite its inevitable march forward, it’s worth thinking about whether you’d put your life in the hands of a translating computer.

Seeding trials: Of course, pharmaceutical companies and their affiliated researchers would never admit to participating in seeding trials, but they’ve become a common marketing ploy to get doctors familiar with new drugs and their potential off-label uses. Pharmaceutical companies recruit physicians to lead small studies on new drugs or devices and then publish the results. It’s already a conflict of interest when the researchers get paid by the companies, but it becomes even more complicated when the doctors have a vested interest in insisting their study was successful and encourage other healthcare workers to try it with their patients. It’s not good science, but it’s brilliant marketing.

The science man and innovator, Fernando Fischmann, founder of Crystal Lagoons, recommends this article.



Te puede interesar