From smartphone apps and China’s databases of its citizens to police use of DNA phenotyping, there are plenty of reasons to be concerned about the threat that new technologies pose to data privacy.
“Our data is ours — or it should be,” declared Democratic presidential candidate Andrew Yang. “At this point, our data is more valuable than oil. If anyone benefits from our data, it should be us. I would make data a property right that each of us shares.”
Yang has recommended that users should “receive a share of the economic value generated from your data.” His proposal comes on the heels of the California Consumer Privacy Act (CCPA), which takes effect in January 2020. It creates “new consumer rights relating to the access to, deletion of, and sharing of personal information that is collected by businesses.” The Golden State also plans to add to the 2020 ballot initiatives to expand the CCPA to include wider restrictions on data mining and the establishment of a state privacy enforcement agency.
Privacy and robotics
The arrest of William Merideth illustrates how personal privacy is colliding with the development and deployment of robots and drones. The Kentucky man was charged in 2015 with “wanton endangerment and criminal mischief” after shooting a drone out of the sky.
The accused claimed to have acted in self-defense when observing a quadcopter hovering over his sunbathing teenage daughter. The court eventually agreed with Merideth and declared him innocent of all charges.
Since then, the American Civil Liberties Union has championed Merideth’s and all Americans’ right to privacy over the proliferation of unmanned vehicles.
“Drones have many beneficial uses, including in search-and-rescue missions, scientific research, mapping, and more,” states the ACLU. “But deployed without proper regulation, drones equipped with facial recognition software, infrared technology, and speakers capable of monitoring personal conversations would cause unprecedented invasions of our privacy rights. Interconnected drones could enable mass tracking of vehicles and people in wide areas. Tiny drones could go completely unnoticed while peering into the window of a home or place of worship.”
Looking for data privacy and AI guidance
As popular demand for legislation protecting data — and, by extension, curbing artificial intelligence and autonomous systems — grows, I reached out to Jules Polonetsky, CEO of the Future of Privacy Forum (FPF) for policy guidance.
“Data protection has really started to hit a massive changing point in the U.S.,” he explained. “For a very long time, we had consumer protection law, not privacy law, with the FTC, state regulators, or consumer affairs commissioners enforcing laws that prohibit businesses from engaging in deceptive or unfair practices or activities.”
“The U.S. is long overdue for a comprehensive privacy legislation,” added Polonetsky. “In fact, we’re one of the only democratic countries in the world that doesn’t have one, but we’re rapidly moving in that direction.”
Polonetsky said he believes that deep learning systems have exacerbated the data privacy issue, especially because most people are frustrated in trying to understanding how algorithms work.
“Machine learning has only added another element of concern because companies can learn things about you that you didn’t even know,” he said. “The typical arguments that companies will ask individuals for permission weren’t quite working. People weren’t taking the consents seriously, would ignore them, or they couldn’t conceivably understand what machine learning could do with their data.”
The privacy advocate is particularly concerned about facial recognition software.
“When my face becomes something trackable, and the government or private companies can use it for marketing or have data and intelligence about me, we’ve really lost the last zone,” said Polonetsky. “FPF has worked out a set of best practices as a model for facial recognition. We distinguish between identifying people in public and counting how many people are in a space or how they move around a venue.”
The principles developed by the FPF can also apply to public safety and other systems, such as autonomous vehicles.
“Autonomous cars are a key area for us to draw those lines,” said Polonetsky. “Obviously, we understand that there is a need for a camera, and there is clear value in a camera alerting a driver that they’re looking away from the road for safety, maybe for managing fleets and understanding that a driver has fallen asleep. That’s why protecting that data by law could support beneficial uses, but protect against unwanted surveillance and extreme uses by insurance companies or law enforcement.”
At the same time, he said, we need to bolster communities that take proactive measures to prevent abuse.
“The city of Portland currently has a proposal to ban government and private-sector uses of facial recognition,” Polonetsky said. “We need to put laws in place that can help allay fears and prevent harmful activity, so as to enable the societally-beneficial uses.”
Urging transparency for roboticists
Polonetsky views regulations positively because they can enable engineers to focus their inventions on specific use cases and ultimately forge a better relationship with the public.
“My advice to designers and roboticists is that transparency and trust are 90% of the puzzle,” he said. “If I trust that you’re on my side, I’m eager for you to have my data, to help me and to support me.”
“Companies in a low-trust environment need to figure out how to lean in and ensure that the consumer feels supported and comfortable that how you use their data will improve their life,” added Polonetsky. “That doesn’t mean that the company can’t benefit financially while doing so, but are you doing it on the consumer’s behalf?”
Polonetsky said he is optimistic that more companies will embrace regulation to build greater loyalty with their users.
“I predict you’ll see major tech companies aggressively pivoting in this direction,” he said. “Although marketing and advertising may be a majority of their income today, their future plans involve smart cities, healthcare, cloud, machine learning, genetics, autonomous vehicles — all areas where you need huge trust.”
“You can see Apple leading in this direction, not only because they decided it’s a human rights value, but because they’re going into the healthcare market,” said the FPF CEO. “They’re being welcomed because they’ve done work to build that level of trust and confidence, while you see pushback from other companies delving in these fields.”
Learning to manage data hunger
In his new book, Digital Minimalism, Cal Newport confronts the tension between the social good of the Internet and the loss of humanity from overuse. He writes the “irresistible attraction to screens is leading people to feel as though they’re ceding more and more of their autonomy when it comes to deciding how they direct their attention. No one, of course, signed up for this loss of control.”
The hunger for data is driving more people to lament Andrew Sullivan‘s famous refrain, “I used to be a human being.” Personally, as a funder of automation startups, my aim is to improve the quality of life, not subvert it.
Polonetsky saliently provides a path forward: “Most of us don’t truly want to live disconnected or hide from the world — we just want better control over the disruption that technology creates.”
Going to CES? Join me for my panel on “Robots and Other Humanoids at Retail” Jan. 8th at 10 a.m., Las Vegas Convention Center.
Editor’s note: The Robot Report will also be moderating a panel on “5G and Robotics” at CES 2020, and Robotics Business Review will be running a track on “Robots for Good” at the event.
Tell Us What You Think!