The entrepreneurs of Silicon Valley are undoubtedly finding many ways to make the world a better place — with tunnels, flying cars, interplanetary travel. Yet I can’t help noticing a growing divide between the problems people have and the problems tech companies are willing or able to solve.
This divide has two sources. The first is related to inequality: When people live in an elite bubble, they don’t experience the everyday frictions of normal life. So they focus on issues that would barely register for the rest of us workaday saps. That’s how you get Theranos, which was supposed to solve the problem of people who don’t like getting blood drawn. Even as someone who regularly faints at the doctor’s office, and whose veins are tricky to find, I don’t think that rises to the level of a real problem.
This part of the divide might not be insurmountable. Imagine sensitivity classes to educate Silicon Valley venture capitalists and start-up founders on “problems that don’t affect you personally but that you should consider in your product design.” Maybe, after some time, it would have an effect. I don’t want to be overly naive, because there’s little evidence, but it’s a theoretical possibility. After all, people want to think of themselves as good.
Never miss a local story.
Unfortunatelu, the divide has a second, less tractable part. It stems from a conflict, mostly unacknowledged, at some of the largest internet companies. They want our personal data, but they don’t want the transparency and responsibility that should come with it. While they generate billions of dollars of ad revenue with our information, they remain opaque and unaccountable.
Examples abound. Google has organized the internet’s information, but it can’t seem to collect and share the pay data needed to know whether it’s compensating its female employees fairly. Facebook has plans to read people’s minds, yet it doesn’t seem interested in working with anti-extremism projects using technology that already successfully fights child pornography.
In a reasonable negotiation, we’d offer up our personal data and attention in return for basic services like making sure our emotional health is protected and our democratic process hasn’t been undermined. But again, these are problems that tech giants have no incentive to address. Instead, researchers must construct their own data sets to learn that Facebook makes people unhappy, and journalists must beg readers for the data needed to understand how people are profiled and how internet advertising affects democracy.
Look where the power is. That’s where data are lacking. Are our entrepreneurial saviors too busy waiting for the singularity and cursing traffic jams to care?
Dr. O’Neil is a mathematician who has worked as a professor, hedge-fund analyst and data scientist; contact her at firstname.lastname@example.org.