On October 27, 2012, Facebook CEO Mark Zuckerberg emailed his then director of product development. He said, “I just can’t think of any instances where data has leaked from developer to developer and caused a real issue for us.” If Zuckerberg had prepared to take the GMAT, he would have learned the logical fallacy innocuously forming the weak foundations of his argument. Hint: Just because it has never occurred in the past, does not mean it will never occur in the future.
It wasn’t until 2016 that all hell broke loose courtesy of the Cambridge Analytica scandal. Zuckerberg and his product managers and teams would have realized the consequence of their shortsightedness. Five more years later, Facebook is still seen as the Big Bad Wolf. Users are extremely wary of what and how they post on the platform. Were it not for Facebook’s timely monetization, it would have suffered the same fate as its peers like Orkut or Myspace.
Products, privacy issues and more
Technology seems to have developed an ability to impinge on democracy and polarize societies. This is one of the biggest reasons why strong policies are being created to regulate its use.
Take for instance advanced image recognition technology (which sits quietly within consumer products). It is also susceptible to data and privacy concerns. Such tech could just as easily be trained to polarize the virtual world we live in. Big tech companies, such as Apple and Google, have the capability to automatically detect people, animals, and things in images. They can even segregate a photo library based on detected objects in each photo. This is impressive, especially to those of us who weren’t born into the luxurious world of instant photos.
The wild side of technology
One of my peers, a wildlife enthusiast, was looking through their old pictures on an Apple device. They noticed that the face detection algorithm had spotted a wild animal, a leopard. It silently perched on top of a branch, looking down on the entire battalion of tourists. Although on a wildlife safari, they remained blithely unaware of their proximity to their predator.
It is not unimaginable that such technologies could be used by Government institutions and similar authoritative agencies to suppress bad publicity. They could strong-arm (social media) platforms to screen unfavorable content, taking away the right to speech and expression. And make public enemies of commoners who were merely shedding light on events by publishing information.
The ethics of product management
Corporates tend to incentivize product managers to wear rose-tinted glasses and be optimistic in their roles. This is usually aimed at optimizing the employers’ bottom-line. However, product managers are duty-bound to be pessimistic visionaries.
Tech employees need empathy, ethics, and vision. Now more than ever. They should at least partially understand how a feature, capability, or a product itself could negatively impact users’ lives. At that point, they should perhaps make an ethical choice. This is where C-suite members and their sergeants at arms — developers, managers and researchers — need to inculcate parts of the Hippocratic Oath.
- I will neither administer a deadly product or solution to [any client] who asks for it, nor will I make a suggestion to this effect.
- I will remember there is art to technology as well as science, and that empathy and understanding, of disruptive outcomes, may outweigh my [team’s] ability to build a product or solution.
- I will remember that I do not build a solution or a product whose capabilities may uproot a person’s livelihood and economic stability. My responsibilities include these related problems, if I am to build a product that will replace human workforce.
Soundarya Murugaiyan works as a product management intern at Crayon Data. She is a product enthusiast and a future product manager, who has donned many hats, from risk management to advertising photography and entrepreneurship.