Democratizing privacy, not data: The revolving role-play in the product world
It’s a casual day. Your product signups are coming in droves, filling your dashboard with metrics. Oceans across, a cafe owner is thinking about expanding his business online — he checks out your product’s offering, thanks to the technology, and signs up. And, there’s a teen trying to download a game by entering some details. Well, nothing different, maybe. In the parallel mise-en-scène, the national security agents are discussing the next potential threat, something that we’re all unawarely part of.
Well, this story can be expanded like a Precis writing block, but do you see a common thread here? A product professional, business person, teen, and government — what unites them here is the fact that they’re concerned about data. Data someone inputs, data that gets matched through algorithms, data that can be predicted — a simple signup for a product or downloading of an app can set lanes to multiple roads of information. Over the last few years, the importance associated with data has taken a name, privacy, primarily becoming one’s liberty to decide their data usage. Many countries and their governmental regulations have implanted unique ways through which technology companies and their product people get to handle their customers’ information. Somewhere in the realms of digitization, we’ve even arrived at a term called ‘Techplomacy’ as stated by Casper Klynge, the very-first Danish technology diplomat. This brings in the sense of privacy and security not just at a personal scale but a dedicated and combined approach of governments, tech companies, and citizens. It brings forward a collective understanding that everyone, from a company CEO to a user who is seeking for a product/service, should be aware of what information is submitted, how it gets stored and processed, and when this information can be inferred or deleted.
Actually, the words ‘governance,’ ‘privacy,’ ‘regulation’ or even ‘personal data’ have become commonplace. The effect of tech journalism and ability of technologists to finally accept the predicament of privacy, has by far placed us in a space where most of us understand implications and perils of privacy invasion. Interestingly, as a technologist and a product professional, handling the General Data Protection Regulation (GDPR) ramifications and business amendments was a profiting challenge for me. Then, it definitely looked like a top-down approach directly from the European Union and the government involved. It still is, but it also involves the work of a bottom-up strategy — as Brad Smith writes in his notes, the untiring works of Max Schrems; Schrems, an Austrian activist and author, raised a flag against big giants on their data collection and usage processes. While the attempts were considered to bury the efforts, one can’t help but think how this shaped the EU to pick up threads on a world-wide regulation of data protection and privacy. The product world took a deep turn in most of their practices after the ratification of GDPR, and I was amazed at Schrems’ droplets that took the formation of an ocean.
Not too long thereafter, we had Alastair Mactaggert, the real-estate developer and investor, who heard his night-party’s guest talk about overriding data privacy, and kickstart the incremental foundations of what constitutes California Consumer Privacy Act (CCPA). Despite being regional, CCPA has made many heads turn and influenced fellow states (and outside nations) take data a little more seriously. It’s wonderful to see how these regulations can be grandiose in nature, but were incinerated by a spark from a fellow human like us, who just took privacy at its base level — as a constitutional right. In fact, just as wink, before even fully deciphering this undulated CCPA terms, there already comes a corollary — California Privacy Rights Act — that created a subcategory of sensitive and personal information enclosure (remember GDPR).
Privacy, just as we spoke as a basic entitlement, has surpassed that when it turns to be the new guiding light for brand positioning. Hey, the popular email client, has its philosophy matching here (discounting all the other controversial outcomes). At the very core, Hey promises a refreshing email experience, hinting sheerly at the privacy that instills in every user — from choosing one’s inbox-sitters to going on untracked by external digital senders. The ‘protected’ capability by itself has turned into an armor, the raison d’être of the brand. So, does this mean everyone who uses Hey has their privacy protected, and no more of direct tracking? Well, yes, looks like it. But, how is the ecosystem going to balance this new cycle? That’s a good ballgame the future seems to look at.
Privacy and data protection suddenly seem to appear as those hotcakes — everyone wants a piece, but no one truly knows what’s the flavor like. That’s exactly why they want in the first place — to devour and make it belong. This reminds me of just the recent time when Jio Meet was launched and got raved for the user interface rip-off. But, that apart, there was one sizable chunk of people actively scurrying the website to see if encryption and security practices were better than most other alternatives in the space. Some of them actively shared messages on how they’re waiting to learn more about the privacy and secure aspects of this newly-launched product, while the UI veneer was debated. It isn’t the first time to see users and tech enthusiasts commonly unite over this, as we spoke about previous one-person shows that led to the birth of regulations. The recent iOS14 keynote brought in data protection in the form of app privacy, letting users to control how they want to use the app and share their data, as well as making developers explicitly self-declare what data the app collects from users (imagine health charts in packets of eatables as Craig mentioned). Hmm, straightforward, yeah?
In fact, there’s more to why it’s crucial to let users decide and define their product/app experience and usage tracking, let alone regulations mandate it. So, what do companies or product professionals do with the data, or even sensitive data that’s collected or tracked? An example to this is the 2010 happening with Amazon’s catalogue abandoned cart emails. As much as the company boosted high volume of sales and revenue from such personalized emails that track users’ choices, their clicks, and interests they feed in their profile, one incident proved detrimental. When a customer who browsed for lubricants under the sexual wellness category ended up getting an auto-personalized email from the brand to check further suggestions, things let loose — the person was thoroughly agitated and embarrassed, and felt like a privacy intrusion. It did dip down inside Amazon, with category managers arguing that such data-focused personalized emails fetch good returns and that this is only a particular case with such recommended gels and lubricants being available in common drug stores too. But, finally, Bezos’ interminable dedication to customers’ preferences and their emotions saw victory — he canceled all the emails that employed tracking sensitive data, especially the ones that customers wouldn’t approve of. (A different story stitched together in 2020, with him having to testify against congress with regard to lack of condor on digital data)
In the hindsight, privacy is championed as a fundamental human right of any digital user, and security is the promise taken up by digital brands to serve and honor this right of users. But, yeah, increasingly, there’s the role-play of national governments into this brand-user ecosystem. Not just in terms of sanctioning laws, but also unique involvement anecdotes.
The concept of nationalized platforms has been seeing light for the last couple of years. Predominantly dealing with the monopoly social markets, most platforms and product ecosystem brands did invest their thoughts (and money) in releasing variants of their services, more suitable for different nationalities. An interesting point to consider is also how some nations are releasing constraints on usage and data censorship on online platforms. Apart from this, many brands try launching their Beta program specific to certain nationalities and their economies, before releasing it out to the entire world. What one needs to realize is how a brand from a birth country can have the highest number of users from countries other than their own. A simple case is how WhatsApp, with its parent company in the US, still has Brazil and India as the first and second highest user population. WhatsApp Pay option was tested (and later suspended) first in Brazil, before it could be rolled out to the others. Let’s think about all the brands. The result? We are dealing with data from citizens across multiple nations, under different governments, laws and regulations and hence cultures. So, government involvement is getting more than necessary in multiple instances. The recent ban of a bevy of apps by the Indian government, once again in the interest of national security and citizen data protection, adds to the highlight.
As I think back, in a world with a gale of privacy measures like sans-tracking, minimal data collection, consented business engagement, there’s also a wave of future robust mechanisms to hark, like privacy-preserved open data systems across nations and further micro-segmented regulations. The future of privacy-driven marketing, products/platforms, and customer experiences, looks promising — something that even gets built on goodwill, belief that relationships work out, and trust on moving forward.
And, our Precis writing continues…with revolving role-plays, like dumb charades at the kitty party — users/citizens, products, companies, and governments, are constantly guessing (and sometimes second-guessing too!), and playing right on data. Yonder, a riveting journey ahead — buckle up!