Privacy and the dodo bird
At TM Forum Live! (in May in Nice, France) Amdocs’ Dr. Gadi Solotorevsky will take part in the panel debate on The reality of security & privacy concerns. Here, he looks at the reality of privacy in the digital world. Privacy is like the dodo bird – it existed in the past, but today it’s long gone. And while everybody agrees how extremely important privacy is, we still happily gave it up in exchange for a navigation/social/search, or even a flashlight app. We gave up our privacy the same way that Manhattan was allegedly bought from the Native Americans for a mere 60 guilders in 1626 (that’s $1,000 in 2006 values). We could argue about whether we sold our privacy at a fair price or too cheaply, but the fact remains that the majority of us have chosen to sell it to companies (and are continuing to sell it each day) in exchange for new apps, nicer GUIs (graphical user interfaces) etc.
So instead of privacy, perhaps we should talk about trust – whether we can trust the entities that have our private information to use it wisely, according to our contractual agreements and implicit expectations.
The contractual agreement sets some boundaries such as whether they can sell our data to third parties or spam us with advertising, but most people often don’t bother to read these (lengthy) contracts, even though many of these contracts are quite open, giving them a lot of flexibility to exploit our data. The implicit expectations are our expectations around what will be done with the data and whether it will benefit us. Hearing our conversations and sending targeted advertising might be acceptable for some, but using the same info to update the bank about financial problems, (even if permitted according to the general contract), won’t meet most people’s expectations. Even though companies breaching the contractual agreement, or the implicit expectations can find themselves sanctioned by legal entities and/or social media, that still doesn’t guarantee that they won’t breach our trust intentionally or unintentionally
Trust is also about trusting those who “acquire” our private data to keep it safe protected from hackers and from leaks.
So today when I sell/give away my privacy, I ask myself: do I trust the “buying” entity? (Personally, I would trust Google with my data more than an anonymous flashlight app developer). And when possible – and if it makes sense – I prefer to pay with money, instead of with my privacy.
Citizens of the IoT era
But this is the reality of being a citizen of the digital/IoT era – we’re forced to interact with more and more entities and vendors that are collecting more and more data about us. We are even welcoming sensors, microphones and cameras into our houses (from Amazon’s Alexa to Mattel’s Hello Barbie smart doll) which create even more privacy concerns. But again, the question isn’t whether or not we have privacy, but rather whether we trust companies like Amazon and Mattel with our personal private information. Unfortunately, it’s impossible to check and trust each of the devices/services/vendors – there are now simply far too many.
So here’s the big opportunity for the big players: convince me that I can trust you, and provide an extensive ecosystem of devices/services/vendors for which you are willing to vouch.
These players can be the Googles, Amazons, Microsofts and Facebooks of today, but they can also be the communication service providers (CSP). The competitive differentiator is about who can create trust, and provide a large ecosystem.
Trust and CSPs
To create trust, CSPs need to verify that the digital ecosystem is functioning flawlessly, and that it’s protected from classic fraud and security attacks, as well as ever-changing threats. And to do this in a dynamic adaptive way so it keeps up with the fast pace of change, it isn’t enough to rely only on human analysts, and supporting systems from the last decade – they need to use advanced artificial intelligence (AI), machine learning, and robotics to ensure flawless operations and to continuously adapt the protection of the data. This is a key reason why today we’re seeing these technologies being used more and more in cyber-fraud protection and revenue-assurance domains, and you can see an example of using behavior analysis to protect citizens as part of the TM Forum Connected Citizen: Life in a Green, Clean, Smart City Catalyst. By the way, what is the price of your privacy – is it less than the cost of a flashlight app? TM Forum Live! takes place May 15-18 in Nice, France. Find our more at www.tmforumlive.org