So I walk into my local pub, the landlord calls out “hey Dave! Your usual?” – I acknowledge him with a smile and nonchalantly walk up to the bar as he pours my drink; I am secretly overjoyed that I have finally achieved such status and recognition (although I barely spare a moment’s thought for the years of patronage and resulting family neglect that have afforded me such privilege.)
That kind of personalised service is something we as consumers strive to experience (come on, we all have a secret fantasy of playing Norm in your own local bar where “everyone knows your name”) and service providers have long chased the dream of creating that sense of “home”, where we know you, we know what you like and relax, you’re amongst friends here. (Don’t believe me? Watch any airline advert from the last 10 years and you’ll know exactly what I mean).
So what if then, I walk into a different bar, in an unfamiliar town and the landlord does the same thing “hey Dave! Your usual?” do I offer him the same smile and nonchalance? Of course not, I turn around and run out of the bar, screaming in terror at the indignity of the invasion of my privacy.
But why should I be freaked out by that? After all, the landlord in the second bar has as much interest in offering me the personalised service as the landlord in the first pub. But what’s different is my _expectation_. If I had whiled away the hours on http://www.makeminemylocal.com availing landlords across the country with my photograph and drinking preferences so they can offer such a service, then I might reasonably expect such a friendly welcome, but the fact it is not expected is what freaks me out.
The lesson here for us as consumers (and for us as technology providers) is “no surprises” – if the consumer is (reasonably) surprised about the service or the usage of their data, then as a provider, you’ve probably got it wrong. You can tell me all you like that a specific bit of information about me is public information, but if it doesn’t feel like it to me then I’m going to have a hard time when somebody I wasn’t expecting uses it. It’s that expectation that’s almost as important as the permission to use the data, in the first bar, I’m OK with the data attribute “my favourite beer” being used. In the second bar, when it gets used I am unnerved not just because I never gave permission, but equally because I wasn’t expecting it.
I think this difference between the role of reasonable expectation and permission is often overlooked and will potentially catch us out as our culture (and expectations) about reasonable use evolve. We live in an increasingly personalised world, and our expectations and comfort with the mechanics of how that world is created are growing ever easier, we are freaked out initially by the “filter bubble” but then realise that actually, used properly (and transparently) it is a vital resource if we are to stand a chance of sorting the wheat from the chaff in a modern (big data) world.
I am reminded of a similar example from our recent past that shows how these evolutions can happen. Do you remember when caller ID first appeared on our landline phones at home? I do, mostly because I was incensed at the thought of _my_ number being displayed to whoever I called, even though I had requested to be “ex-directory”. Fast forward a few years and you will find me refusing to answer the phone when the number is unknown or not recognised. I no longer care about my number being displayed because my expectations have evolved to appreciate the value that the service provides.
But this is not just about always adapting or evolving to new developments and privacy boundaries. Crucially, there needs to be some constructive tension to ensure that this evolution neither goes too far too quickly nor becomes unbalanced in terms of the value to the corporation versus the consumer. Given the complexity of the issue (and the difference context makes in the usage of the data in question) the law alone is not enough to do this, we need to ensure that a place exists where consumers, regulators, privacy advocates (like Privacy International, Big Brother Watch and others) and technology providers can come together to collectively and constructively debate the best way forward for all involved. I talked about this recently at an event on Location Privacy, and was reliably informed that there at least 5 different places (and counting) where this debate can and does happen. This is good but it needs to be better and more focused if we are to provide the best outcome for all of the stakeholders involved.
We all have a part to play in making sure this dialogue continues to happen – why don’t you join us?