Posts Tagged ‘privacy’

Big Data, the Machines and You

Friday, September 14th, 2012

Ah, Big Data, the old IT bandwagon rides again eh?  Who’s with me?  Yeah, you and every other IT consultancy in town.

The thing is, beyond the hyperbole (and of course the ridiculous notion that data can be big or small or even mid-sized) incredible things are beginning to happen with data that affect the products and services we use, how we innovate and even how we understand the world around us.

More and more we are using big data services to help us in our personal lives, they recommend our purchases, answer our search queries, even translate our languages and every day, through the beauty and wonder that is machine learning, they get _better_.

Having access to more and more data, combined with technological advances like the cloud which provide seemingly limitless storage and compute power we are able, finally to start to harness the incredible power and potential it offers not just society at a broad level or just to huge organisations like Microsoft, Google and Facebook but we also start to get to a point where that power becomes accessible to every individual and every single business.

As with all such major advancements, we’ll face our fair share of challenges too; some will be technical – we’re still looking for the needle, but now it’s in a billion haystacks, some will be cultural – how do you ensure that data is accessible and of sufficient quality? And some will be just plain hard – like in a world of data and machine learning, what happens when the algorithms take over?

As it turns out, bandwagon or no, big data is crucial to our respective success.  Don’t believe me?  Well, why not waste 30 minutes of your life listening to me trying to convince you.  This is a presentation I gave at this year’s Turing Festival trying to make exactly that point.  (You can also download the slides here).

Like it or not, the world of big data is here – it is now up to us to figure out how to make best use of it.

(n.b. Thanks and appropriate respect go to both GetAmbition and Interactive Scotland for both organising the event and creating video and supporting collateral).

Preparing Our Future – The Need for Critical Thinking

Thursday, May 31st, 2012

thinkThere has been much discussion in the UK recently about the importance of getting the right approach to the role of technology in schools.  Many have used this as the opportunity to reinforce the need for greater emphasis on the STEM fields (Science, Technology, Engineering and Mathematics) with further focus being given to the need to create a new generation of “kids who code”.  Whilst this on its own is an incredibly important initiative, it is vitally important to continue to remind ourselves that it is still just a subset of the overall duty of care we have as technologists to ensure that every single aspect of society is empowered by technology.  Yes that means having great software, and as such brilliant computer scientists, but more importantly it means ensuring that every single member of society knows how to make the best use of technology whatever their societal role – this is our modern equivalent of a “PC on every desk”.

In order to achieve this it means that we need to get beyond teaching the “tools” and start teaching the “skills” that will make all the difference for the workforce of the future. In particular it requires that our children and every other member of our society are equipped with the cognitive capability and skills that enable them to harness the incredible potential that technology brings to us. It is no longer  just a case of “feeding” them with the basic tools that will become obsolete tomorrow, but instead teaching them to “fish” in a growing digital pool.

Within our brave new digital world, one of the most important skills we must learn is “critical thinking” a concept that rather incredibly, dates back to Socrates over 2000 years ago, but after being “recently” updated in the 20th century for a modern society by many great scholars it provides an powerful framework for our internet age.

findEvery single day, we are bombarded by millions of signals of data, information and content, and every single day the quantity of information we are exposed to grows exponentially.  These days we are still looking for the needle, it’s just that now it’s in a billion haystacks.

Critical thinking is about “reflective” rather than “routine” thought, it’s the process of “active, persistent and careful consideration” of the credibility and conclusions of supposed knowledge or information.

Most of us use critical thinking every day and for most of the time, we are barely aware of it.  Every time we read a newspaper article, watch a documentary or look something up on Wikipedia we are aware of a whole range of biases, influences and emotions that may interfere with the validity, accuracy and overall conclusion of the content and, if we’re doing our job properly, we take all of that into account as we parse the information, reflect on it drawing in a range of other context and ultimately use it to draw conclusions and make decisions.

Fortunately for us, we’ve had years of practice and experimentation to get this right but in this new digital age, where children and young people have so much access to an incredible world of information but have yet to develop the skills to know how to deal with it.

From an early age, we need to ensure that children using the internet are able draw upon critical thinking skills to:

Search efficiently and effectively – depending not solely on the search engine’s view of relevancy but able to navigate and adjust the query to ensure the most appropriate results.

Distinguish kinds of sources and analyse a source’s validity and reliability – from basic differentiation of primary vs secondary sources through to deconstructing domain names and URL’s to learn more context about the source.

Make a habit of cross checking facts, even from reliable sources – we know from experience that even “authorities” can mislead and experts make mistakes so wherever possible there must be independent confirmation of the facts.

Conscientiously and properly attribute the words and ideas of others – the internet has made plagiarism a lot easier, but thankfully, easier to spot. Students need to know the basic rules about when and how to quote others’ words and how to properly attribute the ideas that are not their own.

Stay safe on the internet – these are some of the most important skills of all, from not giving out personal information through to taking care about the kind of conversations they enter into on-line, staying safe is absolutely paramount. 

Interact with others online honestly, respectfully, fairly and clearly – the anonymity, immediacy and lack of proximity presented by the internet can lead to anti-social behaviour, sometimes with devastating consequences. Learning how to speak honestly, fairly, and with respect, clarity and brevity along with understanding why this is important in a society, especially a democracy, is crucial.

(Note: More detail on each of these areas, as well as lesson ideas for different ages of students can be found in the “Critical Thinking” white paper we published in 2010)

So, as we prepare to wind down this educational year and pause over the summer to think about the role of ICT in the new school year in September, please, let’s make sure we keep a firm focus on ensuring that as well as being brilliant at coding, our future citizens (and workforce) are equipped with all the necessary skills to make the very best of all that technology will have to offer them.

Evolving Our Expectations of Privacy

Friday, May 4th, 2012

So I walk into my local pub, the landlord calls out “hey Dave! Your usual?” – I acknowledge him with a smile and nonchalantly walk up to the bar as he pours my drink; I am secretly overjoyed that I have finally achieved such status and recognition (although I barely spare a moment’s thought for the years of patronage and resulting family neglect that have afforded me such privilege.)

That kind of personalised service is something we as consumers strive to experience (come on, we all have a secret fantasy of playing Norm in your own local bar where “everyone knows your name”) and service providers have long chased the dream of creating that sense of “home”, where we know you, we know what you like and relax, you’re amongst friends here. (Don’t believe me? Watch any airline advert from the last 10 years and you’ll know exactly what I mean).

drinking manSo what if then, I walk into a different bar, in an unfamiliar town and the landlord does the same thing “hey Dave! Your usual?” do I offer him the same smile and nonchalance? Of course not, I turn around and run out of the bar, screaming in terror at the indignity of the invasion of my privacy.

But why should I be freaked out by that? After all, the landlord in the second bar has as much interest in offering me the personalised service as the landlord in the first pub. But what’s different is my _expectation_. If I had whiled away the hours on http://www.makeminemylocal.com availing landlords across the country with my photograph and drinking preferences so they can offer such a service, then I might reasonably expect such a friendly welcome, but the fact it is not expected is what freaks me out.

The lesson here for us as consumers (and for us as technology providers) is “no surprises” – if the consumer is (reasonably) surprised about the service or the usage of their data, then as a provider, you’ve probably got it wrong. You can tell me all you like that a specific bit of information about me is public information, but if it doesn’t feel like it to me then I’m going to have a hard time when somebody I wasn’t expecting uses it. It’s that expectation that’s almost as important as the permission to use the data, in the first bar, I’m OK with the data attribute “my favourite beer” being used. In the second bar, when it gets used I am unnerved not just because I never gave permission, but equally because I wasn’t expecting it.

I think this difference between the role of reasonable expectation and permission is often overlooked and will potentially catch us out as our culture (and expectations) about reasonable use evolve. We live in an increasingly personalised world, and our expectations and comfort with the mechanics of how that world is created are growing ever easier, we are freaked out initially by the “filter bubble” but then realise that actually, used properly (and transparently) it is a vital resource if we are to stand a chance of sorting the wheat from the chaff in a modern (big data) world.

I am reminded of a similar example from our recent past that shows how these evolutions can happen.  Do you remember when caller ID first appeared on our landline phones at home?  I do, mostly because I was incensed at the thought of _my_ number being displayed to whoever I called, even though I had requested to be “ex-directory”.  Fast forward a few years and you will find me refusing to answer the phone when the number is unknown or not recognised.  I no longer care about my number being displayed because my expectations have evolved to appreciate the value that the service provides.

But this is not just about always adapting or evolving to new developments and privacy boundaries.  Crucially, there needs to be some constructive tension to ensure that this evolution neither goes too far too quickly nor becomes unbalanced in terms of the value to the corporation versus the consumer. Given the complexity of the issue (and the difference context makes in the usage of the data in question) the law alone is not enough to do this, we need to ensure that a place exists where consumers, regulators, privacy advocates (like Privacy International, Big Brother Watch and others) and technology providers can come together to collectively and constructively debate the best way forward for all involved. I talked about this recently at an event on Location Privacy, and was reliably informed that there at least 5 different places (and counting) where this debate can and does happen. This is good but it needs to be better and more focused if we are to provide the best outcome for all of the stakeholders involved.

We all have a part to play in making sure this dialogue continues to happen – why don’t you join us?

Inside Google’s Big Tent

Friday, May 20th, 2011

I spent a day this week inside Google’s “Big Tent” – essentially a high profile event on privacy, hosted by Google, Privacy International and Index on Censorship, with an audience of the very cream of the British digital elite (and me).

I learnt a lot of things of which I’ll share the detail in the moment, but first I thought you should know the headlines:

  1. Eric Schmidt likes Chrome – he says it’s safe and fast.
  2. The Right Honourable Jeremy Hunt, UK Minister for Culture, Media and Sport (and responsible for this country’s legislation around internet use) says the government’s priorities for the internet are speed and mobile.
  3. In other news, the Pope  _is_ catholic and bears _do_ defecate in the woods.

DonkeyI mean seriously, is that the best we can do when it comes to pushing the boundaries of thought leadership around privacy in the digital society?  Thankfully, the audience was mostly cynical hacks and privacy activists – you can imagine how well those points were received.  

Anyhow, with that out of the way, there was in fact an incredible discussion throughout the day on a wide range of local and global topics around privacy and free speech, what follows below are the (admittedly blinkered) takeaways from the discussion that I want to explore further.

  1. It is clear that the law cannot keep pace with changes in technology. If I had a buck for every time someone on a panel said “technology has made an ass of the law” I would have precisely $16.73c.  Although this point was universally agreed, there seemed to be no clear way forward to address this.  Simon Davies from Privacy International had a particularly pragmatic solution – do nothing – effectively let it happen and let them learn. (The context for that point was the discussion around super-injunctions and Twitter in the UK).
  2. Organisation vs the individual. The focus remains to be on what can the “organisation” do to make an individual’s privacy better. Despite pushing from the audience (advocates from Mydex et al in particular) there was little interest in a discussion around what it would mean to put the individual in full control of their information.
  3. Collation vs Publication. There was still a desire to focus on the search engine’s role in collating the content (i.e. the index) vs the actual publisher of the content. I’m wondering why this point is so hard for people outside the industry to grasp.  (see 4 below).
  4. Search is not the internet. Google’s Drummond put this well, “It’s a search engine, not the internet” but the conversation never followed suit. We should have been pushing Jeremy Hunt on the legal changes and leadership required from government i.e. you tell us which is the content we should remove and we’ll do it, the best example being religious extremist content – you want us to remove it, but you won’t tell us what is and what isn’t? Go fish. (My words).
  5. The “Right to be Forgotten” is a jingoistic phrase that not many understand.  Common (mis)perception means that this should allow me to have control about anything about me on the internet.  They forget of course that this conflicts with free speech.  Where we need to move on this discussion is an understanding that individuals should have the right to remove data _they_ have posted about themselves, but not data that _others_ have posted about them.
  6. Privacy Boundaries.  We established at least three clear boundaries around privacy that need to be explored further: Privacy vs Innovation (consensus was that privacy has _never_ impeded innovation), Privacy vs Free Speech (what’s private to you, may be free speech to me – who decides?), and Privacy vs Public Interest (are super-injunctions an expensive waste of time in a digital age).

Like Max Boyce, always said, “I know ‘cause I was there” – but what did _you_ think?

Teens that Tweet – I can haz privacy

Wednesday, April 27th, 2011

As we speak more and more about how social networks and associated media affect the lives of our children and younger generations in general, we often make the assumption that younger people care less about their privacy than older generations. What I think is interesting about this is the presumption that their different view on personal privacy is _worse_ than the standard established by ourselves. (Arrogance of the present anyone?)

tweetersNow I don’t doubt that we have to do much more to help people (young and old) to better understand the consequences of public communication – this is usually the point where someone will bring up the inevitable friendly warning about prospective employers screening candidates via their Facebook escapades, but that notwithstanding, it’s important to dig a little deeper around this issue as the reality is much more interesting.

This article from Danah Boyd and Alice Marwick, starts to show that the reality of how younger people think about and deal with their individual privacy is more about having your cake _and_ eating it.

My theory is that younger generations are much more binary about elements of their personal lives that they share versus keep private or within a very close circle of friends.

They may have a broader list of personal data “elements” they are willing to share with the world, but they maintain fierce control over a smaller subset of their personal identity that they will only share with their inner circle of their closest friends.

The truth is, younger people are very adept about managing what stays inside the private circle and what gets broadcast outside, often using complicated obfuscation techniques, encrypting private messages in public conversations using language that no parent could ever penetrate.

The other thing to remember is that there’s really nothing new about the view that younger generations have a looser definition of what they are willing to broadcast to the world. Since the beginning of time, young people have been more public about their personal likes and dislikes as a means of establishing their identity in their society. As we become more confident in our identities we lose the desire to be so promiscuous with the elements of our identity and settle into the shoes we were destined to wear.

For my own example, having reached a certain age, I no longer feel compelled to tell the world I am The Men They Couldn’t Hang’s biggest fan by wearing t-shirts and other paraphernalia emblazoned with their logo or boring people in the pub with how much I know about BMW motorbikes and beer (OK, scratch that last one), frankly, my identity is established and I am free to live in my size 12’s and worry more about other things (like life, death and taxes).

So understanding this, what do we need to do? Well as technology providers, we need to provide an open and transparent means of letting individuals (young and old) establish and maintain a firm boundary between public and private, with the understanding that the line will be different for every single individual, and will change based on the context of what they are doing at any given point in time. Failure to do that will only result in more embarrassing headlines about unintended personal data breaches and a continued lack of trust in how we use technology effectively in our personal and professional lives.

Oh, and by the way, if you really are worried about prospective employers judging you on your Facebook feed, worry not, these days you can probably turn the tables by looking them up first…

Social Signals and Search

Tuesday, February 15th, 2011

Last week, I was lucky enough to get some time and present at one of the Social Media Week London events. It was a great opportunity to meet with a diverse range of people and organisations, all looking for better ways to use Social Media across their business and their lives.

It was timely too, as we’ve been doing a lot of work about the importance of Social, especially when it comes to search.

What’s key in all of this, is the understanding that Social search is absolutely _not_ what it says on the tin – this is not just about “finding people” and searching Twitter archives, but is in fact much more about how you can use the power of the social “signal” to make searching a much better, more trusted experience.

Watch the presentation to find out why:

http://www.theenvisioners.com/wp-content/uploads/podcasts/SocialSignal.flv

The presentation is also available as a handy download for your favourite mobile device – Social Signals in Search (MP4)

You can find the slides here – Social Signals and Search (Slides)

Searching for a smarter internet

Tuesday, January 25th, 2011

Regular readers will know I’ve been absent for a few months, there are some boring reasons for that and some well, rather more interesting ones too.

The truth of it is that I’ve decided to put my money where my mouth has been for the last few years and joined our consumer business focusing directly on the potential and importance of “search” in our digital world.

It ticks all the boxes for me, is firmly rooted in all the “consumerisation” hyperbole I’ve been spouting, but most importantly because I firmly believe it’s a crucial area who’s time has yet to come. The delay in posts has simply been because I needed some time to “find my voice” in this brave new world.

For me, search is essentially the UI for the internet, the means by which we extract value from all that the internet has to offer a digital society.  The trouble is, like the web, it’s based on an evolving (and increasingly outdated) metaphor and as a result, I think we’ve all got a long way to go before we can really start to get all of the potential that is on offer.

searchwarningI was reminded of just how far we all have to go by this sign, posted in the ICT lab at my son’s (primary) school. Now look, I totally understand why this is there, but to me it’s more evidence that we’re missing something rather fundamental – why doesn’t the search engine _know_ that the people using it are aged between 5 and 11? Why isn’t it smart enough to understand that and adjust the results accordingly?

The answer of course is complicated, but within it lies a conversation I hope to explore with you about semantic language, user intent and relevance – fundamentally about how we can turn this blunt tool into something much sharper but without sacrificing our fundamental digital rights like privacy.

Search needs to be the best way to leverage the knowledge that exists on the internet, across multiple mediums and a vast ocean of data – this is no easy task but the good news is, I think we are well on the way.  We need to stop thinking about the task-oriented nature of the web, (remember HTML is built on a book metaphor) and start thinking about how we incorporate all aspects of our digital lives to create far better relevance – getting beyond “10 blue links” and into a far richer service that is truly representative of the internet and the potential it offers a modern society.

We’ll explore all of these areas over the coming months and I hope you’ll join me and get involved in the conversation.

Privacy By Design

Wednesday, June 23rd, 2010

HV Yesterday, we launched HealthVault in the UK, in some ways I think it is one of the most interesting (and perhaps, significant) products we’ve had for some time. 

Not just interesting and significant in the context of the product itself, but more because of the approach to privacy that has been taken throughout the development of the platform.

For the uninitiated, HealthVault is simply a cloud based application platform, that allows people to develop rich UI based applications that feed off an individual’s secure and private datastore (in this context for applications that focus on “wellness”). 

HealthVault is unique because it puts the individual in control of their health information, they have full visibility of what data is being consumed, by whom, which applications they use and more importantly, in every decision they make about which apps to use, or who to share their data with, the user is made explicitly aware of what data is required.

What is important in this approach is that the platform was developed using a series of key principles that were there when we started – we didn’t create the code and then “bolt” privacy on as so often happens.

Those principles were simply:

  1. The record you create is controlled by you.
  2. You decide what goes into your record.
  3. You decide who can see and use your information on a case-by-case basis.
  4. Your information cannot be used for commercial purposes unless you are explicitly asked you clearly tell us we may.

Privacy isn’t a binary problem, there is no single answer, but we can’t afford to ignore this key area, we need to listen to (and engage with) the experts – organisations like BigBrotherWatch, Privacy International, and NO2ID are excellent examples of people who are actively engaged in Privacy discussions across the board in an attempt to help us all do a better job of getting this right.

Sure, there’s more to it than this, but the point I’m trying to make is Privacy is going to be the “killer” topic in IT for the next few years (if you don’t believe me, ask Mark Zuckerberg ;-) )  Our collective success in addressing it properly will only come if we work together to understand the issues and build on the above principles to make it stick. 

Cloud Computing – What’s the Point?

Tuesday, October 27th, 2009

http://www.podtrac.com/pts/redirect.flv/www.theenvisioners.com/wp-content/uploads/podcasts/Episode4.flv

Back in the Summer, Matt Deacon asked if I’d like to give a presentation on the subject of Cloud Computing to an Architect forum he was planning in the UK for September. I said “yes” immediately because I was getting increasingly frustrated with all the hyperbole about Cloud Computing being “the Future of IT” when all that was really being said was about cost containment and greater agility and frankly I wanted to prove that there really was more to it than that.

So, several weeks passed, the deadline loomed, and I set out to prove my theory that Cloud Computing would enable some significant outcomes that would transform society’s use of technology. Take a look to see how I got on…

You can download the webcast here (right click and “save as”) or click here to subscribe to the Envisioners podcasts on iTunes.

This presentation uses the superb Productivity Future Vision video generated by Microsoft’s Office Labs team. You can find this video (and get the background and more detail) here…

Finally, you can also download the slides I used here – like everything on this site, they’re available for use under Creative Commons license, so feel free use them if they’re helpful to you, but please respect the copyright of the image authors (see last slide in the deck) and ensure you are licensed properly for their use.

6 Themes for IT’s Future

Friday, August 14th, 2009

http://www.podtrac.com/pts/redirect.flv/www.theenvisioners.com/wp-content/uploads/podcasts/Episode3.flv

Wow, time flies.  It’s been a busy and slightly ugly Summer, but enough of that, it’s finally time for us to bring you the presentation I made at this year’s Architect Insight Conference back in May of 2009.   A particularly important event for me as it marked the first public release of the 6 key themes we’ve been working on for the last 12 months or so.

This presentation walks you through the 6 key themes that are the foundation of all of the challenges (and opportunities) we face in helping move the value of technology in our society even further forward and why, in some cases, our initial perceptions of them are not always correct.

Sit back (remind yourself what Summers _used_ to be like) and enjoy…

You can download the webcast here (right click and “save as”) or click here to subscribe to the Envisioners podcasts on iTunes.