If you need HELP, SUPPORT or just have a GDPR question please call +44 (0) 208 133 2545 or email us at firstname.lastname@example.org.
Alternatively please visit our contact page
FREE GDPR Helpline
Call +44 (0) 208 133 2545
Elizabeth Denham’s speech at the Institute of Directors Digital Summit on 17 October 2017.
Thank you Jeremy for that introduction. And thank you for inviting me here today. It’s been an inspiring morning hearing about the achievements and opportunities in the digital world.
I don’t need to tell you how the world has changed in just a generation. We have a digital infrastructure that was unimaginable when the Data Protection Act was forged 20 years ago.
Whether you were into BoyZone, Britney Spears or the Beautiful South, I doubt you ever imagined you’d have a computer in your pocket allowing you to download your favourite tracks wherever and whenever you wanted.
Well, organisations – like yours – are looking for ways to exploit new technologies and enrich their services.
The fuel propelling these advances is big data – vast datasets that are constantly and rapidly being added to. And what exactly makes up these datasets? Very often it is personal data. The online form you filled in for that car insurance quote.
The statistics your fitness tracker generated from a run. The sensors you passed when walking into the local shopping centre.
In this world of big data, AI and machine learning, my office is more relevant than ever. You only need to read the industry-led report on AI released on Sunday to see how the ICO needs to work to keep up with technology. Our AI and Big Data paper recently won a major award at an international conference of global regulators. That is external validation of our status as a tech savvy regulator.
I oversee legislation that demands fair, accurate and non-discriminatory use of personal data; legislation that also gives me the power to conduct audits, order corrective action and issue fines.
Under the new data protection reforms my office will be working hard to improve standards in the use of personal data through the implementation of privacy seals and certification schemes.
Many of you will know that we’re currently investigating how political campaigners and social media platforms use data analytics to target potential voters with bespoke adverts or information.
My office’s investigation is ongoing, but this much is clear: these tools can have a significant impact on people’s privacy and autonomy and it is important that there is greater and genuine transparency about the use of such techniques. We must ensure that people have control over their own data and that the law is upheld.
And as the way we view, handle and utilise personal data changes, the law that protects it must too.
Next year on 25 May – 220 days away to be precise – the General Data Protection Regulation comes into effect. The GDPR gives specific new obligations for organisations, for example around reporting data breaches and transferring data across borders.
But the real change for organisations is understanding the new rights for consumers and citizens.
It’s an evolution of the current law and a step change that brings greater accountability, transparency and consumer control.
These are the three pillars of data protection law that will give people agency over their information.
Individuals will have stronger rights to be informed about how organisations use their personal data.
They’ll have the right to request that personal data be deleted or removed if there’s no compelling reason for an organisation to carry on processing it, and new rights around data portability and how they give consent.
GDPR. It’s not quite here yet… but I’ve been asked to talk about what comes next.
Well, what comes next is the Data Protection Bill.
It will put in place one of the final pieces of much needed data protection reform. Effective, modern data protection laws with robust safeguards are central to securing the public’s trust and confidence in the use of personal information within the digital economy, the delivery of public services and the fight against crime.
Matt Hancock has already spoken clearly about that this morning.
So what I want to talk to you about today is not the process by which we oversee data protection in the future, but the spirit behind it.
Because the exact form of legislation may vary the route, but the direction of travel for privacy and data rights remains the same.
When I speak to people – regular people – they aren’t concerned about the details of GDPR or the new Bill or what legislation might follow it.
They want to know – is my personal information safe? Who’s making sure it is? Who’s on my side?
For me, the end game in the data protection field is always about increasing public trust and confidence in how their personal data is used. And I will always stand up for the privacy rights of UK citizens.
That’s what the ICO wants to achieve. It’s our mission.
Our Information Rights Strategic Plan commits us to leading implementation and effective oversight of the data protection reforms.
It commits us to exploring innovative and technologically agile ways of protecting privacy.
It commits us to strengthening transparency and accountability and promoting good information governance.
And it commits us to protecting the public in a digital world.
This matters to me and my office. But I know it matters to you too.
Many times I have quoted this statistic – four out of five UK people do not trust private companies – businesses like yours – with their personal data. I suspect this concerns you as much as it does me.
Innovation in the digital economy relies on consumer trust. Innovation in government relies on citizen trust.
Innovation is at least as much about adoption as it is about invention. This is a good point to consider the Royal Free and Google DeepMind project. Where innovation took priority over privacy and, as a result, threatened to undermine trust and confidence.
I’m sure you didn’t miss this. But if you did, let me give you a quick re-cap.
This summer my office ruled that the Royal Free London NHS Foundation Trust failed to comply with the Data Protection Act when it provided patient details to Google DeepMind.
It turned over 1.6 million sensitive medical records as part of a trial to test an alert system for acute kidney injury.
Our investigation found several shortcomings in how the data was handled, including transparency for patients.
There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights.
Where were the controls and the transparency? Where were the considerations for data retention? Where were the contracting safeguards?
And, importantly, where was the consideration of equality of access for competing innovators.
Royal Free paid a price but the real value of this story was in the message it sent to others – privacy matters.
It can never be a choice between privacy or innovation – it’s hard to imagine how the end would justify the means.
The two can sit side by side – they must sit side by side. And while it’s not always an easy relationship, it’s up to you to make it work. Because that’s what the law requires and it’s what the public expects.
Data Protection Impact Assessments inform a project and prevent you driving too far down the wrong road. It’s expensive, time-consuming and reputationally damaging to have to travel back and retrofit controls.
So, yes – be creative. Be cutting edge. Pioneer. But don’t sacrifice people’s legally ensured fundamental privacy rights in the name of innovation.
And you know the ICO is a champion of innovation.
Earlier this year we launched a Grants Programme to promote and support independent, innovative research and solutions that are focused on privacy by design.
There will be a number of grants awarded each year of between £20,000 and £100,000.
The 2017 grants awards will be announced very soon. We’ve managed to whittle the applications down from over 100 to four. That was a tough task. Watch this space.
We’re also looking at how we might be able to engage more deeply with companies as they seek to implement novel business processes that impact personal privacy. How we can build a regulatory safe space or sandbox where companies can test their ideas, services and business models.
So what’s next after GDPR? I expect more of the same. This isn’t like preparing for Y2K. It’s evolution not revolution. And it’s an opportunity.
Those organisations which thrive in the changing environment will be the ones that look at the handling of personal information with a mindset that appreciates what citizens and consumers want and expect.
That means moving away from looking at data protection as a tick box compliance exercise, to making a commitment to manage data sensitively and ethically.
When you commit, compliance will follow.
If you haven’t started preparing for the reforms, it’s not too late. There’s plenty of guidance on our website and the GDPR workshop session which follows my slot will provide practical advice.
I am also happy to announce today, that on 1 November we will launch a dedicated service for small enterprises based around our telephone helpline.
This is part of a package of resources and support for small and micro sized businesses in the UK.
This room – and indeed this platform – is groaning under the weight of innovation.
It’s inspiring to see and hear how digital advances can make our lives easier, richer and support businesses to run more efficiently and thrive.
And I guess you might feel a bit like I’m raining on your parade here with all this talk of regulation and consumer rights. But I am the UK regulator and my job is to protect the privacy of UK citizens. I’m not asking you to cancel the parade – far from it. But it’s advisable you take an umbrella.
This post was originally published here: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2017/10/institute-of-directors-digital-summit/