Where’s my data? The problematic legacy of data-driven campaigning

 

GINA: Hello, welcome, everyone.

Thank you for joining us today. We are waiting for attendees to make their way into the main Zoom space.

Hello and welcome to the latest event from the Minderoo Centre for Technology and Democracy. At the University of Cambridge. I am Gina Neff and I am the director at the Minderoo Centre for Technology and Democracy. Before I introduce today's guests I want to let you know that this event is being professionally live captioned by Jen. If you like to have live captioning you can select it using the zoom tool bar at the bottom of your screen. Additionally there is a StreamText captions for the event. This is a fully adjustable live transcription of the event in your browser. If you wanted to open this, we are sharing the link now to the StreamText. We will also have a transcript made available online after today's event.

 

So before we begin, a few points of housekeeping. The event is being recorded by Zoom and streamed to an audience on the platform. By attending the event you are giving your consent to be included in the recording. This recording will be available on the Minderoo Centre for Technology and Democracy and CRASSH websites shortly after the event. Our guests tonight will speak to us for around 30‑40 minutes and then we will open up the discussion and take questions from you, the audience. These questions can be asked through the Q&A function on Zoom. We will share those questions during the discussion, so please don't place questions in the chat, use that Q&A tab.

 

We would very much appreciate if you could complete a short feedback questionnaire after the session. The link to that will be sent via Eventbrite. Please follow us, tag us on Twitter and other social media platforms @MCTDCambridge for example, if you are live tweeting tonight's event. So with little ado, I am really excited for tonight's event. We produce at MCTD, at the Minderoo Centre for Technology and Democracy critical research to really reimagine society's relationship to digital technologies through evidence‑based work that leads towards policy change. Our research is anchored in creating ways to build capacity and how we as a society can build tech systems to account in order to create better futures for people, societies and the planet. So with the next UK General Election on the horizon what role do citizen's data play in those elections?

 

Tonight's event will explore the role of data‑driven campaigns and the legacy of data‑driven campaigning in the UK. This follows the recent publication of research by our research associate, Julia Rone, who joins us tonight in the Journal Media and Communication. Julia's article explores the public campaign data both during and after referendum campaigns. Throughout tonight's event we will examine the ramifications for citizens of data practices in previous UK elections and referendums and we will explore this theme of public participation in campaign data, decision‑making and exam what challenges we might expect coming forward in the near future.

 

So, for this discussion, I am delighted to be joined by three panellists.

First I would like to introduce, Declan McDowell‑Naylor. Declan is a Senior Policy Officer at the Information Commissioner's office. With experience in technology policy and regulation, Declan has a focus on data protection and privacy and he holds a PhD in political science. Also with us this evening, today, depending on your time zone, is Bethany Shiner. Bethany is a senior lecturer in law at the University of Middlesex and qualified non‑practising solicitor advocate. Before joining Middlesex she was a review lawyer at Public Interest Lawyers her specialism includes judicial review, the Human Rights Act 1998 and the use of data in political digital campaign.

 

Finally Julia Rone, Julia is a research associate at the Minderoo Centre for Technology and Democracy. She spent the last decade doing research on politics and the utopias and dystopias of digital media. She focussed in particular on sovereignty conflicts around Brexit and Free Trade Agreements such as CETA and TTIP. As mentioned, Julia recently published an article in the Media and Communication entitled Beyond Brexit, Public Participation in Decision‑Making in Campaign Data During and After Referendum Campaigns. The research raises questions of who should have a say in how people's data are used in referendum campaigns and afterwards. It makes the case for democratising such decisions. I am now going to hand the floor over to Julia to introduce her paper and the topic for tonight's discussion.

 

JULIA: Thank you very much, Gina. I will try not to speak for too long and basically I want to start with giving the information. This paper is part of a special issue on referendum campaigns in the digital age. I was invited to be part of this special issue, mainly because of my previous work on the Brexit referendum and discourses of sovereignty during this referendum. One of the things that really struck me and that is why I also wanted to write this paper was that despite so much discussion about taking back control and all of this rhetoric we remember painfully from this period, there has been very little attention to what happens to citizens' data, people's data during the referendum. One of the most shocking things for me was reading the story of the collapse of the People's Vote campaign. It was a huge campaign, I attended one of their protests with thousands of people. Basically at some point this campaign completely unravelled and all the data of the participants in the campaign as well as donations etc were suddenly left in the hands of a private organisation with very little opportunity for citizens to have input into what happens with this data.

 

So for me this was the initial provocation: when discussing popular sovereignty we rarely apply this concept also to data and digital technology more generally. I am working also on trying to democratise decision‑making on digital infrastructure, but there has been surprisingly little work on democratic participation when it comes to data and especially data used in political campaigns. So this is kind of the broad background of the paper.

 

I am going to very briefly present the main arguments I make. My presentation is intended fully as a provocation, because, precisely due to this lack of, I would say, extensive academic research on participation in decision-making on campaign data, it seems to me almost utopian to talk about this. Hopefully it isn't and we will hear from our guests whether there is some possibility.

 

So the two main points with which I start this paper are the following:

 

First there has been an extensive process on the role of Cambridge Analytica as a company in terms of the Brexit referendum and the way this company supposedly has influenced the results. I was reading a fantastic paper preparing for today's presentation and again I will just briefly read a sentence from it, which is extremely indicative of the way we discussed Brexit referendum. The sentence is the following, it sounds uncontroversial and hopefully I will create some doubt about it. So the sentence is:

 

'The events related to Cambridge Analytica reviews attractions and perils of these type of political frameworks in the framework of the US Presidential election and the Brexit referendum. '

 

This is an almost standard sentence with which a lot of articles on campaigning in the digital age start. It was taken as given that the Brexit referendum was very much influenced by Cambridge Analytica. One of the things I start my paper is to argue that that has not been the case. There have been official investigations that have proven that their role, if anything, is limited and it has not decided the referendum. And yet, it seems that academic research, media coverage and basically official investigations produced by regulatory bodies are talking at cross purposes. There is very little dialogue between them.

 

So one of the points I want to make in the article is that actually beyond these slightly overblown theories of psychological profiling and targeted ads and secret dark operations what happened during the Brexit campaign and in its aftermath in the years between 2016 and 2020 was a much more trivial story of misusing data, abusing data and engaging in a series of different malpractices when it comes to citizen data. It's not as scary. We can't speak of democracy being hijacked. Beyond these fears of hijacking, what we have witnessed is a much more trivial, but I would argue fundamental problem of lack of participation in deciding how citizen data is used in campaigns.

 

Second, I argue in the paper that most of the academic research we have on the Brexit referendum focuses on the campaigns that lead up to the referendum. Yet we know much less about what happened afterwards and campaigns such as the already mentioned People's Vote, Leave Means Leave, etc. What I want to argue in this paper is that these "supposedly" not so fatal malpractice for which in the end the campaigns were fined, are actually very problematic. I am just going to give several brief examples on such malpractices. So what do I mean?  

One example Aaron is Banks, from the unofficial Leave Campaign who passed data to Cambridge Analytica.

SIn the end, he did not hire the company, so this part of the story is in brackets, but what I find astonishing is the ability of a campaigner to pass party data to a private company without any ability or any opportunity for party members to A) know about the transaction, B) influence transaction. Another example, Leave.EU received ads from Eldon insurance - the company of Aaron banks - and vice versa, Eldon customers  received materials promoting to vote Leave. Again highly problematic mixing private and public, commercial, basically advertising and political. Another example, the Lib Dems sold data for £100,000. It's also been a long-protracted scandal in which it was difficult to establish what data was sold, whether it was in some ways basically extended, expanded, etc. But again very controversial case of a party selling its party member data to a campaign, without any oversight or accountability.

 

And finally what I already mentioned in the beginning, the collapse of the People's Vote Campaign. Basically what happened was that the Britain's Stronger in Europe campaign, after the referendum was relaunched as Open Britain. The organisation was chaired by a famous PR figure Roland Rudd. Basically Rudd hired two specialists, Tom Baldwin and James McGrory, who organised this whole movement for People's Vote which was one of the biggest campaigns in UK history and was responsible for some of the biggest protests we have seen after the war in Iraq- related protests. This campaign completely unravelled, both Tom Baldwin and James McGrory  were fired. Most supporters of the campaign could not influence what will happen with their data. In the end, this organisation that kept the data was later renamed I think to Democracy Unleashed. I need to check the exact name.

 

Again we see over and over again this kind of transfer of data from political parties to referendum campaigns, different third party actors who reuse this data with very little possibility by citizens to influence this, or participate in decisions on this. Which again I think also questions this whole notion of referendum campaigns because what we see is a constant flow of data from election campaigns to referendum campaigns, etc. Now, there have been different regulatory articles and I am going to finish and wrap up soon. In terms of trying to basically counteract these highly problematic developments.

The problem for me is that most of the Regulatory cause that we have seen focus on framework of data protection that is very well‑known from the European GDPR for example. A framework that focuses on individual rights and of course on the ability to have data flow freely between different markets.

 

So we have an individual fundamental rights justification and this kind of commercial justification as Maja Brkan states in a recent article . What we don't have is an article about the ways this is affected by digital campaigning. Transparency has been another very important buzzword and there has been fantastic research by Katharine Dommet, who shows how this appears again and again in regulatory documents yet there is very little precision in defining what type of data we want to be transparent and how it will be accessible for citizens.

 

Both paradigms of course are useful, they are currently being used and they are the dominant paradigms of regulation. What we completely miss is a focus on the broader democratic implications of this profoundly undemocratic handling of citizens' data during campaigns. And what I argue for in this article and this is my provocation with which I am going leave you in a second is that we need more citizen participation in deciding for what purpose data is collected, what kind of data can political parties and third party campaigners buy and collect? So would we like them to get data from Facebook?  Very interesting research in the Australian context shows that citizens are highly uncomfortable with this data purchases. Knowing they are taking place is not going to solve this fundamental problem of citizens not wanting to have their Facebook data used for political advertising. So I argue we need to include more citizen's voices in terms of deciding what data is acceptable to use, how to use it, in what context, etc. This can happen through different means. We can have the use of different democratic innovations such as citizen panels, focus groups and public consultations and ways of involving the public when deciding how to regulate campaign data.

 

Alternatively we can try to think of even more utopian ways of what kind of things people are comfortable with. There have been very interesting attempts to think about data, data privacy and rights in more collective ways. Most of this research has focussed on the platform economy. We have almost no thinking about public participation when it comes to campaign data. There are many problems with this, I will not enter into them, I am certain we will discuss them in the following discussion and questions and answers. But, on my side I just want to make this provocation and argue that we need more public participation in deciding on campaign data use. I would be very happy to hear whether this is feasible at all legally and what is the opinion of the Regulators on this. So thank you very much.

GINA: Thank you, Julia. I am sure many of you on the call with us today have questions, please get your questions in to the discussion. I am inviting Declan and Bethany to join us on screen, on camera. Bethany, I hope, there you are. And thank you, Julia, for that great provocation, for that great food for thought. We are going to jump in and think about what are the ramifications of these ideas for these citizens of these data practices in previous elections and referenda?  What does this mean for people in the UK. Bethany, you have thought quite a bit about what some of the big consequences are for this data economy and election campaign. Can you tell us more about that?

BETHANY: Yeah. I mean, it's hugely complex and it's almost difficult to isolate the issue of data from other issues to do with electoral campaigning. Whether that is a General Election, local election or referendum. The issues are deeply intertwined and they cover various different bodies of legislation, regulation, guidance specifically the Data Protection Act, 2018, various different parts of the electoral law and of course the guidance that comes from regulatory bodies. So the Electoral Commission and the ICO. So all of these areas at once touch on this issue, whilst also failing to grasp and fully regulate the use of data and whilst also failing to allow data subjects to use the language of data protection law, but citizens, individuals, allowing individuals to see, understand and have any agency over how information about them is used for the purpose of informing them, for the purpose of an election.

Persuading them. Contacting them. All of these things.

So these are the broad issues. Gina, I don't know how long you want me to talk for?  I could go on. But I don't want to take up too much space.

But that is certainly one overall problem that we have various different bodies who are all meant to be in the same space and yet none of them can tackle, not just the issue that Julia raised but all of the related issues that go to democratic participation. I will leave it there.

GINA: That is a nice transition to Declan, who I know has written about public participation in the decision‑making around campaign data. And what the opportunities and what the limits are there. So Declan, would you like to use this opportunity to jump in?

DECLAN: I was very grateful to be included this evening, because Julia's paper and this panel touch on three things which I am very interested in which is public participation in technology and data policy per se. Obviously now data protection now that I work for the ICO and election campaigning as well, which is something I have done some research on when I was an academic. The point about data subjects and citizens is really interesting, because parties have access to electoral data, so we talking about a very wide range of data subjects in that scenario. In term of the implications for citizens. Any area of election campaigning that is not compliant with data protection is therefore not fulfilling the rights of data subjects. So where we see the things that Julia was talking about there, questionable behaviour, behaviour that perhaps needs to be looked at, then we are talking about the violation of people's rights. I think that is a really important thing to draw attention to, because it does then go to the broader points around democratic harms, which Julia talks about in the paper and how those are tackles. That is where I think the conversation gets very difficult and complicated, because there are lots of things to disentangle.

 

So if someone's right is violated, you could make the case that is a democratic harm. We might say it's a slight different harm. In those scenarios, the action that can be taken to protect those rights is mitigating or preventing harm. And this is, this can be done in quite a parochial and trivial relatively speaking ways, DPIAs, for example. Doing a DPIA is quite a dull mechanism, but it is an important mechanism where DPIAs are designed is to get a data controller thinking about the way that they are using that data. Then you have I think the more interesting at the other end of the scale which is this argument that Julia is making about participation. That gets us beyond, I think data protection, in terms of compliance and get us more into the realm of data ethics, because there isn't anything in Kate's arguments are great about what transparency means in practise, what are the mechanisms for transparency and participation is potentially one of those, because if you take the view that public participation and the use of data is aligned with Article 5 of GDPR in term was transparency, if the outcome of participation is to produce transparency then that is a GDPR compliant outcome. It's also one you could argue has a democratic benefit at the same time. One thing I am trying do is close the gap between data protection and democratic, positive democratic outcomes, if that makes sense.

 

This is the final point, I think the democratic participation is in itself an exceptionally complex area, particularly when it comes to tech and policy and this is the focus of my PhD. There is extensive STS literature which explores this. I think in the modern context, there is two sides to it. There is the into ed to actually do it. That is the, one of the arguments. That is where we are in the data ethics, there is nothing in legislation, or nothing legally which says you have to do participation. So that is really on controls and process and not on parties and citizens and everyone else to inculcate and do.

 

Then we have the existing use for democratic participation do exist, but the argument I would carry forward is they are generally insufficient, because they take the consultancy route of you get some consultants in to do workshops and the value of it is less than it could be. I think that is the other way to look at that. I will end there, I have a lot of ground in my opening salvo.

GINA: Both great salvos, thank you. So kind of looking forward what challenges might we expect to see around campaign data coming in the near future, say in 2024, as we ramp up for a General Election in the UK. What might be some of those challenges?  I open it to all three of you.

 

BETHANY: I will jump in. I think the challenges in many ways will be the same as what we have seen before, because there has just not been enough reform and there has not been enough reflection, if you like. I think there has been a huge amount of interest that has been generated following the Cambridge Analytica scandal, which we know is just one tiny pinprick of the activity that occurs around this. Since Cambridge Analytica this business model, because the use of data, the gathering of data and the analysis of data and the selling of data is a vast industry, largely unregulated and hugely global. And there has been even recent insights into how these different companies, engaging in these types of practices are working.

 

There was an investigation on a story in the Guardian. These are hyped up practices, described as black operations where we have the use of individual personal data, but we also have that matched with other practices. Disinformation, of course. Bots. Hacking. And of course old‑fashioned substitution and the aim of this is to artificially coercively shape electoral outcomes. And there is nothing wrong with the electorate being communicated with. There is nothing wrong with my local Labour councillor, or the Conservative campaign HQ knowing where I live, knowing my age and my gender and any of these things.

 

The problem is the fact that the knowledge that is being gathered is becoming deeper, but also the practices surrounding it are, I am not sure what word to use as Julia said in her article, the use of the word transparent or non‑transparent is kind of often not specific enough. But they do nothing to enhance trust, let's say.

We have a historic and very old model for elections and democratic participation and that model hasn't accounted for these forms of political campaigning. And there is often a discussion around the consent of data subjects and consent is really important and consent goes to trust, but it's not the complete answer, I don't think there is one silver bullet. I don't think transparency is a silver bullet and I don't necessarily think consent is a silver bullet, because this is a web of complexity and I don't think the issue is whether or not data is being used against us in ways that we can't see or have any control over. That is certainly a massive issue, but there is also another related issue which is that all of this erodes trust. And all of this, has the potential erode trust in the electoral process itself. That is being reflected in recent research that the electoral commission has done. The Electoral Commission did recent polls and one of the question was specifically about this and the Electoral Commission also commissioned a focus group report where an amount of people were put in a room to discuss all of these issues and I found that, reading that report very insightful and it comes out time and time again, the question of trust, the feeling of being deceived and the feeling of having our basic agency, our political agency being taken away and used against us.

 

The last point I would like to make, I don't want to talk too much, is that there have been ongoing attempts to address this. This is hugely complex and not just about the use of data. But one issue which has fallen out of the Cambridge Analytica story is the question of foreign interference and the ICS published their report on the possibility of Russian interference, the Government refused to investigate that further and there is an ongoing matter before the European Court of Human Rights, where this is just an update, but the Government were written to by the Court and asked to addressed petitioner's concerns and questions about why an investigation wasn't opened up. And the Russian, the potential for Russian interference is another thing that is eroding trust, it's eroding people's sense that their individual agency accounts for a lot less than it should.

GINA: Young that is a really good point and I want to make sure that we have time, because we have gotten some questions brought in from the audience. I want to give Declan a chance to jump in on what kind of challenges we might expect around campaign data in the near future.

DECLAN: I wrote a little list. This is partly based on some stuff I have researched a couple of years ago. So I think the consultancies and data brokers area is a really important place to look, because that is where we are looking at data being shared. That is a thing that is very specific to data campaigns, but I think it is really a broad thing about data processing as well. So that is one area where I think there needs to be a lot of attention, having scrutinising and paying attention to that whole market for consultancy and data sharing. There is also the use of inhouse tools and platforms by parties which seems to be an increasing trend. So the Labour Party, for example has tools like Doorstep and you can see the work of digital parties work, for example, parties as a platform, if you like. That involves a lot of processing of voter data, often derived from canvassing and I think there are significant data governance challenges there, particularly if you are using people to collect data who don't have GDPR training, that creates a data governance risk.

 

Targeted advertising, I think we are aware of the issues that exist there. The one thing that is interesting is the new changes being put in place by, for example, Apple, basically takes a whole swathe of wealthy iPhone users out of the equation when it comes to targeted advertising which I would hypothesis techniques we might see in the next election which would be an interesting thing to look at. Machine learning, I think Bethany Shiner was alluding to this, just the depth and scope of information that is required for that.

 

There is also a really interesting thing there is always a trade off with machine learning in the less private it is the better the model. So you have to find the right balance between utility and privacy. Campaigns want to win, so they will steer towards the utility side of that and that creates incentives that might not be compliant. Then attached to that the capacity now for inferred data. This was in a report that came with the ICO's work. The data points are multiplying massively, we have metadata, biometric data, all of this stuff can be collected and using machine learning calculating to infer things about people that they didn't, it would violate the principle about what data subject expect that data to be used for. If you are using a smart watch you are not necessarily thinking this is going to be used in a campaign, but it might.

 

So I think there are really important challenges there about purpose limitations of what we are using this data for and that would link back to the consultancies and the data sharing. And just on the point, I think I will pass over to Julia, but I think the trust thing is really important for Bethany to raise. It's definitely a good one to discuss on the panel.

I will leave it there and hopefully it answers the questions.

 

JULIA: If I could very quickly jump in. I really liked what Bethany said in the beginning about having a lot of regulation, legislation and guidance that touches the issue but fails to grasp it. This is my fear when we look at the current frameworks for protection, because, for example, both consent and the legal grounds for election campaigns  fail to grasp the core democratic problems. First because we all know how we give consent. I think the example of a smart watch is a fantastic one. We don't always have meaningful consent which the legislation suggests that we would be able to give.

 

Similarly legal grounds for processing seems to me a bit too vague in allowing too many malpractices. So these challenges are there.

I would agree and disagree with some of the things Bethany mentioned. I do agree that what we have is a huge problem of trust. I do agree that we tend to overestimate the capacity of political parties to influence voters. There are huge potential risks, but if we look there has been interesting empirical research within the UK, comparative research across countries, within the Italian context Cecilia Biancalana has done some really interesting research on digital campaigning. The real practice is quite often not so sinister and horrible as we imagine. But I do agree that the big problem we have is basically this very strong narrative of fear, lack of trust, potential dangers, which are there, that make people really doubt the democratic process and here I go again to my main point. I do believe that having citizens involved in discussions a bit more rather than having this a bit more dry, legalistic approach would hopefully help enhance trust. Not sure whether it would really help, it might also undermine it further and lead to polarisation, but I think it's worth trying.

DECLAN: I have two sentences to add to that. The ICO's view is very much aligned with the academic research, so Kate Dommet's work, the tools are much more mundane.

 

Secondly I think it's, I like this point about exploring participation. To say that I am behind that view, I think it's important that it's no, I don't think there is a compelling argument to say we shouldn't have democratic participation in data, that is like, an undemocratic society that is a broadly agreeable point, I think the difficulty is in the implementation of that. That is really where the heart of the issue lies. So it's slightly more than two sentences, but it was intended by two sentences.

GINA: That is a good point to turn to question from Kate. She asks I am interested whether you think it's possible to identify universal public preferences for standards around data?  Is it possible for example that individuals from minority or persecuted groups have different views towards data. For example women are often more cautious than men. How could these different preferences be reconciled?

JULIA: It's a fantastic question and I think it confirms beautifully what Declan just said. The problem with this idea is the problem of implementation. I would argue we have first preferences deferring from country‑to‑country that have been very clearly researched and proven so far and indeed within a country we have very different preferences of different groups of the population. One thing I would change in the paper is to focus more on democratic participation, rather than citizen participation only since these kinds of notions are a bit more inclusive. Of course when it comes to voting this is a citizen's right, but I would agree completely that different groups have very different preferences and then maybe we can try to think of more targeted approaches. I don't think that we need to have this universal one size fits all approach. We can try, for example, to have different representative bodies and different groups. We have people basically share as much data as they would feel comfortable with sharing. Maybe someone would not mind having their smart watch data shared with a political party, others might find it highly problematic. It is a complicated question, but I think it should not prevent us from thinking about different possible solutions and yes, again absolutely. But I think to think about implementation we need to first think about democratic participation and this has been so incredibly absent from the legislation and the existing approaches. Once we start about implementation we will discover a lot of things that can go wrong. But I think it needs to be at least on the agenda, that is my short answer, well long answer!

DECLAN: There is broad agreement. One thing I would argue is legislating for participation. Because I think... I am not sure how it would work for various reasons, because it would be, Parliament would have to adjudicate on the kind of de facto what is democracy and what is public participation which I don't think would be, they are a difficult legislative process with the parties involved. The more realistic territory we are in is the case-by-case best practice potentially with the backing of certain organisations. I am not sure the ICO is that organisation, because it's not really in the remit, but there are Parliament and other third sector organisations and academics which have fantastic capability to put these ideas forward. And you can look at things like an approximate example, things like data charters and things like Camden Council are doing and those kinds of things.

 

The reason I emphasise the case by case thing is the applications of data and how they may end up in an election campaign are varied. If you take the smart watch example, the use case for that around participation would be very difficult for the basic one around the electoral register and how people expect that to data to be used. So it's difficult, because the sources, the touch points of data collection are so varied and wide we will never have a consensus and even if you did get something through on the legislation it would probably be uselessly broad. Again it would go back to a data ethics question, Kate's question is fantastic, as usual, because it gets to the knottiness of a democratic process when you start to take into account different group's views that is a problem of participation, democratic participation per se and is one of the necessary things you would have to understand I think to push this forward. But it doesn't detract from the point that participation is generally a good thing and I think well‑aligned with GDPR in terms of transparency, but I know there is different views on that. I will leave it there.

GINA: We have another view on participation in the form of a question from the audience from Alex. Alex asks there is a complexity between people having a keen sense of unhappiness with how their data is being used by political and business actors and the idea that they will participate in processes around data governance, which gets close to data trusts at scale. Whose utility hasn't been borne out by either quite a few of the practical trials or the longer-term civic technology or interventions. So Alex's questions. I wonder how the panel think about the gap between addressing aims and standardising and maintaining public enthusiasm and participation and the accompanying risks of overrepresentation of bluntly data and policy nerds in this space. So we talk a good talk, but how would that work?  Anyone want to take up Alex's question?

DECLAN: It's well established in science literature to go to that point. Again, I guess it also connects to Kate's point as well about people with an education would have a better knowledge of GDPR and would have more of a process. That is not unusual, when we talk about tech, because technical expertise buys you more access to a conversation or a participation process. So those are, that is a wicked problem I think, I am not sure we can come up with a solution now, but I know there are some answers in the science and technology literature that could directly address that.

JULIA: I will be very quick. I guess also referring to what Declan commented on, probably there are things that are more general and that can be decided, such as are people okay in terms of having different data sets and different parts of their personality merched. For example, Facebook data, data from wearable devices. One thing would be to have opt‑ins. So when people register they can be informed about the possibility of this data, then depending on their data they can go for it or not. This does not solve the problem of some people being more aware of the dangers and some people being less aware. These might be more general approaches that allow us to prevent some of the worst practices that we are currently observing and from which everyone in this ecosystem and Bethany has called in her research before, profits apart from citizens. In terms of maintaining interest, that is a fantastic question and I do agree that maybe some people might be overrepresented. One way that would solve this would not to have necessarily governance, but just an idea, but maybe rather to have more, let's say, consultation processes at the moment of devising the legislation, where people are for example drawn by lots. There are all kinds of attempts also to address the very reasonable critiques that exist against democratic participation.

 

So we can try to somehow balance this kind of overrepresentation of educated people, tech nerds, etc. It's not going to be easy and I would argue that one of the important steps in this direction is also to increase public understanding of technology beyond scandals and this kind of media hyperbole and also public understanding of our technology is something that our centre is trying to work on and improve. It's not easy, but I think these are questions that are absolutely valid, but addressable, hopefully.

GINA: Please, Bethany.

BETHANY: It's okay we can move on if you were going to move on to the next question.

GINA: We have just a few minutes left, so I wanted to give each of the you the chance to pipe in on what needs to change. What needs to change in the electoral landscape?  Maybe, Bethany, maybe we start with you?  

BETHANY: Okay, I mean I think a lot of this comes down to the fact that we don't have clear laws and the laws aren't being understood by the people who need to understand them and they are not being enforced properly. We wouldn't need to have a conversation about democratic participation in the collection and use and selling and then misuse of data if these things were in place. I don't think that most people would find much interest, accept the data nerds in being part, in being involved in something like this, they have to, this question has only been raised because of this problem, this failure in the regulation, in the laws and the enforcement. So that is what needs to change. I mean data ethics are wonderful, but we have laws in place and they are not working. The fact that the selling of data, I think this is something that we could really focus on more. The selling of data, the regulation and the oversight of the sale of the data, when data is sold, what is that data, if it shouldn't have been sold we need to know what the data was. There are some, there have been some changes to the landscape, as I mentioned in the elections that, the digital imprint rules and so on and so forth. I think there could be a lot more fleshing out of the reasonable protection of data subjects in the ICO guidance. I know do refer to it in its guidance but I don't think there enough clarity on what a reasonable expectation might be. Of course this is well‑developed in public law and in grounds for judicial review, so maybe there is something there.

GINA: Just so we have time to give everyone a final word, Declan, what needs to change?

DECLAN: There are lots of things. I think the ICO's 2018 policy recommendation, I promise this isn't going to be a spiel out of the corporate handbook, but I think there are some interesting things in there about regulators and Governments working together to understand what data is being used. I think that is a viable area of exploration.

 

One thing I don't think I have seen is party, when I was doing research was parties rolling out transparency features in relation to what they were doing. I think a lot of other areas of the data of the digital economy are miles ahead of parties when it comes to transparency features. I think there are direct imports from other areas of the economy which parties could rely on. Just as easy wins basically in terms of increasing transparency for people at the low level. Without getting into the other things, just to change tack a little bit, I think the media narrative could probably be, could do with coming back down to earth a little bit in terms of dealing with the day‑to‑day aspects of what is happening in political parties as opposed to the more high falutin' scary stories. I think that would be a valuable change that would improve citizen's understanding, not that that is a criticism of the media, but I am thinking of a general media narrative communication point there. I will leave it there. I don't want to.

GINA: And Julia, final point.

JULIA: I agree, I think we as academics also have a responsibility to contribute to a more balanced narrative. I think my final point would be a reference to an interesting chapter by Colin Bennett who talks about the party cartel in Canada against introducing more stringent data regulations because this is in the interest of the parties. I would argue we need to focus more on party actors not only on the few rotten apples and encourage them to implement transparency rules that Declan referred to. I think this points to the broader question of the politics of law‑making.

 

I agree with Bethany that one of the biggest problems is we have laws that we don't understand and are distant and not implemented, but I do expect that it be politicised and this public politicisation and debate might at least for a bit spark interest and push towards a better ecosystem. We have seen such politicisation in the field of copyright diplomacy which was considered super technical. Maybe if we see this in how data is used in elections and referendum campaigns we can see more concerted push in time to have a better system. Even if people disengage afterwards and don't care.

GINA: I don't know if leaving on the note of people being disengaged and not caring is the note we will leave on, but we will leave Julia with the last word. I want to thank all of you for being with us today.

 

I want to just promote a few of the upcoming events we have at the centre. The Cambridge Festival is around the corner. Please join John Naughton, Marco Caballero and others for the discussion on the right to prepare and why it's vital to take back control of things that we own. That will be on Tuesday 28th of March at 5:30. That is in person in Lee Hall in Wolfson college in Cambridge. Another in person event in May after the Easter holiday, please join us to welcome Meredith Broussard to discuss the ways in which technology and inequality. She will be discussing her book that launched today, More Than a Glitch confronting race, gender and ability bias in tech and that book is from MIT Press. That event is the 9th of May at 5.00pm person at Jesus College.

 

Then traditionally today our affiliate, Dr Alexa Hegarty has released a book, 'Still Life With Bones'. Sorry, I just lost the script on this. It's a brilliant book. Buy it, read it. The book is a haunting account of violence, grief and forensic anthropology in Latin America. In the book she works with teams and victim's families to investigate crimes against humanity, exploring what science can tell us about the lives of the dead. Please follow us on social media at MCT Cambridge, follow us on Twitter and Facebook and other platforms. And you can find out more information about these events and others on our website.

 

Again, thank you all very much for joining us today. We look forward to seeing you at future events.