Prof Gina Neff speaks at Data for Policy 2024 on shifting mindsets around the value of data and building capacity for open data use and governance

The theme of this year’s Data for Policy conference was ‘Decoding the Future: Trustworthy Governance with AI?’ and convened global experts from academia, industry, and policy to explore how AI can foster trustworthy decision-making and governance.

Our Executive Director Professor Gina Neff joined the ‘At a time of rapid advances in AI, are we instead entering a data winter?‘ panel on Day Two (10 July) of the conference, alongside Sonia Cooper (Assistant General Counsel at the Open Innovation Team at Microsoft), Prof Elena Simperl (Director of Research at Open Data Institute), Barbara Ubaldi (Head of Digital Government and Data Unit at the OECD), and panel chair Stefaan Verhulst (Co-Founder of The GovLab at the NYU Tandon School of Engineering).

The ‘data winter’ panel considered the current state of the open data ecosystem, how it has been affected by generative AI, and possible ways forward to advance data openness. The conversation centred around three main issues: the impact of data accessibility on AI development, public and private sector data commitments, and the democratisation of data through generative AI.

On the topic of data accessibility and AI development, Prof Neff discussed her provocation – published in a recent WIRED article on the digital dark age – that we are in danger of a new AI winter if inadequate data access continues. She explained how the pulling up of API drawbridges from social media platforms, lack of sharing by commercial actors, and the difficulty of building out meaningful access for public data are setting us on a bleak path. At a time when enthusiasm about AI is high, data access problems are slowing down the potential benefits of AI for society.

The panel discussed the need for a shift in the mindset around data, highlighting two particularly unhelpful narratives: 1) data is the new oil and 2) data can simply be sprinkled on problems. The former has driven commercial actors to assert the inherent value of data rather than thinking about data as most valuable when it connects (i.e. across people or datasets) and when it is made into meaning. The latter fails to recognise that there is a significant amount of work that needs to be done to make data work in specific contexts and for specific problems. An AI winter will be all the more certain if we continue to cling to these approaches to data.

In terms of the public and private sector data commitments, Prof Neff focused on the need to build capacity for using data in the age of AI. She emphasised not only increasing skills and data literacy but also creating the engineering for the future, so that, echoing a point fellow panelist Prof Simperl made, the “plumbing” for data analysis does not need to be built from scratch every time. Seattle’s One Seattle Data Strategy, for example, makes data sharing part of the city’s governance mandate and as a result they think more carefully about what data to collect and maintain. Openness and privacy do not need to be opposites and strategic approaches to open data can point the way to new alternatives to surveillance capitalism. Rather than hoovering up data in case it is useful, governments and companies should take better data minimisation approaches.

Finally, on the subject of democratisation, Prof Neff concluded that more data in people’s hands can transform how they ask questions, connect to others, and view the world. Convening in spaces like Data for Policy and the Data & Policy community can ensure that we keep building toward that optimistic future where data is truly accessible and used responsibly by governments and private actors.