Minderoo Centre for Technology & Democracy

Annual Report 2023-2024


 

March 2024

Minderoo Centre for Technology and Democracy

The Minderoo Centre for Technology and Democracy is an independent team of academic researchers at the University of Cambridge, who are radically rethinking the power relationships between digital technologies, society and our planet.

www.mctd.ac.uk

 

Table of Contents

Foreword. 0

Our Year in Numbers. 0

Meet the Team.. 0

Key Achievements. 0

Policy Engagement 0

AI Fringe. 0

AI and Employment Partnership. 0

AI in the Street 0

The Technology and Democracy Conference. 0

Publications & Report Launches. 0

Overview of Projects. 0

Spotlight on Researchers. 0

Academic Publications. 0

Events. 0

Media Engagement 0

TV & Radio. 0

Podcasts. 0

Op-Eds and Letters. 0

News Articles and Local Press. 0

Governance. 0

CRASSH.. 0

 

Foreword

Prof. Gina Neff, Executive Director

The Minderoo Centre for Technology & Democracy was founded to shift the balance of power around digital technologies. Our founding initiatives on work, trust, and the environment anchor our research, policy engagement and commitment to expanding the public’s capacity to hold the tech sector to account for the decisions that affect us all.

This year we saw global conversations about artificial intelligence and the kinds of changes that these technologies might bring for people, communities, and the planet. The expansion of our work over the last year shows how the stakes have increased. We engaged a broad set of stakeholders and policy makers and consulted directly with decision makers and regulators, helping to get the right voices to the tables where decisions about digital technologies are made. It has been our busiest year, and our team has grown to meet new challenges and opportunities.

This report covers our fiscal reporting period April 2023 to March 2024. Over this time, we increased the support for our research through projects with UK Research and Innovation and the Economic and Social Research Council. We continue to partner with civil society groups like the International Committee of the Red Cross and the Trades Union Congress. Our work to date prepares us to lead in three key areas. First, we will work to ensure that AI technologies are developed, deployed and used responsibly, and that people affected by automated technologies can meaningfully participate in decisions about them. Second, we will continue to partner with communities and researchers to ensure a healthy digital public sphere, which is especially important as democracies around the world hold elections this year. Third, we will work to ensure that the infrastructure for our digital future does not come at the cost of climate goals.

Looking forward, we will continue to strengthen the links between our research and our policy engagement, working to connect, convene, and contribute to the conversations about our digital futures. We will continue to bring our expertise and support education and capacity building to make digital technologies work for societies. Joining together to build a just digital future has never been more urgent.

 

Our Year in Numbers

·     49,000 Website views

·     17,000 Website visitors

·     4,000+ Followers on social media

·     4,000+ Attendances at out hosted and participatory events

·     10 Core team members

·     30 Affiliates

 

Meet the Team

Gina Neff
Executive Director

Louise Hickman
Research Associate

Hugo Leal
Research Associate

Julia Rone
Research Associate

Hunter Vaughan
Senior Research Associate

Stefanie Felsberger
Research Associate

Timothy Charlton-Czaplicki
Research Associate

Ann Kristin Glenster
Senior Policy Advisor

Irene Galandra
Centre Coordinator

Tom Lacy
Policy and Impact Manager

Molly Hayhurst
PA to the Executive Director

Christine Adams
Research Project Administrator

Affiliate Network
We have grown and nurtured our global affiliate network. We currently connect 30 affiliate researchers from across continents, disciplines and at different stages of their career, fostering collaboration, hosting regular reading groups, and connecting across events and projects.

 

Key Achievements

Policy Engagement

Over the last year we have doubled down on our commitment to translate our research into effective and impactful policy engagement.

Between our Generative AI Policy Brief, our workshops, and our written evidence submissions we have put more energy into directly engaging policy stakeholders and government agencies than ever before. This couples with our continued programme of thematic reports to create a consistent campaign front.

We continued to work with stakeholders to campaign for better researcher access to online platforms’ data. We coordinated a letter, co-signed by 20 high-profile academics and campaigners, to the Secretary of State, strongly urging the Government to back specific amendments to the Online Safety Bill and calling for stronger regulations to guarantee access to data to study the online world.

In January, Gina Neff took part in a two-stage civil society consultation with the Secretary of State and Department for Science, Innovation and Technology staff on the Government’s AI White Paper, and Ann Kristin Glenster joined with partners at the University of Cambridge to meet the Shadow Secretary of State about Cambridge’s capacity in this area.

We are a member of PICTFOR, the Parliamentary All-Party Parliamentary Group on Internet, Communications and Technology, and have submitted written evidence to a range of bodies including the UK Information Commissioner’s Office, US Consumer Financial Protection Bureau, and United Nations.

 

AI Fringe

In November, while the UK Government hosted big-tech and international representatives at the AI Bletchley Summit, we were involved in a wider gathering of civil society, academia, and industry: the AI Fringe. For the Centre, the Fringe provided a locus of activity for stakeholder engagement, media work, and events.

During the AI Fringe, Gina Neff represented the Minderoo Centre for Technology and Democracy in interviews on BBC News and Canadian CBC news (discussing the Prime Minister’s speech on AI regulation), LBC news (about the challenges AI is presenting in the here and now), Times Radio (discussing the summit as a whole), and BBC Radio 4 (on AI generated abusive images). Gina also recorded an episode of BBC Newscast on the risks of AI.

In addition, Gina moderated a panel discussion on AI and the future of work, appeared on a separate panel on the same theme the following day, and attended dozens of stakeholder and networking meetings with diverse contacts from across the globe.

In February, RAI UK published the AI Fringe report, bringing together views garnered during the week of activity from diverse perspectives.

 

AI and Employment Partnership

We have partnered with the Trades Union Congress (TUC) on a project to draft legislation protecting workers’ rights in the age of AI.

We co-convened an expert advisory committee consisting of legislators, academics, unions, and NGOs to advise on the drafting of an AI and Employment Bill. The Bill was commissioned by the TUC and drafted by expert lawyers. Gina Neff co-chaired the advisory committee, which met three times over the last year, and staff from the Minderoo Centre for Technology and Democracy provided the secretariat to the project, coordinating committee members and bringing together written contributions.

We will support the TUC’s launch event in April, and will continue to partner with the TUC to campaign on this vital issue.

 

Looking forward to further key activities for 2024:

 

AI in the Street

Louise Hickman is a co-investigator on the AHRC’s BRAID grant, AI in the Street: Scoping everyday observatories for public engagement with connected and automated urban environments. As part of this multi-partnership grant with the University of Warwick, University of Edinburgh and University of Cambridge, Louise will explore divergences between principles of responsible AI and the messy reality of AI as encountered in the street, in the form of automated vehicles and surveillance infrastructure. The aim is to ground understandings of AI in lived experiences.

 

The Technology and Democracy Conference

In April 2024, we will host a conference of around 120 high-level invited attendees from across disciplines to discuss the interplay of technology, democracy, and our planet.

The Technology and Democracy Conference builds on 4 years of work at the Centre, and will allow us to bring together our impressive network of stakeholders to move the conversation forwards on how technology should be governed, and for whom.

 

Publications & Report Launches

We have published an ambitious series of policy interventions over the last year:

 

Policy Brief: Generative AI
Co-published with the Bennett Institute for Public Policy and AI@Cam and aimed directly at policy makers, this report outlines policy levers the UK could use now to become a global leader in applying safe, responsible and ethical AI.

Deceptive Design: Workshop Report
This interim report by Ann Kristin Glenster outlines a workshop with diverse stakeholders at the Nobel Prize Summit, summarises the problems and harms associated with deceptive online design, and proposes a framework for solutions.

Written evidence to US CFPB: dark patterns and online consumers
Continuing our work on deceptive design, our evidence to the US Consumer Finance Protection Bureau urges regulators to look again at how companies extract personal data for market profiling and financial gain.

Evidence submitted to the Global Digital Compact
Our response to the UN Secretary General’s call for evidence on the Global Digital Compact brought together input from 17 researchers at the University of Cambridge to explore key principles for connecting all people to the internet.

The Global Digital Compact and International Law
Building from our written evidence, this report explores key takeaways from a workshop we co-convened with the Lauterpacht Centre for International Law on the legal implications of the Compact’s mission.

Written evidence to ICO: Legal Basis for Web Scraping
The UK Information Commissioner’s Office called for evidence in the first of a series of consultations on regulating AI. Our written evidence urges clarification on the legal basis for large-scale web scraping.

Letter to the Secretary of State: Researcher Access to Social Media Data
Following on work from previous years and a workshop hosted in Cambridge, we coordinated a letter co-signed by 20 high-profile academics calling for stronger legislative commitments to guarantee researchers’ access to data from social media sites.

 

We continued to chart the way forward for the kinds of digital futures we would like to see:

Sustainable Digitalisation: Ensuring a Sustainable Digital Future for UK Film and Television
MCTD’s Hunter Vaughan partnered with Pietari Kääpä to explore the environmental and social sustainability impacts of digital media practices.

Securing the Metaverse: Addressing Harms in Extended Reality
MCTD affiliate Shannon Pierson called for better regulation and platform governance to protect users of Virtual and Extended Reality systems.

A Distinct Absence: Why Crypto and Web3 Technologies Need Environmental, Social and Governance Frameworks
In collaboration with Venture ESG, MCTD affiliates Johannes Lenhard and Cessiah Lopez called for the adoption of Environmental, Social and Governance Frameworks for any Venture Capitalists investing in Web3 and Crypto spaces.

 

Overview of Projects

Responsible AI UK

Gina Neff is Chair of the Strategy Group for Responsible AI UK (RAI UK).

RAI UK is a £31 million consortium backed by UK Research and Innovation (UKRI) to create and fund a research and innovation ecosystem for responsible and trustworthy AI in the UK and beyond.

The Strategy Group works closely with the Leadership and Delivery Teams to ensure that the programme leverages partnerships, outputs, and research strengths of existing research council investments, and that it is inclusive of the wider research community.

As Chair of the Strategy Group, Gina advised RAI UK as they funded £1.8 million of acceleration projects, £300,000 of international partnership projects, and a soon to be announced set of ‘keystone’ projects. RAI UK was also a founding partner of AI Fringe (see key achievement 2) and continues to bring together stakeholders from across the UK and the world.

 

Digital Good Network

The Minderoo Centre for Technology and Democracy has joined the Digital Good Network (DGN). This £4 million, multidisciplinary and cross-sectoral network explores how to ensure that digital technologies benefit people, society and the economy.

The Digital Good Network is funded by the Economic and Social Research Council (ESRC), part of UKRI. With a management team led by the University of Sheffield, DGN has partners including BBC R&D and Birmingham Museums Trust.

Staff at the Minderoo Centre for Technology and Democracy are particularly involved with the creation of a ‘Digital Good Index’, a product bringing together the work of DGN partners to create a usable resource for assessing the ‘good’ in digital products and services.

 

AI4TRUST

AI4TRUST is an Horizon Europe consortium of 17 organisations led by Fondazione Bruno Kessler. The project aims to combat misinformation and disinformation across Europe by creating trust-based tools integrating AI-based technology and the work of human fact-checkers.

Last year, we announced that Gina Neff and Hugo Leal had joined the project. Since then, we are pleased to say that we have taken on another researcher to work full time on the project: Stefanie Felsberger.

Researchers from the Minderoo Centre for Technology and Democracy are leading policy and ‘human-centred AI’ research for the consortium, delivering a programme of work that will help AI4TRUST build a tool that journalists and policy makers can use in their work.

 

Humanitarian Action Programme

The International Committee of the Red Cross (ICRC) has funded the Humanitarian Action Programme.

Led at Cambridge by Gina Neff, the Humanitarian Action Programme is a home for new research that is:

·      Studying how digital transformation and new technologies are impacting future humanitarian action, as well as communities affected by humanitarian emergencies.

·      Examining the impact of new technologies on the capacity of humanitarian organisations to continue serving affected communities.

We have hired a full time researcher for this project: Timothy Charlton.

 

Spotlight on Researchers

Louise Hickman’s most significant outcome this year stems out of her time as PI on a Digit Innovation Fund grant for her project on The Digitalisation of Access Work: fiction to policy recommendations. The aim of her research was to investigate the perspectives of access workers who provide support for disabled individuals in the workforce and gain a better understanding of their technical expertise, skills, and practices. She also aimed to comprehend how the increasing digitalisation of access work has changed working conditions for disabled workers and the potential consequences for non-disabled employees.

For example, the use of cloud transcription services during online meetings can benefit disabled and deaf individuals, but this recent change often overlooks data protection and consent issues. Furthermore, the limited availability of access workers to support employees in the workplace has far-reaching implications for their peers. Together with co-I Hannah Wallis (Curator at Grand Union), Louise co-led a workshop with Jenny Chamarette from Reading University and Jo Lindsay Walton from Sussex University, both with backgrounds in near-future fiction and critical disability studies.

The resulting fiction work, “Criptörök” delves into the intersection of Mechanical Turk work and critical disability studies. This pairing of the disabled body with the discussion of click work served as a productive means to address the technical expertise of access work from a distance, providing anonymity for those who participated in the initial stage of the project. The text is now available at the Grand Union Gallery in Birmingham.

In 2023 Julia Rone won a DAAD grant and co-organised (with Hunter Vaughan and and MCTD Affiliate Sebastián Lehuedé) a workshop on the Democratic Politics of Digital Sovereignty, which brought leading policy scholars from Germany together with UK-based researchers on media and digital infrastructures to discuss a common academic and policy agenda.

In 2024, in collaboration with Eli Gateva (University of Oxford) and Emilija Tudzarovska (Czech Academy of Sciences), Julia won UACES network funding to build a network engaging in comparative East-West research in Europe. At the opening conference of the network, Julia is convening a panel on the digital transition in Europe from a comparative perspective.

With a European Policy grant from the University of Warwick, co-PI and project co-director Pietari Kääpä and Hunter Vaughan completed a report on Sustainable Digitalisation: Ensuring a sustainable digital future for UK film and television and held both an in-person strategising soft launch in Cambridge (October 5, with 20 academic and industry experts in attendance) and a larger online event.

Response to this report has been very strong, and indicates support and need for far more research, visibility, and engagement on the intersections between digital growth and environmental and social sustainability in the field. Based on this work Hunter has been invited to give multiple talks in the UK and US, including the 2024 annual Benson Lecture at Iowa State University.

The Sustainable Subsea Networks project also experienced growth and large-scale outputs over this time. Their final report was released in December 2023 and circulated widely, achieving well beyond the original promised deliverables and including a best practices assessment and map, carbon footprinting model, renewable energy feasibility study, and policy and law survey. They were successful in drumming up additional interest and support to develop metrics across the marine, manufacturing, and cable landing station operations of the global subsea sector.

Hugo Leal focused on online misinformation and disinformation. The bulk of his activities have been developed in the context of the project AI4TRUST, ”AI-based technologies for trustworthy solutions against disinformation”. Hugo is a Co-I in the work package on “Human-Centred Interpretation, Explainability, and Policy”, and his efforts have been to lead the social scientific input, namely on the most adequate theoretical and analytical frameworks to conceptualise, identify and counter online misinformation and disinformation. Hugo co-wrote the report on the “Social dynamics of misinformation”, and he has been actively involved in the design of the methodological toolbox that will underpin the entire project.

These activities have been complemented with a range of academic engagements on the same and/or related topics, such as the participation in the international conference “Countering Foreign Information Manipulation and Interference”, sponsored by the Delegation of the European Union to the UK; the participation in the International Symposium “The Language of Fake News”, at the Senate House, London organised by the School of Advanced Study at Royal Holloway, University of London; or the involvement in the first Cambridge Misinformation Hackathon as juror.

Hugo has also given a substantial contribution to the cross-Cambridge evidence submission to the United Nations Global Digital Compact.

Stefanie Felsberger joined the Minderoo Centre for Technology and Democracy as Research Associate working on the AI4TRUST project. Building on her PhD research which analysed how users of period tracking applications engage with and work with technologies, Stefanie has recently started to conduct qualitative fieldwork with the different stakeholder groups within AI4TRUST targets: policy makers, fact-checkers, journalists, academics and scientists. She conducted ethnographic fieldwork with a leading fact-checking organisation in Europe to understand fact-checkers’ existing strategies to counter mis/disinformation and their understanding of existing gaps between available tools and strategies to counter disinformation.

Her work contributed to a report published through the AI4TRUST consortium that mapped the social dynamics of mis/dis/malinformation (November 2023). Stefanie was invited to present her work and research at a workshop on “Building resilience to future emergencies and disinformation through adult information literacy” organised by the Royal Society, WikiMedia, and the BBC in November 2023.

In September 2023, Timothy Charlton-Czaplicki took up the position as Research Associate in the Humanitarian Action Programme.

Since September, Timothy has worked closely with a team at the ICRC Delegation for Cyberspace in Luxembourg to establish how digital technologies are impacting neutral, impartial and independent humanitarian aid in conflict situations. A research strategy, collaboratively developed over the past months, will guide the project’s initiatives and research outputs going forward, with an initial report scheduled for summer 2024.

Timothy represented the Minderoo Centre for Technology and Democracy at the January 2024 Symposium on Cybersecurity and Data Protection in Humanitarian Action, a conference of key stakeholders from humanitarian organisations, governments, data protection authorities, cybersecurity agencies, the private sector, civil society and academia.

 

Academic Publications

The Politics of Data Infrastructure Contestation: Perspectives for Future Research
J Rone, Journal of Environmental Media 3 (2), 207-214.

Interconnected Realities: The Hybrid Dynamics of Far-Right Online and Offline Mobilisation
M Hoffmann, J Rone, Contemporary Germany and the Fourth Wave of Far-Right Politics, 57-75.

‘Stop the Pact!‘ The Foreign Policy Impact of the Far-Right Campaigning Against the Global Compact for Migration
J Rone, M Fielitz, Geopolitics, 1-24.

Rethinking Affordances
P Nagy, G Neff, The SAGE Handbook of Human–Machine Communication, 273.

ICT Environmentalism and the Sustainability Language-Game
H Vaughan, A Pasek, N Silcox, N Starosielski. “ICT Environmentalism and the Sustainability Language-Game.” Journal of Language and Politics, 2023 (DOI https://doi.org/10.1075/jlp.22125.vau).

Footprinting the Network
N Starosielski, H Vaughan, A Pasek, N Silcox. “Footprinting the Network: Subsea Cables and the Challenges of Environmental Assessment of Internet Infrastructure.” Routledge Handbook of Ecomedia Studies, Routledge Press, September 2023.

Environmental Media Management: Overcoming the Responsibility Deficit
P Kääpä and H Vaughan. Routledge Handbook of Ecomedia Studies, Routledge Press, September 2023.

 

Events

We led a rigorous programme of public events, directly hosting around 800 attendees at our events over the last year, including:

 

Report Launch
A distinct absence: why crypto and web3 technologies need environmental, social, and governance frameworks
(online webinar and in-person stakeholder round-table)

Public Talk
Is ‘artificial’ intelligent? Understanding human intelligence in the AI age, with Sheila Hayman

Report Launch
Towards a Sustainable Future in UK Film and Television
(online webinar and in-person stakeholder round-table)

Public Talk
Meredith Broussard: Confronting race, gender, and ability bias in tech

Public Talk
AI and the Attention Economy: What tech companies need to disclose, with Tim O’Reilly

Public Talk
How are technologies impacting human rights?

Public Talk
Where’s my data? The problematic legacy of data- driven campaigning

Interactive Public Workshop
How can Democracies reclaim control of digital technologies?
(in collaboration with DAAD Cambridge)

Online Public Workshop
Digitalisation of Access Work: From Fiction to Policy

Report Launch
Securing the Metaverse: Addressing harms in extended reality

Public Talk
The Right to Repair, with John Naughton

 

As Executive Director, Professor Gina Neff represented the Centre at an ambitious programme of public events, reaching thousands of audience members, including:

IFG DataBites: Getting Things Done with Data in Government

Shaping an Ethical Future for British Jobs (TechUK Digital Ethics Summit 2023)

AI Fringe events - including the IFOW’s panel on AI and the future of work

Royal Society Science X AI Safety - Expect the Unexpected

VivaTech: (When) Will Generative AI Take Over Your Job?

Perspectives of AI Risk in Light of the UK’s AI Safety Summit (GWU Data Governance Hub)

Ethics of Artificial Intelligence - Dilemmas and Solutions (Thomas Reuters Foundation Trust Conference 2023)

Making AI work for us: Gina Neff for the Alexander Von Humboldt Institute

 

Media Engagement

TV & Radio

As the profile of our centre grows, Gina Neff has appeared on several live news broadcasts, including:

LBC Andrew Marr Show
In the context of the Bletchley AI Safety Summit, Gina Neff spoke to Andrew Marr on LBC about the challenges we are facing through AI tools, for example, to our jobs and our families; challenges that were not necessarily being addressed at the summit.

Canada: CBC Morning Live
The AI Safety Summit was also international news: Gina Neff spoke to CBC about the different kinds of risks AI is posing, and how addressing those risks can help facilitate responsible innovation.

BBC Radio 4: Woman’s Hour - AI and Child Sexual Abuse
AI is already increasing child sexual abuse imagery online, and as many more people now have access to AI tools, we are seeing the democratisation of these harms. Gina joined Jessica Creighton and Emma Hardy from the Internet Watch Foundation to examine this troubling trend.

BBC Radio 4: Woman’s Hour - Online trolling against women
In February, Tortoise Media published a major investigation revealing hundreds of AI-generated bots had been deployed on social media to troll Amber Heard. Gina joined Tortoise’s Alexi Mostrous, arguing that platforms have a responsibility to put tools in people’s hands to help combat this kind of abuse when it happens.

BBC Radio 4: The Briefing Room - Will the Online Safety Bill Succeed?
As the UK’s Online Safety Act passed through Parliament, Gina Neff joined an expert panel of speakers on The Briefing Room to discuss the legislation’s scope, approach, and whether it would be effective in its final form.

 

Podcasts

Our work is also increasingly featured in longer-form podcast interviews:

BBC Newscast
AI: Nothing to ‘Lose Sleep Over’?

BBC Newscast
How is AI Threatening Elections?

The Nowhere Office
Gina Neff discusses AI’s impact on the future of work

AI Ireland
Broad-ranging discussion with Gina Neff recorded at the British Ambassador’s residence in Dublin

The Good Robot
Louise Hickman on the politics of captioning

Cultural Studies
Hunter Vaughan on Internet Infrastructure and the carbon footprint of digital media

 

Op-Eds and Letters

We have authored a number of letters and Op-Eds to support our work campaigning on the issues that matter to us:

Gina Neff for Wired
Are we entering a New Digital Dark Age?

Letter in The Times
Pressing need for secure and safe AI systems

Letter in the Telegraph
The key to a more effective Online Safety Bill

 

News Articles and Local Press
Over the last year our team and research have been featured in a range of news outlets, from the national papers to specialist and local outlets:

BBC News
What guardrails are needed for AI?

BBC News
AI hurts ability to keep democracy alive

BBC News
Gina Neff quoted in article: “Rishi Sunak says AI has threats and risks”

Deutsche Welle
AI: Which rules do the top tech moguls want?

The Guardian
‘The challenges are real’: TUC taskforce to examine AI threat to workers’ rights (TUC collaboration)

The Canary
Workers and employers are crying out for certainty over how AI should be used in the workplace (TUC collaboration)

Wired
Police Use of Face Recognition is Sweeping the UK

Open Access Government
AI legislation is vital towards empowering the UK economy (AI policy brief)

Eurekalert!
UK needs AI legislation to create trust so companies can ‘plug AI into British economy’ – report (AI policy brief)

Innovation News Network
The UK must leverage generative AI for real-world applications (AI policy brief)

Information Age
“Meaningful” AI regulation needed from UK government, says report (AI policy brief)

The Mirage (Australia)
UK Urged to Enact AI Legislation for Economic Boost (AI policy brief)

HR Grapevine
Unions criticise Government for leaving workers out of AI summit

Women's Health Magazine
How worried should you be about period tracking apps?

 

Governance

Advisory Board

Steven Connor
Director of Research of the Digital Futures Institute, King’s College, London

Diane Coyle
Bennett Professor of Public Policy, University of Cambridge

Richard Danbury
Senior Lecturer in Journalism, City University of London

Sheila Hayman
BAFTA and BAFTA Fulbright winning documentary filmmaker, Director’s Fellow, MIT Media Lab

John Naughton
Emeritus Professor of the Public Understanding of Technology, Open University, Director, Wolfson Press Fellowship Programme, the Observer’s Technology columnist (Chair of the Advisory Board)

Rasmus Kleis Nielsen
Director of the Reuters Institute for the Study of Journalism and Professor of Political Communication at the University of Oxford

Emma McDonald
Acting Executive Director of Impact Missions, Minderoo Foundation

 

CRASSH

The Minderoo Centre for Technology and Democracy is hosted within CRASSH (Centre for Research in the Arts, Social Sciences and Humanities).

CRASSH was established in 2001, with the objective of creating interdisciplinary dialogue across the many departments and faculties of the School of Arts and Humanities and the School of Humanities and Social Sciences, and to forge connections with science subjects. Since then, CRASSH has grown into one of the largest humanities institutes in the world, with a global reputation for excellence. 

CRASSH serves to draw together disciplinary perspectives in Cambridge and to disseminate new ideas to audiences across Europe and beyond.

 

 

Minderoo Centre for Technology and Democracy

 

Address:
Alison Richard Building, 7 West Road, Cambridge, CB3 9DT

 

www.mctd.ac.uk

minderoo@crassh.cam.ac.uk

 

Report design by CHRIS PRINTED THIS