What is existential risk from AI, and what do we do about it?
The development of AI has been incredible fast over the last decade. We seem to be able to keep up less and less, while the abilities of AI will soon outpace us. What do we need to do to make sure that AI will not become an existential risks? What is the latest on this, and what are decision-makers willing to do? And what is their responsibility to do something about it?
Book your spot for free at Pakhuis de Zwijger!
About the speakers
Connor Leahy is CEO and founder of Conjecture, a London-based company aimed at making AI existentially safe by doing technical research. Before founding Conjecture, Connor partially reverse-engineered GPT-2, a precursor to the models behind ChatGPT, from his bedroom. He is known to speak frankly about difficult questions in AI and has been invited to comment on AI existential risk by CNN and many other media outlets. Connor Leahy will participate remotely.
Hind Dekker-Abdulaziz is a Member of Parliament for D66. She is a member of the digital affairs committee and recently filed a parliamentary motion asking for more AI Safety scientific research, which was unanimously accepted.
Mark Brakel is Director of Policy at the Future of Life Institute, one of the world’s leading nonprofits working on the governance of AI. He is a former Dutch diplomat and has previously briefed the UN in Geneva, the European Parliament and the defence committee of the German parliament. Mark has extensive experience in practical policy-making at the EU level, notably with the draft Artificial Intelligence Act.
Joep Meindertsma is one of the founders of PauseAI, a movement campaigning for an AI Pause. Their protests in Brussels and London were covered by Politico, TIME, Euronews, and the BBC, and a Dutch protest is upcoming. Before becoming a fulltime activist, Joep was the CEO of a Dutch IT company.
This event will take place at Pakhuis de Zwijger at the Piet Heinkade 179 in Amsterdam. It will start at 19:30. This event will take place in English.