Interviews

One of our main goals is to facilitate conversations between those concerned about potential risks from advanced AI systems and technical experts. To that end, we conducted 97 interviews with AI researchers on their perspectives on current AI and the future of AI, with a focus on risks from advanced systems. This collection of interviews includes anonymized transcripts, quantitative analysis of the most common perspectives, and an academic talk discussing preliminary findings.

Interactive Walkthrough

In our interviews with AI researchers, some of the core questions focused on risks from advanced AI systems. To explore the interview questions, common responses from AI researchers, and potential counterarguments, we created an interactive walkthrough. You are encouraged to explore your own perspectives, and at the conclusion your series of agreements or disagreements will be displayed, so that you can compare your perspectives to other users' of the site.


Resources and Getting Involved

Interested in learning more? Our Resources page has further reading, both for ML researchers as well as for the general public.

Concerned about potential risks from advanced AI systems? We have recommendations for what you can do to help. In particular, work in technical research on AI alignment is especially needed, and we would be happy to talk with you about these opportunities.


About Us

AI Risk Discussions (AIRD) was developed by Larchwood, a project of Players Philanthropy Fund, a Maryland charitable trust recognized by IRS as a tax-exempt public charity under Section 501(c)(3) of the Internal Revenue Code (Federal Tax ID: 27-6601178). We aim to facilitate discussion and evaluation of potential risks from advanced AI. Our focus is on soliciting and engaging with expert perspectives on the arguments, and providing resources for stakeholders. This project is led by Dr. Vael Gates, with many other contributors, most prominently Lukas Trötzmüller (interactive walkthrough), Maheen Shermohammed (quantitative analysis), Zi Cheng (Sam) Huang (interview tagging), and Michael Keenan (website development).