AI Safety Fundamentals: Governance Course

We provide a completely free, online course that covers foundational knowledge for doing research or policy work on the governance of transformative AI (TAI): future AI systems with impacts at least as profound as those of the industrial revolution.

We aim to help people who are curious about working in AI governance – or who are already doing so – to have a positive impact on the governance of Transformative AI.

This course follows our governance curriculum.

Course overview

In 2023, ChatGPT became the fastest ever product to gain 100 million users. Whilst ChatGPT is a rather benign tool on its own, its release and adoption marks the rapidly increasing capabilities of artificial Intelligence systems.

The rise of any powerful technology demands a thoughtful approach to its governance and regulation. There has been increasing interest in how AI governance can and should mitigate extreme risks from AI, but it can be difficult to get up to speed on research and ideas in this area.

We run this course to help you learn about potential accidents, misuse and structural harms arising from ‘Transformative Artificial Intelligence’, and proposed solutions. Examples of the types of risks we’ll consider are interactions between AI and biosecurity, cybersecurity and defence capabilities, and disempowerment of human decision makers. We’ll also provide an overview of open technical questions such as the control and alignment problems – which posit that AI itself could pose a risk to humanity.

Alongside the course, you will be joining hundreds of course alumni in the AI Safety Fundamentals Community. The AISF Community is a space to evaluate and debate these ideas with others with the relevant skills and background to contribute to AI Governance, and grow your network to foster future opportunities in AI Safety.

The course is completely free to participate in, and is run by BlueDot Impact – a non-profit project that runs courses to support people like you in developing the knowledge, community and network you need to pursue high-impact careers.

The next round of the course will being in August, lasting 12 weeks. 

There will be some precursory steps to take from mid-July. The deadline is 23:59 Sunday 25th June. 


Please register your interest for our other courses, if you want to be notified when they start.

If you are already well versed in AI safety and governance, and are still keen to get involved, consider reading about facilitating the course.

Why should you do the course?

Learn about key ideas with a widely recommended curriculum. It is difficult to know where to start when trying to learn about AI Governance for the first time. The course will give you the structure and accountability you need to explore the key features of AI, frameworks that have been put forward so far, and give you a conceptual map of the field.

Get access to knowledgeable facilitators. You will have access to a facilitator who has the knowledge needed to help you navigate the content and the social aspects of the field like what organisations exist, which researchers are doing what work, etc.

Learn alongside others via our AI Safety Fundamentals community. You’ll have the opportunity to learn from other community members, and work with each other to explore the field. After the course, these might be people you form project collaborations with, or give each other accountability in applying for jobs in AI governance and preparing work samples and interviews, etc.

Get support to think through your potential next steps. We hope you’ll use the learning you gain to inform your next steps in AI safety and governance, which could involve starting a project with people you meet on the course, applying for programmes and jobs, or doing further independent study. We also have a large network of organisations working in this field; We’ll share job opportunities directly with you, and share your data with their recruiters (with your permission).

Join over 1000 course alumni. We’ve helped over 1000 people learn via our AI Safety Fundamentals courses. You’ll gain access to this network, and to many others who join in future rounds of the course.

Who should do the course?

Applications for the course are open to everyone. The research field needs people from a range of academic backgrounds and disciplines, and we’re excited to consider applications from anyone who’s interested in working on AI safety.

We think this course will particularly be able to help you if any of the following apply to you:

  • You have policy experience, and are keen to apply your skills to reducing risk from AI.

  • You have a technical background, and want to learn about how you can use your skills to contribute to AI Governance agenda.

  • You are early in your career or a student who is interested in exploring a career in governance to reduce risks from advanced or transforative AI.

We expect at least 25% of the participants will not fit any of these descriptions, so we still encourage you to apply if none of those resonate with you. There are many skills, backgrounds and approaches to AI Governance we haven’t captured here, and we will consider all applications accordingly.

If we don’t have the capacity to have you in the organised course, you can still read through our public curriculum.

What will you actually do on the course?

The course comprises 8 weeks of reading and small-group discussions, followed by 4-weeks spent on a project or further guided learning.

The time commitment is around 5 hours per week comprising of reading, preparing exercises and participating in sessions, so you can engage with the course alongside full-time work or study.

Each week, you’ll be provided with structured content to work through and a facilitated discussion group session. In these sessions, you’ll be led through activities and discussions with other community members participating in the course, guided by a facilitator who is knowledgeable about the content.

You can read the curriculum content here.

Apply now


What to do if application are not currently open? 

Whilst applying to our course offers the benefit of community and network whilst learning about AI Governance, we believe reading the curricula independently can provide much of the learning value of doing the course. We encourage you to start working through our resources as soon as you like, if you’re keen to get started. We’ll consider your future applications even if you’ve already read the content.

© 2023. BlueDot Impact is funded by Open Philanthropy, and is a project of the Effective Ventures group, the umbrella term for Effective Ventures Foundation (England and Wales registered charity number 1149828, registered company number 07962181, and also a Netherlands registered tax-deductible entity ANBI 825776867) and Effective Ventures Foundation USA, Inc. (a section 501(c)(3) public charity in the USA, EIN 47-1988398).

We’ve just updated our websites. If you find a bug, please contact us.

Designed by And—Now

Designed by And—Now

We use essential cookies on our website to provide a richer experience. By accepting, you agree to our use of such cookies. Cookie Policy.