AI Alignment Course
12 week online course, covering a range of technical AI alignment research agendas.
AI systems are rapidly becoming more capable and more general. Despite AI’s potential to radically improve human society, there are still open questions about how we build AI systems that are controllable, aligned with our intentions and interpretable.
You can help develop the field of AI safety by working on answers to these questions.
The AI Alignment course is designed to introduce the key concepts in AI safety and alignment, and will give you space to engage with, evaluate and debate these ideas. You’ll meet others who are excited to help mitigate risks from future AI systems, and explore opportunities for your next steps in the field.
The course is run by BlueDot Impact – a non-profit that supports people to develop the knowledge, skills and connections they need to pursue a high-impact career.
Why do the course
You’ll learn about the foundational arguments. It is difficult to know where to start when trying to learn about AI safety for the first time. The programme will give you the structure and accountability you need to explore a wide variety of alignment research agendas, and give you a rich conceptual map of the field.
Your learning is facilitated by experts. Your facilitator will help you navigate the course content, develop your own views on each topic, and foster constructive debate between you and fellow participants.
You’ll be learning alongside others. Your cohort will be made up of people who are similarly new to AI safety, but who will bring a wealth of different expertise and perspectives to your discussions. Many participants form long lasting and meaningful connections, that support them to take their first steps in the field.
You’ll be supported to take your next steps. This could involve doing further work on your end-of-course project, applying for programmes and jobs, or doing further independent study. We maintain relationships with a large network of organisations and will share opportunities with you. Additionally, with your permission, we can help make connections between you and recruiters at top organisations.
Who this course is for
We think this course will particularly be able to help you if:
- You have machine learning experience and are interested in pivoting to technical AI safety research.
- You have a technical background, such as studying a STEM subject, and are keen to work on technical AI safety research.
- You are managing or supporting technical AI safety researchers and understanding the alignment landscape would make you more effective in your role.
- You are a student seriously considering a career in technical AI safety to reduce risk from advanced AI.
If none of these sound like you, but you’re still interested in technical AI safety research we still encourage you to apply. The research field needs people from a range of backgrounds and disciplines, and we can’t capture all of them in this list.
What this course is not
This course might not be right for you if you are looking for:
- A course to teach you general programming, machine learning or AI skills. Our resources page lists a number of courses and textbooks that can help with this. Note that these skills are not hard prerequisites to taking our AI alignment course.
- A course that teaches general ML engineers common techniques for how to make systems safer. Instead, this course is for people involved or interested in technical AI alignment research, e.g. investigating novel methods for making AI systems safe.
- A course that covers all possible AI risks and ethical concerns. Instead, our course primarily focuses on catastrophic risks from future AI systems. That said, many of the methods to target the catastrophic risks can also be applied to support with other areas of AI safety.
- A course for government policymakers and related stakeholders to learn about AI governance proposals. Our AI Governance course is likely a much better match.
The course comprises 8 weeks of reading and small-group discussions, followed by a 4-week capstone project. The time commitment is around 5 hours per week, so you can engage with the course alongside full-time work or study.
This week is particularly useful if you have a limited background in ML. If you’re already knowledgeable about ML, it’s a good opportunity to make sure you have the specific knowledge you need to engage with the arguments later in the programme.
This is where you’ll work through the main course curriculum, designed with AI safety experts at OpenAI and the University of Cambridge. We plan to update this curriculum in late February 2024.
Each week involves 2-3 hours of readings and independent exercises, plus a 2 hour live session (via video call, which we’ll arrange at a time that suits you).
The live sessions are where you work through activities with your cohort of around 5 other participants. These sessions are facilitated by an expert in AI safety, who can help you navigate the field and answer questions.
If accepted onto the course, we’ll ask for your availability so we can find a time slot that suits you (including evening or weekend sessions). There’s flexibility to change sessions in case your availability changes.
Compared to studying the curriculum independently, participants tell us they really value the live cohort sessions as the facilitator helps create an engaging discussion space, the activities are designed to enable effective learning, and you develop deep relationships with highly motivated peers.
The project stage is an opportunity for you to put the knowledge and skills you’ve gained during the first 8 weeks into practice. Previously, top projects have led people to get jobs at top AI companies or to directly influence government policy.
It’s up to you what idea you’d like to work on. In case you want suggestions, we’ll have a list of projects that we think would be valuable based on our experience and connections with field experts.
Once you’ve got your idea, we’ll introduce you to a cohort of people working on similar projects. You’ll work on your project independently, and can put as much time as you’d like into it. Each week you’ll review your progress and receive feedback in 1-2 hour facilitated online discussions with your cohort.
At the end of the project stage, you’ll share your project with other people on the course. And of course, you’ll be able to talk about your experience completing this projects when applying for future opportunities.
After submitting a project, you’ll receive a certificate of completion. Here’s an example.
The expected dates for the next iterations of the course are:
Early 2024 course:
- Applications close: 7 February 2024
- Application decisions made by: 16 February 2024
- Course starts: 3 March 2024
- Course ends: 2 June 2024
Mid 2024 course:
- Applications close: 3 June 2024
- Application decisions made by: 14 June 2024
- Course starts: 1 July 2024
- Course ends: 23 September 2024
These dates are our current best estimates as to when we will be running the courses, and are subject to change.
We expect to run a version of the course roughly every 3-4 months.
Apply through our online application form, and you should receive an email confirmation a few minutes after you send your application.
We make most application decisions within 1-2 weeks after the application deadline closes. Do keep an eye on your emails during this time, as if accepted we’ll need you to confirm your place. All legitimate emails regarding the course will come from @bluedot.org.
If you have any questions about applying for the course, do contact us.
We take a holistic approach to evaluating applications that considers a number of factors we think are important to getting the most from the course.
This said, there are some general tips that can help ensure you’re putting your best foot forward:
- When talking about projects you’ve done, focus on your specific contributions to those project, rather than explaining details about the project itself.
- If you include any links in your application, make sure the link is correct and can be accessed publicly. Opening the link in an Incognito or Private Browsing window is a good way to test this. It’s also often helpful to give us a summary of the highlights or takeaways we should get from what you’ve linked us to.
- Before you hit ‘Submit’, review your answers and double check they are answering the question asked. Additionally, make sure your contact information (especially your email address) is correct.
- Availability 5+ hours per week to study AI alignment
- A reliable internet connection and webcam (built-in is fine) to join video calls
- English language skills sufficient to constructively engage in live sessions on technical AI topics
There is no mandatory payment for this course. If you would like to, you will have the option at the end of the course to pay an amount that you are comfortable with and that you feel reflects the value that the course has brought to you.
BlueDot Impact Ltd, the organisation that runs this course, is a non-profit based in the UK and is entirely philanthropically funded. This course costs us roughly £300 per participant to run, and any payment you make would be used to subsidise places for future participants on our courses.
Any other questions?
If you’re not sure whether to apply, we recommend that you put in an application. If you have any other questions do contact us!