Skip to content

Astra Fellowship Empirical Stream — Call for Applications

Application deadline: May 3, 2026 (11:59 PM, Anywhere on Earth)

You can fill out our expression of interest form here (~ 1 minute) to receive reminders of deadlines and stay up to date on announcements, such as additions of new mentors.

Program Overview

Astra is a fully funded, in-person program that pairs senior mentors with talented fellows to conduct research and engineering on AI safety. The program runs from September 2026 - February 2027.

Over 80% of Astra's first cohort are now working full-time in AI safety roles at organizations such as OpenAI, Anthropic,Google DeepMind, Redwood Research, METR, the Center for AI Standards and Innovation, and the UK AI Security Institute.

What Fellows Work On

Fellows will work on machine learning research in core technical safety topics, such as AI alignment, AI control, model evaluations, and scalable oversight. We recommend applicants review partner organizations for more specific information on the mentors’ research agendas. To see the full list of mentors, please visit our website.

Previous fellows through Astra have released research including capabilities elicitation and an independent model evaluation of Kimi 2.5.

At the start of the program, empirical fellows are paired with mentors and pitched on projects by their host organization, working with their mentors to determine a research project to pursue.

Program Details

  • Time Commitment: Full-time
  • Stipend / Compensation: $8,400/month for the duration of the program
  • Compute Budget: Up to $15,000/month
  • Mentorship: Weekly meetings with a senior expert in the field.
  • Research Management: Weekly meetings with a research manager.
  • Location: In-person at Constellation Institute in Berkeley, CA. A secondary research hub is available for UK-based fellows at the London Initiative for Safe AI (LISA). We will also consider remote options in rare cases.
  • Visa Support: We support J-1 ; F-1 / CPT (full-time only) or OPT, and require DSO approval.
  • Career Support: A talent mobilization team to support transition into full time roles after the fellowship.
  • Startup Incubation Services: Operational and financial support for fellows launching new projects or organizations (business operations, communications, hiring, fundraising, and more).
  • Community & Network: Access to a world-leading AI safety convening space including 300+ weekly visits from experts in the field, shared meals, weekly seminars, and workshops.

What We're Looking For

  • Strong coding proficiency with Python
  • Agency and proactivity, i.e. the kind of person who takes ownership of ambiguous problems and resolves them on their own
  • Basic knowledge of machine learning
  • Basic knowledge of AI safety
  • Ability to learn and iterate quickly and make rapid progress based on new results.

Some mentors prefer for their fellows to have advanced knowledge of AI safety, machine learning, and strong research taste.

How to Apply

Application Materials

  • Resume
  • References (two minimum). We encourage applicants to include a reference that is part of the Constellation network when possible.
  • (Optional) Links to your CV, personal website, or written samples (such as previous papers you’ve published)

Application Process

  • Stage 1: Initial application screening
  • Stage 2: Coding test and referral check
  • Stage 3: Work tests
  • Stage 4: Interview

Contact Constellation

astra@constellation.org