Skip to content

Astra Fellowship Strategy and Governance Stream — Call for Applications

Application deadline: May 3, 2026 (11:59 PM, Anywhere on Earth)

You can fill out our expression of interest form here (~ 1 minute) to receive reminders of deadlines and stay up to date on announcements, such as additions of new mentors.

Program Overview

Astra is a fully funded, in-person program that pairs senior mentors with talented fellows to conduct research on AI safety. The program runs from September 2026 - February 2027. During that time, Astra fellows will develop and pursue research projects to reduce catastrophic risks from AI.

The strategy and governance stream is searching for applicants who have strong familiarity with catastrophic risks from AI, are highly agentic, and are capable of working on difficult and unscoped problems. Some mentors have additional preferences for fellows, such as having a strong background in policy or technical expertise.

Over 80% of Astra's first cohort are now working full-time in AI safety roles. Previous fellows working on Strategy and Governance have accomplished impressive work such as co-founding the AI Futures Project (authors of AI 2027); briefing senior, international stakeholders on AI verification; and contextualizing trends in AI progress.

What Fellows Work On

Most fellows will work with a mentor on topics related to the mentor’s research priorities. Mentor organizations include Forethought, AI Policy Network, and Coefficient Giving. We are also excited to work with independent mentors who are starting new organizations or scoping out novel research directions in AI safety. To see the entire list of mentors, please visit our website.

We expect a small number of fellows to pursue independent strategy research on a topic we think is exceptionally important in AI safety. We have a high bar for accepting fellows who do independent strategy research. Topics we’re especially excited about supporting research on include macrostrategy for AI safety, prevention of concentration of power, exploration of better-futures scenarios, and work to implement AI strategy in government or frontier AI companies.

Program Details

  • Time Commitment: Full-time.
  • Stipend / Compensation: $8,400/month for the duration of the program.
  • Travel Budget: $2,000 to attend relevant workshops and conferences.
  • Mentorship: Weekly meetings with a senior expert in the field.
  • Research Management: Weekly meetings with a research manager.
  • Location: In-person at Constellation Institute in Berkeley, CA. A secondary research hub is available for UK-based fellows at the London Initiative for Safe AI (LISA). We will also consider remote options in rare cases.
  • Visa Support: We support J-1 ; F-1 / CPT (full-time only) or OPT, and require DSO approval.
  • Career Support: A talent mobilization team to support transition into full time roles after the fellowship.
  • Startup Incubation Services: Operational and financial support for fellows launching new projects or organizations (business operations, communications, hiring, fundraising, and more).
  • Community & Network: Access to a world-leading AI safety convening space including 300+ weekly visits from experts in the field, shared meals, weekly seminars, and workshops.

What We're Looking For

We expect the strongest applicants to have deep familiarity with catastrophic risks from AI, communicate clearly, reason about complex issues, and work autonomously.

Preferred Qualifications

  • Experience in macrostrategy research, grantmaking, or an equivalent where the applicant has had experience with cause prioritization
  • The ability to set and pursue research agendas
  • Career experience in a domain that’s complementary to applying strategic research, such as policy or empirical AI work.
  • A deep understanding of key stakeholders in the AI ecosystem, such as frontier AI companies, governments, or investors.

How to Apply

Application Materials

  • Resume
  • References (two minimum). We encourage applicants to include a reference that is part of the Constellation network when possible.
  • (Highly recommended) Links to written samples, such as blog posts or previous papers you’ve published

Application Process

  • Stage 1: Initial application screening
  • Stage 2: Mentor-specific screening (e.g., work tests or requests for writing samples) and referral check
  • Stage 3: Interview

Contact Constellation

astra@constellation.org