Amidst the accelerating capabilities of models like o3 and the decreasing cost-performance ratio of models such as Deepseek-r1, Transformative AI—technology capable of "precipitating a transition comparable to (or more significant than) the agricultural or industrial revolution"—is approaching swiftly. To ensure the best outcomes, it is crucial to develop a coherent narrative about the future of AGI, one that integrates technological progress with political developments in the context of an AGI arms race.
XLab AI Strategy is a 7-week focus group that aims to develop shared understanding on the crux questions of Transformative AI. Along the way, we will be doing forecasting informed by scaling laws, threat-modelling, a tabletop wargame (premise: we reach AGI in 2027), and many more. By the end, you will write your own AGI takeoff scenario.
The fellowship will run from week 2 to week 8 for 1.5 hours each week, schedule TBA.
We are looking for 10-12 UChicago students who have thought a lot about the future of AGI along all dimensions, including technology, economics, ethics, political science, philosophy, etc. Technical experience is not required though having some intuition for neural networks would be helpful. We will pay a lot of attention to your writing.
Apply here by March 24, Monday of week 1. Contact Jo at jojiao@uchicago.edu if you have any questions.
Explore various epoch AI papers and engage in interactive forecasting exercises.
Situational awareness and scaling analysis
Interactive workshop on AI forecasting
Deep dive into deceptive alignment and its implications.
Critical analysis of deceptive alignment risks
Different ways that AI risk could manifest
Explore concrete scenarios and encourage constructive debate
Examine model organisms and their role in alignment research.
The case for a new pillar of alignment research
Analysis of how AI models may fake alignment
Come up with your own ideas for model organisms & critiques
Participate in an immersive AGI tabletop wargame simulation.
Content to be announced.
Continue and conclude the AGI tabletop wargame simulation.
Content to be announced.
Analyze unipolar vs. bipolar takeoff scenarios and AGI realism.
A critique from an Oxford international relations scholar
Analysis of strategic implications
Synthesize your learning by writing your own AGI takeoff scenario.
Example scenario for inspiration