As I conduct conversations for the Update Project (and just informally, on my own time) I’m looking for important open questions. Important, in the sense that what you believe about that question changes how you try to impact the world, and how successful you are at it. And open, in the sense that smart, well-informed people disagree about the answer.
I’ve begun collecting those open questions here, in the hopes that making them salient will make us more likely to notice evidence or arguments that shift our thinking about them in a useful way.
- When is overconfidence useful (if ever)?
- Can we intentionally improve the world? Planners vs. Hayekians
- Are you motivated by obligation, or opportunity?
- How bullish should we be about artificial general intelligence based on recent progress in the field?
- A priori, how hard should we expect AGI to be?
- How much ideological diversity is optimal (and along what axes)?
- How efficient is the “market” for doing good?
- Which new technologies have the potential to significantly impact humanity’s future (for good or for ill)?
- How should Trump’s presidency alter our model of global risks?
- How hard should we expect radical life extension to be?
- On the margin, should we be trying to make science faster, or more rigorous?
- How much should we expect to be able to improve on our default decisionmaking strategies?
- What are some of the blind spots in the “rationalist” worldview?