Oracles are a subtype of Genies putatively designed to safely answer questions, and only to answer questions. Oracles are often assumed to be Boxed and the study of Oracles is sometimes taken to be synonymous with the study of Boxed Oracles.


  • Zermelo-Fraenkel provability oracle

    We might be able to build a system that can safely inform us that a theorem has a proof in set theory, but we can’t see how to use that capability to save the world.


  • Strategic AGI typology

    What broad types of advanced AIs, corresponding to which strategic scenarios, might it be possible or wise to create?

  • Task-directed AGI

    An advanced AI that’s meant to pursue a series of limited-scope goals given it by the user. In Bostrom’s terminology, a Genie.