Or­a­cles are a sub­type of Ge­nies pu­ta­tively de­signed to safely an­swer ques­tions, and only to an­swer ques­tions. Or­a­cles are of­ten as­sumed to be Boxed and the study of Or­a­cles is some­times taken to be syn­ony­mous with the study of Boxed Or­a­cles.


  • Zermelo-Fraenkel provability oracle

    We might be able to build a sys­tem that can safely in­form us that a the­o­rem has a proof in set the­ory, but we can’t see how to use that ca­pa­bil­ity to save the world.


  • Strategic AGI typology

    What broad types of ad­vanced AIs, cor­re­spond­ing to which strate­gic sce­nar­ios, might it be pos­si­ble or wise to cre­ate?

  • Task-directed AGI

    An ad­vanced AI that’s meant to pur­sue a se­ries of limited-scope goals given it by the user. In Bostrom’s ter­minol­ogy, a Ge­nie.