Oracle
Oracles are a subtype of Genies putatively designed to safely answer questions, and only to answer questions. Oracles are often assumed to be Boxed and the study of Oracles is sometimes taken to be synonymous with the study of Boxed Oracles.
Children:
- Zermelo-Fraenkel provability oracle
We might be able to build a system that can safely inform us that a theorem has a proof in set theory, but we can’t see how to use that capability to save the world.
Parents:
- Strategic AGI typology
What broad types of advanced AIs, corresponding to which strategic scenarios, might it be possible or wise to create?
- Task-directed AGI
An advanced AI that’s meant to pursue a series of limited-scope goals given it by the user. In Bostrom’s terminology, a Genie.
The term “oracle” has a very specific definition in recursion theory and computational complexity theory, namely, a black box that can instantly compute a (possibly uncomputable) decision problem or function (not answer questions in natural language). I know this is just describing Bostrom’s term, and maybe I’m just being nitpicky, but doesn’t this kind of terminology signal a misunderstanding of CS concepts?