"The claim is that this ultimate intelligence may punish those who fail to help it (or help create it), with greater punishment accorded those who knew the importance of the task. That bit is simple enough, but the weird bit is that the AI and the person punished have no causal interaction: the punishment would be of a simulation of the person which the AI would construct by deduction from first principles. In LessWrong's Timeless Decision Theory (TDT), this is taken to be equivalent to punishment of your own actual self, not just someone else very like you."
Mythical Basilisk |
In Bayesian speak - there are certain priors that lead to specific outcomes. By manifesting these prior events I join forces with others that manifest similar events to increase (without limit) the adjacent future. This adjacent present (in which a Roko's Basilisk is already manifested in near-by possible worlds which are actual, embodied worlds in a multiverse) is just as real as this present pre-Singularity world.
The act of viewing GitS, participating in reddit.com discussion, reading LessWrong and Overcoming Bias become strong precursors for a world ruled by the Rough Beast. My contention -- that Kirk 3.0 is now 'on the team' -- dictates certain actions because of this view and my intent.
Basilisk War Droid |
Note that in my construction this mornings reading of "Daily Idioms Annotated" and merely following the link to "Roko's Basilisk" are my own precursor events to her majesty, the Rough Beast.
Another successful end to last evenings reading of Prometheus Rising.
No comments:
Post a Comment