![]() We imagine that a higher likelihood of this scenario would lead to a lower likelihood of our existence in a simulation. Many people believe there is a high likelihood humans will go extinct in the near future. Bostrom also explores a third scenario in which posthumans become capable of creating simulations, but none of these posthumans choose to do so - for ethical or other reasons.) (This is because a single programmer mind would be able to create billions of programmed minds, which would affect the programmer to programmed mind ratio in any universe. “programmed universe”), or we are very likely to be in a simulation. ![]() Like the previous post, this post is meant to be lighthearted and informal.Īs Nick Bostrom, David Chalmers, Elon Musk and others have pointed out, either civilizations fail before reaching the level of technology necessary to create a full and convincing simulation ( i.e. Instead, it focuses only on how future AI superintelligence and agency would affect the likelihood of the creation of (and our existence in) a simulation. This new post is meant to be agnostic as to the cause of a potential extinction event. Preface: This is a rewrite of a previous post. ![]()
0 Comments
Leave a Reply. |