Hacker News

Spacetime as a Neural Network

9 points by bisonbear ago | 5 comments

FloorEgg |next [-]

I've been studying astrophysics as a hobby for years and am going to riff off your framing because it seems compatible with the way I've been thinking about some of the issues with lamda CDM.

Speed of light = speed of causality = computational constraint of the simulation

Wave function = preference

Point Particles = choice

Gravity = computational lag

During interactions when preference -> choice if one dimension of interaction takes longer to compute than another, instead of waiting the simulation favors the delayed dimension. (If this could be formalized it could integrate GR with quantum, but requires new math currently over my head)

Dark matter = hidden (emergent) complexity via long distance coupling, where force drops off from 1/r^2 to 1/r (fitting MOND).

This assumes that emergent complexity with higher coupled degrees of preferences is harder to compute than decoupled wave functions. It also assumes the universe is extremely computational efficient but has limits.

I've been trying to imagine an experiment that could falsify this. Lots of ideas but none are good enough to share yet.

I think the biggest issue with this theory is the current interpretation of the CMB, which I'm still trying to deeply understand.

bisonbear |previous [-]

I posted this to reddit and got a bunch of great additional reading recs from the community there! https://www.reddit.com/r/ArtificialInteligence/comments/1pyr...

barrenko |root |parent [-]

You are the post author? What would you say led you to be interested in this?

bisonbear |root |parent [-]

Yep! Honestly it was just a random rabbit hole - I recently read Lee Smolin's Time Reborn (highly recommend btw, super fascinating read) and was curious as to what his more recent work was about, which lead me to come across The Autodidactic Universe paper. With the AI hype train full steam ahead, the paper felt newly relevant, especially as it seems that we're starting to hit plateaus for model intelligence and looking to other areas (e.g. world models) for further advancement in the field

barrenko |root |parent [-]

Looking forward to reading this.