COSMOLOGY: With over 1e83 atoms in the universe, how much of the universe can be remembered?

I’ve often thought that the universe itself is incapable of storing a perfect memory of anything more than a tiny percentage of its contents–some laughably small fraction of its whole. Even with an amazingly efficient system, how much information could really be stored about what’s transpired in the past?

By this I mean: I’d like to know, for example, which oceans the molecule of water in my glass has been immersed in, which algae used it as part of their metabolic processes, which comet deposited it onto the proto earth, which nuclear furnace generated the oxygen that went into its formation, etc.

I’d like to know this for as much of the universe as possible, in as much detail as possible. How much of the universe would be required, and how much information could you store?

Here are my thoughts on how to solve this. Considering:

  • There are some 1e83 atoms in the universe. Describing an atom at a macro level would mean you’d need to store its position and velocity over time, integrated into arbitrarily small time units, or perhaps only recording the changes in velocity when it accelerates (but even then you’d be discarding true information regarding its precise movements from heat and possibly inscrutable stochastic processes if it could even be attained at that scale without destructively altering the atom as it goes about its normal business).

    The atom itself isn’t even atomic, as there are quarks and other strange particles, but you get the idea of “atom” for the thought experiment.

  • You’d need some multiplicand of atoms to describe this information–very likely already requiring a few orders of magnitude of atoms per recorded atom.

    This would be a reading and writing mechanism, again a few orders of magnitude per atom that somehow encodes information about the ones you’re observing. (I.e., computer memory requires some 1e25 atoms (500g of silicon) to manipulate 8e12 bits of information (1 TB)–criminally inefficient.)

  • You’d need motive forces that would be so inclined as to construct this system, themselves constituting overhead rather than a storage mechanism. E.g., powerful creatures with space thumbs.
  • You’d need to have constructed the system in such a way that it could transmit or collate that information despite the inflation and entropy of the universe, and be willing to discard or localize data (locally accessible but inaccessible from other parts of the universe) that simply couldn’t overcome time horizons.
  • You’d need a non-destructive or at least not particularly invasive method of recording the state of each atom even when squeezed in among siblings (and we’ll assume such a thing is possible, although it almost certainly isn’t)
  • In a Gödel-esque puzzle, as I vaguely understand the term from GEB, the system itself would necessarily fail to encode information about its own history

Thus in the best-case scenario, you’re talking about an encoding inefficiency that necessitates only being able to record the state of some smaller subsection of the rest of the universe, and that would require the entire rest of the universe being dedicated to that task, which is presumably unlikely.

ASTRONOMY: The search for our solar system’s ninth planet

Could the strange orbits of small, distant objects in our solar system lead us to a big discovery? Planetary astronomer Mike Brown proposes the existence of a new, giant planet lurking in the far reaches of our solar system — and shows us how traces of its presence might already be staring us in the face.