Blowing The Information Balloon And Beyond
The speed of light in this Universe is the fastest possible for any entity, with an upper limit of 299,792,458 m/s (in vacuum). Even though faster speeds may be possible (in the form of the hypothetical tachyons), they are not accounted for by the Standard Model of particle physics, and therefore all objects or ideas described in this discussion will be in conformance to the corresponding limits set by the same model.
Light is used as a carrier of information between two objects, and its propagation is commonly guided by subterranean fibre-optic cables on our planet. Therefore, the information carried therein cannot reach the target location before the beam of light does, irrespective of any transmission errors. There is only a single instance where such a phenomenon might transpire (when the core of the cable contains a dense substance whose interaction with photons is greater than the interaction with some other information carrier), and which has been put to use in the LHC in the form of the RICH (Ring Imaging Cherenkov) detector.
Now, it is common knowledge that the universe is expanding – as predicated by the Doppler shift of distant galaxies and corroborated by the third law of thermodynamics – from a state of lower volume to a state of higher volume. This fact can be paraphrased as the Universe moving from a smaller past to a larger future. Therefore, the information of the past was contained in a smaller volume than the information of the future is going to be contained in.
In 1972, Jakob Bekenstein, a theoretical physicist, established that there is an upper limit to the amount of information than can be contained in a finite region of space which has a finite amount of energy (referred to as the Bekenstein bound). Keeping in mind that the Universe is expanding, let us assume the existence of a finite space, S. The fraction of universal space occupied by S is given by dividing it by the volume of the Universe.
Now, it is unnecessary to know the absolute value of the total volume since it is constantly increasing; we are not interested in numbers but only in the differential decrement rate. A simple place to begin would be to analyse the Kepler problem in general relativity, summarized by the following equation to measure the angle of deflection during gravitational lensing.
θ = 4GM/rc2
(G – universal constant of gravitation; M – mass towards which light is deflected; c – speed of light in vacuum; r – distance between gravitational lens and M)
Over a given time period, the values of θ can be computed to give the rate of change of the distance between the light source and the lens. Further, assuming that the mass M is “carried” by the volume S, and knowing θ0, the rate of change of the fraction can be calculated unto a fair approximation to estimate the rate of change of the amount of information that can be contained by the Universe (“raisin bread” model).
On another note: is it possible to “produce” quantities of such information to give rise to more space (as a corollary of the Bekenstein bound) that would be required to contain it? Or would the information be destroyed – and how? Or would the process of such information “creation” itself be limited by certain factors governed by the laws of physics – and what could those laws be?
- Traipsing With Physics (enderanimate.wordpress.com)