The basis of information theory is now well-established. Following the approach of Brillouin [3], if P denotes the number of states in a system, then the information memory capacity (denoted by I) in „bits‟ is defined to be

where, if a problem is considered with N different independent selections, each corresponding to a binary choice (0 or 1), the total number of possibilities is

and so, the information is

Alternatively, the entropy function of statistical thermodynamics is given by

where k is Boltzmann‟s constant.
It follows that, for the above expression for P,

Further, it may be noted that the first and second laws of thermodynamics may be combined into the equation

where dU denotes the internal energy, T the absolute temperature and d’W the work done on or by the system. In terms of memory capacity, this becomes

and it is seen immediately that the energy required to add one bit of memory to the system is given by

where the partial derivative is evaluated with the work term held constant.
It might be noted that heat capacity is necessarily a positive quantity [5] and, therefore, this last equation leads to the realisation [4] that a program written using
bits of system memory dissipates energy of at least

As noted previously, this constitutes an irreversible bound on a classical computation imposed by the second law of thermodynamics.
This brief introduction to some of the basic ideas of information theory and the link with statistical thermodynamics provides one part of the basis for the promotion of the idea that water possesses memory. The second part derives from a detailed study of some of the properties of water itself.