Thursday 16 February 2017

The problem of Information and Ergodicity

Information is not ergodic: the average surprisingness of an entire message is not the same as the surprisingess of a section of the message. So how does this affect the way we use Shannon equations? What makes information non-ergodic is the continual transformations of the games that we play when we communicate. A surprise for one game is different from a surprise in another.

The transformation from one game to another is a key part of Nigel Howard's metagame theory where a shift from a game with one set of rules, to a metagame of that game is prompted by a "paradox of rationality" - basically, a crisis. It is a bit like shifting up a level of bifurcation. Maybe even there's a shift up from one energy level to another (which makes the connection to Schroedinger and Kauffman). What's interesting is the cause of the shift.

In Howard, the crisis - his "paradox" of rationality - means that only the jump to the metagame can resolve the contradiction of the current game. The shift is a redescription of existing descriptions in new terms.

My examples for this are all emotional in some way - the experience of "crisis" is very real, but I think Luhmann is right that these things are social-systemic, not psychological. So, for example, in music the climax of the Liebestod at the end of Wagner's Tristan is a moment when constraints of pitch, harmony, rhythm, etc all coincide. It has a curious homology to orgasm. In intellectual life, Koestler's idea of "Bisociation" is also a synergy of multiple descriptions in a similar form. Luhmann's 'interpenetration' is another example, as is Schutz's 'intersubjectivity'. Bateson's levels of learning and Double-bind are further examples. Politically, there are some obvious examples of "regime change" at the moment - changing the game most obviously!

The result of this kind of process is that a distinction is made between the old game and the new one: a boundary produced. On one side of the boundary, there is a degree of entropy in the number of descriptions. On the other, there is a degree of synergy between those descriptions. It is these processes of continual game-change which are non-ergodic.

Whilst I doubt whether we should make Shannon entropy calculations across different games, Shannon is useful for counting within a single game - that would indicate how close a "regime change" might be. Shannon mutual information is itself a kind of game between sender and receiver. I think it would also be worth counting "game changes" - that seems do-able to me - the boundary-markers are the moments of synergy. [my physics colleague Peter Rowlands mentioned to me that he thought that the things that Boltzmann actually counted in statistical thermodynamics were 'bifurcations'. I don't fully understand what he means, but an intuitive reaction says that is a similar thing]

In Stuart Kauffman's "Investigations" he goes into quite a lot of detail about these transformation processes. It has woolly edges.. but "Investigations" is a powerful book - there's some value in it (I didn't like it at first). Kauffman's introduction to Ulanowicz's book "A Third Window" is quite revealing in terms of mapping the space between his ideas and those which centre more around Shannon. Both lines of thought have powerful contributions to make.

No comments: