Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

Re: [Phys-l] entropy +- disorder +- complexity



A word of warning:

Beware of the word "universal" when used in this context, such as
"universal" Turing machine or "universal" probability measure.

In this field, the word has a highly technical definition that
is quite counterintuitive. IF you stick closely to the technical
definition, them most of what they say about it is true. However,
this notion of "universal" does not mean all-purpose or even
general-purpose.

This is a huge trap for the unwary.

Let's be clear: A "universal" probability measure is *not* an
all-purpose probability measure.

As a familiar example, Lempel-Ziv is a lossless and therefore
universal compressor. It works well for ordinary text files
("zip") ... but it is not practical (by itself) for compressing
image files. We use other compressors for that (png, jpeg,
mpeg).

Suppose we have two probability distributions A and B, both of
which are "universal" in the technical sense, i.e. both are
nowhere zero.

The difference between A and B can be thought of as a /bias/.

1) There are some questions for which the difference between
A and B does not affect the answer. For instance, given an
asymptotically large amount of data, one can overwhelm any
finite amount of bias and converge to the "right" answer.

2) There are some questions for which the difference between
A and B matters a great deal. For example, the practical
question of /how much/ data you need in order to overwhelm
the bias is in this category.

Universality guarantees that the bias is finite, but it does
not guarantee that it is small.

There are, alas, some "authorities" (folks who have written
books on the topic) who focus on category (1) and more-or-less
refuse to acknowledge the existence of category (2). If you
mention a practical example in category (2) they bring up an
example from the other category and talk right past you.