Hi all-
I have downloaded John's essay on Entropy. His little execises
implementing information theoretic concepts are fun and amusing; some
sowhat similar exercises may be found in Liu's book <Elements of Discrete
Mathematics> (McGraw-Hill, 2d Ed. 1077) from which I tried to teach a
course to uncomprehending juniors in about 1986, at a place callled
Elmhurst C9llece, in Illinois.
I have a problem with Dr. Denker's discussion of entropy -
(unfortunately, my download of his posting does not contain page numbers).
In his exercise on the ``cup game'' he writes ``where ther are N cups and
B blocks (to be hidden under the cups), there are N^{B} (LaTeX
notation) possible states. ( OK, no problem so far).
Then he writes, under the following para. 2, ``if we are smart,
the number of'' guesses ``is at worst'' (i. e., the minimum number of
guesses) is the smallest exponent E, of 2, such that 2^{E}>=N^{B}=2.
This statement cannot be correct. I there is 1 object and 2
boxes, then the prediction is 2*{E}>=2^1=2 -> I need two questions to
decide which cup.. But I can make the decision with one question, ~~Is it
Box A?''.
Again, if there are 3 cups and one object, then the prediction
is that 2^{E}>=3^{1}=3. But if I label the cups ABC, then two questions
suffice to locate the cup holding the object: ``Is the object in one of
the cups A or B?''. In the worst case, the answer is Y, in which case, I
ask, ``Is the object in cup A?" Either answer locates the object.
I haven't thought this through. Maybe the counting ust need a
trivial correction.
Regards,
Jack
--
"Trust me. I have a lot of experience at this."
General Custer's unremembered message to his men,
just before leading them into the Little Big Horn Valley