Send
Close Add comments:
(status displays here)
Got it! This site "robinsnyder.com" uses cookies. You consent to this by clicking on "Got it!" or by continuing to use this website. Note: This appears on each machine/browser from which this site is accessed.
Making decisions using information entropy
1. Making decisions using information entropy
For the introduction, see
Entropy function .
The information entropy function is defined as follows.
2. Entropy Formula
The following Shannon entropy formula expresses how entropy for a set of symbols is calculated.
S is the entropy
Pi is the probability of a given symbol i in a sequence of symbols appearing
3. Twenty questions
Most people have played the game of twenty questions.
Here is a simplified version.
Think of a number from 1 to 1,000,000 I will guess that number in 20 yes-no questions.
4. More simplified version
Here is a more simplified version.
Think of a number from 1 to 16 I will guess that number in 4 yes-no questions.
How can you do it? Can you explain why it works?
5. Algorithms
In the study of algorithms, one learns how to use divide and conquer problem solving strategies for a problem space of size
n.
Divide the space by two at each step, or binary division of the problem, leads to a solution of order ln2n or O(ln2n).
Divide the space of size n into 1 and n-1 leads to a solution of order n or O(n).
6. Question game
For guessing the
1 to
16, or
24 the following hold.
An O(n) solution requires 16 guesses to obtain the solution (in the worst case) and 8.5 guesses on average. Why?
Note: If someone does not remember the numbers guessed, it can take longer, on average.
An O(ln2n) solution requires exactly 4 guesses. Why?
7. Entropy of splitting lists
Here is a way to create partitioned lists of numbers using list on a lazy range.
The flowing program shows the entropy of each possible partition of a list of 16 elements into two non-empty lists.
8. Python program
Here is the Python code [#1]
Here is the output of the Python code.
9. Result
The result is that the best place to partition the list into two non-empty lists such that the minimum of the maximum entropies is in the middle. That is, that splits the list into two equal sized lists.
10. End of page