Send Close Add comments: (status displays here)
Got it!  This site "robinsnyder.com" uses cookies. You consent to this by clicking on "Got it!" or by continuing to use this website.  Note: This appears on each machine/browser from which this site is accessed.
Making decisions using information entropy
by RS  admin@robinsnyder.com : 1024 x 640


1. Making decisions using information entropy
For the introduction, see Entropy function .

The information entropy function is defined as follows.

2. Entropy Formula
The following Shannon entropy formula expresses how entropy for a set of symbols is calculated.

Entropy function equation

3. Twenty questions
Most people have played the game of twenty questions.

Here is a simplified version.

Think of a number from 1 to 1,000,000 I will guess that number in 20 yes-no questions.

4. More simplified version
Here is a more simplified version.

Think of a number from 1 to 16 I will guess that number in 4 yes-no questions.

How can you do it? Can you explain why it works?

5. Algorithms
In the study of algorithms, one learns how to use divide and conquer problem solving strategies for a problem space of size n.

6. Question game
For guessing the 1 to 16, or 24 the following hold.

7. Entropy of splitting lists
Here is a way to create partitioned lists of numbers using list on a lazy range.

The flowing program shows the entropy of each possible partition of a list of 16 elements into two non-empty lists.

8. Python program
Here is the Python code [#1]

Here is the output of the Python code.


9. Result
The result is that the best place to partition the list into two non-empty lists such that the minimum of the maximum entropies is in the middle. That is, that splits the list into two equal sized lists.

10. End of page

by RS  admin@robinsnyder.com : 1024 x 640