Measuring Information Part 1: Throughout this course we will be looking at how t
ID: 3883371 • Letter: M
Question
Measuring Information Part 1: Throughout this course we will be looking at how to measure how much information is in random events. These exercises are based on Shannon’s Information Theory. To find out who Claude Shannon was and how he solved problems go here:
http://www.businessinsider.com/engineer-claude-shannon-problem-solving-process-2017-7
Consider any event- say something that happens with almost certainty, like the rising of the sun, others like flipping a coin result in a 50/50 outcome, while some, like a Nor’easter snowstorm are rare. For any event x, we can assign a probability of that event happening as p(x). So for a fair coin flip p(heads)=0.5 The surprise of an event is defined as s(x)= -log2(p(x)). The units of surprise are bits. Write a program that asks for the probability of an event and computes and outputs the surprise of that event. You will need to use the cmath library to access the log function. Note logn(y)=log(y)/log(n), where log is the natural log function. Run your program and record the output for the following probabilities p(x)= 0.0001, p(x)=.0001, p(x)=0.01, p(x)=0.1, p(x)=0.5, p(x)=1. Note that for p(x) =0.5 the surprise is 1 bit- meaning it takes one yes or no question to determine if for example a coin is heads or tails. For p(x)=1 the surprise is 0, since the event happens with certainty
Explanation / Answer
#include <stdio.h>
#include <math.h>
int main() {
double px, surprise;
printf("Enter p(x):");
scanf("%lf",&px);
surprise = -log2(px);
printf(" Surprise = %lf bits", surprise);
}
Input-Output
Enter p(x): 0.0001
Surprise = 13.287712 bits
Enter p(x): .0001
Surprise = 13.287712 bits
Enter p(x): 0.01
Surprise = 6.643856 bits
Enter p(x): 0.1
Surprise = 3.321928 bits
Enter p(x): 0.5
Surprise = 1.000000 bits
Enter p(x): 1
Surprise = 0.000000 bits