MathB.in
New
Demo
Tutorial
About
I'm learning about Shannon's entropy and I am seeking to minimize the average code length (number of dots and dashes that make up a an english letter) $\bar{l}$ For a given code length in morse code used to represent a letter in the alphabet we can calculate the entropy as $$ \bar{l} =\sum_{u \in U} Pr(u) * l(u)$$ am I correct? Where - $u$ is the given letter from the universe of letter $U$ - $Pr(u)$ is the probability of the letter occuring - $l(u)$ is the number of dots and dashes (bits) used to represent this letter. Just at this point it seems to me that what we have is $$ \bar{l} = \sum_{u \in U} \left( Pr(u) * l(u\right) )$$ but it's being explained like so: $$ \bar{l} = \left( \sum_{u \in U} Pr(u)\right) * l(u)$$ Which is it? the second one doesn't have a source for the $u$ in $l(u)$
ERROR: JavaScript must be enabled to render input!
Fri, 07 Aug 2020 14:26 GMT