Содержание
- 2. entropy(p1, p2, .... pn) = -p1 log p1 - p2 log p2 ... - pn log
- 3. This second law states that for any irreversible process, entropy always increases. Entropy is a measure
- 5. Entropy can be seen as a measure of the quality of energy: Low entropy sources of
- 6. - Claude Shannon transferred some of these ideas to the world of information processing. Information is
- 7. At the extreme of no information are random number. Of course, data may only look random.
- 8. A collection of random numbers has maximum entropy
- 9. Shannon defined the entropy Η (Greek capital letter eta) of a discrete variable X with possible
- 10. Presupunem evenimentul aruncării unui zar cu 6 fețe. Valorile variabilei X sunt {1,2,3,4,5,6} iar probabilitățile obținerii
- 13. Entropyoutlook([2,3], [4,0],[3,2]) = (5/14) * 0.971 + (4/14) * 0.0 + (5/14) * 0.971 = 0.693bits
- 14. gain(outlook) = 0.940 - 0.693 = 0.247 bits of information. gain(temperature) = 0.029 bits gain(humidity) =
- 20. Скачать презентацию