INTRODUCTION
I wrote about Claude Shannon who used Markov chains in information theory. See http://www.kinberg.net/wordpress/stellan/equations/#informationtheory
INDEX
.
Conditional probability
Where event probability depends on other events. So the events are dependent.
https://www.mathsisfun.com/data/probability-events-conditional.html
Can be illustrated with a Bayesian network like this:
Another good course is this:
Origins of Markov chains is explained by Khan Academy
Everything in the world is governed by precise ratios and a constant law of change.
.
Shannon information theory
See http://www.kinberg.net/wordpress/stellan/equations/#informationtheory
I dont know why Edward Witten choose to start with a short introduction to Communication theory (the Shannon theory) at “Theoretical Physics 2018: From Qubits to Spacetime”
I presume it is important to know about this, to understand the theory of Quantum mechanics and aspects of General Relativity as he says to continue with at the conference.
I decided to take a look at C.E. Shannons book from 1948 ” A mathematical theory of Communication“
Shannon was an American mathematician, electrical engineer, and cryptographer known as “the father of information theory“. He wrote also “Theoretical Genetics.”[12]
the video with English text is here.
.
conclusion
Proba