you did alchemy?
(i don't know the mathematical definition of entropy. i learnt about that in a general chemistry class, and this was basically what was said. it seems to frivolous to be mathematically defined.)
How is entropy defined mathematically? This is a question I have all the way from 11th grade. I know that when we were doing Alchemy in high school I remember the teacher saying that there is a mathematical definition for entropy. But since I knew he was an alchemist and not a mathematician I never asked. I am sure it is hard, I assume it has to do with Boltzman's Constant, maybe?
There are at least two types of Entropy, the Thermodynamic Entropy and
Shannons Information Entropy. You will find the expressions for both on the
Wikipedia page for entropy, the first under "Microscopic definition of entropy"
and both under "Entropy and information theory"
RonL
I never heard of Shannon's Information Entropy.... Information entropy - Wikipedia, the free encyclopedia
Good stuff!
Shannon's Information Entropy is one of the most brilliant constructions in Communication theory. And subsequently his channel capacity theorem is very powerful too. When communication engineers design their codes, they know they can never cross what is known as Shannon's limit
Shannon:Electrical Engineering <- Newton:Physics
Anyway the point of my post is that there is a nice story on why it's called entropy and the connection with physical entropy:
Originally Posted by Wiki
Von Neumann, you are cunning !!!!