1. ## Entropy

How is entropy defined mathematically? This is a question I have all the way from 11th grade. I know that when we were doing Alchemy in high school I remember the teacher saying that there is a mathematical definition for entropy. But since I knew he was an alchemist and not a mathematician I never asked. I am sure it is hard, I assume it has to do with Boltzman's Constant, maybe?

2. Originally Posted by ThePerfectHacker
How is entropy defined mathematically? This is a question I have all the way from 11th grade. I know that when we were doing Alchemy in high school I remember the teacher saying that there is a mathematical definition for entropy. But since I knew he was an alchemist and not a mathematician I never asked. I am sure it is hard, I assume it has to do with Boltzman's Constant, maybe?
you did alchemy?

(i don't know the mathematical definition of entropy. i learnt about that in a general chemistry class, and this was basically what was said. it seems to frivolous to be mathematically defined.)

3. Originally Posted by ThePerfectHacker
How is entropy defined mathematically? This is a question I have all the way from 11th grade. I know that when we were doing Alchemy in high school I remember the teacher saying that there is a mathematical definition for entropy. But since I knew he was an alchemist and not a mathematician I never asked. I am sure it is hard, I assume it has to do with Boltzman's Constant, maybe?
There are at least two types of Entropy, the Thermodynamic Entropy and
Shannons Information Entropy. You will find the expressions for both on the
and both under "Entropy and information theory"

RonL

4. Originally Posted by CaptainBlack
There are at least two types of Entropy, the Thermodynamic Entropy and
Shannons Information Entropy. You will find the expressions for both on the
and both under "Entropy and information theory"

RonL
I never heard of Shannon's Information Entropy.... Information entropy - Wikipedia, the free encyclopedia

Good stuff!

5. Originally Posted by colby2152
I never heard of Shannon's Information Entropy.... Information entropy - Wikipedia, the free encyclopedia

Good stuff!
Shannon's Information Entropy is one of the most brilliant constructions in Communication theory. And subsequently his channel capacity theorem is very powerful too. When communication engineers design their codes, they know they can never cross what is known as Shannon's limit

Shannon:Electrical Engineering <- Newton:Physics

Anyway the point of my post is that there is a nice story on why it's called entropy and the connection with physical entropy:
Originally Posted by Wiki
My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.'

* Scientific American 1971 , volume 225 , page 180
* Explaining why he named his uncertainty function "entropy".

Von Neumann, you are cunning !!!!