Results 1 to 5 of 5

Math Help - Entropy

  1. #1
    Global Moderator

    Joined
    Nov 2005
    From
    New York City
    Posts
    10,616
    Thanks
    9

    Entropy

    How is entropy defined mathematically? This is a question I have all the way from 11th grade. I know that when we were doing Alchemy in high school I remember the teacher saying that there is a mathematical definition for entropy. But since I knew he was an alchemist and not a mathematician I never asked. I am sure it is hard, I assume it has to do with Boltzman's Constant, maybe?
    Follow Math Help Forum on Facebook and Google+

  2. #2
    is up to his old tricks again! Jhevon's Avatar
    Joined
    Feb 2007
    From
    New York, USA
    Posts
    11,663
    Thanks
    3
    Quote Originally Posted by ThePerfectHacker View Post
    How is entropy defined mathematically? This is a question I have all the way from 11th grade. I know that when we were doing Alchemy in high school I remember the teacher saying that there is a mathematical definition for entropy. But since I knew he was an alchemist and not a mathematician I never asked. I am sure it is hard, I assume it has to do with Boltzman's Constant, maybe?
    you did alchemy?

    (i don't know the mathematical definition of entropy. i learnt about that in a general chemistry class, and this was basically what was said. it seems to frivolous to be mathematically defined.)
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by ThePerfectHacker View Post
    How is entropy defined mathematically? This is a question I have all the way from 11th grade. I know that when we were doing Alchemy in high school I remember the teacher saying that there is a mathematical definition for entropy. But since I knew he was an alchemist and not a mathematician I never asked. I am sure it is hard, I assume it has to do with Boltzman's Constant, maybe?
    There are at least two types of Entropy, the Thermodynamic Entropy and
    Shannons Information Entropy. You will find the expressions for both on the
    Wikipedia page for entropy, the first under "Microscopic definition of entropy"
    and both under "Entropy and information theory"

    RonL
    Follow Math Help Forum on Facebook and Google+

  4. #4
    GAMMA Mathematics
    colby2152's Avatar
    Joined
    Nov 2007
    From
    Alexandria, VA
    Posts
    1,172
    Awards
    1
    Quote Originally Posted by CaptainBlack View Post
    There are at least two types of Entropy, the Thermodynamic Entropy and
    Shannons Information Entropy. You will find the expressions for both on the
    Wikipedia page for entropy, the first under "Microscopic definition of entropy"
    and both under "Entropy and information theory"

    RonL
    I never heard of Shannon's Information Entropy.... Information entropy - Wikipedia, the free encyclopedia

    Good stuff!
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Lord of certain Rings
    Isomorphism's Avatar
    Joined
    Dec 2007
    From
    IISc, Bangalore
    Posts
    1,465
    Thanks
    6
    Quote Originally Posted by colby2152 View Post
    I never heard of Shannon's Information Entropy.... Information entropy - Wikipedia, the free encyclopedia

    Good stuff!
    Shannon's Information Entropy is one of the most brilliant constructions in Communication theory. And subsequently his channel capacity theorem is very powerful too. When communication engineers design their codes, they know they can never cross what is known as Shannon's limit

    Shannon:Electrical Engineering <- Newton:Physics

    Anyway the point of my post is that there is a nice story on why it's called entropy and the connection with physical entropy:
    Quote Originally Posted by Wiki
    My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.'

    * Scientific American 1971 , volume 225 , page 180
    * Explaining why he named his uncertainty function "entropy".

    Von Neumann, you are cunning !!!!
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Information Entropy Problem
    Posted in the Advanced Applied Math Forum
    Replies: 6
    Last Post: September 5th 2011, 10:01 AM
  2. Entropy questions
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: February 24th 2011, 11:53 AM
  3. Replies: 0
    Last Post: August 3rd 2010, 07:02 PM
  4. Maximum entropy given mean and variance
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: October 22nd 2009, 11:20 PM
  5. Entropy
    Posted in the Discrete Math Forum
    Replies: 2
    Last Post: October 6th 2008, 10:48 AM

Search Tags


/mathhelpforum @mathhelpforum