Results 1 to 2 of 2

Math Help - Converting a Variance calculation to a 1 to 100 scale (need some help)

  1. #1
    Newbie
    Joined
    Jun 2012
    From
    Draper, UT
    Posts
    1

    Question Converting a Variance calculation to a 1 to 100 scale (need some help)

    I need some help in figuring out what I need to do for the following math problem (hopefully this is the right place for this question):


    We have a survey that asks respondents to rate certain criterion. These criterion need to be changed at regular intervals based on several factors, one of which is the variance that we're seeing for the criterion. To make this process easier I want to create a score that is based on variance. This score will tell the user at a glance whether the variance is okay or not okay.


    We will calculate the variance for a criterion based on a scale that can change from one survey to another (survey scale can vary - e.g. 0 to 10, 1 to 10, 0 to 7, etc.) and then I want to convert the calculated variance for that criterion to a 1 to 100 scale. I want the maximum variance possible to be 100 on the scale and the minimum variance (zero) to be 1 on the scale.


    If the thing we are measuring has a small variance score then we may not want to continue measuring it (and it would show up as a low number on the 1 to 100 scale. E.g. I could set rules that say if the variance score falls below 50 on the 1 to 100 scale then we need to evaluate that criterion to determine if we should continue measuring it).


    If the thing we are measuring has a large variance score then we do want to continue measuring it and the variance score would be a large number on the 1 to 100 scale.


    Converting the variance into a 1 to 100 scale is so that I can quickly and uniformly communicate whether we have good variance or poor variance and whether they need to take action based on what we're seeing.


    Hopefully I've done an adequate job in communicating what I'm trying to do here. Can anyone help me with the math for this problem?


    Thanks for your help!
    Follow Math Help Forum on Facebook and Google+

  2. #2
    mfb
    mfb is offline
    Junior Member
    Joined
    Jun 2012
    From
    Germany
    Posts
    72
    Thanks
    5

    Re: Converting a Variance calculation to a 1 to 100 scale (need some help)

    The minimal variance is 0, and you get the maximal variance if 50% check the minimum and 50% check the maximum. With a scale from a to b, the variance is then given by approximately 1/4(a-b)^2.
    You can convert this to a score just by multiplying the variance with 400/(a-b)^2. I would expect that most of the values are below 50. It would be possible to modify the mapping variance -> score to get more values larger than 50 (without ruining the order), if necessary.
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Logarithmic scale calculation for mapping
    Posted in the Math Topics Forum
    Replies: 2
    Last Post: May 25th 2011, 10:02 AM
  2. Replies: 2
    Last Post: October 6th 2010, 06:13 PM
  3. Converting AVI video into Gray Scale
    Posted in the Math Software Forum
    Replies: 0
    Last Post: February 28th 2010, 03:19 PM
  4. Converting Farenheit to Arbitrary Scale
    Posted in the Algebra Forum
    Replies: 2
    Last Post: September 7th 2009, 03:29 AM
  5. Converting a scale on a ruler
    Posted in the Algebra Forum
    Replies: 1
    Last Post: June 1st 2009, 10:44 PM

Search Tags


/mathhelpforum @mathhelpforum