Originally Posted by

**stevenjs** Greetings,

Let me begin by saying I am not a mathematician, nor even a very good arithmetician. However, I am tasked with explaining to a group of humanities faculty how a piece of software calculates grades based on a method alternately described as "weighted average" or "percentage weighting."

To keep this as simple as possible, let me pose the following two examples of this grading method.

A teacher has two components to student assessment, a quiz worth 20% of the final grade and an exam worth 80%. However, he has his own reasons for wishing to ascribe a point value of 400 for the 20% quiz, and 20 for the 80% exam. If a student earns all 400 points of the quiz, he gets a grade of 100, or A+. If he earns all 20 points on the exam, he likewise gets a grade of 100 or A+, and again the same 100 or A+ for the course's final grade.

The reason I use a large point value for the quiz and a small point value for the exam, is precisely because I do not want an intrinsic relationship between the total points earned as a percentage of total possible points to inform the final grade, but rather the percentage weighting that is assigned to each of the two tests.

Needless to say, a student who gets 370 out of the 400 point quiz and 16 out of the 20 point exam, will get a lower final grade than 100 or A+. That grade, however, must exactly equal the final grade if I had ascribed 20 points to the 20% quiz and 80 points to the 80% exam for a convenient 100 point total for the course.

Example Two: Three tests, one quiz of 30 points worth 5%, another quiz of 70 points worth 15%, and an exam of 10 points worth the remaining 80%.

Can anyone express this percentage weighting in a formula? I am admittedly algebraically challenged, and have filled several sheets of paper trying to figure this out on my own, to no avail.

Any light you can shed on this computation will be greatly appreciated.

regards,

stevenjs

____________________________

"I am but an egg."

--Stranger in a Strange Land