A weighted mean for the following test scores

4/10

8/15

19/31

3/5

2/5

5/5

21/28

72/100

is the sum of the values times their weights over the sum of the weights so

(4*10+8*15+19*31+....+72*100)/(10+15+31+....+100)

which comes to 8587/199 = 43.2 (to two decimals places)

and if the first term changes from 4/10 to 6/10 then the new weighted mean would be

(6*10+8*15+19*31+....+72*100)/(10+15+31+....+100) = 43.3 to two decimal places.

The change between the two would be equal to 0.1

The question I'm trying to answer is how much of a change will there be in the final result? My options are

"2 percent", "2 percentage points", "0.6 percent" and "0.6 percentage points"

I'm not sure what the difference is between "percentage points" and "percent". I also can't figure a way to get any of the above answers (2 percent, two percentage points etc).

If I set all the fractions equal to 1 (ie 100 percent on every test) I get just over 61 = 100% on every test, which means the weighted mean is itself not a percentage. If I convert the scores to a percentage I get 43.2/61 which is 70.81 percent (71 if rounded to the nearest percentage) and 43.2/61 which is 70.98 percent (ALSO 71 if rounded to the nearest percentage). The percent difference is about .23 percent according to what I have.

If I take 43.2 and scale it up by 2 percent I get 43.2*(1.02)=44.064 so that's not right but if I scale up 43.2 by 0.6 percent I get 43.2*1.006 = 43.54 which is also wrong. The score has been changed by 0.0023148 because 1.0023148*43.2= 43.29999936

I must have misunderstood something about this, can anyone help? is my weighted mean of 43.2 wrong? or is it percentage change that I'm doing wrong?

thanks