Measuring Cup Percentage Accuracy
I don't want this to turn into a debate about volume vs. weight measurements, thank you. It is strictly a math problem.
I have a measuring cup which is marked "240 ml". We know that 1 ml of water weighs 1 gram. I have a Jennings CJ4000 scale which I like. We also know that 1 U.S. nickel weighs 5 grams. On my scale, 20 U.S. nickels weigh 100 grams, so I'm satisfied that my scale is accurate.
I take my 240 ml measuring cup and fill it with water. I weigh it on the scale and it weighs 232 grams. My question is, how to state the error as a percentage?
Do I use the actual measurement as the numerator or the denominator of a fraction, i.e. 232 / 240 (the actual weight over the weight it is supposed to be), or should it be 240 / 232, with the measured weight as the denominator?
232 / 240 gives 0.9666666666666667. I subtract this number from 1 and get 0.0333333333333333. Multiplying this number by 100 gives me 3.333333333333333, so I have a 3.33% error.
240 / 232 gives 1.03448275862069. I subtract 1 and multiply by 100, for 3.448275862068966, or a 3.44% error.
My question is, which is the correct way to do this? Do I have a 3.33% error or a 3.44% error?
Now suppose the measuring cup held 120 grams of water. 120/240 gives 0.5. 1 minus 0.5 gives 0.5. Multiplying by 100 gives a 50% error.
240/120 = 2. Subtract 1 and multiply by 100 and we have a 100% error, which can't be right.
So it seems the correct way to do this is: actual measurement over the "supposed to be" measurement. What do we think?