April 8, 2009 - 1:33pm

## Measuring small amounts and spoon scales

How accurate do you think inexpensive Salter-type digital scales are at measuring amounts between 2 and 14 grams? Does anyone use a digital spoon scale? I'm trying to figure out if I ought to buy one of those spoon scales or if my Salter, which supposedly weighs down to increments of 1 gram, is good enough.

--Pamela

I don't know this particular scale, but what I am going to say applies to all.

The first thing to understand is that if the accuracy is 1 gram, does that mean that it displays weights in 1g increments but is actually more accurate (simply rounding to nearest gram), or does it mean it has a measurement error of up to 1g? I have yet to see a manufacturer that specifies which it is.

Let's assume (best possible outcome) that it is accurate to 0.5g, in other words, it is rounding to the nearest gram. That means that when you see 1g displayed, the actual weight might be between 0.5 and 1.5g. The relative error corresponding to that is 50% - 33%. If you see 100g, however, the relative error will be 0.5% - 0.49%. In other words, the accuracy of this type display becomes better as the amount to be weighed becomes more and more.

The second question is the quality and consistency of the sensor device. All digital scales use one or more sensors. Some sensors are affected by ambient conditions (temperature, moisture), and some change over time. These introduce errors in the measurement. Some of these errors are introducing an "offset" or constant size error, others may be multiplicative, meaning the error gets larger/smaller with the amount weighed. Some self-zeroing scales automatically compensate for the offset. Finally a sensor's response is often not quite linear across the whole range. This causes different size errors across the usable range.

If you want to know how your scale behaves you will need to get a set of certified weights and weigh them, in different combinations, and possibly in different conditions, to create a graph. Personally, for my baking, I don't care as the error rapidly diminishes for amounts over 10g. Where it matters, sometimes is for salt and yeast when making small doughs. An error in yeast is not fatal, but shortens (and slightly less flavor) or lengthens the desired proofing time, something I can usually afford to deal with. If you accidentally double the salt, proofing will also be retarded, and taste might be a little salty. Still I doubt the product will be inedible.

As to your original question, and with the assumptions above, a 1g readout scale with assumed accurate measurements across the 2-14g range would have errors ranging from 25% (at 2g) to 3.6%. If we assume some more inaccuracies due to the other factors described, this could easily be 40% around the 2g measurement.

Now here is a trick I use to improve my odds. Say I need 2g of salt. I definitely do not want to have too much. A little too little is ok. So, zero (TARE) the scale, and start adding salt, slowly and watch for the scale to flip from 0 to 1. If you did this slowly, at that time you were likely only at 0.5. Eyeball the amount. Keep going and watch for the transition from 1.5 to 2. If you stop here, you likely have 1.5g and your error will be -25%. Now add 50% to 100% of the observed amount. You will now have between 1.75g and 2g, with an maximum error of 12.5%. Because the error is negative, your bread will not be too salty! If the amount needed is closer to ending in .5 you will be more accurate even! As you do this more often, you become better at it and your typical error might be 10% or less. Some scales' circuitry for self-zeroing is active all the time (as opposed to just after power on) and slowly adding an ingredient will cause the scale to zero itself all the time. To prevent that, place another weight on the scale while you do this and remove it when done (and look at the relative change in the display rather than the absolute value).

Another trick is to measure 8g instead of 2, visually divide it in 4, use 1 portion and throw the other 3 back. This is sometimes more accurate than trying to measure the 1g.

If this is too much work for you, consider getting a second, more precise scale that is accurate for smaller increments, and use that for the small amount ingredients. Be aware that, since you cannot measure your flour etc. with those scales, you end up transferring the salt/yeast etc. from a small(er) measuring container to the larger one. Depending on moisture level etc. a small amount of ingredient may stay behind in the smaller container, negating all or some your efforts.

Overall, don't sweat it. Consider a typical 1.5lb (681g) loaf with a 2% ingredient. That 2% will be 13.62g, and your typical error will be less than 5%. Make a double batch of dough and the error is less than 2.5%. This is why professional bakers never worry about this. Their scales are 1 or 5g accurate but they measure large amounts. Same 2% yeast in a 30lb french bread dough will be 156g so with a 1g scale the error will be 0.3% even with a 5g accurate scale, the error will be around 1.6%. To them the accuracy is important as it affects their schedule, but a 2% error in yeast will have a negligible effect on schedule.

I am adding things like salt, yeast, baking powder, etc. slowly and do watch the scale flip to the next increment. What you say about relative error makes perfect sense. I took statistics but just didn't think about the error in terms of total weight. But I was also worried about adding rapidly vs. slowly; I could do a test on salt and see if I get the same amount regardless of whether it is added slowly or rapidly. Mostly I'm not worried about things like salt or yeast, but stuff like baking powder and soda esp. when a recipe calls for something like 2 grams. Of course I could use a measuring spoon, but if I really wanted to know, what about these scale spoons? I think they are more popular in Europe, but just catching on here. I suppose I could look to see if they say anything about accuracy.

I could also get a weight set and an old double beam scale--my dad had one of those when I was growing up.

Thanks a lot for your reply; it really gives me a lot of information.

--Pamela

I have never used them because, as I wrote, I don't care enough. The engineer in me says that they probably only work accurately when held very close to horizontal though. The other comment I would make is like the one above about residue remaining behind on the spoon when you drop the measured amount in the mixing bowl. Moisture can easily do that and if you really desire to be so accurate, may be you care.

I don't see why you have to care more about accuracy of baking soda/powder vs. yeast or salt. Not enough means a dense product, a little too much probably doesn't hurt. Remember that even for 2g in a 500g batter for a quick bread, with a 0.5g error could cause you to have 1.5 or 2.5g. Since you don't want err on the low side, go to the "flip" and add a little more. I still believe your error will be very small and if you push it to the "too much side" you will not get a dense product and you will not have enough too much to taste a residual.

If I were you, I would not bother with the expensive spoon gadget and just a more accurate scale. I'd get something like this: http://www.oldwillknottscales.com/my-weigh-ibalance-300.aspx. It looks similar to the 7000DX that I use for all my other baking. Both are also useful for counting pills, weighing envelopes for postal charges etc. The i300 weighs up to 300g so put your container (mixing bowl) on it, TARE, add small ingredients with precision, move to 7000DX (or 7001DX) and add big ingredients. Done with precision.

Looks like it would be a lot simpler and more accurate to use teaspoons for these small amounts. Jean P. (VA)

Yes, I could get a more accurate scale, and probably will at some point. But not because it would make my baking better--like you say, the amount of error is probably not important. Such a purchase would be purely made to satisfy something in my nature that desires absurd accuracy. I know it is ridiculous, but I just can't help it.

I always tare everything too.

Another factor concerns recipe testing. If I'm testing a recipe, then I want to make sure I'm testing it accurately. But again, I agree, I'm being a bit ridiculous.

Maybe some of the European TLFers would chime in and let me know if they use spoon scales.

--Pamela

Hi Pamela,

The easiest way to check your weights for small amounts is by measuring much larger amounts with your scale, then dividing it out. For instance, measure the weight of 1 cup of salt in grams, then divide by 48 and now you know how much 1 tsp of salt is. Do the same with sugar, baking soda, or whatever, and you can make yourself a chart or engrave it on your teaspoon or tablespoon.

As you know, your scale is proportionately more accurate at a larger weight, so you'd be pretty darn close with your teaspoon measurement even if your cup measurement is 5 or 10g off.

After that, you don't need to use your scale for small stuff since it's easier to measure that in volume anyway.

-Mark

Thanks, Mark. I see that is probably the way to go. I'll perform some experiments so I can be more certain of what I'm getting.

--Pamela

For many commonly used ingredients the necessary information to convert weight to volume and vv. is available as part of my Dough Calculator Spreadsheet. Much of the required data came from the USDA, but some was done in exactly the manner described by MCS.

Thanks for putting this link out again. I'll take a look at what it can do for me.

--Pamela

If I was going to purchase a scale to weigh out very light ingredients for baking I

would notbe looking at electronic digital scales (postage scales?), whether run by batteries, or by a DC transformer that plugs into a wall outlet that connects to the scale by a cord..Instead I would make the 1-time investment in purchasing the Ohaus Model 310-00 Dial-O-Gram scale..It is fully mechanical, never needs batteries or an electrical outlet, has a 310g maximum capacity, is very accurate, and most importantly it measures down to 0.01g, or 1/100 of a gram..

http://www.amazingscales.com/ohaus_dial_o_gram_scale.asp

Shop around for the best price..I just used this site because it had the best image of the scale to view..

Bruce

That's a very cool scale, Bruce. Only problem is where I'd put it!

--Pamela

The 310 model looks like the type of scale I used in high school. I'll bet getting a used one would be cheap.

Here is one on ebay for $10. I want one now! It's even nearby!

Ohaus 310 on ebay

Thanks for the link. --Pamela

I've never found a conversion chart that gave the weight of a smidgen accurately. Stay tuned! ;-)

David

David, I bet it is 1 gram! --Pamela

I have spoons that are actually labeled "dash", "pinch" and I think "smidgen." They came with my more standard set, and I don't think I've ever used them.

Home cheesemakers use these smaller measuring spoons to measure out cultures. Ricki Carroll sells them on her website http://www.cheesemaking.com

The weight of a smidgen depends on what you put in it. Lead (not that we would use that for any food) would weight many times more than yeast. A smidgen (and its brethren) are most commonly defined as:

Using MCS's approach to determining the weight equivalent of, say 1 cup, you then as appropriate divisions to come up with weights for dash, smidgen and pinch. BTW there are measuring spoons for these amounts available.

...that a smidgen was a volume measurement and not a weight measurement. It sounds so imprecise. Actually, if it's 1/32 of a tsp, then it's very precise since there would be 32x48=1536 of them in a cup.

-Mark

If a scale has a factor of accuracy of .1% or 1% really doesn't matter that much. Remember that the error is not unique to a specific ingredient; every ingredient weighed on the same scale should have the same error factor. So the ratio of error for each ingredient should be similar enough to have no real adverse affect on the final result. If you're truly worried about accuracy, remember to weight all your ingredients on the same scale. If you're REAL concerned, try a digital reloading scale:

http://www.ableammo.com/catalog/product_info.php?products_id=109320

or a dampened beam scale:

http://www.midwayusa.com/eproductpage.exe/showproduct?saleitemid=605320&t=11082005

An error would not be specific to the ingredient. Thanks for mentioning that.

--Pamela

As I mentioned before the relative error is affected by the quality of the scale, the environment, and the amount to be weighed. The latter because we talk about relative error.

The argument that because we use the same scale all errors are the same for all ingredients is only true for the absolute error. Yes, we might have a 0.5g error in yeast as well as flour. In a typical recipe where the amount of flour used is about 50x the amount of yeast, that makes the relative error about 50x as large for the yeast as for the flour. If you need 0.5g of yeast for a biga and you end up putting in 1 or 1.5 you'd be amazed at the difference in the result. For flour 500, or 501g makes no real difference.

Yes, with an error factor of .5g per gram, your calculations work. But such an assumption for error results in an exponential result. Most of the scales I've used include a linear error factor. The degree of error is therefore expressed as a percentage of any total. That means an error of 5% +/- at one end of the spectrum translates to an error of 5% +/- at the other end. So a gram of yeast would calculate within a range of .95 grams to 1.05 grams. 500 grams of flour would calculate between 475 grams and 525 grams. Those are not significant differences in any but the most stringent laboratory standards.

That's incorrect for two reasons. First, the error can be not only relative, but also absolute, that is every measurement could be off by some constant value. That would have larger effect on smaller weight. Second, small relative error can result in large difference in the displayed value. Say you have put exactly 2.50 g of salt on a scale that can not show decimals. Then 1% relative error can make displayed value 2 g or 3 g - much larger difference.

The terms absolute error and relative error can co-exist just fine. Absolute error in a measurement refers to the difference in actual (known) value, and measured value. Relative error divides absolute error by either known value, or indicated value to get a percent. There is also, a separate set of these for the displayed value. Digital scales (unlike their analog counterparts), then have to turn their measured value into a display and thus there is a (possible) interaction between measurement error and display error. Not every measurement error, in every instance, results in an error in displayed value.

If a measurement is off by some constant the absolute error would be the same, no matter how much you are weighing, but since relative error requires dividing that by the amount weighed, relative error goes down as you weigh more.

For most scales we can buy we don't know the measurement error because it is not published. What we do know is the display error, so it is most meaningful to work with that.

The example you give is true, and would be even for a relative error of 0.000001% (or any non-zero value). It is a contrived scenario though because any scale would be equally bad. What is of interest to the user is the probability that the indicated value is wrong. If you put any random weight on with a known weight between 2 and 3 gram (average 2.5), and the relative error is 1% only weights between 2.5±0.025 might give the wrong value, all the rest would be right. The probability of a correct result is thus (1-2*0.025) / 1 = 95%, and thus 5% for a wrong result. And this is the same for measuring between 200 and 201 grams.

The significance of the error is better expressed with the displayed value error. In your example the displayed value is off by 0.5 and that represents 0.5 / 2.5 = 20%. This is the number, as I explained, that goes down as the amount to be weighed (displayed) goes up. For 200.5g it would be 0.5 / 200.5 = 0.25%.

I don't think you read carefully what I wrote, which explain why you think it to be meaningless. I never said that absolute error can not occur simulataneously with relative error, which is, by the way,

alwayscalculated based on real value, not measured. Moreover, one simply can't exist without the other, since they are just different representation of the same thing. It was wrong of me to use word "relative" in the sense I did and had I known you'd decide to get on my case I wouldn't have done it. And yes, as I mentioned originally relative error will be larger for smaller weights.As far as my example goes, I think it is you who are wrong. First you're taking completely different things, errors based on precision of the scale and accuracy of the measurement and lump them together with proportional systematic error, that is every measurement being off by some constant factor. If you wish then for the sake of conversation we can call either one "relative error", even though the second one really isn't, but only one, not both. It goes without saying, of course, that absolute error is different from constant systematic error, that is offset.

So let's consider another contrived but not impossible situation. There is 2.45 g on the scale. Will there be a difference in displayed value if proportional error is +2% vs +3%? You bet it will, in one case the scale will show 2 g in the other 3 g. The relative error will be very high and fairly similar - 0.18 and 0.22, but the important part is that measured value can be both smaller and larger than the real value, and depending on the ingredient being weighed and the actions of the user it can lead to not insignificant differences. For example if we're talking about 2.45 g of yeast when the recipe calls for 3 g the user may either see the desired value and stop or feel compelled to add more yeast in which case as much as .9 g could be added. Now on the other hand if we have 245 g the proportional error would still be the same +2% and +3%, but because the absolute error is now larger than the precision of the scale it will be the same in sign, which will greatly reduce difference in corrective action. Of course this is pure speculation since 1) household digital scale are much more precise than 2%, I checked, 2) even expensive top-loading balances that can go to three decimal points are notoriously inaccurate when such small amounts are concerned and hardly ever used for such purposes in professional setting. I can go on but frankly I find the topic boring and only got riled up because you implied that I am ignorant.

Never meant to achieve that, so I apologize. I never state that you are ignorant, but merely have a different opinion as to the practical value of your argument. If disagreeing means the other party is ignorant, you have just declared me ignorant. You also accuse me of being careless in my reading, while at the same time admitting that you used the word "relative" incorrectly, which is precisely why I disagreed with you in the first place.

I think you are overreacting, making unjustified assumptions about me and my intentions. Clearly our contribution to this thread is diminishing in value to all concerned, and I think it best if we close this topic.

The last few times I made a mix that called for a 2/10th gram of yeast or some other small amount, I used a method taught to me by Mike Avery. (by the way where IS that guy?)

I start with a small pharmacy pill vial and measure out 10 grams of flour. To this I add 1 gram of yeast. Mix well and keep in the freezer.

When you need .2 grams of yeast, simply weigh out 2 grams of the flour/yeast mixture and it will be close enough for all purposes. The additional flour can be compensated for if you are neurotic.

This doesn't really speak to Pamela's question about accuracy between 2-14 grams. I do know that adding small amounts trying to creep up on a number will sometimes result in adding more than you think you are measuring. Most people think about digital instruments as accurate because they see exactly what the number is. The truth is however, an analog clock/scale is more accurate since it tells you exactly what the time/weight is. No rounding up or down.

The problem with the balance scales above mentioned is that they generally can't handle a 1000 gram scoop of flour. You need to find an old hardware store scale used for weighing nails. I'll bet that would work just fine.

Eric

My preferred method of weighing small amounts is to measure large amount, but in a way exactly opposite to the one suggested by Mark. What I do is I put an empty bowl on a scale, tare and start adding an ingredient using measuring spoon, for example, adding dry yeast with a quarter-teaspoon. When I measure 10 or 20 spoons I write down the weight, empty the bowl, tare and repeat. The I can divide that valaue by 10 (or whatever) and get a much more precise weight of one unit. Of course this requires keeping records.

Pamela,

I posted this once before, but it seems relevant to your question. According to the US Mint, coins have the following weights --

Penny: 2.500 grams

Nickel: 5.000 g

Dime: 2.268 g

Quarter: 5.670 g.

So you can check your scale's accuracy by weighing coins. Put a nickel on your scale, and see if it registers 5 grams. Add a dime, and see if it goes up to 7, but not 8.

Also try it with a bowl on the scale, because sometimes scales are more accurate at heavier weights. Add the bowl, tare, and then add coins.

Mare

Thanks, Mare, for this information. I'll check it out on my scale. This is great.

--Pamela

My scale's accurate according to the coin weight. I just finished weighing the coins. Thanks again Mare for such a simple solution.

--Pamela

I just weighed a new penny on a laboratory balance. I got 2.48 grams. My balance will resolve .01 grams. If I want less thn that I resort to the addict's method with a glass plate and a razor blade.

All this back and forth has me wondering again; which is heavier, a pound of feathers or a pound of lead?

Sorry, I just HAD to do it!

Hi All, I read many threads whilst sitting here trying to improve my knowledge on all things bread, and I have to say, generally I find this site extremely good, for which I am grateful. However, this thread leaves me baffled, its a simple loaf, not the next Apollo moonshot. Of course accuracy helps maintain consistency, but if we take this to its ultimate conclusion, we won't have time to bake the loaf let alone eat it. Think back to the times of yor, digital scales were not invented, or for that matter even needed. Nuff said.

kindest regards

Patrick

From Sunny Sussex

Generally I have found the precise weight is needed only with gunpowder and poisons. Otherwise a pinch and a dab do rather nicely.