How to Accurately Weigh Ingredients

weighing process

Weighing is essential for food processing to ensure ingredients meet recipe specifications and quality requirements. There are several ways to reduce errors during the weighing process, such as comparing an unknown substance to a precision mass standard and using the weighing-by-difference method.

Analytical balances should always be exercised by placing a weight equal to the reading on the pan. This helps stabilize the instrument and improve repeatability.

Measurement of Mass

One method of measurement involves comparing or substituting the unknown object with a mass standard with known calibration values. This method (also known as comparison or substitution weighing) eliminates the error caused by built-in weights, reduces disturbances during the measurement and requires no adjustment of dial settings. However, environmental factors such as differences in air density, temperature, and evaporation of water can cause significant errors.

Another method of measuring mass is to determine its buoyant force by displacing an amount of water equal to the volume of the sample. The resulting figure is divided by the density of the water to give an approximate value of mass. Unfortunately, this method is prone to errors due to a variety of factors such as:

Measurement of Weight

Weighing involves determining the force exerted by gravity on an object and is measured by a balance. A balance uses several load cells that support (or suspend) a weigh vessel or platform. Each load cell sends an electrical signal proportional to the weight of the material on it to a junction box, where they are summed and sent via a single cable to a weight controller that converts them to a weight reading.

Vibration from process equipment and ambient noise may affect weighing systems. To help compensate for these effects, a weighing system should be located in a draft-free location on a solid bench, and it should have built-in calibration weights to periodically maintain accuracy.

Weighing by comparison to a known calibration standard, also called substitution or comparison weighing, is the most accurate way to calibrate a balance. It reduces the errors caused by a fluctuating dial setting and allows for faster measurement of small loads, particularly those that are difficult to read accurately (e.g., slow or blocked feed rates due to ingredient bridging).

Measurement of Volume

Volume is the amount of three-dimensional space occupied by a shape or object. Unlike area, which is measured in square units, volume is expressed exclusively in cubic units. The unit for volume is the cubic meter (m3), which is one of the coherent derived quantities defined by the International system of units, along with decimeters and milliliters.

For some less-defined shapes, such as those of bread or confectionery, it is possible to measure their volumes without immersing them in liquid. This is achieved using the seed displacement method. Alternatively, for liquids, such as water, the graduated cylinder technique can be used to measure their volume.

For level or inventory measurements, weighing technology is the most accurate way to go. Ingredients that foam or settle unevenly, stratified layers with different dielectric constants, poor reflectivity, vessel shape and size, and bridging and rat-holing can all affect measurement accuracy, but weight is unaffected by these factors.

Measurement of Temperature

Measurement of temperature during the weighing process helps ensure proper and consistent handling. For accurate weighing, the system’s load cells must support all of the weight that needs to be measured. Rigid conduit connections or piping on a weigh vessel can support only part of the total load, degrading the accuracy of the weighing system.

To avoid shock loading that damages the weighing system, control material flow onto the balances with a feeder or other device. Then, install the weighing system in an area with low levels of air currents and vibration.

To read a thermometer, look for numbers and a scale of black lines, with each long line representing one degree of temperature. The four shorter lines between each of the long lines represent tenths of a degree. The scale of black lines on a thermometer also shows the range of temperatures that the instrument is capable of measuring. The thermometer must be set to the correct temperature before use.

Posted in News.