## Role of Estimates of Average Median Requirements

For many of the uses given for reference values, it becomes important statistically to not depend on an allowance that would cover the needs of everyone and thus might include a safety factor added to some adequate level of intake but, rather, to apply estimates of the average requirement for the group of interest. For most nutrients, with iron a notable exception, it can be assumed that nutrient requirements are symmetrically distributed in a population of similar people (Figure 3), which means that some will have higher requirements than other similar individuals due to genetics and other factors, and that a median requirement intake level can be determined, such that consumption of a nutrient at that level would be adequate for half of the individuals in the group but inadequate for the other half. If this

Usual level of iron intake (mg/day)

Figure 2 Probability that specified usual iron intake would be inadequate to meet the needs of a randomly selected menstruating woman. (Used with permission, G. Beaton, 1994)

Usual level of iron intake (mg/day)

Figure 2 Probability that specified usual iron intake would be inadequate to meet the needs of a randomly selected menstruating woman. (Used with permission, G. Beaton, 1994)

Increasing Intake-►

Figure 3 Probability distribution of individual nutrient requirements.

Increasing Intake-►

Figure 3 Probability distribution of individual nutrient requirements.

distribution of requirements is symmetrical, then the median and the mean requirement are the same.

Why have an Estimated Average Requirement ? There are two main reasons to have an Estimated Average Requirement (EAR): to use as the basis for establishing the recommended intake for an individual and to assess the adequacy of intakes of similar population groups. The concept of establishing an average requirement, and assuming that the requirements of individuals in a population of similar people are symmetrically (or normally) distributed, is not new. Conceptually, it has served as the ideal basis for recommended intakes during the past few decades. However, it was rigorously used on only rare occasions. The RDA has been conceptually defined in the United States during the past few decades as the lowest amount of a nutrient that, in the judgment of the Food and Nutrition Board, meets the known nutritional needs of almost all of the population (subgroup), and it was also more mathematically defined as the mean requirement plus two standard deviations (SD), which would equal an amount required by 97 or 98% of the population to whom it is applied.

The Dietary Reference Intake (DRI) process—a joint effort of the United States and Canada— retained the term RDA, limiting its use to serving as the goal for intake when planning diets for individuals and standardizing the method by which it is established. It is defined as follows: RDA = EAR + 2SDear. When data on variation in requirements of a specific nutrient are lacking, it is assumed that the standard deviation (variation) in requirements is approximately 10%. This variation in requirements is derived from the variation seen in basal metabolic rate in individuals and the variation seen in protein requirements, with protein being the nutrient whose variability has been most studied.

Usual Intake (amount/day)

Figure 4 Using the EAR to estimate the prevalence of inadequacy in a population from the distribution of nutrient intakes.

Usual Intake (amount/day)

Figure 4 Using the EAR to estimate the prevalence of inadequacy in a population from the distribution of nutrient intakes.

It has been demonstrated statistically that the prevalence of inadequacy in a population whose requirements are symmetrically distributed can be estimated by comparing its intake to the EAR for that nutrient in the same (or a similar) population (Figure 4). Thus, in the DRI process, when evaluating vitamin C requirements, experimental data from a clinical study indicated that the average intake for men needed to achieve 70% white blood cell ascor-bate saturation (the chosen indicator) was ~75 mg/ day, and the EAR was set at 75mg/day. This is a value that can be applied to the intakes of other similar populations of men who have similar characteristics to determine the percentage of the population who may be inadequate based on this criterion of adequacy (Figure 5).

To use this method to assess adequacy of population groups, there are other basic statistical assumptions that should be met. First, an individual's requirement for a nutrient must be statistically independent of the intake for that nutrient (this does not hold for nutrients such as total energy or water—

Vitamin C intake (Food and supplements) Gender-Men----Women

Figure 5 Vitamin C intake data from NHANES III for men and women; using the EAR to determine the expected prevalence of inadequacy.

Vitamin C intake (Food and supplements) Gender-Men----Women

Figure 5 Vitamin C intake data from NHANES III for men and women; using the EAR to determine the expected prevalence of inadequacy.

people eat or drink because they know they need energy or water). Second, the amount of variation (the distribution) in the nutrient intake levels in the population group must be greater than the variation in the group of the requirements for the nutrient (this is almost always the case, except when everyone in the group consumes the same food in the same amounts—thus, there is little variability in intake). If these two assumptions are met, along with the symmetry mentioned previously, then the EAR can be used as the cut point for adequacy in other similar populations (as shown in Figure 4). This is called the EAR cut point method.

Because the RDA has been misused as a tool to assess adequacy of intakes of groups in the past by policymakers and scientists alike, it has been argued by some that it is better for scientific panels of experts not to provide, in addition to EARs, any recommended intakes since their only use is to provide guidance to the individual, and health professionals can easily develop recommended intakes from reference values that are average requirements. However, the concept of RDAs in the United States and RNIs in Canada has been accepted in the general population to the extent that to not provide RDAs (and, as an extension, recommended intakes such as Adequate Intakes (Als) where data are lacking) would result in more misguided actions than would result from providing them along with instructions for their specific and only use: to plan diets for the individual.

Adequate Intake: Used when an EAR cannot be determined Whereas for many nutrients enough data exist to be able to establish levels of nutrient intake at which half of the individuals in a group would be inadequate based on the criterion chosen, for some nutrients the necessary data may be conflicting or lacking. In order to give some guidance to users of nutrient reference values, it is still necessary to provide quantitative numbers. To further differentiate the appropriate uses of the RDA, the DRI framework provides an additional category of a recommended intake for use with individuals to plan diets—termed the AI. This is a level that is considered adequate for all members of the group and thus may overestimate the needs of many, if not all. Statistically, it cannot be used as if it were an EAR to assess adequacy. It does, however, provide guidance for how much an individual should consume. In some cases, it is derived from the average intake of a population in which inadequacy appears to be nonexistent based on review of available indicators or criteria (such as is the case for vitamin K).

## The Most Important Guide On Dieting And Nutrition For 21st Century

A Hard Hitting, Powerhouse E-book That Is Guaranteed To Change The Way You Look At Your Health And Wellness... Forever. Everything You Know About Health And Wellness Is Going To Change, Discover How You Can Enjoy Great Health Without Going Through Extreme Workouts Or Horrendous Diets.

Get My Free Ebook