The indicator amino acid oxidation (IAAO) method arose from work on the amino acid requirements of neonatal pigs (Kim et al., 1983). Although the IAAO method is based on measurements of amino acid oxidation, it uses measurements of the carbon catabolism of a nonlimiting amino acid (called the indicator amino acid) as a carbon analogue of nitrogen balance. The reasoning is that when a single indispensable amino acid is provided below its requirement, it acts as the single and primary limitation to the ability to retain other nonlimiting amino acids in body protein. These other amino acids, including the indicator amino acid, are then in nutritional excess and are oxidized (Zello et al., 1995). When the
intake of the test amino acid is zero, then protein synthesis is minimal and oxidation of the indicator is maximal. As the intake of the test amino acid is increased, protein retention increases and the oxidation of the indicator amino acid falls until the requirement level of the test amino acid is reached, after which the oxidation of the indicator amino acid is lower and essentially constant. The data are then analyzed to obtain as estimate of the intersection of the constant and linear portions of the relationship (the breakpoint).
The IAAO method has some advantages over the direct oxidation and carbon balance methods and has been validated in growing piglets by comparing estimates based on growth and body composition (Kim et al., 1983). The first advantage is that the metabolic restrictions of carbon dioxide release apply only to the indicator amino acid. Thus amino acids such as threonine, whose peculiar metabolism makes them problematical in the DAAO method, can be studied. Second, the pool size of the indicator amino acid does not change radically as the intake of the test amino acid is varied. Thus to some extent, potential problems of compart-mentation are minimized and, in principle, the method does not require estimates of the turnover of the indicator amino acid.
However, the IAAO method also has several limitations as it has been applied. First, like the DAAO approach, it has only been used in the fed state and the extent to which the fasting-state oxidation of the indicator amino acid is altered by the status of the limiting amino acid has not been determined. Second, the dependence of the result on the amount of total protein given during the isotope infusion has not been established. Third, the choice of the best indicator is still under study so that data obtained with the method are dependent on the assumption of the general applicability of the indicator amino acids (phenylalanine and lysine) that have been used most frequently.
Classical nitrogen balance studies in humans show that it takes 7 to 10 days for urinary nitrogen to equilibrate in adults put on a protein-free diet (Rand et al., 1976). On the other hand, it has been shown that most (about 90 percent) of the adaptation in leucine kinetics is complete in 24 hours (Motil et al., 1994). Zello and coworkers (1990) studied 2- to 8-day adaptation periods to either 4.2 or 14 mg/kg/d of phenylalanine on rates of phenylalanine oxidation at phenylalanine intakes of 5, 7, 10, 14, 21, 28, or 60 mg/kg/d. These investigators were unable to show any effect of prior adaptation to these two different phenylalanine intakes on the rates of phenylalanine oxidation at changing phenylalanine intakes, where the adaptation to the test level was about 4 hours. Clearly, from this study, adaptations in amino acid metabolism appear to take place much more quickly than do adaptations in urinary nitrogen excretion and are (at least for leucine [Motil et al., 1994]) virtually complete within 24 hours.
The most satisfactory statistical models for determining amino acid requirements use regression to define the population mean and variance. For the regression models to work, ranges of intake (particularly at the low end) have to be fed. In practical terms, this has greatly hampered studies in infants, children, and other vulnerable groups. On the other hand, if the individual only needs to be on a low or even zero intake of the test amino acid for a matter of 8 hours, then it becomes feasible to study indispensable amino acids in these and other vulnerable groups.
Such a minimally invasive indicator oxidation model has been developed (Bross et al., 1998) and applied to determine tyrosine requirements in children with phenylketonuria (Bross et al., 2000). In this model the oxidation study is conducted after only 6 hours of adaptation to the level of the test amino acid, which is administered every 30 minutes.
For amino acid oxidation measurements, two-phase linear crossover regression analysis was introduced during the validation of indicator amino acid oxidation in piglets (Kim et al., 1983). Later, this approach was transferred to humans in a direct oxidation study (Zello et al., 1990) and in indicator oxidation studies (Bross et al., 2000; Zello et al., 1993). This technique permits a precise determination of the breakpoint, which is used as the estimate of the requirement for the amino acid Estimated Average Requirement (EAR).
As pointed out above, the drawbacks of the indicator method are the short period of measurement in the fed state only, and the lack of a period of adaptation to the test diets. To avoid these drawbacks, a 24-hour indicator method has been developed (Kurpad et al., 2001a), which takes advantage of the strengths of the indicator approach, as well as the 24-hour period of measurements including feeding and fasting. On theoretical grounds, this method has advantages over other methods for estimating amino acid requirements, and is the chosen method for estimated amino acids requirements where data are available.
Was this article helpful?