• No results found

Summary of dietary reference values and discussion of use of reference values for

4 Dietary reference values for iodine

4.3 Summary of dietary reference values and discussion of use of reference values for

IOM/NASEM, WHO, NNR and EFSA have all established dietary reference values for iodine, and they have all agreed upon a recommended daily iodine intake of 150 µg/day for adults.

However, based on various arguments, the recommendations for pregnant women vary between 175 (NNR Project Group, 2012) and 250 (WHO, 2007) µg/day and for lactating women between 200 (EFSA, 2014; NNR Project Group, 2012) and 290 (IOM, 2001) µg/day.

The recommendations for children and adolescents are in the range between 90 and 120 µg/day depending on age. In addition to recommended intakes, average requirements (AR or EAR), intake values that are estimated to meet the requirement of half the healthy individuals in a population, are established. In (NNR Project Group, 2012) the AR for adults was set to 100 µg iodine per day, but no values were set for children or adolescents. The IOM (2001) set the estimated average requirement for children and adolescents between 65 to 95 µg/day for children aged 1 to 18 years. The EFSA Panel (2014) concluded that there was insufficient evidence to derive an average requirement (AR).

UL/GL µg/day Endpoint Based on NOAEL LOAEL

55 The VKM project group responsible for the present report discussed which of the dietary reference values that is most appropriate for the purpose of comparison with the levels of iodine exposure described in chapter 6, including which of the datasets for intakes in the low range that holds a similar likelihood of inadequacy (and risks related to inadequacy) as does the UL and a likelihood of excessive intakes (and risks related to excess). The project group decided to use EAR and UL as comparison values, and this is in line with the recent proposal for harmonised dietary reference values from WHO, FAO, NASEM where AR and UL are considered the core values for evaluating population intakes (Allen et al., 2019).

The AR/EAR is the primary reference value for evaluation of nutrient intakes, and the RI, LI and UL can be used as complementary values (NNR Project Group, 2012). Comparison of population intakes with RI would give an overestimate of inadequacy as the RI is set to cover almost all individuals.

EAR is defined as an intake that is estimated to meet the requirement of approximately half of the healthy individuals in a life stage and gender group (i.e., median requirement). RI is derived by adding two standard deviations to EAR and is defined as the average long-term intake level of a nutrient that is estimated to meet the requirement of and maintain good nutritional status in almost all healthy individuals in a group. The range of optimal iodine intake is small. Using EAR rather than RI as the lower cut-off for acceptable iodine intake in the iodization scenarios will accommodate a higher flexibility, achieving a distribution of iodine intake levels that prevent deficiency without pushing the intake distribution to the higher end and thus increasing the probability of exceeding UL.

A simplified approach referred to as the EAR cut-point method can under certain assumptions, be used to estimate the prevalence of inadequate intake directly as the proportion with intake below the EAR (IOM, 2001). This method does not rely on a known requirement distribution, but the distribution should be symmetrical around the EAR, the variance of intake should be greater than the variance in requirements, and the intake and requirement should be independent (i.e. have a low correlation) implying that individuals with high requirements do not tend to have higher intakes. Assumptions related to

requirements are in practice difficult to verify, but we have not come across literature or data to indicate that these assumptions do not hold for iodine.

In the iodization scenarios, the prevalence of inadequate iodine intake considered acceptable is ultimately a matter of judgment, but 2 to 3 percent has been used when planning diets for groups. According to the EAR cut-point method, 2.5% of the population will then have intakes below the EAR, and intake will be adequate for >97.5% of the population with intakes >EAR (IOM, 2003). While we present percentages > EAR in text and tables in this benefit and risk assessment, it should be kept in mind that the uncertainty in the estimated iodization scenarios is likely to be higher than a few percentages.

56 Generally, UL is the maximum level of total chronic daily intake judged to be unlikely to pose a risk of adverse health effects, and in the case of iodine the UL for adults is the maximum daily intake where changes in TSH are unlikely to occur (SCF, 2002).

Derived from the arguments above, the project group decided to compare the exposures in chapter 6 with the AR/EAR for adults from NNR (2012) and EAR for children and adolescents from IOM (2001) and the ULs established by the SCF (2002) as the cut-off for acceptable intake in the lower end and UL as the cut-off in the upper end of iodine intakes. Even though these cut-offs are not fully comparable, they were considered to be the most appropriate dietary reference values for comparison with the exposures for iodine. The health

consequences from having intakes below EAR or above UL is elaborated on in the following chapters (5, 6, 8 and 10).

Table 4.3-1 Overview of values for comparison used to evaluate the iodine intake estimates in chapter 7.

EAR/AR Adults (18-70 years)

13-year-olds 9-year-olds 4-year-olds 2-year-olds 1-year-olds

EAR/AR 100 73 73 65 65 65

UL 600 450 300 250 200 200

A review of new literature after year 2002 to re-evaluate the existing ULs is given in chapter 6 and discussed and summarised in section 6.2.3.

57

5 Systematic literature review and