1. Introduction
Recently, criticisms of memory-based dietary assessment methods have spurred conversation about the context from which a diet assessment is made. Some dietary assessment methods seek to understand whole food consumption, while others quantify macro- and micronutrients, quantities of phytonutrients, and other active dietary components [
1]. While there is real concern about the validity of self-reported dietary intake measurements, there are few truly objective dietary-measurement techniques, necessitating valid and reliable self-reported dietary measurements for nutrition research [
1]. One suggestion for researchers interested in specific nutrient adequacy is that the use of a “method that captures frequency of consumption and amounts consumed of all foods and beverages contributing the dietary component of interest with a high degree of validity [
1].” The “gold-standard” for this approach would be multiple-pass 24 h diet recalls taken on several days. However, limitations to this method include detail-based memory fault, reporter bias, difficulty obtaining repeated measure recalls, and difficulty obtaining direct histories from patients and participants. These barriers make the process of highly detailed dietary recall assessment methods difficult in some areas of population-based research, particularly in programs that need frequent monitoring for nutrient-based intervention.
Food pattern modeling is an emerging dietary assessment method that has been used for community-based nutritional trials and development of dietary guidelines [
2,
3,
4]. In food pattern modeling, foods consumed are categorized or grouped by their nutrient content and consumption patterns are used to estimate nutrients of interest or dietary adequacy [
2]. Unlike food frequency questionnaires or multiple-pass dietary recall methods, food pattern modeling does not give context to the quantity or frequency of food consumption but rather to the foods consumed. Advantages of this dietary assessment method include simplicity, reliability, and cost-effectiveness compared to 24 h diet recalls [
3]. Despite this, food pattern modeling has not been rigorously validated against 24 h recall methods to predict nutrient adequacy in clinical trials and its retrospective validity to biochemical or anthropometric measures has not been explored.
While studies have used food pattern modeling to determine dietary iron nutritional adequacy [
4,
5], there has been no validation of this assessment method against gold standard methods. Adequacy of dietary assessment methods for iron may be particularly important to investigate. Iron is one of the most common micronutrient deficiencies globally [
6], and iron deficiency is the target of many local, national, and research-based nutrition intervention programs. Some simple assessment methods for iron intake, such as food frequency questionnaires [
7], have been validated. However, food-frequency questionnaires thus far have been hemeprotein focused and may not be applicable to populations consuming diverse or primarily non-heme iron-based diets. Further, it is not well understood whether food patterns rich in phytate [
8], dietary iron [
9], phytate–iron ratios [
10], or ascorbic acid [
11], all of which have been linked to regulation of iron bioavailability from food, can be evaluated through food pattern modeling. It is also not known how food pattern modeling for these nutrients compares to more intensive methods of diet recall to accurately predict outcomes related to known hematological markers of iron adequacy, such as hemoglobin, ferritin, or acute iron absorption.
The utility and validity of food pattern modeling depends on first understanding whether it can similarly predict hematological iron-status indices compared to intensive memory-based recall methods. The objective of this study was to compare the diagnostic accuracy of food pattern modeling to 24 h diet recall methods for predicting hematological indices of iron status.
2. Materials and Methods
2.1. Sample Size
A sample size was calculated using a paired equivalence test for hemoglobin, ferritin, or acute iron absorption differences of <15% between dietary assessment methods and with a correlation of at least 85% between methods using a power of 80% from hematological outcomes among participants on a rolling basis. The greatest of the three sample size calculations was 21 paired tests for ferritin, which was used for analysis.
2.2. Study Recruitment, Inclusion Criteria, and Study Preparation
The study protocol for the present study was approved by the Institutional Review Board at Wichita State University (#4396). In brief, fifty-six women responded and were screened by email for inclusion criteria listed below on a rolling basis. Before participation, an informed consent document was signed and all study procedures, risks, and benefits were identified verbally and in writing. Participant inclusion criteria dictated that participants were premenopausal, were aged 18–35 y, were nonobese (BMI ≤ 30.0 kg/m2), had no history of oral or gastrointestinal disease, were moderate (≤14 g alcohol per day) or nonalcohol consumers, and were nontobacco users. Participants were excluded if they were pregnant, were breastfeeding, had blood or gastrointestinal disorders affecting iron absorption, or were taking medication that would impair iron bioavailability. Participants were asked to stop iron containing medications and supplements seven days prior to study activities. Of the 56 total participants that were contacted for participation, 30 met inclusion criteria and were enrolled. After study completion, only 27 participants had three completed 24 h dietary recalls needed for study analysis.
2.3. Hematological Data Collection
To evaluate dietary recall methods against hematological outcomes, each participant completed an iron-rich meal challenge to assess for iron status and bioavailability. Meal challenges followed a format previously described [
12,
13]. Prior to the meal, two separate blood samples were collected by venipuncture in a 5-mL serum separator and 3-mL ethylenediaminetetraacetic acid (EDTA) evacuated tubes to measure serum iron (by spectrophotometry), ferritin (by immunoassay, sensitivity 0.1 ng/mL), and complete blood count, including hemoglobin (g/dL) [
12,
13]. After blood draw, the challenge meal was administered. All meals consisted of a 95-g bagel with 12 g sugar-free strawberry jam (half sprinkled with 18 mg iron as ferrous sulfate), a 90-g banana, and 355 mL of water. Blood samples were drawn to measure acute iron absorption at 210 min post-meal in a 2-timepoint draw to estimate the percentage of maximum iron recovery (% max iron absorption) as described previously [
12,
13,
14]. After study activities were completed, blood and serum were sent to Quest Diagnostics for analysis within 24 h. Serum iron data were used to calculate acute iron absorption for iron bioavailability analysis. Calculations were completed as described previously [
12,
13,
14].
2.4. Dietary Analysis
Participants completed three 24 h dietary recalls via the Automated Self-Administered 24-Hour Recall (ASA24) three days prior to the test meal. Dietary data were extracted, and means were calculated for four nutrients of interest related to hematological outcomes: iron, phytate, phytate–iron ratio, and ascorbic acid [
15]. To determine the nutrient density for food pattern modeling, variables of interest were assigned to a dichotomous categorical value (either “high” or “low” quantities) according to parameters from previous research [
4]. During this process, an electronic spreadsheet (Microsoft Excel) was used to review all individual food items from dietary assessment data for each participant. Based on extracted data, individual food items for dietary recalls were coded into categories based on average nutrient content for each variable of interest per 100 g servings. Using previously defined parameters, the cutoff point between high and low phytate–iron ratios was 1 [
16,
17]. Food iron and ascorbic acid content were categorized based on iron cutoff values of 0.35 mg/100 g [
4,
5,
18,
19,
20] and 24 mg/100 g [
4,
19,
20,
21,
22] respectively. Phytate was categorized using a cutoff point of 50 mg/100 g [
19,
23,
24] (
Table 1).
A comprehensive dietary nutrient density profile was created for each participant based on mean variable-nutrient density for all foods consumed. Using categorized food items for each nutrient, a sum score for overall food pattern was tabulated by taking the total number of consumed foods that were categorized with high nutrient density for each variable, divided by the total number of foods consumed.
Food pattern modeling estimated nutrient density (for each variable)
To construct mean dietary intake variables for comparison to food pattern modeling, estimated iron (in milligrams) and ascorbic (milligrams) acid intakes were extracted and means were calculated from three 24 h dietary recalls collected during the study. Food intake logs were downloaded from the Automated Self-Administered 24 h recall for manual calculation of phytate and phytate–iron ratios. During this process, an electronic spreadsheet (Microsoft Excel) was used to review all dietary data for each participant. Food items were referenced from United States Department of Agriculture (USDA) tables and entered in an electronic spreadsheet; phytate amounts were calculated and averaged for each recall [
25]. The phytate–iron ratio was calculated using the equation:
To compare mean dietary estimates against food pattern modeling variables for hematological outcomes, the sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of each method was calculated. Similarity of methods was compared by using a receiver operating characteristic (ROC) curve calculation [
26]. The ROC was used to compare continuous hematological variables (hemoglobin, ferritin, and acute iron absorption) against estimated mean nutrient intake and food pattern modeling variables. Curves were created to estimate likelihood ratios for each assessment type by setting discrete values within the continuous hematological data indices to form dichotomous comparisons for analysis.
ROC curves were established using points within the nadir and peak values for hemoglobin, ferritin, and iron absorption among participants. Hemoglobin values ranged from 11.0 to 14.0 g/dL with 0.5 g/dL intervals; ferritin values ranged between 5 and 55 ng/dL with 5 ng/dL intervals, and acute iron absorption values ranged from 0 to 50%, with 5% intervals. ROC parameters were chosen according to participant outcomes. Quartiles for each dietary assessment method variable were calculated to plot ROC curves according to previous studies [
4,
27], where the values are reported
Table 2:
2.5. Statistical Analysis
Data were analyzed using SAS statistical software (SAS Studio version 3.8), statistical significance was set at p < 0.05, and data are presented as mean ± SD. Before analysis, all data were analyzed for normality and homogeneity of data by Levine’s test of homogeneity.
Fisher’s exact test was used to compare dietary assessment methods against hematological outcomes. From the table analysis, sensitivity, specificity, PPV, and NPV were calculated for each nutrient by each hematological cutoff value. Mean dietary estimates and food pattern modeling method sensitivity and specificity were plotted for ROC curve analysis. The calculated response elements for each dietary analysis method were compared by a Student paired T-test, using p < 0.05 as a cutoff for significant differences between methods.
4. Discussion
The objective of this study was to pilot the diagnostic accuracy of food pattern modeling to 24 h diet recall methods. In this study, food pattern modeling was used to predict hematological indices for four variables: iron, ascorbic acid, phytate, and phytate–iron ratios. These variables were chosen due to their known impact on the hematological indices of iron status and to their common focus in community nutrition research and programming [
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
17,
18,
19,
21,
22,
23,
24,
27].
Food pattern modeling has been used to estimate community level nutrient intake through modeling methods like the present approach elsewhere [
29,
30,
31,
32]. However, this is the first study, to our knowledge, that has compared the effectiveness of food pattern modeling to repeated multiple-pass memory-based dietary assessment by comparing acute and chronic measurements of nutrient status by hematological outcomes. This is also the first study to have compared food pattern modeling and 24 h dietary recall for nutrients with a known relationship to iron status.
Diagnostic measurements, including sensitivity, specificity, PPV, NPV, and ROC curves were not significantly different between food pattern modeling and mean dietary intake estimates for markers of long-term iron status, including hemoglobin and ferritin. Despite this, food pattern modeling was statistically inferior to mean dietary intake estimates for acute iron absorption for estimated iron and ascorbic acid intake. This suggests that, while food pattern modeling may be non-inferior to 24 h dietary recall methods for long-term markers of iron status, it may be inadequate to predict acute markers of iron absorption. However, it has been suggested that dietary iron intake may be poorly predicted by single-meal iron bioavailability studies compared to long-term markers or iron status [
3,
12,
13,
33], questioning whether the improved diagnostic capabilities of 24 h dietary recalls may be needed in community-based dietary assessment. In addition, the mean difference in diagnostic accuracy for all nutrient variables for any hematological outcome between assessment methods was less than twenty percent, often trending in favor of food pattern modeling methods for assessment. This suggests of non-inferiority of food-pattern modeling to 24 h recall assessment in this study.
There were several limitations to the generalizability of this study that outline the challenges of food pattern modeling as a dietary assessment method available for widespread use. As a proof-of concept pilot study, hematological outcomes here were correlated with two dietary assessment types approximated from 3–24 h dietary recalls. However, this study was conducted in a homogenous population that largely was not iron deficient. While studying women may be a sensitive group to changes in iron status and absorption, results from this study cannot be generalized to other populations of interest, such as children, pregnant women, or populations with largely anemic cohorts. In addition, this modeling was based on the overall iron intake for hematological outcomes prediction. Future studies may benefit from understanding nuances in nutrient bioavailability during model-building. One such exploration may be to consider differences in modeling between heme and non-heme iron sources. Further, this was a small sample size. Future studies should aim to explore differences in methods using larger sample sizes in populations with geographic diversity.
Despite its definition in the Dietary Guidelines for Americans [
2], there is no standardized approach to food pattern modeling as a form of dietary or nutrient assessment. In addition to the methodology from the present study, other studies reviewed suggested a variety of models and methods that were employed with characteristics that met the definition of food pattern modeling [
3,
4,
6,
7,
8,
30,
31,
32]. To use food pattern modeling broadly, methodology and further validation should be considered for research and programming moving forward.
Finally, this study aimed to model differences in assessment methods for four known nutrient variables that impact iron absorption and status; however, modeling for diverse nutrient, in diverse dietary settings may reveal limitations with food pattern modeling. It is relatively unclear at this time how food pattern modeling can be standardized to populations and programs wishing to utilize this method without complicated modeling efforts. Future studies should outline guidelines and research gaps for researchers and programming intending to use food pattern modeling to estimate nutrient intake for a wide variety of macronutrients and micronutrients.
5. Conclusions
Despite limitations, this study was able to validate as a proof-of-concept pilot that food pattern modeling may be as effective as dietary recalls for community health programming or nutrition research. Further, the study was completed in participants consuming a wide diversity of foods when compared to some populations globally. Outcomes generated suggest that food pattern modeling for iron-relevant nutrients was as diagnostically accurate as detailed memory-based recall assessment methods in predicting hematological markers of iron status. This may demonstrate that, with standardized guidelines and further research, food pattern modeling could be a simple, readily utilizable dietary assessment method for research and community health.