4.1. Descriptive Statistics
Cow longevity is a very complex and important trait in breeding programs [
3,
4,
30]. In general, the economic profitability of cattle production increases with an increase in longevity, as most decisions on culling of cows are based on their productivity instead of involuntary reasons [
4,
31,
32]. However, the improvement of longevity through genetic selection is suboptimal due to several factors. First, the late expression of the phenotype (i.e., end of life) considerably increases the generation interval, which reduces genetic progress per time unit. Moreover, the lack of information regarding the culling reasons can potentially generate bias in the genetic and genomic evaluations, as some animals do not have the opportunity to fully express their genetic merit for longevity. In order to evaluate the impact of different culling reasons in the estimation of genetic parameters for longevity and avoid the interference of censored data in the results, only animals that had culling information were kept in this study. Nonetheless, we recognize the need for evaluating the impact of censored data in subsequent studies for genetic analyses of longevity.
As suggested by Jamrozik et al. [
13], decisions on removal of a cow from the herd may involve several reasons and not all of them are usually reported in the dataset. Most animals analyzed in this study were from the culling group of miscellaneous, which included both cows culled before 15 years of age without specific reasons and cows sold as commercial animals (
Table 1). Thus, as suggested by Rózańska-Zawieja et al. [
15], it is advisable to provide more specific culling reasons in the farm recording programs. Only a small proportion of animals were culled due to disease and structural problems, which suggests that the methods used for prevention and control of diseases, as well as selection against structural problems (e.g., feet conformation), have been effective in the North American Angus population. For both the American Angus Association [
33] and Canadian Angus Association [
34], expected progeny differences (EPDs) are predicted for claw set and foot angle in a joint analyses. Selecting for adequate claw set and foot angle can reduce the incidence of lameness [
35], one of the main reasons for early culling in beef cattle [
36,
37]. This in agreement with Vargas et al. [
38], who commented that locomotion disorders can lead to several productive and reproductive losses.
The high proportion of animals culled due to fertility-related issues found in this study is in agreement with Rózańska-Zawieja et al. [
15], who reported that reproductive disorders were the most common culling reason for animals from Brahman, Hereford, and Angus breeds raised in Poland. Similarly, Koeck et al. [
39] reported that about 25% of Holstein cows were culled due to reproductive problems. Our study also indicated that cows with fertility issues were culled at a younger age than cows culled due to other reasons, as was also reported by Morales et al. [
40]. In this context, Burris and Priode [
41] showed that cows calving late in a breeding season are usually culled sooner. Moreover, Cushman et al. [
42] and Damiran et al. [
32] showed that this feature is even stronger for heifers, i.e., heifers that calve later at their first calving fail to remain in the herd as long as heifers that calve earlier (first 21 days in the breeding season). Furthermore, discarding cows that did not become pregnant in a breeding season is a reasonable strategy used by several farmers to reduce future economic losses [
3].
The number of calvings per cow found in our study (
Figure 1) is in agreement with Brzáková et al. [
11], who found that 22% of the beef cows from Czech Republic had only one calving. Animals from the group of natural death tended to die older than animals from the other groups, however, the average culling age found in our study for the group of natural death (9.75 years) was lower than the average reported by Rózańska-Zawieja et al. [
15] for beef cattle from United States and Canada (i.e., 12.70 and 13.00 years, respectively). Only ~10% of the animals included in this study were culled due to performance, which reinforces the importance of genetically evaluating longevity in North American Angus cattle (
Table 1).
4.2. Model Comparison
The statistical models used can influence the predictive performance of EBVs. Therefore, model definition is a crucial step in genetic and genomic evaluations. Longevity traits were first evaluated using non-linear proportional hazard models [
43,
44]. In summary, hazard models allow to easily account for censored data (i.e., animals without culling information), and the inclusion of time-dependent environmental effects [
44]. However, the main disadvantage of this model is the fact that it only allows the estimation of a single genetic effect for each animal during its whole life [
43]. Thus, in order to avoid the complexity of hazard models and predict EBVs for all ages, Veerkamp et al. [
43] proposed that a RRM can be used for genetic analyses of longevity related traits. Nowadays, RRM seems to be the optimal choice to genetically evaluate longevity over time [
16,
17,
19]. The predictive performance of RRM relies on how well the model fits the data, which is strongly related to the type and order of polynomials used [
18,
19]. Several studies have reported that Legendre orthogonal polynomials are preferred for genetic analysis of a variety of traits compared to other types of polynomials [
45,
46]. However, the polynomial order considered as optimal tends to be population and trait specific.
Especially for longevity-related traits, different orders of Legendre polynomials have been assumed as optimal. For instance, third order Legendre polynomials were chosen to genetically evaluate stayability to consecutive calvings in Canadian Simmental [
13]. On the other hand, Plaengkaeo et al. [
47], who tested different Legendre polynomial orders to evaluate longevity in swine, concluded that second order Legendre polynomial should be used for genetic evaluations. Moreover, Haile-Mariam and Pryce [
48] found that fitting only the intercept was more adequate to analyze longevity in Australian Holstein cattle. However, the mentioned authors decided to use first order Legendre polynomials in order to study the association between longevity and other traits (such as production, fertility, and type traits) over time [
48]. In our study, three different polynomial orders were evaluated (i.e., LEG2, LEG3, and LEG4) and the more parameterized models (LEG4) outperformed the simpler models for all groups of culling reason and longevity definitions (
Supplementary Table S1). Thus, the improvement in the quality of the fit seems to compensate for the increase in the models’ complexity for genetic analysis of longevity in North American Angus cattle.
Assuming homogeneity of residual variance might not be realistic for genetic analysis of longevity, as it indicates that the variance due to non-explained effects remains constant over time. For this reason, after choosing the optimal polynomial order, LEG4 models assuming homogeneity and heterogeneity of residual variance were compared. Testing the polynomial order before the number of classes used for the residual variance is a common practice reported in the scientific literature [
49]. As expected, using heterogeneous residual variance improved the model’s fit for all groups of culling reason when considering FL as the analyzed trait. However, for TL, RRM using heterogeneity of residual variance did not improve the model’s fit in two different groups of culling reason: structural problems and disease, which might be related to the smaller number of observations for these groups and the consequent increase in models’ complexity when using heterogeneous residual variance. In this context, especially for TL, the smaller number of observations might have reduced the phenotypic variability in adjacent ages, as records assumed for a specific age are likely more related to the previous age when using TL compared to FLa and FLb (i.e., TL does not consider calving information in its definition). This fact might have decreased the need to account for heterogeneous residual variance in the RRM used for the genetic evaluations of the groups of structural problems and disease under the TL definition.
The higher residual variances estimated for FLa compared to TL and FLb indicate a worse model fit for this longevity definition compared to the others, which is likely related to the fact that uncertain information is inaccurately being assumed as certain under the FLa definition (i.e., the code 0 has been used for both situations, i.e., after the cow was culled or if the cow did not record a calf at the specific age). In this study, only one option of heterogenous residual variance was tested (i.e., 14 classes), which greatly increases the model’s complexity. One option to reduce the number of residual variance classes is to group similar classes together. Several methods can be used to group different classes of residual variance, such as self-organizing maps [
50], change point [
51], and visual inspection [
52]. Testing different numbers of classes for the residual variance should be considered in subsequent studies in order to simplify the RRM.
4.3. Genetic Parameters
Heritabilities. Genetic analysis of longevity (or longevity-related traits, such as stayability and survival) are becoming popular in livestock breeding research [
11,
13,
53]. However, to our best knowledge, there are no studies evaluating the impact of different culling reasons in the estimation of variance components and genetic parameters for longevity. In this study, different heritability estimates were obtained for each group of culling reason (
Table 3 and
Figure 3). For instance, higher heritability estimates were obtained for animals that died due to structural problems and disease, intermediate heritabilities were obtained for natural death, and lower heritabilities were obtained for performance, fertility, and miscellaneous. These results are due to the larger additive genetic variances observed for the groups of structural problems and disease (
Figure S5,
Supplementary Material), and the larger permanent environmental variances estimated for the groups of fertility, performance, and miscellaneous (
Figure S6,
Supplementary Material). Similar herd-year-season variances were estimated for all groups of culling reasons (
Figure S7,
Supplementary Material). Heritabilities estimated combining all groups of culling reasons were similar to the heritabilities estimated for the groups of performance, fertility, and miscellaneous. These findings indicate that combining all culling reasons for genetic and genomic evaluations without accounting for their genetic differences will likely weaken the genetic progress for longevity. Moreover, heritabilities estimated in this study suggest that direct genetic selection for longevity, regardless the group of culling reasons used, will result in improved longevity of North American Angus. Thus, combined with management strategies, including longevity in the breeding goals has the potential to reduce involuntary culling in the herds, which can minimize financial losses.
The main goal of using FL instead of TL in the genetic evaluation for longevity is to increase the probability that cows will not only be alive in the herd but also producing one calf per year, which will ensure an economic return for beef cattle producers [
54,
55,
56]. In general, the average heritabilities estimated for FL
b tended to be higher than those estimated for FL
a and TL, indicating greater genetic progress is expected through direct selection for FL
b. These results differ from those reported by Morales et al. [
40], in which similar heritability estimates for length of true life (0.14) and length of productive life (0.14) were observed in the Retinta beef cattle breed. Even though there are similarities in the concepts used by the authors to define the traits and the TL and FLb definitions used in our study, the differences in the results are likely explained by the different statistical models used (Weibull proportional hazard vs. linear RRM).
Brzáková et al. [
11] compared the use of two longevity definitions (i.e., probability of cow reappearance in the next parity, and the number of calvings at ages of 6.5, 7.5, 12.5, and 13.3 years) for the genetic evaluation of a beef cattle population in the Czech Republic, using single- and multiple-trait linear models. Despite the fact that the statistical models and the longevity definitions used in their study are not conceptually the same as in our study, the authors reported heritability estimates close to the ones found in our study when considering the group combining all culling reasons under the FLa definition (heritabilities ranging from 0.09 to 0.13). Similar heritabilities (ranging from 0.09 to 0.16) were also estimated for stayability (defined as probability that a cow had calved at least three times before 6.3 years) in Brazilian Nellore cattle [
57]. On the other hand, higher heritability estimates (ranging from 0.18 to 0.25) were found for stayability (defined as stayability to calving, from 2 to 6 years) in Hereford cattle, using a traditional linear model [
14]. Using an approach based on RRM similar to the one presented in our study, Jamrozik et al. [
13] performed a genetic evaluation for stayability to consecutive calvings (defined similarly to the FLb definition used in our study) in Canadian Simmentals. The heritabilities estimated by the afore mentioned authors were, in general, similar to the heritabilities estimated in our study for the culling group of natural death under the FLb definition (heritabilities estimated by them ranged from 0.13 to 0. 35). Specifically for Aberdeen Angus, Roughsedge et al. [
10] estimated a heritability of 0.13 for lifespan. Lifespan was defined by the authors to reflect the parity that was expected to be reached using average survival probabilities from parity to parity in the population [
10]. For South African Angus cattle, heritabilities estimated using an animal threshold model ranged from 0.18 to 0.20 for stayability defined as the probability that a cow remained in the herd from 4 to 8 years of age [
55].
In general, most peaks of heritability observed for the different longevity definitions analyzed in our study occurred between 4 to 7 years, which indicates that performing selection within this interval can help to accelerate the genetic gain for longevity. This is in agreement with Brzáková et al. [
11], who reported that the highest heritabilities were estimated at 6.5 years. On the other hand, the highest heritability reported by Jamrozik et al. [
13] was estimated at 2 years. For the majority of groups of culling reasons and longevity definitions analyzed in our study, smaller heritabilities were obtained at 2 and over 12 years compared to the intermediate age categories. These smaller heritabilities in the extremes are likely due to the lower genetic variability observed in these ages (
Figure S5,
Supplementary Material), which is a consequence of the quality control performed (i.e., all cows were required to have their first calving before 30 months of age), selection process, and reduced phenotypic variability (
Figure S3,
Supplementary Material). In this regard, the sudden increase in the heritability estimates observed at high ages specifically for the group of disease under the FLa definition (
Figure 3b), and the groups of structural problems and performance under the FLb definition (
Figure 3c) are likely due to a poor fit of a high order polynomial at these points when using heterogeneous residual variance. Moreover, various studies currently available in the literature have reported unstable heritability estimates in the extremes of the curve for several traits when using RRM based on high-order Legendre polynomials [
50,
58,
59].
Genetic correlations between ages. Cattle longevity traits have been traditionally evaluated at 6 years-old, which leads to explicitly ignoring records from cows that are not yet 6 years-old or that are still alive beyond this age [
17]. In our study, a comprehensive range of ages was evaluated using RRM (i.e., 2 to 15 years-old), which enabled us to use all the information available to identify the best time periods to perform the selection for increased longevity. In general, our findings showed that higher genetic correlations were observed at closer ages, and that the magnitude of the correlations decreased with increasing gap between ages (
Figure 4). These findings are as expected with a RRM and in agreement with the ones reported by Jamrozik et al. [
13], studying stayability to consecutive calvings using RRM in Canadian Simmental cattle.
The low average genetic correlations estimated when considering all ages (i.e., from 2 to 15 years;
Table 4) is due to the negative genetic correlations found for ages greater than 12 years. These negative correlations are likely due to the reduced number of records in these age categories, which is a consequence of the selection process, combined to the instability of variance components in the extremes of the curve generated by poor fit of the high-order Legendre polynomials [
50,
58,
59]. Regardless of this issue, Sánchez-Castro et al. [
17] reported that the inclusion of older age records (e.g., 7 and 12 years) increases the EBV stability for stayability measured at the traditional 6 years old in Angus cows. In addition, the authors commented that accuracies (calculated according to the guidelines of the BIF [
60]) obtained using RRM including the additional ages were higher than accuracies obtained with the RRM that only used data up to 6 years of age. Similarly, Bohmanova et al. [
61] suggested that EBV accuracies increase when additional records were incorporated into the RRM. Thus, even though caution is advised when evaluating animals for longevity at ages greater than 12 years-old, inclusion of these records can be beneficial for the genetic and genomic evaluations of North American Angus cattle.
The genetic correlations estimated for ages between 3 and 12 years (
Table 4) indicate that longevity up to 12 years-old can be well predicted using any time point within the mentioned interval. In this context, using EBVs predicted for early ages such as 4 or 5 years can help to shorten the generation interval for longevity, as well as accelerate genetic gain due to the higher heritabilities estimated at these ages (
Figure 3). Similar findings were also reported in other studies. For instance, Jamrozik et al. [
13] reported genetic correlations ranging from 0.74 to 0.99 between ages 3 to 8 years-old. Brzáková et al. [
11] found that productive longevity measured at 7.5 years-old is a good indicator of longevity measured at 13.3 years-old (genetic correlations reported by the authors were above 0.84). In addition, Venot [
62] reported high genetic correlations (0.95 for Charolais and 0.92 for Aubrac) for number of calvings estimated between 6.5 and 12.5 years-old, while evaluating length of productive life in beef cows.
Finding measures of longevity taken earlier in life is paramount, as true longevity is not known until the end of a cow’s life [
11,
17]. However, it is important to highlight that the genetic correlation estimated between 2 years and older ages was, in general, substantially lower than the genetic correlation estimated at 3 or 4 years and the other ages (
Figure 4). These findings might be related to the fact that longevity at 2 years-old is likely more strongly associated with the analyzed culling reasons than older ages, as suggested by Cushman et al. [
42] and Damiran et al. [
32]. Therefore, it is advisable to avoid using EBVs predicted at 2 years-old to select animals when the goal is to improve longevity at older ages.
Genetic correlations between culling reasons. Even though culling reasons are currently being reported by the farmers for some dairy and beef cattle breeds, to our best knowledge, this information has not been used for genetic and genomic evaluations of longevity around the world [
63,
64]. The main reason for that is the complexity to account for multiple culling reasons in the statistical models [
13]. Nonetheless, identifying the culling reasons allows the recognition of the proportion of voluntary and involuntary culling in the herd [
9,
15], which can influence management decisions. In addition, identifying the impact of the culling reasons in the estimation of variance components can contribute to obtaining more accurate EBVs and accelerate genetic progress for longevity.
In general, the genetic correlations estimated between the different culling groups were low (
Table 5), indicating that longevity based on different culling reasons are genetically different traits. These low genetic correlations support the different heritabilities estimated for each culling reason (as previously discussed in the
Heritabilities topic). Furthermore, these findings suggest that combining all culling reasons can have a negative impact in the selection program, as heritabilities estimated when considering all culling reasons together were low. However, genetically evaluating multiple longevity traits (e.g., one for each culling reason) can be challenging, as various trait EBVs would be generated. Thus, one strategy would be to develop a selection sub-index [
65,
66] for longevity, where different weights would be applied to each longevity indicator. In this context, greater weights can be used for more prevalent culling reasons (i.e., culling reasons with higher economic impact), such as fertility.
The highest average genetic correlations were found between the groups of animals culled due fertility and performance (
Table 5), which might be a consequence of the strong relationship between these group of traits. For instance, a recent study performed by Pardo et al. [
67] showed that there is a strong positive genetic correlation (0.98 ± 0.01) between fertility (i.e., age at first calving) and performance (i.e., pre-weaning average daily gain), in a beef cattle population composed by Angus, Hereford, and their crossbreeds raised in Argentina. In addition, positive and favorable genetic correlations were estimated between the number of calves at 4.4 years of age and weight gain calculated from weaning to yearling (0.42 ± 0.04) in Nellore cattle [
68]. Speculations regarding the genetic correlations estimated between the other pair of culling reasons can be made, however, to a lesser extent. For instance, the genetic correlation estimated between the groups of structural problems and disease might be related to the impact of inadequate claw set and foot angle on the incidence of lameness [
35]. However, in order to validate these findings and clarify the genetic relationship between longevity traits based on different culling reasons over time, single-step genome-wide association studies based on RRM [
69,
70] and multiple-trait analyses considering longevity and other recorded traits (such as heifer pregnancy, mature cow size, claw and foot angle) could be employed.
Longevity-related traits have been found to be genetically correlated to several other traits. For instance, Valente et al. [
71] reported negative genetic correlations ranging from −0.03 ± 0.11 (between stayability and flight speed) to −0.24 ± 0.16 (between stayability and crush score) in Nellore cattle. Stronger genetic correlations were found by Martínez-Velázquez et al. [
72] for scrotal circumference and stayability (0.76 ± 0.04) and between heifer fertility and stayability (0.57 ± 0.07) in a population composed by Charolais, Charbray, and Charolais–Zebu crosses. Costa et al. [
57] reported genetic correlations between age at first calving and stayability ranging from −0.23 to −0.51, depending on the statistical model used for the analysis (linear-threshold, penalty-threshold, modified penalty-threshold, and linear-threshold-threshold model) in Nellore cattle.
Genetic correlations between longevity definitions. Even though the definitions used to describe longevity and longevity-related traits are still not clear in the literature, it seems that most studies in beef cattle have preferred the term stayability [
12,
13,
14] to describe longevity. However, the definitions used to describe stayability are, sometimes, very similar to the definitions used in studies for other species, which have used the term longevity [
9,
48,
73]. Similar definitions for longevity were also found in the literature under the terms survival [
16,
47] and productive life [
30,
40,
62]. For simplicity, in our study the term longevity was used to describe both TL and FL. Thus, TL was compared to two definitions of FL: including or not missing records for cows without calving information at a specified age (FLb and FLa, respectively). The higher genetic correlations observed between TL and FL
b compared to between TL and FLa, and between FLa and FLb suggest that the information of death (represented by the code 0 in this study) has a higher impact than the information of calving (code 1) in the genetic evaluation of longevity. In addition, the average genetic correlations estimated between the different longevity definitions in our study indicate that the definition used in the breeding program can impact selection decisions (
Table 6). The magnitude of the impact (i.e., proportion of sires commonly selected) is discussed in details in the “
Impact of longevity definition in the selection” topic.
The average genetic correlations estimated between the different longevity definitions found in our study corroborate with the ones reported by Martinez et al. [
14], evaluating stayability to six ages (from 1 to 6 years) and stayability to calving and weaning (both from 2nd to 6th) in Hereford cows. The mentioned authors reported moderate genetic correlations between stayability to six ages and the other definitions (from 0.51 to 0.57) and high genetic correlations between stayability to calving and stayability to weaning (0.86), suggesting possibly re-ranking of sires depending on the trait definition. On the other hand, Morales et al. [
40] estimated high EBV correlations (above 0.96) for length of true life, length of productive life, and number of calvings in the Retinta breed. The authors recommended the use of number of calvings for subsequent genetic and genomic evaluations in the same population due to its higher heritability and easier data access compared to the other traits [
40]. Brzáková et al. [
11] compared the use of two longevity definitions (i.e., probability of cow reappearance in the next parity and the number of calvings at different ages) using a beef cattle population from Czech Republic. Even though the authors did not evaluate the genetic correlation between both longevity definitions, they concluded that evaluating the number of calvings (mainly at 7.5 and 13.3 years-old) is preferred to avoid cows that do not produce one calf per year. Similarly to our study, Brzáková et al. [
11] commented that there was enough additive genetic variance for all traits analyzed.
Comparing the average genetic correlations estimated for the longevity definitions between the different culling reasons, lower genetic correlations tended to be observed for the culling group of fertility (
Table 6). These lower correlations are likely related to the longevity definitions used, which differ mainly regarding the use of calving information. Furthermore, as expected, similar average genetic correlations were estimated for the other culling groups, considering either all ages (i.e., from 2 to 15 years), or ages within the interval of 3 and 12 years (
Table 6).
4.4. Impact of Longevity Definition in the Selection Schemes
In order to facilitate the comparison of the impact of different longevity definitions in the breeding program, the sires’ EBVs were expressed in terms of EDL (only for sires with a minimum of 5 daughters with longevity data). Using EDL to better understand the response to selection has been routine in genetic and genomic evaluations of several functional traits in both beef and dairy cattle [
29,
63]. A higher proportion of commonly-selected sires was observed between TL and FLb (
Table 7), which can be explained by the higher genetic correlation estimated between these longevity definitions (
Table 6). Likewise, the average EDL predicted for TL and FLb were more similar than EDL predicted for TL and FLa, and FLa and FLb (
Table 9,
Table 10 and
Table 11). However, larger standard deviations were found for the average EDL predicted using the FLb definition compared to the TL and FLa definitions, which is a consequence of the greater genetic variability found for this longevity definition for the majority of groups of culling reasons (
Figure S5,
Supplementary Material). The greater dispersion of EDL (as well as sires’ EBVs) using FLb is favorable for selection, as it can increase the genetic gain per time unit [
74]. Moreover, our findings suggest that different sires would be selected based on each longevity definition (
Table 7).
The average differences between the top and bottom sires calculated for all longevity definitions (
Table 9,
Table 10 and
Table 11) suggest that daughters sired by the top 1% bulls are about twice as likely to remain longer in the herd than daughters sired by the bottom 1%. Greater average differences between the top and bottom sires tended to be found for the culling group of performance (
Table 9,
Table 10 and
Table 11), which suggests that the ongoing selection for performance has impacted the longevity trait in North American Angus cattle. For instance, performance traits (e.g., accumulated productivity) have been found to be highly correlated (0.86 ± 0.03) with longevity-related traits in Nellore cattle [
75]. Thus, selecting animals for improved genetic performance might have also contributed to increase the lifetime of Angus cows in the herd. However, it is important to highlight that multiple-trait analyses including longevity and performance data in Angus cattle are required to validate this theory. The smallest average differences found for the group of natural death compared to the other culling groups might indicate that no effective direct selection has been performed for longevity in North American Angus cattle (
Table 9,
Table 10 and
Table 11). This finding might be related to the fact that the American Angus Association [
33] and the Canadian Angus Association [
34] currently do not perform genetic evaluations for longevity traits.
As expected, selecting animals at an older age increased the prediction accuracy (
Table 8, and
Figures S9 and S10 in the
Supplementary Material), which is likely related to the higher genetic correlation found between adjacent ages (
Table 4). However, the averages of improvement in prediction accuracy when compared to the previous age were the highest when selection was performed at 3 or 4 years of age. This suggests that selecting animals at 4 years-old might be more efficient (in terms of correlation of expected and observed values) in the long-term. However, as genetic parameters are usually population-specific, it is advised to test the optimal age to perform selection for longevity traits in other populations. The similar average prediction accuracies calculated for the daughters’ average culling age and proportion of daughters alive at 6 years suggest that the decision of using one or the other EDL would rely exclusively on the ultimate breeding goal. Nonetheless, in most cases, the FLb definition tended to yield slightly higher accuracies compared to TL and FLa, which indicates that the inclusion of FLb in the breeding programs might have advantages (e.g., higher genetic gain) compared to the others. Prediction accuracies estimated in our study ranged from low to moderate, which is a consequence of the heritabilities estimated in our study. These prediction accuracies can be further improved if adjusted for the accuracy of the EBV, or if genomic information are included in the analysis [
76,
77]. In this context, Ramos et al. [
12] suggested that genomic information is always required to ensure high accuracies for longevity-related traits at early ages, because these traits are usually characterized as late-measured and sex-restricted traits.