Comparison of Bone Quality Among Winter Endurance Athletes with and Without Risk Factors for Relative Energy Deficiency in Sport (REDs): A Cross-Sectional Study

The objective of this study was to investigate differences in bone quality in winter endurance athletes classified as either low-risk versus at-risk for REDs. Forty-four participants were recruited (M  = 18; F = 26). Bone quality was assessed at the distal radius and tibia using high resolution peripheral quantitative computed tomography (HR-pQCT), and at the hip and spine using dual X-ray absorptiometry (DXA). Finite element analysis was used to estimate bone strength. Participants were grouped using modified criteria from the REDs Clinical Assessment Tool Version 1. Fourteen participants (M = 3; F = 11), were classified as at-risk of REDs (≥ 3 risk factors). Measured with HR-pQCT, cortical bone area (radius) and bone strength (radius and tibia) were 6.8%, 13.1% and 10. 3% lower (p = 0.025, p = 0.033, p = 0.027) respectively, in at-risk compared with low-risk participants. Using DXA, femoral neck areal bone density was 9.4% lower in at-risk compared with low-risk participants (p = 0.005). At-risk male participants had 21.9% lower femoral neck areal bone density (via DXA) than low-risk males (p = 0.020) with no significant differences in females. Overall, 33.3% of athletes were at-risk for REDs and had lower bone quality than those at low-risk.
Source: Calcified Tissue International - Category: Orthopaedics Source Type: research