Mapping Multi-Modal Brain Connectome for Brain Disorder Diagnosis via Cross-Modal Mutual Learning

Recently, the study of multi-modal brain connectome has recorded a tremendous increase and facilitated the diagnosis of brain disorders. In this paradigm, functional and structural networks, e.g., functional and structural connectivity derived from fMRI and DTI, are in some manner interacted but are not necessarily linearly related. Accordingly, there remains a great challenge to leverage complementary information for brain connectome analysis. Recently, Graph Convolutional Networks (GNN) have been widely applied to the fusion of multi-modal brain connectome. However, most existing GNN methods fail to couple inter-modal relationships. In this regard, we propose a Cross-modal Graph Neural Network (Cross-GNN) that captures inter-modal dependencies through dynamic graph learning and mutual learning. Specifically, the inter-modal representations are attentively coupled into a compositional space for reasoning inter-modal dependencies. Additionally, we investigate mutual learning in explicit and implicit ways: (1) Cross-modal representations are obtained by cross-embedding explicitly based on the inter-modal correspondence matrix. (2) We propose a cross-modal distillation method to implicitly regularize latent representations with cross-modal semantic contexts. We carry out statistical analysis on the attentively learned correspondence matrices to evaluate inter-modal relationships for associating disease biomarkers. Our extensive experiments on three datasets demonstrate the supe...
Source: IEE Transactions on Medical Imaging - Category: Biomedical Engineering Source Type: research