Spatiotemporal integration of molecular and anatomical data in virtual reality using semantic mapping

Abstract
Spatiotemporal integration of molecular and anatomical data in virtual reality using semantic mapping Jung Soh1, Andrei L Turinsky1, Quang M Trinh1, Jasmine Chang1, Ajay Sabhaney1, et al1Sun Center of Excellence for Visual Genomics, University of Calgary, Calgary, AB, Canada; 2Department of Biological Sciences, University of Alberta, Edmonton, AB, Canada We have developed a computational framework for spatiotemporal integration of molecular and anatomical datasets in a virtual reality environment. Using two case studies involving gene expression data and pharmacokinetic data, respectively, we demonstrate how existing knowledge bases for molecular data can be semantically mapped onto a standardized anatomical context of human body. Our data mapping methodology uses ontological representations of heterogeneous biomedical datasets and an ontology reasoner to create complex semantic descriptions of biomedical processes. This framework provides a means to systematically combine an increasing amount of biomedical imaging and numerical data into spatiotemporally coherent graphical representations. Our work enables medical researchers with different expertise to simulate complex phenomena visually and to develop insights through the use of shared data, thus paving the way for pathological inference, developmental pattern discovery and biomedical hypothesis testing.Keywords: anatomical atlas, gene expression, pharmacokinetics, biomedical data integration, CAVEman, virtual reality in medicine