Directed Information and Its Biomedical Applications
Abstract
The foundation of Information theory by C. E. Shannon is the entropic measures to quantize the abstract concept of information in random variables. When the interest is to study the relationships among multiple random processes, directed information introduced by Massey in 1990 extends such entropic measures to study the direction of information flow among the random processes, thereafter enabling the causality inference.
Theoretically, directed information is defined on joint or conditional probability distributions of two random processes. In practice, with observed time series data, the estimation of directed information will be dependent on accurate estimates of joint probability distribution functions of random processes. We explore existing directed information estimators, generally based on Context Tree Weighting method and general asymptotic equipartition property, to infer the directed interactive relationships among involved components in neural activity mathematically modeled on Poisson process. Outcome validates causality relationship among a set of neurons. Although challenges remain, particularly finding computational efficient way to distinguish indirect relationship from direct interaction, estimation of directed information can effectively dig out causality among network.
Citation
Wan, Qing (2016). Directed Information and Its Biomedical Applications. Master's thesis, Texas A & M University. Available electronically from https : / /hdl .handle .net /1969 .1 /156978.