Asymptotic rates of the information transfer ratio
information processing; parallel systems
Information processing is performed when a system preserves aspects of the input related to what the input represents while it removes other aspects. To describe a system's information processing capability, input and output need to be compared in a way invariant to the way signals represent information. Kullback-Leibler distance, an information-theoretic measure which reflects the data processing theorem, is calculated on the input and output separately and compared to obtain information transfer ratio. We consider the special case where input serves several parallel systems and show that this configuration has the capability to represent the input information without loss. We also derive bounds for asymptotic rates at which the loss decreases as more parallel systems are added and show that the rate depends on the input distribution.