## Info

classes is not equal to the number of underlying signals, inefficiencies are likely to occur. However, in biology it is not always possible to optimize this, especially if some of the signals are unknown. The mathematical objects under the control of the biologist are Y, P, R, and D. Since the rest of this book is mostly about Y, and in most cases y, a single biomarker, this chapter is mostly about P, R, and D . In other words, the biologist can optimize the q/s only when D is specified. The one exception is when the experiment can be designed so that the n's are known and individuals can be selected to make them equally likely.

A few calculations from information theory are presented here. For applications below, S\ will represent the "normal" state and D1 will represent the "normal" classification. The others will represent "abnormal" or pathological states and classifications. Pharmacological or other therapeutic effects will be considered as abnormal states except when a cure results (i.e., the subject reverts back to the normal state).

The idea of entropy comes from thermodynamics, and it has been shown that thermodynamic entropy and Shannon entropy (information) are related. There are three primary types of Shannon entropy (average uncertainty): the entropy of the source, k

the entropy in the receiver, i

and the communication system entropy, ki

The base 2 logarithm is used here because the units of information are in bits (binary digits). With an ideal, noise-free communication channel, H(S) = H(D) = H(S,D). This means that qu = 1 and qtj = 0 for all i * j.

TABLE 2 Biological Channel Capacity (Bits) by the Number of Possible Signals Transmitted and the Probability of an Abnormal Signal Being Sent

Probability of an Abnormal Signal (1 - fti)

TABLE 2 Biological Channel Capacity (Bits) by the Number of Possible Signals Transmitted and the Probability of an Abnormal Signal Being Sent

Probability of an Abnormal Signal (1 - fti)

Signals (k)