The Open Signal Processing Journal

2011, 4 : 1-18
Published online 2011 April 19. DOI: 10.2174/1876825301104010001
Publisher ID: TOSIGPJ-4-1

Constrained Signals: A General Theory of Information Content and Detection

Mark M. Stecker
Joan C. Edwards Marshall University School of Medicine, 1600 Medical Center Blvd, Suite G500, Huntington WV, 25705, USA.

ABSTRACT

In this paper, a general theory of signals characterized by probabilistic constraints is developed. As in previous work [10], the theoretical development employs Lagrange multipliers to implement the constraints and the maximum entropy principle to generate the most likely probability distribution function consistent with the constraints. The method of computing the probability distribution functions is similar to that used in computing partition functions in statistical mechanics. Simple cases in which exact analytic solutions for the maximum entropy distribution functions and entropy exist are studied and their implications discussed. The application of this technique to the problem of signal detection is explored both theoretically and with simulations. It is demonstrated that the method can readily classify signals governed by different constraint distributions as long as the mean value of the constraints for the two distributions is different. Classifying signals governed by the constraint distributions that differ in shape but not in mean value is much more difficult. Some solutions to this problem and extensions of the method are discussed.

Keywords:

Entropy, signal detection, statistical mechanics, receiver operator characteristics.