On minimizing distortion and relative entropy

M. P. Friedlander, M. R. Gupta
IEEE Transactions on Information Theory, 52(1):238–245, 2006



A common approach for estimating a probability mass function \(w\) when given a prior \(q\) and moment constraints given by \(Aw\le b\) is to minimize the relative entropy between \(w\) and \(q\) subject to the set of linear constraints. In such cases, the solution \(w\) is known to have exponential form. We consider the case in which the linear constraints are noisy, uncertain, infeasible, or otherwise “soft.” A solution can then be obtained by minimizing both the relative entropy and violation of the constraints \(Aw\le b\). A penalty parameter \(\sigma\) weights the relative importance of these two objectives. We show that this penalty formulation also yields a solution \(w\) with exponential form. If the distortion is based on an \(\ell_p\) norm, then the exponential form of \(w\) is shown to have exponential decay parameters that are bounded as a function of \(\sigma\). We also state conditions under which the solution \(w\) to the penalty formulation will result in zero distortion, so that the moment constraints hold exactly. These properties are useful in choosing penalty parameters, evaluating the impact of chosen penalty parameters, and proving properties about methods that use such penalty formulations. to maximizing entropy.


  Author = {M. P. Friedlander and M. R. Gupta},
  Journal = {IEEE Transactions on Information Theory},
  Number = 1,
  Pages = {238-245},
  Title = {On minimizing distortion and relative entropy},
  Volume = 52,
  Year = 2006,
  Doi = {10.1109/TIT.2005.860448}