We present a family of relatively simple and unified lower bounds on the capacity of the Gaussian channel under a set of pointwise additive input constraints. Specifically, the admissible channel input vectors $\bx = (x_1, \ldots, x_n)$ must satisfy $k$ additive cost constraints of the form $\sum_{i=1}^n \phi_j(x_i) \le n \Gamma_j$, $j = 1,2,\ldots,k$, which are enforced pointwise for every $\bx$, rather than merely in expectation. More generally, we also consider cost functions that depend on a sliding window of fixed length $m$, namely, $\sum_{i=m}^n \phi_j(x_i, x_{i-1}, \ldots, x_{i-m+1}) \le n \Gamma_j$, $j = 1,2,\ldots,k$, a formulation that naturally accommodates correlation constraints as well as a broad range of other constraints of practical relevance. We propose two classes of lower bounds, derived by two methodologies that both rely on the exact evaluation of the volume exponent associated with the set of input vectors satisfying the given constraints. This evaluation exploits extensions of the method of types to continuous alphabets, the saddle-point method of integration, and basic tools from large deviations theory. The first class of bounds is obtained via the entropy power inequality (EPI), and therefore applies exclusively to continuous-valued inputs. The second class, by contrast, is more general, and it applies to discrete input alphabets as well. It is based on a direct manipulation of mutual information, and it yields stronger and tighter bounds, though at the cost of greater technical complexity. Numerical examples illustrating both types of bounds are provided, and several extensions and refinements are also discussed.
翻译:暂无翻译