References

Textbooks and expository articles on minimax theory


Lecture 3, Wed Jan 25
In the 90's Donoho and Johnstone made remarkable contributions to non-parametric functional estimation under Gaussian noise. Their techniques for deriving minimax lower bounds are different than the one covered in class and are more classical, but certainly important. For a comprehensive treatment, see the draft of the book For a textbook treatment of the classic minimax theory, see

Lecture 4, Mon Jan 30
For a statement and proof of Fano's lemma, see page 39 of
  • Cover and Thomas, Elements of Information Theory, 1999.

    Lecture 6, Mon Feb 6
    For more on mutual information and differential entropy of Gaussian variates, see
  • Cover and Thomas, Elements of Information Theory, 1999. For lower bounds on the model selection problem in sparse linear regression, see
  • M. J. Wainwright. Information-theoretic bounds on sparsity recovery in the high-dimensional and noisy setting. IEEE Trans. Info. Theory, 55:5728– 5741, 2009.

    Lecture 9, Mon Feb 20
    The sparse Varshamov-Gilber Lemma and the examples covered in class were taken from the refeence
  • Raskutti, G., Wainwright, M. and Yu. B. (2011). Minimax rates of estimation for high-dimensional linear regression over lq-balls, IEEE TRANSACTIONS ON INFORMATION THEORY, 57(10), 6976-6994. Other relevant references for the problem of deriving minimax lower bound for sparse high dimensional regression are For other versions of the sparse Varshamov-Gilbert Lemma, see