Course 048703, Spring
2008:
Noise Removal - An
Information Theoretic View
• Lectures:
Here are
the dates for the 12 lectures (all taking place 16:30-18:30), including the
class locations:
–
Lecture 1: Monday, May 19 Slides of introductory lecture
– Lecture 2: Sunday, May
25, room 354 (scribing by Oren Zeitlin)
– Lecture 3: Monday, May
26, room 351
– Lecture 4: Sunday,
June 1, *room 1007*
– Lecture 5: Monday,
June 2, room 351
– Lecture 6: *Tuesday*,
June 10, room 354 (note that Sunday and Monday of that week are vacation
days due to Shavuot and Tuesday will be a Sunday format)
– Lecture 7: Sunday,
June 15, room 354 (scribing by Renata Goldman)
– Lecture 8: Monday,
June 16, room 351
– Lecture 9: Sunday,
June 22, room 354
– Lecture 10: Monday,
June 23, room 351
– Lecture 11: Sunday,
June 29, room 354
– Lecture
12: Monday, June 30, room 351
• Staff details:
– Lecturer: Tsachy Weissman, email: tsachy@ee.technion.ac.il, phone: 4732
Office hours: Thursdays 13:30-14:30 and
15:30-16:30, or by coordination
– T.A.: We might get one eventually
• References:
– Material will be drawn primarily from papers in
http://www.stanford.edu/~slansel/tutorial/publications.htm
with particular focus on papers in
http://www.stanford.edu/~slansel/tutorial/DUDE.htm
– Background reading may occasionally be recommended from the following
classical sources:
o
T. M. Cover and J. A. Thomas,
Elements of Information Theory, Wiley, 1991.
o
I. Csiszar and J. Korner,
Information Theory: Coding Theorems for DiscreteMemoryless Systems, Akademiaki
Kiado, 1981.
o
R. Gallager, Information Theory
and Reliable Communication, Wiley, 1968.
o
T. Berger, Rate-Distortion
Theory, Prentice-Hall, 1971. R. M. Gray, SourceCoding Theory, Kluwer, 1990.
o
R. M. Gray, Entropy and
Information Theory, Springer-Verlag, 1990. (Available online.)
o
R. M. Gray, Probability, Random
Processes, and Ergodic Properties, Springer-Verlag, 1988.
o
L. Devroye, L. Gyorfi and G.
Lugosi, A Probabilistic Theory of Pattern Recognition, Springer, 1996.
o
N. Cesa-Bianchi and G Lugosi,
Prediction, Learning, and Games, Cambridge University Press, 2006.
o
Luc Devroye, A course in
density estimation, Boston: Birkhauser, 1987.
o
L. Devroye and G. Lugosi,
Combinatorial Methods in Density Estimation, Springer, 2001.
o
A. Dembo and O. Zeitouni, Large
Deviations Techniques and Applications, Springer, 1998.
o
L. Breiman, Probability, SIAM,
1992.
o
R. Durrett, Probability: Theory
and Examples, Duxbury Press, 1996.
o
K. Petersen, Ergodic theory,
Cambridge University Press, 1983.
• Course requirements:
50% Lecture
report: Each enrolled student will be assigned a 2-hour lecture to scribe.
The write-up should be an expansion rather than a summary of the lecture,
elaborating where needed on issues that have been merely skimmed over, filling
in mathematical details that were skipped, etc.
1. Report should be prepared in English in Latex
2. Report will be due 2 weeks after the lecture
3. I will give you some feedback on the report 2 weeks after you submit it
4. 2 weeks after you receive my feedback, the final version of the report
will be due
5. Final grade on report will be the average of your grades on versions of
steps 2 and 4
50% Exercise
sheet: As we go through the material, during the lectures, we
will pose problems, questions and exercises every once in a while. These will
be collected into an exercise sheet that will be handed out before the end of
the semester. You will be required to choose a subset of the questions (the
size of which will be specified), to be completed independently (no teamwork)
and handed in at a date that will be specified, some time after the end of the
semester.
• Course Description:
– Introduction and overview [1 lecture]
– Discrete Denoising [3 lectures]:
o
Fundamental performance limits
o
Optimal but non-universal
schemes:
§
Bayes-optimal schemes
§
Example: forward-backward
recursions for noise-corrupted Markov processes
o
Universal denoising
§
Discrete Universal DEnoiser
(DUDE): construction and performance analysis
§
Variants of DUDE
§
Performance boosts for the
non-asymptotic regime
§
Theory and algorithms for
non-stationary data
– Compression-based denoising [2.5 lectures]:
o
Lossy compression
preliminaries:
§
Rate distortion theory for
ergodic processes
§
Shannon lower bound
§
Empirical distribution of rate
distortion codes
§
Universal lossy source coding:
1. Yang-Kieffer codes
2. Lossy compression via Markov chain Monte Carlo
o
Indirect rate distortion theory
o
Universal denoising via lossy
compression
– Denoising of analogue data (via approaches developed in the discrete
case) [2 lectures]:
o
(a bit about) Wavelet thresholding
techniques (pros and cons)
o
Universal denoising:
§
Kernel density estimation-based
techniques
§
Context quantization-based
techniques
– Sequential decision making in the presence of noise [2.5 lectures]:
o
Noise-free case:
§
prediction with expert advice
§
lossless coding preliminaries
§
prediction via compression
§
Ziv-Lempel (LZ) predictor
o
Noisy prediction
o
Filtering (sequential
estimation):
§
compound sequential decision
problem
§
filtering via prediction
§
LZ filtering
– Applications [remaining time]:
o
Image denoising
o
Data compression and
communications:
§
lossy source coding with
decoder side information (the Wyner-Ziv problem)
§
decoding of systematically
encoded redundant data
[time permitting; probably not this short semester:]
Suggestions and/or requests for specific topics will be welcome