Below is a short description of courses that I teach often.

Undergraduate courses

  • ENG EC 381: Probability in Electrical and Computer Engineering. Introduction to modeling uncertainty in electrical and computer systems. Experiments, models, and probabilities. Discrete and continuous random variables. Reliability models for circuits. Probability distributions. Moments and expectations. Random vectors. Functions of random variables. Sums of random variables and limit theorems. Signal detection and estimation. Basic stochastic processes. Discrete-time Markov chains. State-diagrams. Applications to statistical modeling and interpretation of experimental data in computer, communication, and optical systems.
  • ENG EC 401: Signals and Systems. Continuous-time and discrete-time signals and systems. Convolution sum, convolution integral. Linearity, time-invariance, causality, and stability of systems. Frequency domain analysis of signals and systems. Filtering, sampling, and modulation. Laplace transform, z-transform, pole-zero plots. Linear feedback systems. Cannot be taken for credit in addition to ENG BE401.

Graduate courses

  • ENG EC 503: Learning from Data. Introductory course in statistical learning covering the basic theory, algorithms, and applications. This course focuses on the following major classes of supervised and unsupervised learning problems: classification, regression, density estimation, clustering, and dimensionality reduction. Generative and discriminative data models and associated learning algorithms of parametric and non-parametric varieties are studied within both frequentist and Bayesian settings in a unified way. A variety of contemporary applications are explored through homework assignements and a project. Requires a solid foundation in Probability, Linear Algebra, Multivariate Calculus, and good Computer Programming skills.
  • ENG EC 505: Stochastic Processes. (Prereq: undergraduate probability, linear algebra, and signals and systems: ENG EC 401 and CAS MA 142 or equivalent and either ENG EC 381 or ENG EK 500 or equivalent). Introduction to discrete and continuous-time random processes. Correlation and power spectral density functions. Linear systems driven by random processes. Optimum detection and estimation. Bayesian, Weiner, and Kalman filtering.
  • ENG EC 515: Digital Communication. (Prereq: ENG EC 381 and ENG EC 401 or equivalent undergraduate probability and signals and systems). The modern field of digital communication was pioneered by Claude E. Shannon in 1948. Digital communication systems have now become the basic workhorses behind the information age. Examples include wired and wireless phone systems, digital data storage systems, cable modems, digital subscriber loop technology, etc. This course is an introduction to the fundamental principles underlying the design and analysis of digital communications systems. A partial list of topics covered in this course includes the following: optimum receiver principles with focus on additive gaussian noise channels and signal-space concepts; efficient signaling for message sequences with focus on basic digital modulation and demodulation techniques and their performance analysis, notions of symbol and bit rate, symbol and bit error probability, and asymptotic power and bandwidth efficiency;  real passband additive gaussian noise channels and their equivalent complex baseband representation; efficient signaling for message sequences over general bandlimited additive gaussian noise channels with focus on signal design and equalization methods to combat intersymbol interference; noncoherent and diversity communication systems to combat channel fading; introduction to modern error correction codes.
  • ENG EC 517: Introduction to Information Theory (Prereq: ENG EC 381 or equivalent undergraduate probability). Discrete memoryless stationary sources and channels; Information measures on discrete and continuous alphabets and their properties: entropy, conditional entropy, relative entropy, mutual information, differential entropy; Elementary constrained convex optimization; Fundamental information inequalities: data-processing, and Fano’s; Block source coding with outage: weak law of large numbers, entropically typical sequences and typical sets, asymptotic equipartition property; Block channel coding with and without cost constraints: jointly typical sequences, channel capacity, random coding, Shannon’s channel coding theorem, introduction to practical linear block codes; Rate-distortion theory: Shannon’s block source coding theorem relative to a fidelity criterion; Source and channel coding for Gaussian sources and channels and parallel Gaussian sources and channels (water-filling and reverse water-filling); Shannon’s source-channel separation theorem for point-to-point communication; Lossless data compression: Kraft’s inequality, Shannon’s lossless source coding theorem, variable-length source codes including Huffman, Shannon-Fano-Elias, and Arithmetic codes; Applications; Mini course-project.