Cover Page

Digital Signal Processing with Python Programming

Maurice Charbit

Wiley Logo


This book addresses the fundamental bases of statistical inferences. We shall presume throughout that readers have a good working knowledge of Python® language and of the basic elements of digital signal processing.

The most recent version is Python® 3.x, but many people are still working with Python® 2.x versions. All codes provided in this book work with both these versions. The official home page of the Python® Programming Language is Spyder® is a useful open-source integrated development environment (IDE) for programming in the Python® language. Briefly, we suggest to use the Anaconda Python distribution, which includes both Python® and Spyder®. The Anaconda Python distribution is located at

The large part of the examples given in this book mainly use the modules numPy, which provides powerful numerical arrays objects, Scipy with high-level data processing routines, such as optimization, regression, interpolation and Matplotlib for plotting curves, histograms, Box and Whiskers plots, etc. See a list of useful functions p. xiii.

A brief outline of the contents of the book is given below.

Useful maths

In the first chapter, a short review of probability theory is presented, focusing on conditional probability, projection theorem and random variable transformation. A number of statistical elements will also be presented, including the great number law and the limit-central theorem.

Statistical inferences

The second chapter is devoted to statistical inference. Statistical inference consists of deducing some features of interest from a set of observations to a certain confidence level of reliability. This refers to a variety of techniques. In this chapter, we mainly focus on hypothesis testing, regression analysis, parameter estimation and determination of confidence intervals. Key notions include the Cramer–Rao bound, the Neyman–Pearson theorem, likelihood ratio tests, the least squares method for linear models, the method of moments and the maximum likelihood approach. The least squares method is a standard approach in regression analysis, and it is discussed in detail.

Inferences on HMM

In many problems, the variables of interest are only partially observed. Hidden Markov models (HMM) are well suited to accommodate this kind of problem. Their applications cover a wide range of fields, such as speech processing, handwriting recognition, the DNA analysis and monitoring and control. There are several issues with HMM inference. The key algorithms are the well-known Kalman filter, the Baum–Welch algorithm and the Viterbi algorithm to list only the most famous ones.

Monte-Carlo methods

Monte-Carlo methods refer to a broad class of algorithms that serve to perform quantities of interest. Typically, the quantities are integrals, i.e. the expectations of a given function. The key idea is using random sequences instead of deterministic sequences to achieve this result. The main issues are first the choice of the most appropriate random mechanism and, second, how to generate such a mechanism. In Chapter 4, the acceptance–rejection method, the Metropolis–Hastings algorithm, the Gibbs sampler, the importance sampling method, etc., are presented.


October 2016

Notations and Abbreviations

empty set
imagesA(x) = images
(a,b] = {x : a < xb}
δ(t) images
Re(z) real part of z
Im(z) imaginary part of z
i or j = images
IN identity matrix of size N
A complex conjugate of A
AT transpose of A
AH transpose-conjugate of A
A−1 inverse matrix of A
A# pseudo-inverse matrix of A
r.v./rv random variable
probability measure
θ probability measure indexed by θ
images {X} expectation of X
Bθ X expectation of X under ℙθ
Xc = Ximages {X} zero-mean random variable
var (X) = images {|Xc|2} variance of X
cov (X, Y ) = images covariance of (X,Y)
cov (X) = cov (X, X) = var (X) variance of X
images {X|Y} conditional expectation of X given Y
images a converges in distribution to b
images a converges in probability to b
images a converges almost surely to b
d.o.f. degree of freedom
ARMA AutoRegressive Moving Average
AUC Area Under the ROC curve
c.d.f. Cumulative Density Function
CRB Cramer Rao Bound
EM Expectation Maximization
GLRT Generalized Likelihood Ratio Test
GEM Generalized Expectation Maximization
GMM Gaussian Mixture Model
HMM Hidden Markov Model
i.i.d./iid independent and identically distributed
LDA Linear Discriminant Analysis
MC Monte-Carlo
MLE Maximum Likelihood Estimator
MME Moment Method Estimator
MSE Mean Square Error
OLS Ordinary Least Squares
PCA Principal Component Analysis
p.d.f. Probability Density Function
ROC Receiver Operational Characteristic
SNR Signal to Noise Ratio
WLS Weighted Least Squares

A Few Functions of Python®

To get function documentation, use .__doc__, e.g. print(range.__doc__), or help, e.g. help(zeros) or help(’def’), or ?, e.g. range.count?



From numpy:

From numpy.linalg:

From numpy.random:

From scipy:

(for the random distributions, use the methods .pdf, .cdf, .isf, .ppf, etc.)

From scipy.linalg:

From matplotlib.pyplot:


From sympy: