Convolutions often arise when you are trying all possible ways of doing things that add up to , for a large range of values of , or when sliding a mask or pattern over a sequence and calculating at each position.

Integer Multiplication

We can interpret integers as polynomials in any base . For example, , where . Polynomial multiplication behaves like integer multiplication without carrying.

There are two different ways we can use fast polynomial multiplication (done with convolution) to deal with integers.

  1. First, we can explicitly perform the carrying operation on the product polynomial, adding  and then replacing with .
  2. Alternatively, we could compute the product polynomial and then evaluate it at to get the integer product . With fast convolution, either way gives us an even faster multiplication algorithm than Karatsuba’s algorithm, running in .

Cross-Correlation

For two time series and , the cross-correlation function measures the similarity as a function of the shift or displacement of one relative to the other.

Perhaps people buy a product on average days after seeing an advertisement for it. Then there should be high correlation between sales and advertising expenditures lagged by days. This cross-correlation function can be computed:

Note that the dot product here is computed over backward shifts of instead of forward shifts, as in the original definition of a convolution. But we can still use fast convolution to compute this: simply input the reversed sequence instead of .

Moving Average Filters

Often we are tasked with smoothing time series data by averaging over a window. Perhaps we want over all positions . This is just another convolution, where is the vector of weights within the window around .

String Matching

Recall the problem of substring pattern matching. We are given a text string and a pattern string , and seek to identify all locations in where may be found. For and , we can find in starting at positions , , and .

The naive algorithm for this problem works by sliding the length- pattern over each of the possible starting points in the text. This sliding window approach is suggestive of being a convolution with the reversed pattern , as shown in the figure below.

Can we solve string matching in by using fast convolution? The answer is yes! Suppose our strings have an alphabet of size . We can represent each character by a binary vector of length having exactly one non-zero bit (one-hot encoding). Say and for the alphabet . Then we can encode the strings and above as

The dot product over a window will be (the length of the pattern) on an even-numbered position of if and only if starts at that position in the text. So fast convolution can identify all locations of in in .

  • We only care about even-numbered positions because each letter is 2 bits; either 01 or 10.

Implementation here: https://github.com/k78ma/implementations/blob/main/misc/string-match-conv.py