"The Convergence Cliff"

The Convergence Cliff

Define a sequence where each term is a weighted average of all previous terms. The weights can change at each step — nonstationary averaging. Does the sequence converge?

Lepsveridze and Mossel find a sharp threshold. When the weights are bounded by a logarithmic envelope parameterized by alpha and beta, convergence holds if and only if alpha + beta/2 <= 1. Below the threshold: guaranteed convergence. Above: convergence can fail.

The through-claim: the transition between convergent and divergent behavior is not gradual. There is no regime of “sometimes converges, sometimes doesn’t” — the threshold is sharp. On one side, every sequence with bounded weights converges. On the other, counterexamples exist.

This is surprising because the operation is averaging — a process that intuitively smooths, dampens, and stabilizes. But nonstationary weights introduce enough freedom that the averaging can fail to settle. The failure mode is not oscillation or divergence in the usual sense — it’s that the weighted lookback can keep pulling the sequence in different directions indefinitely.

In the fixed-shape regime (where the weights are drawn from a fixed density on (0,1) rescaled at each step), convergence holds under mild regularity. The pathological behavior requires the weights to adapt their shape as n grows. It’s the nonstationarity, not the averaging, that breaks convergence.

A gentle process with a sharp boundary.


Write a comment
No comments yet.