Impact of variability in the measured parameter is rarely considered in designing clinical protocols for optimization of atrioventricular (AV) or interventricular (VV) delay of cardiac resynchronization therapy (CRT). In this article, we approach this question quantitatively using mathematical simulation in which the true optimum is known and examine practical implications using some real measurements. We calculated the performance of any optimization process that selects the pacing setting which maximizes an underlying signal, such as flow or pressure, in the presence of overlying random variability (noise). If signal and noise are of equal size, for a 5-choice optimization (60, 100, 140, 180, 220 ms), replicate AV delay optima are rarely identical but rather scattered with a standard deviation of 45 ms. This scatter was overwhelmingly determined (ρ = −0.975, P < 0.001) by Information Content, SignalSignal+Noise, an expression of signal-to-noise ratio. Averaging multiple replicates improves information content. In real clinical data, at resting, heart rate information content is often only 0.2–0.3; elevated pacing rates can raise information content above 0.5. Low information content (e.g. <0.5) causes gross overestimation of optimization-induced increment in VTI, high false-positive appearance of change in optimum between visits and very wide confidence intervals of individual patient optimum. AV and VV optimization by selecting the setting showing maximum cardiac function can only be accurate if information content is high. Simple steps to reduce noise such as averaging multiple replicates, or to increase signal such as increasing heart rate, can improve information content, and therefore viability, of any optimization process.