Enhancing Nutt-Based Time-to-Digital Converter Performance with Internal Systematic Averaging

A time-to-digital converter (TDC) often consists of sophisticated, multilevel, sub-gate delay structures, when time intervals need to be measured precisely. The resolution improvement is rewarding until integral nonlinearity (INL) and random jitter begin to limit the measurement performance. INL can then be minimized with calibration techniques and result postprocessing. A TDC architecture based on a counter and timing signal interpolation (the Nutt method) makes it possible to measure long time intervals precisely. It also offers an effective means of improving the precision by averaging. Traditional averaging, however, demands several successive measurements, which increases the measurement time and power consumption. It is shown here that by using several interpolators that are sampled homogeneously over the clock period, the effects of limited resolution, interpolation nonlinearities and random noise can be markedly reduced. The designed CMOS TDC utilizing internal systematic sampling technique achieves 3.0ps rms single-shot precision without any additional calibration or nonlinearity correction.

Jansson J.-P., Keränen P., Jahromi S., Kostamovaara J.

Publication type:
A1 Journal article – refereed

Place of publication:

CMOS averaging, delay-locked loop, integral nonlinearity, jitter, Nutt method, quantization error, TDC, Time interval measurement, time-to-digital converter


Full citation:
J. -. Jansson, P. Keränen, S. Jahromi and J. Kostamovaara, “Enhancing Nutt-Based Time-to-Digital Converter Performance with Internal Systematic Averaging,” in IEEE Transactions on Instrumentation and Measurement. doi: 10.1109/TIM.2019.2932156


Read the publication here: