Jitter & Control

July 7, 2011

Jitter or the difference in time or variation between when something is ‘supposed to happen’ versus when it ‘actually happens’ as compared to a reference clock source. The deviation can be in terms of amplitude, phase timing, or the width of the signal pulse. Another definition is that it is "the period frequency displacement of the signal from its ideal location.”.

Jitter or the difference in time or variation between when something is ‘supposed to happen’ versus when it ‘actually happens’ as compared to a reference clock source. The deviation can be in terms of amplitude, phase timing, or the width of the signal pulse. Another definition is that it is "the period frequency displacement of the signal from its ideal location.”.

The amount of allowable jitter depends greatly on the application and is always present in any network from USB connections, through Ethernet systems and of course fieldbus applications as well. Provided your communications protocol is robust enough, jitter often exhibits as an increase in the number of retransmissions on a network.

The most common causes of jitter are electromagnetic interference (EMI) and crosstalk with other signals. Once again this reaffirms the importance of proper network design and installation. 

If there were no consequences of jitter other than retransmissions it may be less of an issue unless data losses reach a level that overall network load increases where overall bandwidth is affected.

However, because we are involved in real time control, determinism and being “on time all the time” is the basis on which control systems operate. Basic regulatory control can be affected by as little as 3% jitter making the network unstable. Jitter many not sound like much but then again it does not take much to make a difference.