IN this months HFN they test the jitterbug. The very limited measurements included the eye pattern. PM claims that the rise time decreases from 22ns to 15 ns (which seems difficult to grasp give that it is only a filter not a regenerator). I don't know if anyone else has seen this but on purely visual inspection of the print I cannot see this at all. The two eye patterns look almost identical. Surely a 1/3 reduction in rise time would look obvious. Am I being thick?
From
here:
"Rise and fall time measurements for USB compliance
Mandate: Required
Effective Date: August, 2007
There has always been a problem accurately measuring rise and fall times, especially on high speed devices. The measurement of interest is the edge rate, or slew rate, during the state change time. To help improve accuracy of the measurement, the USB-IF is standardizing on one test fixture for high-speed signal quality.
Aside from the fixturing and probes used to take the measurements, major contributors to the inaccuracies in these measurements are the shape of the edge, noise on the signal and the method of calculating the 10% and 90% points as defined in Sections 7.1.2.1 and 7.1.2.2 of the USB 2.0 Specification.
A waveform with slow corners (see sample eye diagram below) will result in a measured rise time that is slower than the actual edge rate would indicate. Also a small change in the position of the 10% and 90% points due to noise on the signal, etc., can cause a relatively large change in the measured rise time."
USB 2.0 HS defines the fastest edge rate crossing speed as 100ps with a warning given for 300ps so any edge rates faster than 100ps will fail but it doesn't mention the slowest edge rate but the maximum interpacket delay times give an idea of this "So the maximum interpacket delay of a host's response to a device is <= 264 bit times + 60ns, which equates to 610ns or 292 bit times."