Today I am going to show you how jitter looks like on a capable oscilloscope and explain a few things which are not so commonly known. This already starts with the way (or the means used) how jitter is observed and of which I found that this should be done through analogue (scope) means.
Side note : When done through digital means it is so much subjective to the "programs" used to convert the digital signal(s) to the analogue screen for observation that as many mistakes can be made as that, say, XXHighEnd could improve SQ over the years by means of all the (infinite) interpretations needed to do it really well.
In addition the (ADC) sampling rate of the digital scope has to be so high in order to interpret the resolution of today's "femto second" oscillators, that this actually does not exist. Mind you, no matter it's said that it does. So, if today's highest sampling rate of digital scopes is 160GS/s (giga samples per second) then the interpretable bandwidth is to be considered around 1/3 of that, thus something like 53GHz and the remainder is a matter of "time". I mean, when the scope can measure 53GHz "cycles per second" then the resolution of that comes down to 1/53,000,000,000 of a second which is equal to 0.00000000001887 seconds = 19ps (pico seconds). This is still 1000 times less for time resolution compared to what's needed (think 19fs (femto seconds)) while the analyzers concerned still "can do it".
But with so many tricks (already at the ADC side to achieve the high sampling rate) that in my personal opinion nothing much can be trusted. Or at least will vary vastly depending on the analyzer used.
Also thinking about the oscillators in the (ADC) analyzer itself and how they would need to be rated ... It just can not exist against the money such analyser costs (which is multi 100K to begin with).
Remember, all does not prevent the digital analyzers to come up with jitter data after all but regarding reliability - it is just not worth it. Remember : my opinion and judgement.
With an analogue scope the only problem is that it does not exist.
. Old technology. So, maybe analogue scopes are still being made in the lower bandwidth range (like under 100MHz) but at the moment when digital broke through (say 15 years ago) and maybe one manufacturer was working on higher bandwidth analogue scopes, it was considered that digital was the way to go and (high tech) and development of analogue scopes stopped.
And mind you, if I observed correctly, the analogue extreme high bandwidth scopes is so complex to design/manufacture that actually two such (1GHz) scopes existed, the Tektronix 7104 from last century's 70's and the Iwatsu TS-81000 from the year 2002.
For comparison, what an 1GHz analogue scope can do compares with what a digital scope of 300GS/s would be able to do for the time domain
theoretically (theoretically for reasons described above).
Side note again : It is not so that digital analyzers are totally incapable because they can't meet the necessary specs regarding the time domain; digital analyzers can do more because the time domain can be derived from the frequency domain (which is what "digital" can do), but as said, it is full of tricks and such analyzers cost a large house anyway).
This is how an analogue scope "works" and outperforms the digital scope without hassle (I mean, what you see is just real) :
The 1GHz we are talking about is not about 1 giga samples per second or something - this is about the number of "sweeps" (say scans) the scope can perform per time unit. This is defined as 1GHz for the whole (horizontal) screen or more literally : 10 divisions per ns (nano second).
1GHz = 1/1.000.000.000 = 0.000000001 second = 1ns.
In other words : when the whole screen is written in 1ns, one division is written in 1ns/10 = 100ps (pico second).
or :
One division contains 10GHz of time resolution (10 x 1GHz).
In practice this is 200ps per division (don't ask me why but it can be related to nyquist stuff after all and there is math behind this which is somewhat beyond me) with the further notice that at least the time cursor has a resolution of 1/60'th division. Now 200ps/60 = 3.33ps and so the time resolution which can be observed is 3.33ps.
Also notice that for the means of observing the time domain the screen's resolution is important, which in this case is 800 pixels for the (horizontal) time domain (480 vertical). Thus, 80 pixels are available per division which is sufficient for the 60 time cursor steps to begin with but which is also sufficient for seeing the full resolution by eye. Thus, the full resolution which can be observed is 200ps / 80 = 2.5ps.
2.5ps is thus 80 times 200ps is thus 80 times 5GHz which can be derived from the above. This means that 400GHz of time resolution is available for observement.
This 400GHz can still be compared to the 53GHz mentioned in the beginning of this post.
To keep in mind : this is only about observing the time domain or performing math directly on that (which lattter a digital scope can do and which an analogue scope can not do).
Now let's see what we can do with this, with the notice that I never saw pictures of what I will present anywhere else. This is obvious because chances are very small that people in the same field use such an analog 1GHz scope.
(and yes, I got myself a 1GHz Iwatsu
)
To keep in mind : Such a scope is not an "analyzer" as such and analysis goes by means of the eye.
Let's first look at an "analysis" of jitter distribution (as how it is called) of a digital scope (courtesy Rohde & Schwarz) :
Watch the more horizontal beam which is the "analysed" representation of the signal of an oscillator and how the more light blue colour denotes more intensified jitter. Read this as : The light blue colour denotes the most occurrence of the signal. This is how the mid of that beam is the most intensified because it is the signal itself and the deviations (jitter) of that signal are less in occurrence so less intensified.
If you look at the plot in dark blue at the bottom of the screen you see how the deviation of the signal can be plotted and you see two major side beams. The right hand one is the highest and you can see this as the secundary light blue beam in the horizontal main beam. The left hand beam in the plot at the bottom is lower in intensity and this represents the dark blue beam in the horizontal beam.
Lastly, you can see a second side beam in the left hand side and this is the more orange colour in the horizontal beam.
Hard to read is the 51.2ns of deviation to the perceived optimal signal which shows on top of the right hand side beam at the bottom. This means that there's a frequently occurring jitter of 51.2ns towards one side of the optimal signal; when the left hand side beam would also show 51.2ns (but this is not readable) it would mean that peak to peak jitter is 102.4ns for this more intensified (more occurring) deviation. Mind you, only for these more intensified parts because you can see there is jitter beyond that - and it extends to the yellow (least intensified) beams in the horizontal beam. With some visual interpretation you can see that this spans around 400ns just because the distance is around twice that to the light blue and dark blue beams shown in the horizontal beam.
Got that ?
Then let's now look at the analogue scope and what we can make of that :
This shows an 1KHz signal and you actually see exactly the same. One difference of course - it is one colour only and the more occurring deviation (jitter) shows by the more intensified beams;
You can see that the optimal signal - now simply and "honestly" derived from the most occurring signal without any math - is in the middle and that at equal spacing more occurring jitter shows; as in the (first) example picture you can see that the distribution to the left and to the right of the signal is not equal; to the right there's more intensified and less spread deviation and to the left it is more spread (wide) and shows (thus) more vague. With some experience you could see that the deviation to the right and to the left in itself is equal but the deviation to the left is more "wobbling" in itself.
Notice that "to the right" means "late" and "to the left" means "early". So we have jitter that makes our samples come out later than intended and also more early than intended. And, the
distribution of that is not equal.
Mind this word "distribution" because in the end a lot is about that (for Sound Quality).
If you look to the last picture and now concentrate on the middle of the beam (the intended signal) then you can see there too a more intensified and less intensified part. So, don't think this is just the "phosphor" of the screen dying out or something - it is just jitter again and, there too, more deviated to the left than to the right and again the early samples are more "distributed" than the late ones.
Lastly for this picture, notice that the parts between the middle beam and the outside ones are completely black ...
But not so here !
What happened ?
While the previous picture shows a 1KHz test signal, the last one shows normal music playing.
Aha.
What insiders right away will tell us, is that this is thus "data dependent" jitter. Well, don't believe everything they say (and decide whether you believe me or not) but as we already know there is more going on than data impeded jitter as it formally exists (without explaining what that exactly is at this time) - and this is i2s dependent jitter. This too I won't explain plus it has been sorted out by "us ourselves" (with Nick in the base).
Is it really i2s impeded jitter ? maybe maybe not, but it is dependent on things beyond known (formally). Look :
This time again music is playing, but now it has been attenuated by 144dBFS. In't that interesting ...
If you compare this picture with the previous one you will be able to see how "volume" plainly creates jitter. All you need to do for that is seeing that the main beam almost has vanished at that almost all has been spread into deviation.
But not so with the last picture !
Side note : Can you already see how "jitter analysis" (from reviews or whatever other means) need context never provided (because not recognized) ? This is how I love this ... Something new again.
Something new to work with.
Here you see the same but now attenuated to -120dBFS.
See ? it already starts to happen.
In the big picture we must see this in the context of a software player (by PC) and how it can influence jitter. Now we must be careful for the interpretation of the remainder in this post because it
states that this *always* happens. However, this is empirical finding in itself and it had been derived only from so many people's various D/A converters with plainly nobody ever telling that XXHighEnd's influence is not there. Thus the other way around, it always does and this means that for the remainder of this post what I say/claim is true in general for everybody and everywhere (DACs).
Still this does not prove 100%, so please keep this in mind.
What you saw so far has been taken from your fine Phasure NOS1 D/A converter as you have it almost all (USB version I mean). Thus what I showed will show form your NOS1 just the same. Now notice : This has been taken from the 22.xxxMhz oscillator output and the oscillator is rated at 200fs. But look at the picture again and see the two vertical yellow ("cursor") lines and see that really 210ps of peak to peak jitter comes from it.
yeah, well, what to say.
Lousy after all ? maybe, but that doesn't tell at all that other DACs do better. It is all a matter of how the interpretation is done and all I personally can tell is that whatever the Stereophile's etc. come up with is by far superseded by me hence the NOS1 if I apply the same type of measurements. Apparently that doesn't tell much or too much is vaguened under way. And/but mind you, any first real jitter measurement of a DAC manufacturer himself I still have to see. IOW, showing oscillator measurements doesn't tell a thing at all. I hope this is clear now; only when this is done in the physical DAC (like I just showed) it shows reality.
An of course no Stereophile is going to put measurement probes on oscillator legs.
On to something new again (for the world and as far as I can tell) :
What you see here is the top of the oscillator's wave cycle (but with amount of 1 giga scans per second again) and you can clearly recognize that what's shown there has to resemble 100% what was shown before. The difference ? this is the "level" of the output voltage and not the time domain as such. Now, with the notice that the horizontal scale is 50mv per division, the "peak to peak" level variation shows something like 100mV (on a total of 3.3V peak-peak);
It is not said that the same happens to our audio samples (I did not check that) but it can be expected. Thus, this is again the oscillator output and "something" influences not only the time stability but also the level stability (output voltage stability). And since the "jitter" distribution looks the same, its source will be the same (whatever that source is).
Back to XXHighEnd influence ...
... Notice that the maximum output level of the PCM1704 chip in amperage is 1.2mA (which is converted to voltage and amplified). You are entitled to see 1.2mA as totally nothing to begin with, but notice that this is the
maximum output or in other words what it outputs at -0dBFS (no digital attenuation). Can you now please divide 1.2mA by 16,777,216 to learn the minimum output at 24 bit attenuation or 144dBFS ? And also divide that by 1,048576 which implies 20 bits which is equal to -120dBFS ?
... and look back at above pictures which comared the -144dBFS and -120dBFS and that it makes a difference in jitter ?
And that whatever your calculators came up with for super low amperage is THUS so easily influenceable by all other unpronounceable tiny current draw ?
So that is how it works, and it will also work for level influence. Thus we don't deal with jitter in the time domain only, but also with jitter in the level domain (you might call this frequency domain, but not sure). Or in other words : when two samples with exact same calculated level are presented to the D/A conversion both will vary because "something" influences it.
Generally this "something" can be called your DAC and all what happens in there in order to convert the digital samples to analogue output. And that is A LOT.
But let's try to improve on this; what we can measure can be improved;
Aha. Well, between what you saw from the previous pictures and this one reads like a 1 minute change but take it that it really was two years. Even more actually because some know I have been working on it forever.
Anyway now it shows 13ps;
Watch out : This 13ps is measured peak to peak while general "one number" jitter figures are expressed in RMS and this also depends on the distribution; we can't do the real math from looking at a beam like this but think that for the RMS figure very roughly the 13 can be divided by 8 (so RMS jitter would be something like 1.625ps). Remember, "very roughly" and it can be higher or lower.
But is it 13ps or is it actually "analyzer limit" ? It is the latter. How ? well, it is just the beam of the signal and although I am able to minimize the beam more than you see, it comes down to showing "no beam" if less width of it is to be shown (those familiar with scopes will readily understand).
So I can only show the width of the beam and as long as no more intensified parts show (which it does not), it shows "no jitter".
Of course there is jitter (because no oscillator exists without jitter) but at least it will be under 13ps. And, if all was done 100% right it will be the 200fs rating of the oscillator.
Back to the beginning of this post and how digital analyzers may be able to dig out more (for various types of jitter I mean), let's say that for myself I don't need to have dug out more once nothing "is there" anyway. As said, in the end this is not true because down at the femto second level similar will be seen, but whether that is useful - I don't think so (also see the end of this post).
For our fun look here :
The scope is a so-called storage oscilloscope which means that it can store the once passed wave forms and keep on showing that to the screen. You can see this by the somewhat different colour of blue when used. So, this picture as well as the previous one used this feature and it was "all open". This means that it forever keeps on showing the once captured output which in digital terms is named "phosphor technology" and which resembles the after glowing of old Tektronix oscilloscopes.
This feature comes handy when analyzing jitter by this visual means because a "once occurring" deviation will be captured and showed, no matter it is not occurring any more.
So what happened to the last situation shown ?
I hopped up and down on the floor;
While jitter will be under 13ps inherently, I implied a 36ps of jitter which was incurred for by vibration; you can't see it here, but the one time the beam would jump to the right, the other to the left and after a few minutes of exercise on the floor this came out of it.
So be careful of what sound pressure (or speaker vibration) itself can do - not much different from the vast importance for turntables regarding this.
But didn't I put up a couple of posts on a nice Friday evening on how damping of the DAC (protecting it from vibrations) is important ? Right.
To understand this better you need to read to the end so you can see how even more low the jitter will be than this "analyzer limit" BUT that hopping on the floor is still capable to make 36ps of it. Relatively this only emphasizes the importance of good damping (read : should jitter be down to e.g. 200fs while vibration of the floor makes that 36ps then the difference is a lot.
Here you can see that the "jitter" on the voltage output level of the oscillator has vanished as well.
Please notice that this is not 100% comparable to the other pictures because the beam is intensified too much, plus this shot was taken with PNG (compression) which you can clearly see, while the others have been taken with lossless BMP. But the general idea is clear - no any side bands visible anywhere any more.
What technically happened is the 100% isolation from the (USB) interface. This has been accomplished by a means that is *not* detrimental to jitter and which is the big trick. There's not even long term jitter involved like how a PLL would imply that (so just saying); I am pretty sure that whatever I did has not been done before ...
Secondary (but a prerequisite), the lowest possible supply noise has been applied to all parts everywhere. Think 10's of mV before (which also is very good) to less than 2uV (RMS !) now.
As a bonus, here's the eye diagram at a sampling rate of 352800 and with the (phosphor) intensity full open for 15 minutes. It looks the very best (Google for pictures of eye diagrams).
Notice : In itself the eye diagram only tells about the headroom available for errors in the D/A conversion. So this can be seen as when there's so much jitter that there's no opening (this is the part between the crossing lines), there are no continuous "signal edges" to let trigger the output clocking upon (always).
Questions ? shoot. Recognize I sprouted cr*p somewhere ? say it.
But indeed, there is no audible XXHighEnd influence any more ...
Peter