But the point is, that if you transfer data, not audio, via usb there is an error correction. That will compensate for such effects. otherwise you would not be able to copy a picture or a program onto an usb stick for example. It depends on the USB interface and how it cummunicates with the pc. I am no expert on this, Peter can you explain it? How does this digital audio would look like and how is it handeld compared to the data? In which cases we have a digital audio signal (no error correction) and when we have data, if we listen to music from pc?
Of course error correction is important - *if* there are errors. But IMO this is seeking for things which normally don't occur anyway. Similar to normal CD reading (by a CD Player) which may read errorneously and because the lack of time to re-reread (inifitely like EAC) it will spit out the errors (outside inherent error correction). So, do we care ? yes if the results are ticks etc. (and this sure happens !), no (again IMO) when we might think it will degrade sound in general. I say this is non-sense because it would imply that the level (volume) is slightly changed (meaning the least siginificant bits are being read wrongly or twisted or make up something), which is not a logical thing at all. When errors are passed on, all will be wrong, and when it is about the most siginificant bit there will be a tick when the current (momentary) level is low, or when the level is high there will be a tick when -again- the most siginificant bit is read wrongly. All in between exists too, plus it is hard to imagine that when one audio word is wrong that this will be audible (merely think about a few 100 in a row).
Normal isochronous USB is the same. No error correction, but what's the real problem.
Which leads us to an asynchronous connection which indeed is about error corrected normal data. The conclusion must be no different : what can it help. Oh, it can help the other way around : too many errors will consume the bandwidth progressively, and *now* you have ticks and glitches and gaps (the data can not be delivered in time, no matter the buffer used with async).
So, done. Remember, my view.
But my view is also that this is all about jitter. Ok, I won't be the only one. *Now* things change, and they change from so many angles that the number of characters allowed in one post will be too few to complete it.
Jitter is about timing errors which is the most easy to be looked at as wow from a turn table. Even the frequency can be as low as one rev from an LP, or lower. But also rather infinitely higher.
The audio data -now looked at in the "audio stream" realm- is supposed to pass through the convertes (D/A but also A/D) in a 100% constant fashion. We don't need to look at 0's and 1's to understand that the letting loose of the samples (like 44100 per second) must be let loose in a most even fashion, or otherwise we'd have strange things on the frequencies happening. Just keep on thinking wow from the turn table ...
The "frequency" (!) in which the samples are let loose are determined by some clock. Let's not make it difficult, and let's imagine the clock to be right next to the D/A chips.
The clock is (or can be) a crystal, and although it will have its own resonation (frequency) it is driven by current. Or voltage if you want, but Ohm's law will tell you that it is about both always, at least when one can change.
Although it won't be the 100% truth (or at least not in all cases) it is now the most easy to think that the voltage level can change by influences on the current. So, hammer on the current in the same circuitry somewhere, and the voltage (and/or current) that drives our must-be-stable clock, will have impact on the stability of that clock. Yes, we talk about picoseconds or less these days, as the impact of time difference on a certain frequency. Let the clock run fraster and the frequency we want to hear gets higher. If that would be all, it is no problem, except for that our music track may last 5 seonds less than officially intended. But it is about the
stability and and roughly said the clock outputting a different frequency of her pulses within that picosecond dimension.
Now high frequencies (like 10KHz) will be disturbed.
Because many things happen inside of an audio chain (I extend this way outside the DAC on purpose), all kind of "self frequencies" hammer on that supply for the clock. It could be the 50/60Hz mains supply, it can be a voltage regulator (of any random high speed), there can be 100 of those regulators in one device, it can be that other clock in there (like a base 48KHz vs. a base 44.1KHz), it can be a receiver chip - it can be way many;
All these frequencies, also depending on their individual level, interact. They interact to a multitude of additional (!) new frequencies. Now it will not be hard to imagine that these may come together (in one peak) once per 10 seconds, 1 minute, 3 minutes or even days. Now we are talking about the real wow, and it really exists. It is mighty audible as well, at the moment you heard the same without. For example, a nylon string guitar can sound so straight. But with jitter, you wouldn't know what is wrong, because it just sounds good, and you wouldn't know how the artist manipulated the strings. Btw, a Jazz guitar the same.
Since there's all the talk about "computer noise" sneaking into our DACs, I should be mentioning noise as that other reason having impact on jitter. There is not much difference with the before layout, because noise is again about hammering frequenies on to our signal. In the end there *is* no difference, and I say that even white noise does not exist. Maybe at the molecule level, but we will never get to see *that*, because there's always other sh*t around first.
When this "noise" gets into our clock supply, there is not much difference on the impact, although there may be a difference on the current fluctuation. So, the before outlay merely suggested that the current was not stable and thus something like a clock needing current is not able to run stable (think of a car which continuously holds back and forth again), while with noise coming from outside the voltage peaks will not be stable (think of a lamp dimming and back on the frequency of the mains). Now it is a matter of stuff relying on voltage peaks (and all digital is !), and *now* we have the problem of slopes (of voltage waves) arriving earlier and later, just because noise (illegal peaks) ride on them. And hey, here is mr jitter again ! (this time *behind* the clock, the square wave coming from it having its slopes changed).
Now, on to our subject (wasn't that USB ?), all we need to know is that it is the worst. It creates 8KHz spikes because that is the frequency of USB and how it transmits the packets of data. If you look at it, it's only 20dB or so (I forgot) above the normal noise floor. But since I claim that noise rides ON the signal, I say it sounds like sh*t. And it really does. And for those who again recall why I was reluctant to go any USB route ... now you know.
This noise not only rides on the signal, it influences all angles I talked about above. So, not only the analogue signal because you'll see it back as a modulated signal on top of that audio signal, but long before that it influenced jitter by all means thinkable. This is also why USB DACs don't measure good on jitter at all (maybe by now this changed, I don't follow everything and all).
By now you will also be able to imagine how any computer noise will sneak into any interface means, USB foremost, because that packet sending current surge starts in the PC. But the other way around as well : anything that happens in the PC will influence that signal itself. So now we have the 8KHz with a superimposed other pile of frequencies and from there the mess is unpredictable, but it *is* a mess.
And by now you'll also start to understand how software like XXHighEnd can exlicitly influence all this ...
And this is not by changing any bits.
Peter