So Alain, here we go ...
You mentionned that as a hobby, most tweaks would not work anymore and in a sense, this is a dream come true for those who prefer a "plug it and forget it". In a way, for a guy like me, there are some aspects of this that I could not easily grasp.
Add to this that SQ will increase again and you will not only make my day, but my months, my years... And my ears
But some notions have escaped me (this is an understatement) in the process. I have thought that noise (born from all situations that occur inside a PC) was taking any path that would be easy for it to take. So, since the PC is connected to the NOS1 with a USB cable, the data line would already be carrying noise that has nothing to do with the signal itself. Even if someone was using an optical device (like Adnaco), the noise "within" the signal would also go to the dac no ?
To first answer Mani's question : Yes, it is totally galvanically isolated.
But e.g. the Adnaco is that too ...
Now Alain, your perception of the data already carrying the noise is in itself correct. But I don't think this is what I myself ever brought forward as "the reason of". So, of course I have seen many of you debating these kind of issues and it is not so that I have intervened in all occasions, just because it was not relevant. Still, to hopefully keep it in an understandable realm, first about that :
When noise would be obtained in that "data" (so think plus and ground to carry data, no matter it's within a glass wire) it would only be about deterioration of the "digital" signal. Digital signal : for your comfort think about 0V and 2V going On and Off where On means a "bit up" and Off means a bit not-up (down).
All what happens when this is noisy is that the moment it is seen that the bit is up or down varies somewhat. of course everybody right away says "jitter !!" but this is not so with genuine computer data. Oh, it is, but it is not important because there is a relative huge time allowed to detect whether the but is up or down and although jittery, to interpret the state of the bit is just not related. It doesn't do a thing, unless you see your Word document have different characters when more lag etc. But it doesn't.
When this would be an electrically connected signal, the noise as such actually is (also) in the ground. It is *there* where things go "wrong" because the ground is connected to everything. Again for your comfort (thus simply said), when the noise is in the ground it will also affect an oscillator also connected to that ground. And NOW it implies jitter, because of the clock can not have its bits up and down in stable fashion, then this affects the audio which it is controlling for "output speed".
(btw, we don't talk "bits" when it is about feeding oscillators with power but it doesn't matter because in the end ALL is analogue signal).
So, "when electrically connected" eh ? And what about the galvanically isolated glass connection then ?
It is here where it does not matter whether you see noise in the signal as real or not, because it now about "the other end" (DAC) where super noise is implied by the means to convert the data back to electrical. It is very simple : all "draw", no matter how minor, implies noise. Notice though that "noise" as such is just peaks (or merely dips) in the supply. And it again deteriorates the signal like "for an oscillator" (signal = supply here).
On a side note, I think I also never talked about this exactly either, because it is sufficient to think that noise is there (thus also glass) anyway and that it deteriorates. All I did (and fairly explicitly) was promising that it would not help (glass isolation for USB interface), right ?
So again, at that other end is some receiver "pumping" and at each stroke it creates noise. That yet a power brick may again inject noise of different type is something else, but it is already too late.
With the notice that I won't tell all because I like to keep
some secrets (next Swenson Article coming up of course
) ... the galvanic isolation in itself is not so difficult to implement. And, I am sure that more DAC manufacturers claim it. It is only that we must realize that ALL isolation implies jitter. The one type more of it than the other, but 100% sure all do. This is because they imply (almost explicitly "create") lag, and the lag is not per any definition 100,000000000% constant. So it can be 99,99999999% but that soon implies over 100ps of jitter (but merely at the ns level when not careful). This counts for digital isolation means but even more for analogue. The latter because a transformer as analogue means is not linear and it thus depends on "frequency". And several different frequencies exist. But also think about "click through" like how a magnet suddenly hooks in. This also counts for transformers (sginal comes up, pushes and ... ploop - through !). So here too, undefined varying lag (= jitter).
With galvanic isolation - at least the type we apply - the ground potential does not need to be the same at either side. From this you can already see that the signal is actually re-generated. So, we have a noisy part on one side (USB, via glass or not) and we have a super silent - to our best means - environment at the other side. Meanwhile the "captured" signal literally is re-generated. This could be seen as the part where the noise disappears; a deteriorated signal (by noise) is incoming, while a clean signal comes out at the other side. It is as clean as the re-generator can do it, and here starts the ultra low supply necessity. But this is obvious I think.
What should be obvious by now as well is how whatever noise patterns implied by USB or software and PC in front of that, just is not possible to influence. It's just a totally new signal but "triggered" by the old (bits up remain that, bits down the same).
But now jittery because of the isolation means itself.
Following this logic (probably from my confused understanding), it made sense to reduce all that could generate this noise in a PC (Minimize OS, stop many services, underclock the CPU, play unattended, avoid networking, even use a linear PSU, etc...).
If I recall, there is inherent noise related to USB protocol and if there is also noise with the signal path, how does your modifications be stopping this from altering the dac ?
So that should be clear(ed) now.
How the jitter is cleared is something else, and any self-respecting DAC manufacturer who reads this will say something like "easy". Well, let's say that if that were true you would have had it in your own NOS1 right form the start. That is, assumed now that I already tried to figure it out before a first went out and that I really tried to think of everything and all, and still could not find a solution. So I'm dumb. But hey, look at the whole world who thinks that the glass solution will also work. Or in front of that, how asynch USB will do the job to begin with. So indeed, we know more and I hope I know more to than just being dumb;
I think I also talked about other developments the past two years (into physical working products) which contained the solution as well. Such a solution in particular could have been applied by any DAC manufacturer with some real sitting back, thinking and being smart. Still, that too doesn't even happen, or -as said earlier on- I should have had reports about XXHighEnd not doing a thing. Or what about "Rankin" expressions "I don't understand it either and jitter keeps on leaking through, no matter what I do". But still, such a fairly common solution could have been applied. Not this one. This one is so sneaky that even sneaky me could hardly come up with it;
It takes knowledge of
why it is needed in the first place (think XXHighEnd development where all starts) and what would be useful and especially what not. It also requires explicit thinking about what will influence and what can-not. It needs quite explicit knowledge of where to be in the electrical chain and how to turn that upside down when needed. It especially needs the experience (and now knowledge) of how already a Juli@ stupid card can do 705600 while it only does 192000 (I say 705600 and not 768000 because there was a bandwidth limiting chip somewhere). And still I did not write drivers for that. It definitely requires the knowledge of how a USB protocol operates in the first place, or better, how audio is "controlled" by it. In the end it needs so many tricks that I regard myself the only one who already ran into such individual tricks and only needed to combine them. That too requires some twirly thinking, but I can do that. Just grab a beer.
In the end it also needs the equipment to prove a few things, and at least for me that has been the worst part. But OK now.
This all is not about galvanic isolation but it is about jitter. And how you to the extreme want to turn that low without having mil spec oscillators in ovens requiring 2 Amperes for the heat to begin with. This, accidentally, came forth from the DSD development (
) where I simply forced myself to have the jitter lower as the sampling rate of that is higher. Just to have the jitter performance equal of what we are used to (whatever that was). This in itself was a year of throughput. And
still that general means everybody could have come up with and/but still not the very best.
So I'll end with this :
Any normal means would make use of a PLL (Phase Locked Loop) with the key issue of having that lot out of the noise because noise again implies jitter. The PCB for that alone ? quite complicated and around twice as large as your current DAC board. But it would have worked. There would be pairs of voltage tunable oscillators (a PLL is not *that* complicated) with the downside of such VCXOs not being the very best.
Here ? here is just only the normal audio oscillators and they are as fixed as can be and with that the best.
By guarantee there is just no single means "available" that hammers upon the oscillator(s) or draws current from the same supply line, with one small exception : The FPGA where all passes through to clock out the samples. Sadly I can't find jitter figures of FPGA's so that's a nice laugh. Or not, because I can't measure it either. But maybe new (finding) : nobody can unless heavy analog filtering is in order. So briefly about this I guess new to the world finding :
No matter what means are used at the outputs (!) of a DAC, this can not be measured for jitter. Why ? well, because the stepping that digital implies always vary over the wave cycle and no analyzer will be able to trigger on that at 100.000000% even time constant. Do it via the frequency domain, same story. So, THD will vary just because the distortion varies and the distortion varies because the triggering can't be constant. And want to know some fun ? When decent digital filtering is applied the samples for the high (!) frequencies DO end up at constant positions, THD
looks better because of that compared to lower frequencies (huhhh ???) and now jitter will also
measure better. Now that.
So yes, give me a new toy like that extreme high bandwidth analogue scope and it doesn't take minutes to discover such a thing. This, while it is so logical ... (now I know it). But do that with a digital analyzer and it will apply some math, make something of it and then plots something for you.
Well, didn't I tell you.
And never ever I read something about this, so ... new again.
But it sadly means that no jtter can be measured at the output of a DAC unless through capacitors (singing
) and thus even means of digital filtering which influence that, but won't say a thing about reality on jitter.
Same of course with my telling about the FPGA being behind the oscillator(s) but I can always hop over to the bit clock pin of the D/A chip. But later.
Hey Alain, was this something for a suitable answer ?
Best regards and thanks for the 100% justified good questions and pose,
Peter