advertisement


MQA pt II

However as a 'moral' point I regard some of the current use of patents to be wrong. Just as I regard copyright details like extending cover of written works for such long periods to be unreasonable, etc.

I would agree with this actually. The copyright extension was unnecessary and, in my view, wrong. For a pro-European, it pains me to admit that this change came out of the EU in Brussels, the UK previously had a significantly shorter period.

Some of the worst offenders in terms of abusing the system are the pharmaceutical companies who indulge in the practice of ever-greening original patents with follow ups of dubious validity. I spent many a happy year helping generic companies design around these or have them invalidated.
 
And in this case the answer seems no, because MQA uses crypto to authenticate the decoder - to prevent open source advocates doing to them what was done to HDCD (That's what Bob said publically recently). A clean room decoder will not handshake with the code. I guess that can be broken too, but that seems legally dubious...
This doesn't make sense. MQA as deployed today uses cryptography only to "authenticate" files. The public keys for this are known (https://code.videolan.org/mansr/mqa/-/blob/master/mqa-keys.c). Not that they are essential anyway. A decoder could simply ignore the authentication bits.
 
Not that they are essential anyway. A decoder could simply ignore the authentication bits.

I think that this is what Auralic's MQA decoder does.

Incidentally, the refer to it as "capable of decoding MQA" and they have not been sued, as far as I'm aware.
 
More holidays awarded. I'm going to need one soon................

I'm not going to lock the thread at this point as discussion is good. Cyclic arguments and ad hom is not.
 
EDIT:

It would be a pity to lock the thread now. Once the hurly burly's done, I would suggest that the most pertinent posts of the two threads be gathered in a condensed reference thread on MQA. It would be a shame to have the best posts drowned in the widening gyres of irrellevant tit for tat posts.

-----------------------------

On ASR Golden One has published a quite long explanation of what has transpired between him and ASR. Muddied waters.

It would be great to have Golden One get in touch with Jim to discuss the 2L files and more. But I suppose he is kept very busy dealing with all the wanted and unwanted attention his videos are getting.

I am delighted that there may be a possibility to use the proprietary and compatible Auralic decoding process to unfold MQA (here: https://us.auralic.com/pages/auralic-vs-drm). From there anything's possible. Also Archimago, Golden One and Jim are slowly unfolding the truth of what MQA really is. It is a bit like a thriller with the added benefits that what started as informed conjecture is slowly turning out to be very close to the truth.

Still, I am baffled by the calumnies and vigour involved when some people try to defend MQA as a "lossless" file. It is neither "mathematically lossless" nor "perceptually lossless" but rather some third type of lossless. Whatever it is, it is certainly not lossy. The song also goes that MQA never claimed the MQA codec to be (mathematically) lossless, not even when it said "lossless" in their original patent and on their web page (FQAs) until this year. Lossless as distinct from lossy.

Jim is ridiculed for his methodical work because it contained a few typos. Whoever starts posting about MQA because they are interested in getting to the bottom of what MQA is are fair game for attempts at splitting hairs or belittlement. If MQA is what it says on the home page tin, then go public and prove it to the masses and the audiophiles alike. Let MQA be examined in its folded and unfolded state. All this secrecy and disruption walk like ducks, quack like ducks but are MQA files real ducks after all?

Soon we will know, I hope, when Jim, Golden One or someone alse manages to examine and analyse the whole MQA process from original master file - via various filter treatments - to MQA encoding, to MQA decoding and more.
 
Last edited:
It would be a pity to lock the thread now. Once the hurly burly's done, I would suggest that the most pertinent posts of the two threads be gathered in a condensed reference thread on MQA. It would be a shame to have the best posts drowned in the widening gyres of irrelevant tit for tat posts.

This thread has been a hamster wheel for weeks. Very little new meat has been fed into the goulash in that time. At a certain stage it needs to be put out of its misery.

But I would say that for all the heated and not so heated words that have passed, this from @March Audio summarises everything anyone needs to know about the CON argument succinctly. What is the PRO argument in three sentences?

Any rational, objective, subjective or user benefit analysis concludes MQA is a solution looking for a problem. It's obviously a land grab attempting to monetise the music distribution and hardware chain. Technically it just damages the recorded signal. There isn't anything about it that benefits the consumer. Nothing, Nada, zip.
 
is it possbile to have MQA deblurring applied to plain 16/44.1, no MQA metadata - Authentication and MQA signalling, just for playback for generic DAC.
 
is it possbile to have MQA deblurring applied to plain 16/44.1,

If with blurring you mean the time-symmetric ('acausal') shape of the impulse response of the 22.05kHz anti-aliasing filter that is to be used in the production of CDs, and if you accept that this blurring is a real issue (and I my view there is not shred of evidence for this), then you have to look at Meridian's top-flight CD-player, the 808.2, to see how Bob Stuart tackles the problem there.

Here is the Stereophile test https://www.stereophile.com/cdplaye...re_reference_cd_playerpreamplifier/index.html, but sadly JA omitted the more interesting things from his measurement regime. PM did better at Hifi News, but you have to register to access the data (June 2008): http://www.milleraudioresearch.com/avtech/index.html.

So what did Meridian do to get rid of that pesky (or not) filter pre-ringing that is embedded in nearly all CDs?

They use a minimum-phase low-pass filter in their oversampler, cutting in at lower than 22kHz. Minimum-phase means that it has non-linear phase distortion, but also that it has no pre-ringing, only post-ringing. By cutting below 22kHz it removes the ringing artefacts of the 22kHz AA filter used in the majority of CDs' production, but it, of course, imposes its own, different, artefacts. Meridian named this an 'apodising' filter, borrowing from optics were a light path is more heavily attenuated when one moves to the rim of the lenses.

Anyone can do to their CD signals what Meridian did at the replay stage, and anyone can do this as part of the recording process or as part of the replay process. All one needs is a tool like the fully configurable resampler in iZotope, and these days probably a zillion other software packages. You set it for a high-order low-pass at 20kHz or so, minimum phase, and let rip.

Salient detail 1: with this style of filter Meridian did nothing else than emulating the late-70s/early-80s analogue anti-alias filter of digital recording systems like the Sony PCM-1630, the same filters that everyone rallied against in those times. Moreover, Meridian were not first: one of the filters in the Marantz CD-7 did pretty much the same, only Marantz made no brouhaha about this.

Salient detail 2: Ayre reacted to Meridian's filter by finely pointing out that it, still, had very much post-ringing, proposing a much shorter filter, also minimum phase, and deploying that in their products. Ayre named that too 'apodising', not grasping that due to its shortness it would lack the stopband rejection to actually cut out the recording time ringing.

This to bring us back to the use of MQA at CD rates ...

First it must be made clear that MQA-for-CD-rate is a afterthought. It never was part of the original MQA idea, the spark that made Craven and Stuart devise a method for packing a high-resolution (i.e. 88.2kHz or more) signal into something more or less compliant with base resolution (i.e. 44.1kHz or 48kHz) PCM. So baseline CD is not MQA's core. It was added to the MQA story, presumably in order to be able to infect more music, extract more money. After all the high-res market and music pool is very limited and shareholder value must be created.

When I ran my tests of MQA a couple of years ago, using Tidal, a Meridian Explorer2, and digital capture of raw MQA data as well as Tidal-unfolded data, it became clear that for baseline CD rate, MQA does not use the Meridian 808.2 approach. What MQA does is much closer to the Ayre leaky filter style, i.e. clearly suboptimal in each and every respect. I theorised about the why of this back then, but as it is a long time ago I have forgotten much. Maybe I'll revisit my old files. Maybe not.

Now looking today at GO's baseline files I wonder what is going on. But it is hard to see, as GO's masters clearly overloaded the MQA encoder, yielding a veritable dog's breakfast.
 
is it possbile to have MQA deblurring applied to plain 16/44.1, no MQA metadata - Authentication and MQA signalling, just for playback for generic DAC.
I think it's essential to entertain the possibility that "de-blurring" may owe more to good marketing than to good engineering.

I see a lot of "technical" discussion in social media that seems to me to be fundamentally mis-directed, when social media are more often used these days for persuasion than enlightenment.
 
If with blurring you mean the time-symmetric ('acausal') shape of the impulse response of the 22.05kHz anti-aliasing filter that is to be used in the production of CDs, and if you accept that this blurring is a real issue (and I my view there is not shred of evidence for this), then you have to look at Meridian's top-flight CD-player, the 808.2, to see how Bob Stuart tackles the problem there.

Here is the Stereophile test https://www.stereophile.com/cdplaye...re_reference_cd_playerpreamplifier/index.html, but sadly JA omitted the more interesting things from his measurement regime. PM did better at Hifi News, but you have to register to access the data (June 2008): http://www.milleraudioresearch.com/avtech/index.html.

So what did Meridian do to get rid of that pesky (or not) filter pre-ringing that is embedded in nearly all CDs?

They use a minimum-phase low-pass filter in their oversampler, cutting in at lower than 22kHz. Minimum-phase means that it has non-linear phase distortion, but also that it has no pre-ringing, only post-ringing. By cutting below 22kHz it removes the ringing artefacts of the 22kHz AA filter used in the majority of CDs' production, but it, of course, imposes its own, different, artefacts. Meridian named this an 'apodising' filter, borrowing from optics were a light path is more heavily attenuated when one moves to the rim of the lenses.

Anyone can do to their CD signals what Meridian did at the replay stage, and anyone can do this as part of the recording process or as part of the replay process. All one needs is a tool like the fully configurable resampler in iZotope, and these days probably a zillion other software packages. You set it for a high-order low-pass at 20kHz or so, minimum phase, and let rip.

Salient detail 1: with this style of filter Meridian did nothing else than emulating the late-70s/early-80s analogue anti-alias filter of digital recording systems like the Sony PCM-1630, the same filters that everyone rallied against in those times. Moreover, Meridian were not first: one of the filters in the Marantz CD-7 did pretty much the same, only Marantz made no brouhaha about this.

Salient detail 2: Ayre reacted to Meridian's filter by finely pointing out that it, still, had very much post-ringing, proposing a much shorter filter, also minimum phase, and deploying that in their products. Ayre named that too 'apodising', not grasping that due to its shortness it would lack the stopband rejection to actually cut out the recording time ringing.

This to bring us back to the use of MQA at CD rates ...

First it must be made clear that MQA-for-CD-rate is a afterthought. It never was part of the original MQA idea, the spark that made Craven and Stuart devise a method for packing a high-resolution (i.e. 88.2kHz or more) signal into something more or less compliant with base resolution (i.e. 44.1kHz or 48kHz) PCM. So baseline CD is not MQA's core. It was added to the MQA story, presumably in order to be able to infect more music, extract more money. After all the high-res market and music pool is very limited and shareholder value must be created.

When I ran my tests of MQA a couple of years ago, using Tidal, a Meridian Explorer2, and digital capture of raw MQA data as well as Tidal-unfolded data, it became clear that for baseline CD rate, MQA does not use the Meridian 808.2 approach. What MQA does is much closer to the Ayre leaky filter style, i.e. clearly suboptimal in each and every respect. I theorised about the why of this back then, but as it is a long time ago I have forgotten much. Maybe I'll revisit my old files. Maybe not.

Now looking today at GO's baseline files I wonder what is going on. But it is hard to see, as GO's masters clearly overloaded the MQA encoder, yielding a veritable dog's breakfast.

I asked because I was listening an album available as HiFi in Tidal, no equivalent Master version of it and my observations is that the sound is MQA like.

That was my first impression.
 
So what did Meridian do to get rid of that pesky (or not) filter pre-ringing that is embedded in nearly all CDs?


Anyone can do to their CD signals what Meridian did at the replay stage, and anyone can do this as part of the recording process or as part of the replay process. All one needs is a tool like the fully configurable resampler in iZotope, and these days probably a zillion other software packages. You set it for a high-order low-pass at 20kHz or so, minimum phase, and let rip.


This to bring us back to the use of MQA at CD rates ...

First it must be made clear that MQA-for-CD-rate is a afterthought. It never was part of the original MQA idea, the spark that made Craven and Stuart devise a method for packing a high-resolution (i.e. 88.2kHz or more) signal into something more or less compliant with base resolution (i.e. 44.1kHz or 48kHz) PCM. So baseline CD is not MQA's core. It was added to the MQA story, presumably in order to be able to infect more music, extract more money. After all the high-res market and music pool is very limited and shareholder value must be created.

When I ran my tests of MQA a couple of years ago, using Tidal, a Meridian Explorer2, and digital capture of raw MQA data as well as Tidal-unfolded data, it became clear that for baseline CD rate, MQA does not use the Meridian 808.2 approach. What MQA does is much closer to the Ayre leaky filter style, i.e. clearly suboptimal in each and every respect. I theorised about the why of this back then, but as it is a long time ago I have forgotten much. Maybe I'll revisit my old files. Maybe not.

Now looking today at GO's baseline files I wonder what is going on. But it is hard to see, as GO's masters clearly overloaded the MQA encoder, yielding a veritable dog's breakfast.

I'll add some comments, etc...

Firstly, as my 'page1' on this points out, it is more common than people might assume for the content of a CD to le low-pass filtered with a turnover around 20kHz. This means if they used a textbook modified-sinc, that's largely what you get *even if* you use a fancy 'short' filter. Convolution rulz! :)

FWIW I'd be interested in knowing more about what you did wrt your later comments copied above. I'm still looking to get an MQA DAC and capture its output with an ADC, but am currently hitting the snag that I use Linux so need to know before I buy that the device *is* USB Audio Class compliant. If it ain't, I can't use it. All my software runs on Linux or RO which need class compliance.

The last point - and the 'justifications' from MQA - have prompted some thought wrt the basic presumptions behind what 'Music' must be like. I've started to get a possible handle on this which might formally show their model is flawed, but I'm still pondering.

However some hints may stimulate others to think about this.

1) The long-term stats of the fluctuations of traffic flows in/out of a large city have a form similar to the distribution of Galaxies.

2) The coastline of Britain is infinitely long.

3) The usual measurement trick of intgrating to get an average with improved SNR fails when you encounter 1/f^n noise.

Now consider the statistical analysis of 'music' over many examples...

BTW geometry also comes into this. So the much fabled 'magic triangles' may have their uses. :)

All that said, I'm currently working on more mundane aspects like filter analysis and inversion. But the above are more long term thoughts. GO's results remain useful - to some extent *because* they hit the encoder so hard. His 44k version is also useful because it *didn't*.

As usual: Do not hold yer breath. And there will be typoos... 8-]
 
I asked because I was listening an album available as HiFi in Tidal, no equivalent Master version of it and my observations is that the sound is MQA like.

That was my first impression.

This can be hard to tell at present. You may find a spectrum shows a 20kHz low pass filter shape. But that may mean it is a filtered version of the MQA version, not a plain downconversion. However if there is no sign of such a filter then it could be a 'full width' plain LPCM or with added MQA low enough not to show an HF noise 'bump'. (These may show up on a longer spectrum average, though, if the amount of MQA is more than trivial.)
 
I

Meridian named this an 'apodising' filter, borrowing from optics were a light path is more heavily attenuated when one moves to the rim of the lenses.

BTW I'm not sure they can claim to have 'invented' that label. It just means it removes Gibbs-like behaviour, and was a common term for a long time in areas like FT spectrometry lng before audio CD. My first uses of it were in ye olde Concorde Eclipse project when I did the electronics for the interferometer.
(first page of 3 here if interested:

http://jcgl.orpheusweb.co.uk/history/concorde/ChaseTheSun.html

...and yes, I look(ed) awful. :) )
 
Jim, I did this in Feb 2017 and apparently I did not take many notes, or I just deleted them after losing interest.

I captured the MQA-unfolded output from Tidal in the digital domain with Audacity. I also captured the fully MQA-decoded output of the Explorer2 using a Tascam recorder running at 192kHz (a DV-RA1000, these days I use the more handy DA-3000). But a warning with the latter approach: as you know nearly all audio ADC chips these days are delta-sigma and have shaped quantisation noise rising rapidly once above 20kHz. This obscures of course that which one wants to investigate in an MQA capture!

It is much cleaner to capture the first unfold in the digital domain, and then add oversampling with MQA-rendering filtering oneself afterwards. Mans published the render filters that are used in the wild.


Yes, Britain's coastline is infinitely long. But the border of Colorado is not.

If magic triangles are good, then magic hexagrams might be better.
 
(OT)

My first uses of it were in ye olde Concorde Eclipse project when I did the electronics for the interferometer.

You will remember that I told you that we visited that very Concorde three years ago!
 


advertisement


Back
Top