This may be so (I certainly agree that record companies were 'supposed' to use RIAA) but given that using the 'wrong' EQ will add distortion to an already distorted signal, and as hifi nerds we're alert to distortion, it's not a major leap of faith to recognise that if a non-RIAA curve sounds more natural (instrumental tonality being the most obvious victim) then it's a decent bet that that's the appropriate EQ curve. I know we can't know for sure, but there are other things we set by ear (VTA, VTF) so it's not without precedent. If solo violin sounds screechy and thin, I know it doesn't sound like that in real life, or on other records. I can blame the pressing, mastering, or the system; but if using a different EQ curve fixes it, that's surely the likely culprit? I've heard a couple of preamps/phono stages which offer different curves (Decca, Columbia, RIAA most obviously) and it's not a subtle thing, IME.