APTX lossless Bluetooth audio is coming

From a technical perspective this is really interesting but given the enormous number of variables including the wetware aka the person at the ultimate end of the chain, I have doubts about the real world measurable/tangible benefits. In other words, we seem to be approaching the point of the discussion of “oxygen free” speaker cables or even “Corning” versus generic glass in optical cables

And for that matter. I’d much rather they focus on reliability and power consumption as things like interference caused skips and dropouts as well as faster battery drain are real world issues for far more

Hands on with AptX Lossless, the new tech promising CD-quality audio over Bluetooth - The Verge

This would be great, but I’m fully expecting Apple to not implement this, just like they never implemented the prior aptX audio protocol, even though it was vastly better than standard bluetooth audio.

Side note, for the first time in months, yesterday I sat down and played an actual CD (gasp) using a decent receiver (“amp”), and some decent speakers, and it was… nice. Like “I should do this more often” kind of nice. I’ve been using Airpods pro which are great, but there’s something different about ‘room scale’ sound.

1 Like

Well of course they counter with their own ALAC alternative and that for instance it’s more efficient data wise

Though I do remember being at a “shoot out” several years ago at CES where Klipsch was talking about their first Bluetooth audio products and the net was that virtually non one present could reliably identify differences between audio streams above 256kbs regardless of it was AAC or MP3

And thus my reference to oxygen free speaker cables

1 Like

Fair enough, the bigger improvement for me was probably upgrading those 128kbps mp3’s that I was using. :joy:

For me as well. I was an early iTunes adopter going back to the days when the default was 128Kbs and still had DRM and I signed up for iTunes match specifically because it would “upgrade” any less than 256Kbps files in your personal library to 256kbs files.

That being said, I can’t reliably tell the difference between 256kbps AAC and ALAC.

OTOH audio and video perception and processing is incredibly complex and also personal, so I don’t doubt others who say they can tell a difference.

To use myself as example, while I can’t parse the differences above 256kbps audio, OTOH I seem to have a specific idiot/savant skill when it comes to displays, in that I can reasonably reliably tell you if a display panel is OLED, IPS, VA, or TN just by a quick glance at it.

I’m to this day not exactly sure how I have this skill, but my former boss like to use me as a party trick at trade shows to detect it.

Obviously, there is something(s) specific/weird in my visual acuity. And the downside is that I also immediately see things like flaws or incorrectly calibrated ones and it drives me nuts. My wife vowed never to go TV shopping with me again…

You have wetware in the equation. Human eyes have objective (variable) limits but the brain can add an amazing amount with training/experience.

I have the same kind of thing going on with display resolution. Decades of working in the graphic arts field, especially on presses themselves, trained me to instantly detect line art & text jaggies and color dot mis-registration, etc. So I pay the price for “useless” 2k or 4k screens on laptop sized devices because lower resolution often bothers me. (I can usually live with it for games or video.)

1 Like

Haha, I went through the same itunes Match rigamarole. One eye opener (ear opener) is that the bit level may seem plenty high for smoothly representing a loud sine wave, but for very low sound levels (e.g. a quiet passage of only a few sting instruments) that same bit depth may be woefully low to properly reproduce those much smaller sound waves. I’m sure these days there’s dynamic range adaption or something like that, but early me didn’t understand that softer sounds could be hard to reproduce in a given audio protocol. Same thing with faithfully reproducing brightness levels in a very dark scene.

Ahhhhhhh - the joy of having poor eyesight and hearing - I am often satisfied with devices that would cripple real experts like @Desertlap and @Dellaster . I was even satisfied with WQHD for a good 5 years or more after I mustered up $700 for the first one. Of course 4k has since spoiled me, and I do like my Beats Fit Pro, but will never be able to muster up the courage to be a real videophile/audiophile…


Yeah, a great example of that now is HDR which is IMHO a total cluster**** generally (though on the PC side it’s orders of magnitude worse right now) being both very poorly implemented from a technical aspect generally, but also widely misunderstood as well.

I.E. The Witcher on Netflix being held up as crucible for good versus bad HDR processing by many over at AVS forum when in fact it’s very poorly done, unless the intent was to serve as a torture test, in which virtually no display does well with.

1 Like