Hi-resolution audio, especially for streaming. The general idea is that listening to digital audio files that have a greater bit depth and sample rate than CD (24-bit/192Khz vs 16-bit/44.1 KHz) translates to better-sounding audio, but in practice that isn’t the case.
For a detailed breakdown as to why, there’s a great explanation here. But in summary, the format for CDs was so chosen because it covers enough depth and range to cover the full spectrum of human hearing.
So while “hi-res” audio does contain a lot more information (which, incidentally, means it uses up significantly more data/storage space and costs more money), our ears aren’t capable of hearing it in the first place. Certain people may try to argue otherwise based on their own subjective experience, but to that I say “the placebo effect is a helluva drug.”
Oh yeah. 128k rips from back then were rough. MP3 has gotten somewhat better since then, to be fair. V0/V1 VBR is still perfectly fine to listen to; it’s just not as efficient as the newer codecs.
Yeah they do, although CBR performs noticeably worse than VBR with Lame MP3. As I mentioned elsewhere, MP3 @ V0 or V1 VBR sounds just as good as the above. I just personally haven’t used MP3 for years because the newer codecs are more efficient.
which, incidentally, means they use up significantly more data/storage space and cost more money
All of this is very true, but this is the only issue I really disagree with here.
I am in an era where a good quality rip of a movie can be almost 50 gigabytes by itself. That means for every terabyte of storage, I can store just 20 of movies of this size.
Don’t even get my started on television series and how big those can balloon to with the same kind of encoding.
An entire collection of FLACs, thousands of albums worth, is still less than 500 gigabytes total, in other words half a terabyte. (My personal collection anyway)
I mean, the average size of one of my FLAC albums is around 200-300 megabytes. Even with the larger “hi-res” FLAC files you’re still not getting as obscenely big as movie and television files.
Sure, it takes up more space than an MP3 or a FLAC properly encoded to CD standards (my preferred choice, for the reasons outlined above), but realistically, the amount of space it takes up compared to those is negligible when compared to other types of media.
Storage and energy to operate storage has become incredibly cheap, especially when you’re dealing with smaller files like this.
This is true, especially if you are storing files locally. However, even compared to “CD quality” FLAC, a 24/192 album is still going to be around three times larger (around 1GB per album) to download. If everyone switched over to streaming hi-res audio tomorrow, there would be a noticeable jump in worldwide Internet traffic.
I’m personally not ok with the idea of bandwidth usage jumping up over 3x (and even more compared to lossy streaming) for no discernable benefit.
I’m personally not ok with the idea of bandwidth usage jumping tenfold for no discernable benefit.
An extremely reasonable position to take! Because even if the increase in energy usage is negligible locally, when widespread, those small chunks of energy use add up into a much larger chunk of energy use. Especially when including transferring that over an endless number of networks.
I always talk about this in regards to automobiles and manual roll-up/down windows versus automatic windows. Sure, it’s an extremely small amount of energy to use for automatic windows on a car, but when you add up the energy used on every cars automatic windows through the life of each and every car with automatic windows and suddenly it’s no longer a small number. Very wasteful, imho.
50GB for the simple dual layer discs. You can theoretically reach 100GB with triple layer disks. The largest BDRip I have is 90GB for the Super Mario Bros. Movie.
Edit: UHD Blu-ray only supports dual and triple layer disks, not quad. Quad layer discs do exist though, with up to 128GB of capacity.
I’ve always kinda wondered about this. I’m not an audio guy and really can’t tell the difference between most of the standards. That said, I definitely remember tons and tons ‘experts’ telling me that no one can tell the difference between 720p and 1080p TV at typical distance to your couch. And I absolutely could and many of the people I know could. I can also tell the difference between 1080 and 4k, at the same distances.
So I’m curious if there’s just a natural variance in an individual’s ability to hear and audiophiles just have a better than average range that does exceed CD quality?
Similar to this, I can tell the difference between 30fps and 60fps, but not 60 to 120, yet some people swear they can. Which I believe, I just know that I can’t. Seems like these guidelines are probably more averages, rather than hard biological limits.
It’s a fair question. Human hearing ability is a spectrum like anything else, however when it comes to discerning the difference in audio quality, the vast, vast majority of people cannot reliably tell the difference between high-bitrate lossy and lossless when they do a double blinded test. And that includes audiophiles with equipment worth thousands of dollars.
Of that tiny minority who can consistently distinguish between the two, they generally can only tell by listening very closely for the very particular characteristics of the encoder format, which takes a highly trained ear and a lot of practice.
The blind aspect is important because side-by-side comparisons (be they different audio formats, or 60fps vs 120fps video) are highly unreliable because people will generally subconsciously prefer the one they know is supposed to be better.
i think hi res is for professional work.
If you’re going to process, modify, mix, distort the audio in a studio, you probably want the higher bit depth or rate to start with, in case you amplify or distort something and end up with an unintended artefact that is human audible.
But the output sound can be down rated back to human levels before final broadcast.
O couse if a marketing person finds out there is a such a thing as “professional quality”. . . See also
“military spec”, “aerospace grade”
Yeah to expand on this, in professional settings you’ll want a higher sampling frequency so you don’t end up with eg. aliasing, but for consumer use ≥44–48kHz sampling rate is pretty much pointless
I think this is the case where certain people simply can’t see it here the difference.
I collect video game and movie soundtracks and the main difference I can hear between a 320kbps VS a FLAC that’s in the 1000kbps range is not straight up “clarity” in the sense that something like an instrument is “clearer” but rather the spacing and the ability to discern the difference where instruments come from is much better in a Hi-Res file with some decent wired headphones (my pair is $200). All this likey doesn’t matter much though when most users stream via Spotify which sounds worse than my 320kbps locally and people are using Bluetooth headphones at lower bitrates since they don’t have better codec compatibility like aptX and LDAC.
A lot of it will depend on your output device; cheap headphones will wreck audio quality.
I remember the bad old days when .mp3 files for streaming were often 128kbps (or less!); I could absolutely hear audio artifacts on those, and it got significantly worse with lower bitrates. 320kbps though seems to be both fairly small, and I can’t personally tell the difference between that and any lossless formats.
All you really need is the Nyquist frequency of human hearing to know. That’s a good breakdown for audiophiles I’m sure but it is broadly as simple as the Nyquist frequency.
Hi-resolution audio, especially for streaming. The general idea is that listening to digital audio files that have a greater bit depth and sample rate than CD (24-bit/192Khz vs 16-bit/44.1 KHz) translates to better-sounding audio, but in practice that isn’t the case.
For a detailed breakdown as to why, there’s a great explanation here. But in summary, the format for CDs was so chosen because it covers enough depth and range to cover the full spectrum of human hearing.
So while “hi-res” audio does contain a lot more information (which, incidentally, means it uses up significantly more data/storage space and costs more money), our ears aren’t capable of hearing it in the first place. Certain people may try to argue otherwise based on their own subjective experience, but to that I say “the placebo effect is a helluva drug.”
Conversely low res audio clearly sounds like trash.
Up to a certain point, yes. >192k AAC / OGG / Opus sounds just as good as FLAC in a blind test, though. Even with good equipment.
Yeah, I’m thinking of circa 2000 MP3s. 128k was the good stuff and lower was still common.
Back when a 4 minute song was like 1.5MB so you could fit more music on your 256MB mp3 player because you could not afford an iPod.
Oh yeah. 128k rips from back then were rough. MP3 has gotten somewhat better since then, to be fair. V0/V1 VBR is still perfectly fine to listen to; it’s just not as efficient as the newer codecs.
removed by mod
Yeah they do, although CBR performs noticeably worse than VBR with Lame MP3. As I mentioned elsewhere, MP3 @ V0 or V1 VBR sounds just as good as the above. I just personally haven’t used MP3 for years because the newer codecs are more efficient.
removed by mod
All of this is very true, but this is the only issue I really disagree with here.
I am in an era where a good quality rip of a movie can be almost 50 gigabytes by itself. That means for every terabyte of storage, I can store just 20 of movies of this size.
Don’t even get my started on television series and how big those can balloon to with the same kind of encoding.
An entire collection of FLACs, thousands of albums worth, is still less than 500 gigabytes total, in other words half a terabyte. (My personal collection anyway)
I mean, the average size of one of my FLAC albums is around 200-300 megabytes. Even with the larger “hi-res” FLAC files you’re still not getting as obscenely big as movie and television files.
Sure, it takes up more space than an MP3 or a FLAC properly encoded to CD standards (my preferred choice, for the reasons outlined above), but realistically, the amount of space it takes up compared to those is negligible when compared to other types of media.
Storage and energy to operate storage has become incredibly cheap, especially when you’re dealing with smaller files like this.
This is true, especially if you are storing files locally. However, even compared to “CD quality” FLAC, a 24/192 album is still going to be around three times larger (around 1GB per album) to download. If everyone switched over to streaming hi-res audio tomorrow, there would be a noticeable jump in worldwide Internet traffic.
I’m personally not ok with the idea of bandwidth usage jumping up over 3x (and even more compared to lossy streaming) for no discernable benefit.
An extremely reasonable position to take! Because even if the increase in energy usage is negligible locally, when widespread, those small chunks of energy use add up into a much larger chunk of energy use. Especially when including transferring that over an endless number of networks.
I always talk about this in regards to automobiles and manual roll-up/down windows versus automatic windows. Sure, it’s an extremely small amount of energy to use for automatic windows on a car, but when you add up the energy used on every cars automatic windows through the life of each and every car with automatic windows and suddenly it’s no longer a small number. Very wasteful, imho.
50 GB for a BRD rip is one that is not re-encoded, that’s a straight rip from the disk.
50GB for the simple dual layer discs. You can theoretically reach 100GB with triple layer disks. The largest BDRip I have is 90GB for the Super Mario Bros. Movie.
Edit: UHD Blu-ray only supports dual and triple layer disks, not quad. Quad layer discs do exist though, with up to 128GB of capacity.
removed by mod
I’ve always kinda wondered about this. I’m not an audio guy and really can’t tell the difference between most of the standards. That said, I definitely remember tons and tons ‘experts’ telling me that no one can tell the difference between 720p and 1080p TV at typical distance to your couch. And I absolutely could and many of the people I know could. I can also tell the difference between 1080 and 4k, at the same distances.
So I’m curious if there’s just a natural variance in an individual’s ability to hear and audiophiles just have a better than average range that does exceed CD quality?
Similar to this, I can tell the difference between 30fps and 60fps, but not 60 to 120, yet some people swear they can. Which I believe, I just know that I can’t. Seems like these guidelines are probably more averages, rather than hard biological limits.
It’s a fair question. Human hearing ability is a spectrum like anything else, however when it comes to discerning the difference in audio quality, the vast, vast majority of people cannot reliably tell the difference between high-bitrate lossy and lossless when they do a double blinded test. And that includes audiophiles with equipment worth thousands of dollars.
Of that tiny minority who can consistently distinguish between the two, they generally can only tell by listening very closely for the very particular characteristics of the encoder format, which takes a highly trained ear and a lot of practice.
The blind aspect is important because side-by-side comparisons (be they different audio formats, or 60fps vs 120fps video) are highly unreliable because people will generally subconsciously prefer the one they know is supposed to be better.
i think hi res is for professional work. If you’re going to process, modify, mix, distort the audio in a studio, you probably want the higher bit depth or rate to start with, in case you amplify or distort something and end up with an unintended artefact that is human audible. But the output sound can be down rated back to human levels before final broadcast.
O couse if a marketing person finds out there is a such a thing as “professional quality”. . . See also “military spec”, “aerospace grade”
Yeah to expand on this, in professional settings you’ll want a higher sampling frequency so you don’t end up with eg. aliasing, but for consumer use ≥44–48kHz sampling rate is pretty much pointless
I think this is the case where certain people simply can’t see it here the difference.
I collect video game and movie soundtracks and the main difference I can hear between a 320kbps VS a FLAC that’s in the 1000kbps range is not straight up “clarity” in the sense that something like an instrument is “clearer” but rather the spacing and the ability to discern the difference where instruments come from is much better in a Hi-Res file with some decent wired headphones (my pair is $200). All this likey doesn’t matter much though when most users stream via Spotify which sounds worse than my 320kbps locally and people are using Bluetooth headphones at lower bitrates since they don’t have better codec compatibility like aptX and LDAC.
deleted by creator
It’s for all the pets at homes hearing the same audio, now with original insects and birds outside and mice in the walls.
True. There’s something to be said for pleasuring any passing bats who might be in the vicinity.
What genres are your bats listening to?
Goth.
Right you are, but don’t start telling everyone so I can’t silently download my lossless albums from Tidal, Deezer and Qobuz anymore.
A lot of it will depend on your output device; cheap headphones will wreck audio quality.
I remember the bad old days when .mp3 files for streaming were often 128kbps (or less!); I could absolutely hear audio artifacts on those, and it got significantly worse with lower bitrates. 320kbps though seems to be both fairly small, and I can’t personally tell the difference between that and any lossless formats.
All you really need is the Nyquist frequency of human hearing to know. That’s a good breakdown for audiophiles I’m sure but it is broadly as simple as the Nyquist frequency.