Correctly identified with 100% accuracy. The author said they can't, but for me the mp3 versions have noticeable high frequency artifacts that make the recording sound slightly less clear. Using Sony XM5
Acoustic guitar, drums are a good signal - lower quality just sounds hollow / spacey. The most obvious a/b was the Gamma Ray sample, imo (with mid-range Beyer headphones, wired). It's easiest to tell with recordings you know well, for me Steely Dan is a good reference. I rip to FLAC for archiving even though 320 or 250+ VBR is probably 'close enough' unless I'm scrutinizing.
> I rip to FLAC for archiving even though 320 or 250+ VBR is probably 'close enough' unless I'm scrutinizing.
MP3 is fundamentally flawed and has audible artifacts no matter what the bitrate is. If you use a newer codec (AAC or Opus) you'll probably not notice anything.
Part of that might be if you're using them wireless because then you're double compressing the audio which amplifies the artifacts (mp3 -> Bluetooth compression).
The high-frequency "swishiness" the usual giveaway.
But sadly today most popular music is ruined beyond repair with dynamic compression, not data compression. The craven stupidity of the loudness war may be unequaled in the history of art, and yet even the artists often don't seem to understand what the problem is. You see legendary artists complaining about modern sound quality (Dylan, Neil Young, and so forth) but then cheerleading for absurd sampling rates and bit depth. NO. That isn't the problem. I have 45-RPM records that sound better than their "lossless," "remastered" incarnations on streaming services.
The biggest problem in popular music (and I would say this probably pervades everything but classical at this point) is dynamic compression.
Today “loudness” is an aesthetic choice and good mixers and producers know how to craft a record that is both loud and of good sonic quality.
There is a place for both dynamic records (in the sense of classical or old jazz records) and contemporary loudness aesthetic.
Can inexperienced producers/mixers do a hack job trying to emulate the loud mixes of pros? Yes. The difference comes down to taste and ability to execute with minimal sonic tradeoffs.
Source: I have a long history producing, mixing, and mastering records and work among Grammy winners regularly. Very much in the dirt on contemporary records.
From my observations and from industry people I've read opinions from, the early '90s were the peak for mastering quality. Digital was well-understood, but wasn't being abused.
Listen to the original pressings of songs like "Creep." That guitar noise punched through because there were still dynamics back then. Music was fun to listen to, especially with headphones. The soundscape of an album sometimes led me to give music a second chance that I might not have bothered with if it didn't sound so good.
Now, even very catchy music is tiresome and quickly abandoned because of dynamic compression. It's fatiguing (if not grating) to listen to. Yes, there are a few exceptions here and there. "Gives You Hell" by the All-American Rejects comes to mind. But in general music sounds like ass now. Take Coldplay... regardless of what you think of the content, this music should sound great. But it's sonically dull trash.
The thing about mastering is that unless you're a part of the production team and get to hear the before/after you'll almost never know what the mastering engineer's contribution actually was. Done well, their role is invisible.
Mastering engineers work with the record that they receive from the mixer. It's entirely possible that the smashed (over-limited) record was handed to them by the mixer and approved by the artist. In that case the ME's hands are usually tied. They work with what they receive.
Likewise, the mixer may receive a reference mix (from the producer) that is smashed. The mixer has far more ability to influence the sonics than the ME (waaay more), but they too can have their hands tied if the artist is really attached to the vibe of that rough producer mix.
Professional mixers and ME's are well aware of the negative effects of the loudness wars. It's well understood by any working professional today. Ultimately the buck stops with the record's producer and the artist. They're the ones seeing the project through from beginning to end.
The difference falls on them, between a "loud" record that sounds like lifeless trash and a "loud" record crafted with skill, taste, and intention that has depth and impact. As I said, amazing "loud" records do exist when all stages of the record's production team are aligned. But it requires restraint and taste on the production team and the artist.
---
You're not wrong that something changed around the mid 90s. Until the late 80s records were being mixed primarily for vinyl. The limitations of the medium (namely the needle would skip out of the groove if you tried to print a loud or bass-y mix) kept the loudness in check. You simply COULDN'T make a record that loud. This limitation acted like speed bumps. But perceptual loudness has always been an objective of recording engineers since the dawn of recording.
What happened is that in the 90's digital tools (particularly digital limiting) in combination with digital playback mediums (CDs) opened up the door to squeeze greater loudness and new sonic aesthetics out of records. As such, these tools have been abused and over-cooked. In some cases that abuse may be the objective.
Today we're well aware of the trade-offs and to some artists it just doesn't matter. They WANT it smashed. It ultimately comes down to restraint, taste, and good technical know-how to get a flavor of loudness that doesn't have too many tradeoffs.
Agreed regarding the audibility of (data-) compressed audio, just put on some classic jazz with trumpets and lots of cymbals and the artifacts are immediately apparent.
Not going to argue with you regarding dynamic compression, but after backing away from the worst excesses of the volume wars by mastering engineers in the mid '00s, things are sounding better to my ears. Dynamic compression can sound good (even in the extreme) if done for artistic effect. Like here's Beck's Ramona where the drums & cymbals have the tar squashed out of them with serious limiting, which to my ears nicely tames the sonics of Joey Waronker's spirited performance, while fitting well dynamically into the rest of the song.
https://www.youtube.com/watch?v=e3yZ9OVjzbE
That said, maybe the engineers responsible for some of the worst dynamic squashing could be pressed into TV/film audio service where in 2026, there are still extreme volume imbalances between on-screen dialogue and everything else (hint the dialogue isn't loud enough and the everything else, especially crashes and explosions, are wayyy too loud).
Sure, compressing individual elements judiciously is a valid and even necessary choice. But the so-called "remastering" that has ruined our whole pop/rock heritage as represented today on streaming services is a heinous, lazy hack job that ruins people's enjoyment of music... even though they can't put their finger on why.
When I was a little kid, I'd ride my bike to the record store and buy my two or three favorite current songs on 45. I noticed that they didn't sound as "fat" as they did on the radio. So I got an equalizer. But that of course wasn't the answer.
Over time I realized that I liked the sound of the records better. They were more fun to turn up loud. Likewise I realized that the oddly-quiet station on my FM dial (WXRT in Chicago) sounded the best. All because it, like the records, was less dynamically compressed than the other stations.
A huge number of people alive today have never heard good-sounding pop music, which is disgraceful. Near-perfect sound reproduction is within everyone's reach now, but the recordings themselves are ruined before we get them.
It's all even more stupid when you consider that compression could have been (and was) done ON THE PLAYBACK DEVICE. My 1996 Ford CD player has a button on it labeled "Compress."
A good point, but if stress was your motivator, it might be better to work to reframe that and gain motivation through something else that isn't stress.
I do wonder if this is in part to Spotify educating people with their very much in your face notifications when you set your player to lossless quality mode. They inform you bluetooth won't pass the signal with enough fidelity and to go wired.
I don't think many people thought their expensive Airpods/Bose/Sony were not capable of handling lossless and may feel left out or missing something.
For phones, I think it's just the Sony and Asus and Chinese brands that support it. Pixels and Samsungs generally don't since they use Tensor/Exynos instead of Qualcomm/Snapdragon SoCs, and definitely not Apple.
Story is even more bleak on the headphones side, Sony prefers their own LDAC codec so they support that instead of AptX Lossless, a pattern shared by many Asian headphones manufactuers. Many western brands only support up to AptX HD and AAC because Apple/Samsung devices have the majority marketshare. Qualcomm's own site only shows 12 headphones that support AptX Lossless.
Now my opinion is LDAC is close enough to lossless that it's probably good enough for Sony and most people (the 1411kbps for uncompressed 16/44.1 CD quality generally compresses to under 900kbps which is below the 990kbps max of LDAC). Bose does have a headphone that supports AptX Lossless. It's just the Airpods that are far behind the competition.
This is a flawed methodology for measuring success.
Solving a case isn't a single correct search. It's a tool, and a single case could have hundreds of searches associated with it.
As more regulation comes in, as it should, we should get much better auditing data that link each and every search to a specific case. This is evolving quickly at the moment, but ultimately it's up to the public to begin to push for requirements like these.
Currently departments do not necessarily require a case number, as many times a case number has not been created yet.
I think a more fair method to measure success is look at how effective each dollar spent on LE accounts for the whole picture. How much more effective did ALPR make each officer/detective on the force? Generally speaking, these are force multipliers and are much more effective than spending on pure body count. Many departments cannot fill seats even if they wanted to.
Extending that, we don't know whether this prevented costlier or more time consuming methods of investigation, led to closing of cases by arrest or not pursuing someone found to be innocent, or otherwise helped increase efficiency by not assigning officers to patrol duties around Flock areas.
I'm 'active threat model' level of anti-surveillance, but it's worthless to try to base anything off such a premature and incomplete picture.
Setting aside the privacy implications (which are obviously very important), it’s like saying “I searched my filesystem and it went through 1,000,000 files. I found the file but it was 99.999999% ineffective” so yes, that’s not a valid metric
Unless they’re saying every failed search is big problem because of the privacy issues I guess
None of these agencies get your video data without your consent. The feature was designed so they have an easy way to present you the request for footage.
Unfortunately a portion of the information getting circulated is the complete opposite.
> None of these agencies get your video data without your consent.
You certainly can't be sure of that. In fact, it is almost certain that these companies provide the data they collect to the police and government agencies data, often without warrant.
I'm certain they get your video data without your consent when the agencies have a warrant. I think it's very likely that they won't necessarily require a warrant, either.
Consider the Nancy Guthrie case. The owner wasn't around to give consent, and the camera didn't even have an active subscription, yet law enforcement was still able to recover video from Google's systems.
The only way it could be as you say is if the video was only stored locally without any remote access, or if the video was encrypted with keys only you control. Google clearly is not doing this. I really, really doubt Amazon is.
Yes, for now. But ultimately you have no control or say over these features because you do not own the software or data. You must have pure blind faith that this will be the way it continues to work.
If other people are cool with doing things without any reasons and based on pure trust, that's on them. But that's not gonna be me
We have variable speed limits on parts of our roads. People commonly exceed the stated limit by 20+ mph since they're used to "full speed" and ignore the instances where it's being reduced.
I welcome our future robot car overlords where all of these problems should in theory be greatly reduced or eliminated entirely.
Where I live it is impossible to exceed the variable speed limit because every other car on the road is doing it.
I agree, a full network of self driving cars which can all move together in a chain will eliminate this problem. I just hope I live long enough to see it.
Rolled oats tend to give me anxiety, notice the issue with other large amounts of carbs in the AM. Anybody else have that issue? I figure it's a glycemic index issue. I don't believe the issue was present with steelcut.
i cant eat carbs in the morning either, i usually wait til afternoon/early evening before taking carbs (and thats same for rolled oatmeal; sometimes i'll have that for dinner actually)
Getting an e-bike has got me out getting exercise way more than a regular bike ever did. Being able to dial my effort up and down pushes me further, quite literally in distance and fitness goals. I'm by no means fit and almost did a 5 hour 40 mile ride one day. I completely used up the battery in that time, my legs were cooked from the effort. I would have never attempted something like that on a regular bike unless I was fit.
reply