Yeah, I’ve never really heard of “weak etags” before in any sort of common usage of the term. Honestly, most people tend to skip etags by embedding hashes in filenames directly, this way you can avoid any bad proxies serving up stale content or dropping headers. It’s rare these days to be an issue given the use of TLS end-to-end encryption, but I’m sure it still occasionally happens. And yes, the more serious approach to possibly poorly formatted JSON is to “normalize it” into the expected format. It’s less about caching and more about ensuring what you serve to your front end is consistent, even if you are liberal in what inputs you can handle. E.g. if someone gives you XML, rather than write a front end that can handle both XML and JSON, pull the data out of both and make your own JSON later.