The point he's making is that who is going to actually enforce that? If I take something that has that license and make changes to it, who is going to know? That's the underlying premise here.
Your point is circular, let me bring it all around. If I make a 'clean-room' implementation using an LLM of a software that has a GPL license. How does the court enforce that my black box didn't use the original software in any way if there's no way to know? Does having that software as part of it's training corpus automatically enroll all output as GPL enforceable? This is essentially the question some courts are attempting to answer right now.
LLM doesn't add anything new to the table here in terms of legal arguments. See how people did and do clean room implementations without any AI involved, and what lengths they go to.