It really isn’t superior. It’s just the hivemind that gets annoyed with Plex being stagnant, not open source etc. that claims it is. At best it has feature parity for some use-cases. Don’t get me wrong, it’s neat, but it’s not as polished as Plex.
FWIW they are cannibalizing ads right now with AI summaries, since people will navigate less to websites (in the world where they are useful, which they don’t seem to be at the moment).
That paper is yet to be peer reviewed or released. I think you are jumping into conclusion with that statement. How much can you dilute the data until it breaks again?
Peer review, for all its flaws is a good minimum before a paper is worth taking seriously.
In your original comment you said tha model collapse can be easily avoided with this technique, which is notably different from it being mitigated. I’m not saying that these findings are not useful, just that you are overselling them a bit with this wording.
VPN by Google One shuts down ( 9to5google.com )
Five Men Convicted of Operating Massive, Illegal Streaming Service That Allegedly Had More Content Than Netflix, Hulu, Vudu and Prime Video Combined ( variety.com )
Google Search Is Now a Giant Hallucination ( gizmodo.com )
Google rolled out AI overviews across the United States this month, exposing its flagship product to the hallucinations of large language models.
OpenAI strikes Reddit deal to train its AI on your posts ( www.theverge.com )
Stack Overflow bans users en masse for rebelling against OpenAI partnership — users banned for deleting answers to prevent them being used to train ChatGPT ( www.tomshardware.com )