TikTok’s parent company, ByteDance, has been secretly using OpenAI’s technology to develop its own competing large language model (LLM). “This practice is generally considered a faux pas in the AI world,” writes The Verge’s Alex Heath. “It’s also in direct violation of OpenAI’s terms of service, which state that its model output can’t be used ‘to develop any artificial intelligence models that compete with our products and services.’”
I can’t speak for every jurisdiction, but I’d be hard pressed to see why it wouldn’t be legal in the US, especially in these circumstances. ByteDance is a massive legally sophisticated corporation, so they should’ve been expected to fully read and understand the terms and conditions before accepting them. They probably won’t bring a legal challenge, because they know they don’t have a particularly strong legal argument or a sympathetic angle to use.
The issue is that what they are doing there is blatantly anticompetitive.
https://www.ftc.gov/advice-guidance/competition-guidance/guide-antitrust-laws/single-firm-conduct/refusal-deal
Sorry for the late reply, but this doesn’t really seem like it’d come close to invoking any of the US’s neutered antitrust enforcement. Open AI doesn’t have a monopoly position to abuse, since there are other large firms offering LLMs that see reasonable amounts of usage. This clause amounts more to an effort to stop reverse engineering than stifle anyone trying to build an LLM.
I doubt if it is clear-cut enough to bring down enforcement in any case. However, that does not mean that the clause is enforceable.
It is easy to circumvent such a ban. Eventually, the only option that MS has is suing. Then what?
Why would the clause be unenforceable? It doesn’t violate any of the general principles of contract law. If you intentionally contract around these terms that don’t violate any existing body of law and don’t run counter to public interest, a court would have no problem enforcing the terms of a contract. They probably wouldn’t sue you or me in our individual capacity if we circumvented. There’s a much greater chance of recovery if they go after a company which is pretty clearly using their service in a bad faith. If ByteDance wanted to use their LLM to train their own, they could’ve negotiated such a license.