A video game CEO's attempt to use ChatGPT to avoid paying a $250 million bonus has backfired spectacularly in court.
The Case Details
In 2021, South Korean publisher Krafton acquired Unknown Worlds Entertainment — the studio behind Subnautica — for $500 million. As part of the deal, Krafton promised to pay an additional $250 million if the sequel, Subnautica 2, met certain sales targets.
By 2025, Krafton's internal projections showed Subnautica 2 was on track to trigger that massive payout. Rather than honor the contract, CEO Changhan Kim turned to an unlikely advisor: ChatGPT.
"Fearing he had agreed to a 'pushover' contract, Krafton's CEO consulted an artificial intelligence chatbot to contrive a corporate 'takeover' strategy," the court ruling stated.
The AI-Assisted Scheme
According to court documents, Kim used ChatGPT to devise a plan to take over Unknown Worlds and force out its founder, Charlie Cleveland. The AI allegedly helped craft a strategy to void the bonus obligation by restructuring the company's ownership and control.
The scheme ultimately involved terminating Cleveland and other key developers — a move that backfired when the case went to litigation.
Court Ruling
On Monday, a judge ordered Cleveland's reinstatement, ruling that Krafton's actions violated the original acquisition agreement. The court specifically noted that the CEO had "consulted an artificial intelligence chatbot to contrive a corporate takeover strategy" — marking what appears to be one of the first times a court has explicitly cited a CEO's use of AI in a corporate wrongdoing ruling.
Why This Matters
The case highlights the growing role of AI in corporate decision-making — and the legal risks of relying on chatbots for strategic advice. While AI tools like ChatGPT can assist with many business tasks, using them to craft potentially illegal schemes doesn't shield executives from accountability.
It also underscores the importance of contractual obligations in acquisitions. The $250 million bonus wasn't a gift — it was a legally binding earnout tied to performance, and courts are willing to enforce such agreements even when technology is involved in the attempt to circumvent them.
For the gaming industry, the ruling sends a clear message: AI-generated strategies won't hold up in court, and attempting to dodge contractual obligations through algorithmic assistance is unlikely to succeed.