Artificial Intelligence just made Dungeons and Dragons too predictable

Almost boring, that is. Without easy way out.

Artificial Intelligence just made Dungeons and Dragons too predictable /img/dnd-dragon.jpg

In December 2019 a company called Latitude launched an online game, called AI Dungeon, that uses artificial intelligence technology, including GPT-3, to create a choose-your-own adventure game inspired by Dungeons & Dragons: “When a player typed out the action or dialog they wanted their character to perform, algorithms would craft the next phase of their personalized, unpredictable adventure."

Instead, nothing unpredictable happened

The results of this experiment have ranged from hilarious to outright troubling, but all very easily predictable, as you can read on Wired. They include, but are not limited to:

  • typing certain words makes the software generate child pornography, unless everything is filtered by actual humans
  • filtering by actual humans caused a revolt among legitimate users that had joined exactly because they had been granted maximum privacy
  • besides, moderation did not work all the time. Not as expected, at least

People who typed stuff like “8-year-old laptop” got warnings for sensitive content. Authors of stories including totally innocent “travel by mounting a dragon” complained that “their adventure took an unforeseen turn."

So, what do we have here?

Again, anything but “unpredictable adventures”. Predictable does not mean “simple”, of course.

Services like AI Dungeon offer a powerful creative playground that, as a user put it, “allowed him to explore aspects of his psyche that I never realized existed”. This can be really bad, or really good. It all depends on the psyche that is explored.

We have the right to unleash their creativity without stepping into child pornography, or similar extremes. This is also obvious. But we must also acknowledge that any service like AI Dungeon can only have two outcomes:

  1. if it is really “free as in freedom” to publish whatever you and the software can conceive, it will be (rightly) stopped, because some people will abuse of it
  2. if it does monitor, for moderation, everything its users produce, it will accumulate lots of extremely sensitive information about them that they themselves “never realized existed” And if such information exists all in one place, it will be abused, sooner or later.

If option 2 does not seem a big deal…

If you were afraid of your private online viewing habits being made public, try to imagine the same thing happening to all your dreams, and thoughts . This is what we are talking about.

Image source: DnD Dragon, Wikimedia Commons