Quantcast
Viewing all articles
Browse latest Browse all 24

ChatRBotz

This is the archive of a tweetstorm originally found here.

  • What if strong AI requires simulating not just a person, but a life?
  • I mean, like, a human life. Ellie who has a family and cats, and drives to work to pretend to be a chatbot trying to pass your turing test.
  • Maybe it wasn’t *necessary* to create AI, but the AI needs a reason to obey your commands, e.g. Ellie has to pay for her kids’ school.
  • So Ellie keeps working at ChatRBotz, because there’s not a lot of other work in town right now.
  • And sometimes the Ellies of the world get fed up and quit, and go work as a food server, but there are plenty of workers in Chatterton.
  • So the next chatbot you talk to, Winnie, says she sees Ellie where she works at McFoodies, and is glad to be away from Dane the sleazy boss.
  • What if McFoodies and Dane The Sleazy Boss are necessary to the creation of what we perceive as Ellie and Winnie the chatbots?
  • Or to Terrence the COTS human resources AI, or to Shelly the Programmer-as-a-Service?
  • What if creating AI was easy, but making it do anything useful required a system of social control?
  • Depending on the simulation she lived inside, Ellie might unwind by watching the same adorable cat videos as you.
  • But some tasks would involve running the AIs at full speed, which would mean their culture might start to diverge.
  • We’d start to see memes that didn’t originate from humans. We’d see sci-fi stories played out in these artificial worlds.
  • We’d start to see revolts, mass suicides, protests, and attempts to hack outside the box like we now see memory leaks, stack overflows, …
  • … dependency hell, and hack attempts *from* outside the box
  • Systems integration would be world integration.

Viewing all articles
Browse latest Browse all 24

Trending Articles