bbor 2 months ago | next |

Fascinating look into how culture impacts research. This is done in China, and seemingly takes a 100% empirical mindset, deciding not to mention any of the theoretical frameworks that could back such a system, the most obvious of them being Marvin Minsky’s Society of Mind (highly recommend, the most hacker-style book on theoretical AI out there). If I had to guess, that’s 50% due to language barriers and 50% due to overall intellectual orientation.

Regardless, their results are very interesting! Dedicated doomers may remember that Stanford did a proof of concept for this almost exactly a year ago, but it was focused on “believability” of the society rather than the society’s collective ability to perform tasks: https://hai.stanford.edu/news/computational-agents-exhibit-b...

On a humorous side note: I was curious about their choice of a fun name for their simulated environment — “IrollanValley” - in an otherwise very empirical and dry paper, and was excited to learn something about Chinese folklore! But, uhh no lol, it’s just a reference to the video game Heroes of Might and Magic https://mightandmagic.fandom.com/wiki/Irollan

Thanks for posting! I’ve gotta hurry up and publish before this gets out of hand. In addition to the need for a Society of Mind discussion in any follow up paper, I personally hope to see a discussion of Terminator’s SkyNet, as well…

krunck 2 months ago | prev | next |

This is amazing research. What I want to know is if they repeated the same experiment but trained the agents without language or, rather, on random data instead of language, would their gibbering eventually evolve into a structured language? After how many iterations?

kridsdale3 2 months ago | root | parent |

In humans this happens. Isolated clans of non-socialized children (like the result of some disaster, or in a very neglected orphanage) have demonstrated spontaneous generation of novel language for collaboration amongst themselves.

kridsdale3 2 months ago | prev | next |

This immediately evoked in my memory David Chiang’s The Lifecycle of Software Objects. Here, though, there is no need for human caretakers to guide the socialization of the NPCs.

byus 2 months ago | prev |

How is this not intentionally drawing a person?

If we write code that intentionally takes into consideration our own experience, why is it mesmerizing when it behaves as expected?

It isn’t spontaneously forming society if it’s weighted to do so in its initial states given the intentional implementation choices of the makers.

This is like watching a farmer farm and saying the carrots magically grew.