• 0 Posts
  • 22 Comments
Joined 1 year ago
cake
Cake day: July 8th, 2023

help-circle








  • Wasn’t this guy just in the news for kicking the infrastructure bill (which was effectively meant to be our try at national-level climate action) repeatedly in the balls, cutting climate and infrastructure spending by like half - and then holding the whole thing up at the end when his clients/constituents presumably realized that one of the provisions in the bill would jeopardize the pipeline deal they had been working on (which, surprise, happened to go through some frontline communities, etc etc)?











  • What they’re getting towards (one thing, anyways) is that “indistinguishable to the model” and “the same” are two very different things.

    IIRC, one possibility is that LLMs which learn from one another will make such incremental changes to what’s considered “acceptable” or “normal” language structuring that, over time, more noticeable linguistic changes begin to emerge that go unnoticed by the models.

    As it continues, this phenomena creates a “positive feedback loop” in which the gap progressively widens – still undetected, because the quality of training data is going down – to the point where models basically “collapse” in their effectiveness.

    So even if their output is indistinguishable now, how the tech is used (I guess?) will determine whether or not a self-destructive LLM echo chamber is produced.