cross-posted from: https://lemmy.world/post/44699253

This is clearly a sign that the product failed to draw in enough customers and its viability was overhyped.

Hopefully, it is the start of the AI bubble bursting.

  • Sims@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    18
    ·
    1 month ago

    I think it was build on the original transformer architecture and as such took a shitload of compute and was slooow, so I guess they picked the wrong architecture and had to scrub the whole strategy. Huge loss… That also point to US not having enough cheap compute (compute->combined mem and processing) - likely from missing electricity. Lovely… Die Saltman, die, and Go go China !

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      edit-2
      1 month ago

      Go go China !

      Bops the tankie.

      Like, I have a Chinese LLM loaded right this second and follow them closely, but holy moly. Curb your enthusiasm.

      Anyway, OpenAI has plenty of compute to train a Sora 2 if they want, but apparently they don’t. My guess is some combination of:

      • They couldn’t figure out a more efficient architecture, like you speculated. I buy that. OpenAI’s development is way more conservative than you’d think, and video generation is inherently intense, especially if Sora 1 is the baseline.

      • …Maybe they looked at metrics, saw Sora is mostly used for spam, scams, or worse, and pulled the plug for liability reasons?

      • They’re focusing on short-term profitability, as other commenters mentioned.