• jmcs@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    1 day ago

    Before we even get close to have this discussion, we would need to have an AI capable of experiencing things and developing an individual identity. And this goes completely opposite of the goals of corporations that develop AIs because they want something that can be mass deployed, centralised, and as predictable as possible - i.e. not individual agents capable of experience.

    If we ever have a truly sentient AI it’s not going to be designed by Google, OpenAI, or Deepmind.

    • Pennomi@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 day ago

      Yep, an AI can’t really experience anything if it never updates the weights during each interaction.

      Training is simply too slow for AI to be properly intelligent. When someone cracks that problem, I believe AGI is on the horizon.

        • Pennomi@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          19 hours ago

          Artificial General Intelligence, or basically sobering that can properly adapt to whatever situation it’s put into. AGI isn’t necessarily smart, but it is very flexible and can learn from experience like a person can.