One of the stupidest things I’ve heard.
I also would like to attach the bluesky repost I found this from:
https://bsky.app/profile/leyawn.bsky.social/post/3lnldekgtik27
One of the stupidest things I’ve heard.
I also would like to attach the bluesky repost I found this from:
https://bsky.app/profile/leyawn.bsky.social/post/3lnldekgtik27
Before we even get close to have this discussion, we would need to have an AI capable of experiencing things and developing an individual identity. And this goes completely opposite of the goals of corporations that develop AIs because they want something that can be mass deployed, centralised, and as predictable as possible - i.e. not individual agents capable of experience.
If we ever have a truly sentient AI it’s not going to be designed by Google, OpenAI, or Deepmind.
Yep, an AI can’t really experience anything if it never updates the weights during each interaction.
Training is simply too slow for AI to be properly intelligent. When someone cracks that problem, I believe AGI is on the horizon.
AGI?
Artificial General Intelligence, or basically sobering that can properly adapt to whatever situation it’s put into. AGI isn’t necessarily smart, but it is very flexible and can learn from experience like a person can.