LLMs are the worst they’ll ever be today. In 90 days they’ll be better. In a year even better.
Maybe they won’t make it to “real” AGI, but they are terrific and useful even if they don’t. And maybe they’ll help us make progress with a different path to AGI.
So much complaining about them, but this is an exciting time to be alive. We are likely to experience real AGI in our lifetimes!
LLMs are the worst they’ll ever be today. In 90 days they’ll be better. In a year even better.
Maybe they won’t make it to “real” AGI, but they are terrific and useful even if they don’t. And maybe they’ll help us make progress with a different path to AGI.
So much complaining about them, but this is an exciting time to be alive. We are likely to experience real AGI in our lifetimes!
LLMs are the worst they’ll ever be today. In 90 days they’ll be better. In a year even better.
Maybe they won’t make it to “real” AGI, but they are terrific and useful even if they don’t. And maybe they’ll help us make progress with a different path to AGI.
So much complaining about them, but this is an exciting time to be alive. We are likely to experience real AGI in our lifetimes!
I wish things were slightly less exciting!