technologyneutral

AI's Growing Pains: Will We Hit a Wall Soon?

OpenAI, USASaturday, November 16, 2024
Advertisement
For a long time, people in the AI world have been excited about how quickly new models are getting better. Some think this will keep going forever. But lately, there's a worry that these big language models might be hitting a ceiling. This means they might not improve as much as we think just by using the same old methods. Experts from OpenAI say their next big model, called Orion, isn't giving the same big boost as their last one, GPT-4. Some tasks are even saying it's not better at all. Ilya Sutskever, a co-founder of OpenAI, agrees. He says the time of just throwing more data and power at models might be over. What's the problem? It seems like new models aren't finding good new data to train on. Maybe they've already used up all the best stuff from the internet and books. This could slow down how quickly they learn new things. So, what now? Maybe it's time to look for new ways to make AI smarter, not just bigger. That could mean finding new kinds of data or even new ways to teach models. It's like going back to the drawing board.

Actions