By Sharon Goldman — venturebeat.com — These are the 5 biggest AI stories I’m waiting for:
1. GPT-4
ChatGPT is so 2022, don’t you think? The hype around OpenAI’s chatbot “research preview,” released on November 30, has barely peaked, but the noisy speculation around what’s coming next — GPT-4 — is like the sound of millions of Swifties waiting for Taylor’s next album to drop. If expert predictions and OpenAI’s cryptic tweets are correct, early to mid-2023 will be when GPT-4 — with more parameters and trained on more data — makes its debut and “minds will be blown.” It will still be filled with the untrustworthy “plausible BS” of ChatGPT and GPT-3, but it will possibly be multi-modal — able to work with images, text and other data. It has been less than three years since GPT-3 was released, and only two since the first DALL-E research paper was published. When it comes to the pace of innovation for large language models in 2023, many are saying “buckle up.”
2. The EU AI Act
AI technology may be rapidly advancing, but so is AI regulation. While a variety of state-based AI-related bills have been passed in the U.S., it is larger government regulation — in the form of the EU AI Act — that everyone is waiting for. On December 6, the EU AI Act progressed one step towards becoming law then the Council of the EU adopted its amendments to the draft act, opening the door for the European Parliament to “finalize their common position.” The EU AI Act, according to Avi Gesser, partner at Debevoise & Plimpton and co-chair of the firm’s Cybersecurity, Privacy and Artificial Intelligence Practice Group, is attempting to put together a risk-based regime to address the highest-risk outcomes of artificial intelligence. As with the GDPR, it will be an example of a comprehensive European law coming into effect and slowly trickling into various state and sector-specific laws in the U.S., he recently told VentureBeat. Boston Consulting Group calls the EU AI Act “one of the first broad-ranging regulatory frameworks on AI” and expects it to be enacted into law in 2023. Since it will apply whenever business is done with any EU citizen, regardless of location, this will likely affect nearly every enterprise.
3. The battle for search
Last week, the New York Times called ChatGPT a “code red” for Google’s search business. And in mid-December, You.com announced it had opened up its search platform to generative AI apps. Then, on Christmas Eve, You.com debuted YouChat, which it called “Conversational AI with citations and real-time data, right in your search bar.” To me, this all adds up to what could be a real battle for the future of search in 2023 — I’m already munching on popcorn waiting for Google’s next move. As I wrote recently, Google handles billions of searches every single day — so it isn’t going anywhere anytime soon. But perhaps ChatGPT — and even You.com — is just the beginning of new, imaginative thinking around the future of AI and search. And as Alex Kantrowitz told Axios recently, Google may have to make a move: “It’s game time for Google,” he said. “I don’t think it can sit on the sidelines for too long.”
4. Open source vs closed AI
I’m fascinated by the ongoing discussion around open source and closed AI. With the rise of Hugging Face’s open source model development — the company reached a $2 billion valuation in May; Stable Diffusion’s big summer splash into the text-to-image space; and the first open source copyright lawsuit targeting GitHub CoPilot, open source AI had a big, influential year in 2022. That will certainly continue in 2023, but I’m most interested in how it compares to the evolution of closed source AI models. After all, OpenAI shifted to closed source and is now on the brink of releasing GPT-4, arguably the most eagerly-anticipated AI model ever — which is certainly a competitive advantage, right? On the other hand, MIT Technology Review predicts “an open-source revolution has begun to match, and sometimes surpass, what the richest labs are doing.” Sasha Luccioni, research scientist at Hugging Face, agreed and added that open source AI is more ethical. She tweeted last week that open sourcing AI models “makes it easier to find and analyze ethical issues, as opposed to keeping them closed source and saying ‘trust me, we are filtering all the bad stuff out.’
5. Is AI running out of training data and computing power?
Will 2023 be the start of an AI age of creative conservation when it comes to data and compute? The compute costs of ChatGPT, according to OpenAI Sam Altman are “eye-watering,” while IBM says that we’re running out of computing power altogether, because while AI models are “growing exponentially,” the hardware to train and run them hasn’t advanced as quickly.” Meanwhile, a research paper claims that “data typically used for training language models may be used up in the near future—as early as 2026.” I’m eager to see how this plays out in the coming year. Will big ultimately not equal better when it comes to data and compute? Will new AI chips designed for deep learning models change the game? Will synthetic data be the answer to the training problem? I’ve got my popcorn ready for this one, too.
Wishing you all a happy new year! I’ll be back in my temporary-beachfront “office” on January 2. Until then, enjoy the last week of 2022 and here’s to a happy, healthy new year. As a reminder, I’m on Twitter at @sharongoldman and can be reached at sharon.goldman@venturebeat.com.