Artificial Intelligence in 2024 – Time to Shift Gears
The 2023 headlines might lead you to think AI is sort of new. Actually, not much new happened in 2023 other than awareness and hype. So, consider this: perhaps in 2024, it’s time to shift from hype to results.
The origins of AI are far from new. The notion is as old as the computer. Charles Babbage (1791-1871) invented the first computer. His “difference engine” was closer to what we would call a “calculator.” But he conceived much more. Though Babbage was unable to construct a “real” computer, his work did include considerations of what we would now call “Artificial Intelligence.”
Today most of our computers use the Von Neumann architecture, described by John Von Neumann (and others) in 1945. Like Babbage, Von Neumann was interested in how to make machines “think.” Along with Turing and others, he laid the foundations for modern AI.
Since then, AI has seen peaks of enthusiasm, followed by valleys of unmet promises. These cycles have been called AI Summers and AI Winters. Today, we are in the third AI Summer. This time around, we may enjoy a rather long summer. Finally, AI has become generally useful across a range of use cases.
Last year it was fun to play with ChatGPT. But most organizations found it too unreliable for “real” work.
Last year, consumer AI recommendation systems spread beyond Netflix and Amazon. However, high error rates still left these kinds of AI unusable for most other purposes (and some would argue often unusable at Netflix and Amazon, too).
And, last year, we saw plenty of drama: OpenAI leadership turmoil and predictions about the end of civilization, White House proposals and an Executive Order for AI regulation, as well as other headline-generating AI events.
So, at Lone Star, we suggest it’s time to shift gears. Here are four AI topics we’d like to suggest you consider for 2024.
First – Get ready for regulation
Despite the recent US Executive order, there are few prospects for passing any AI laws in Congress. But various agencies will be looking for ways to align their rule-making with President Biden’s orders. So, some of this will soon be in force in the US. And at some point, perhaps other regulations will become law (after the 2024 elections).
In the EU, things are farther along. Just before Christmas, European Parliament negotiators hammered out consensus language for an “AI Act.” Just as the EU was the first to formulate national laws around data privacy with GDPR, the AI Act will lay out principles for the use of AI within the EU.
The EU approach has been called “anti-innovation” in contrast to the UK, whose policies are deemed “a pro-innovation approach to AI regulation.” But either way, there will be regulation in the UK as well.
Perhaps the least stringent approach to AI among major economies is seen in Japan. But regulation is coming to Japan, too.
Looking at history, it’s interesting to see what happened with regulation in the first and second AI summers. In those eras, many innovators simply changed their semantics. When “Expert Systems” was deemed a Cold War risk and placed on the U.S. ITAR export control list, it didn’t stop AI innovation. It only stopped innovators from calling their AI “Expert Systems.”
So, we expect years of uncertainty and semantic confusion as AI regulations take hold.
Second – Look for new forms of Generative AI
But there are many other forms of Generative AI. Some are “narrow” and only apply to one topic, but these are often “deep.” They far outperform their general-purpose cousins.
Here is what you get when DeepAI is prompted to generate an image for “an AI generator for guided missile design”
At Lone Star, one of our Generative Ais can actually automate the first-order design of aero systems, like guided missiles. Our system does not generate cherubs, but it doesn’t get confused about the number of fingers (or control fins) either.
We also expect to see a great deal of news about “Orchestration.” While it may not be a new “form” of GenAI, more than $1B has been invested in firms who are leading “orchestration projects” to integrate and manage generative AI solutions, sometimes blending multiple models. This will “feel” like a new kind of GenAI because it will offer solutions, unlike the systems which made the 2023 headlines.
One goal of orchestration will be the creation of narrow GenAI LLMs, which will “know” about narrow topics and (hopefully) be more reliable than the general-purpose LLMs.
Third – Look at the emergence of Synthetic Data Generators
The lack of clean, labeled data is hindering AI deployment in many use cases. Nowhere is this more acute than in defense applications.
So, in 2024 expect to see the maturation of SDGs – Synthetic Data Generators. As Forest Gump said, “That’s all we have to say about that.” But look for Lone Star to have a lot more to say later this year.
Fourth – Look for more buzz about Quantum AI
But the key word here is not “quantum.” The key word is “buzz.” A lot of the buzz will be contradictory and far from ready to deploy. So, unless you are deeply serious about quantum computing or your firm is an AI company, you can probably ignore this chatter in 2024.
We will see a total eclipse of the sun in Dallas in 2024. We won’t see any practical applications for quantum AI.