What will be the JavaScript of AI?
Heading into the platform era of Natural Language Interaction
I’ve spent the last week or so reviewing older papers about Large Language Models and watching the LinkedIn noise activity around OpenAI’s moves — Open Sourcing some GPT models and releasing GPT-5 (more on that in another post).
What struck me as I looked at the evolution of the AI space over the last two years and the current craze for AI agents is how much the nature of the AI world resembles the early days of the World Wide Web — not necessarily so much in technical plumbing as in the surplus of ideas around how to work with the new technologies and what it’s possibilities will be.
I pitched my observation to a colleague recently and we both landed on the notion that what’s missing right now is JavaScript (and I threw in CSS as an add-on to that). Intuitively that feels right — we’ve got a lot of tools but no unified procedural / scripting “glue” for them.
Let’s avoid the simple answer: “Python will fill the role of JavaScript,” because all that really does is substitutes one programming language for another. The bigger question to be addressed is what niche in the WWW ecosystem JavaScript filled, why that role’s important in the AI world, and what the current candidates are for filling that role.
What role did JavaScript fill?
I think of JavaScript (and yes, CSS) as the component that turned the browser from a display (think 3270 terminal, my fellow mainframe nerds) into a platform.
Giving content providers fine-grained control over how their content is presented opened the door to all sorts of e-commerce and other interaction opportunities.
A standard target environment (eventually further refined by industry standards and by libraries that abstract away differences among browsers) reduced the complexity and cost of delivering content and applications.
Local processing improved responsiveness and reduced latency of web applications.
How does that fit with Language Models?
What’s most clear about Language Model space is that the standards are just beginning to shake out. While there have been standards for storing and transmitting models for a while, even those are still being improved by innovations.
Python seems to be the de facto programming language for AI and Language Models, but even so the target environments are still up in the air. Ollama and vllm both provide useful platforms and there are any number of contenders for edge environments.
How AI should behave on the edge is the last big hurdle. As small models become more capable and edge devices adapt to the models, the need for every interaction to involve network round trip may become a thing of the past.
As an AI decision-maker, picking the “right” model is becoming less and less important. The era of shoveling money into data centers may be ending before they’re even built — the “platform era,” reaching end users seamlessly and still under control, will define the next set of winners and losers.