One of the more intriguing technologies demonstrated at this week’s Microsoft Build conference was ​​OpenAI Codex, a machine learning model that translates natural language into code “across more than a dozen programming languages.” In a keynote presentation entitled “The Future of AI Development Tools,” Microsoft chief technology officer Kevin Scott said that “Codex lets us use natural language to express our intentions, and the machine takes on the responsibility of translating those intentions into code.”

You heard that right: the machine does the coding for you! This could be the beginning of a paradigm shift in programming. It certainly takes the low-code trend to another level, because now you can (potentially) use AI software to talk an app into existence.

Scott also called Codex “a translator between human imagination and any piece of software with an API.” The implication was that Codex could be used in the building of apps not just by developers, but by general users too. Later in the keynote, Ryan Volum (a Microsoft engineer who works directly for Scott) made this even more explicit. He firstly showed a demonstration of Codex creating a 3D model of the solar system in Babylon.js, a Microsoft 3D development tool I profiled recently. “With capabilities like these,” Volum remarked, “we envision that models like Codex will empower not just developers, but also creators.”

Babylon.js with Codex.

OpenAI and Microsoft Partnership Bearing Fruit

According to a Microsoft blog post, ​​Codex “descended from GPT-3, OpenAI’s natural language model that was trained on petabytes of language data from the internet.” Codex is a version of GPT-3 that’s been trained on “code from GitHub software repositories and other public sources.” The Codex model is available through the OpenAI API, as well as through Microsoft’s version called Azure OpenAI Service.

OpenAI was founded in late 2015 by Elon Musk, Sam Altman and others, and entered into a commercial partnership with Microsoft in July 2019. The most famous OpenAI product so far has been DALL·E 2, the AI-powered image generation service that is all over social media.

But while DALL·E 2 is getting all the buzz, the potential impact of OpenAI’s technology on developers is just as significant.

In his opening keynote at Build, Microsoft CEO Satya Nadella talked about how an enterprise customer called CarMax “used the Azure OpenAI Service to generate new marketing content based on thousands of customer reviews that would have otherwise taken someone years to summarize.” Later at Build, Microsoft announced that the Azure OpenAI Service is “now available in a limited-access preview.”

In its own blog post about Codex, OpenAI stated that “Microsoft’s Azure OpenAI Service provides developers with access to Codex and our other models, like GPT-3 and embeddings, along with enterprise-grade capabilities that are built into Microsoft Azure.”

Updates on GitHub Copilot

Codex has been integrated into GitHub Copilot since last July, and Microsoft provided updates at Build.

Copilot is an extension for Neovim, JetBrains and Visual Studio Code, and according to Microsoft it is used to “suggest additional lines of code and functions” based on a programmer’s existing code. In addition, developers can “describe what they want to accomplish in natural language, and Copilot will draw on its knowledge base and current context to surface an approach or solution.”

While it was left unsaid by Microsoft, it’s obviously up to individual developers to decide how much they trust Copilot to write code for them.

Since being released as a technical preview last year, Microsoft says that GitHub Copilot “today suggests about 35% of the code in popular languages like Java and Python.” GitHub Copilot will move to general availability this summer, Microsoft announced at Build.

How to Build Codex Solutions

For those who want to dive into Codex development, Ryan Volum and fellow Microsoft engineer Jon Malsan ran a session at Build showing how to develop apps using Codex. The main technique they showed was called “prompt engineering,” which Volum explained is a way to “coax novel behavior from these models.” But he also discussed other options, such as “fine-tuning,” where you “bring a bunch of data specific to your domain and you retrain the model with it.” While fine-tuning isn’t currently available for Codex, it is a planned future feature.

A demo of Codex.

You can play around with Codex yourself on the OpenAI Playground. While Codex is still very new for developers, it will be intriguing to see how this new form of AI-assisted programming will progress over the next few years.

Lead image: Kevin Scott at Microsoft Build.