A Tale of Two Demos
Our summer break here at HTC so we could focus on releasing new software to the greater universe is almost over.
It was the best of times. It was the worst of times. It was and continues to be in fact a wild ride time for AI, since new exciting things are coming out so quickly it is difficult to keep track.
Time to dive back into recent developments in deep learning, AI, generative models, computer graphics, etc. And boy there have been a lot of those new developments over the last 2 months.
OpenAI just released something really interesting. Now how useful the current incarnation really is, well that is worth understanding by watching these 2 somewhat different demos.
Let's begin with the slick and enthusiastic openAi demo of their new OpenAI Codex automated coding system.
Pretty slick, eh.
But what about that other demo i promised. The boys at Machine Learning Street Talk have a slightly different take on it. Here is their live coding session.
So take what you will from these two somewhat different demos of the OpenAI Codex software in its current incarnation.
Here's a link to the blog post on OpenAI Codex.
1. In some sense it seems like the hardest part of this whole thing is the vague speech to written commands to drive the code generation process. It needs to suss out what the hell you are actually trying to tell it to do.
Perhaps for professional programmers, passing it a specific rigid code specification in some agreed upon format it is trained on would be a better approach. A rigid design description for what you want built, and then it builds it all for you.
Food for thought. I would love to use something like this to bang out code drudgery. But i don't want to play 20 questions with the system, i want to tell it exactly what to do with a detailed high level spec and then let it crank out the detailed low level code for me.