- The AI Exchange
- Posts
- The Open-Source Wars
The Open-Source Wars
How open-source models alter the AI landscape; Get AI to code for you
Welcome to another edition of what we’re determined to make the best damn newsletter in AI. Here we’ll break down AI topics that matter, open your mind to use cases, and keep you ahead of the curve.
Our #1 goal is to be useful. So please shoot us an email 📩 if you have questions or feedback, and especially if you implement something we share!
Here's what we're covering today:
How open-source models are changing the AI landscape
Use ChatGPT and Github Copilot to get AI coding for you
Thoughts on OpenAI's advancements in ChatGPT and GPT-4
... and if someone forwarded this email to you, thank them 😉, and subscribe here!
Let’s get to it! 👇
TODAY'S PERSPECTIVE
Should I use open-source AI models?
As a business, will I have to rely on the “AI Giants” and the services they provide, or will I have options?
These are questions we get asked a lot. And the biggest determining factor here might be the open-source community.
When the code that powers an AI model is open-source, it means that the public has access to see the inner workings of it, critique it, and change the way it works (given the right expertise).
In AI Image Generation, models like Stable Diffusion are open-source and are widely used.
In AI Text Generation, the performance of open-source models has typically lagged about ~6 months behind their closed-source counterparts (like Open AI’s ChatGPT). Even the recently leaked LlAMA model (state-of-the-art AI text generation researched by Meta) is comparable with GPT-3, not GPT-4.
(Yes btw since it leaked, you can run a smaller version of it on your own computer).
If we can get decent performance from open-source AI models, what does that do to the landscape of AI providers?
We probably won’t just have one AI. We will have a mix of options - from different closed-source providers (Open AI, Google, etc.) and open-source alternatives.
We may even have an opportunity to combine them so that we might achieve even better performance on our tasks.
How we might see AI shaping up
We made a pyramid to show 🙂
The best AI model for you really depends on your use case. From our research and conversations, it seems like best-in-class performance will likely come from figuring out how to use prompt engineering to get a General AI model to work for your use case.
But as your task gets more and more specific, you can achieve better performance by fine-tuning your own versions of open-source AI models.
The jury really is still out on how to approach the Messy Middle. We are watching Foundry (and the performance of its early set of enterprise uses), as well as what people are able to launch and build with or without fine-tuned models.
The good news?
If you are a business with limited resources, writing great prompts can help you leverage AI successfully for your specific use case; and maybe even more than investing tons of money and resources into fine-tuning your own AI model.
USE CASE DEEP DIVE
AI can code for you
In the wake of Microsoft and Google’s Copilot for Work announcements last week, we saw a surge of developers talking about just how useful having a copilot can be.
A whopping 88% of software developers that have been using GitHub Copilot say they are more productive 😮
But community member, Alex Jin, goes one step further. He uses GitHub’s Copilot with ChatGPT.
“Copilot is like a supercharged autocomplete that can not only finish individual lines of code but can complete entire functions, whereas ChatGPT is like a coworker who can understand high-level context and answer questions about the general approach.”
If you're interested in how Alex is pair programming with GitHub CoPilot and ChatGPT, check out his full stack here!
LINKS
For your reading list 📚
OpenAI continues to lead the pack...
A VC’s perspective on why the ChatGPT API launch was game-changing (hint: it’s about the cost)
Open AI released early research on which jobs they see most at risk for automation by GPT
But, Open AI is increasing the message cap daily for GPT-4 users in ChatGPT plus and people are not happy
And if you're really nerdy...
If you don't want to use Dalai to run LLaMA, check out this tutorial on how to run llama.cpp yourself
Here's a Twitter thread by Lance Martin showcasing his step-by-step process on how he built an app that uses ChatGPT for question-answering over all of Lex Fridman's podcast episodes
Member-only links (more info on joining here!)
Tutorial on how to connect GPT to Google search results using LangChain and Server.dev (member hub link, must sign in!)
That's all!
We'll see you again on Thursday. Thoughts, feedback and questions are much appreciated - respond here or shoot us a note at [email protected].
... and if someone forwarded this email to you, thank them 😉, and subscribe here!
Cheers,
🪄 The AI Exchange Team