Your AI Apps Got Cheaper & An Upgrade

Three new huge updates to GPT this week; Compliance made simple with AI

Welcome to another edition of what we’re determined to make the best damn newsletter in AI. Here we’ll break down AI topics that matter, open your mind to use cases, and keep you ahead of the curve.

Our #1 goal is to be useful. So please shoot us an email 📩 if you have questions or feedback, and especially if you implement something we share!

Here's what we're covering today:

  • GPT-powered apps got three new upgrades this week, and we’re breaking them down

  • Compliance woes got you down? Let AI help take care of it.

  • Consulting firms aren’t pulling any punches with their AI reports

... and if someone forwarded this email to you, thank them 😉, and subscribe here!

Let’s get to it! 👇

TODAY'S PERSPECTIVE

This week was great for GPT-Powered AI apps

OpenAI dropped 3 big developments this week: cost reductions, larger context windows, and a new function calling feature. Let’s break it down:

1. AI got cheaper, again!

OpenAI’s embeddings model which powers all the “chat with your data” and custom AI chatbot apps, is now 75% cheaper. To add to that — the cost of prompts (or input tokens) for gpt-3.5-turbo got 25% cheaper as well.

Both of these will make it even cheaper to build custom AI chatbots and other applications.

2. You can now send more data in at a time

The 'context window' or how much information GPT remembers is now 16k for gpt-3.5-turbo, four times the previous size. This wider 'view' enhances AI's understanding and output relevance. Improved 'steerability' means models follow our instructions better.

3. The BIG update - functions!

OpenAI added a new capability called “function calling” in the Chat Completions API. These functions act similarly to the ChatGPT plugins, where the language model figures out what function should be called to help complete your task.

Why is this so big?

Everyone is obsessed with agents. ChatGPT is great. But what we really want is our super powerful personal assistant, right?

Well the general consensus is the way we are going to get there is through giving ChatGPT access to other tools, and teaching it how to use them.

Then we’ll interact with ChatGPT, or any other language model or AI chatbot, tell it what we want, and the AI model will go figure out how to do that task.

And this functions update will certainly accelerate how quickly we get there.

USE CASE FROM OUR PARTNERS

AI Compliance Matters, and Secureframe is making it easy

Whether you’re using or building AI tools, compliance with security and privacy are all top of mind - and Secureframe is paving the way.

Security and privacy compliance can be hard to achieve when there’s requirements from what feels like hundreds of different frameworks including SOC 2, HIPAA & GDPR. Not to mention the headache of continued certification and service provider verification as well.

But with Secureframe’s new Comply AI, security and compliance is automated, letting you spend more time on putting AI to work for your business or your customers.

Stop worrying about cloud misconfigurations and remediations, and focus on what you do best.

LINKS

For your reading list 📚

The future of AI and consumer tech…

The future of AI and consulting…

And if you're really nerdy...

That's all!

We'll see you again on Tuesday. Thoughts, feedback and questions are much appreciated - respond here or shoot us a note at [email protected].

... and if someone forwarded this email to you, thank them 😉, and subscribe here!

Cheers,

🪄 The AI Exchange Team