NYT's Stand on AI Data Usage

The NYT, Microsoft, OpenAI legal tussle; ChatGPT's role in achieving your 2024 resolutions

Welcome to another edition of the best damn newsletter in AI.

This free newsletter is designed to keep you ahead of the curve and open your mind to using AI in your work and business.

Digging deep in AI for work or AI Operations? Take a look at our membership.

Our #1 goal is to be useful. So please shoot us an email 📩 if you have questions or feedback, and especially if you implement something we share!

Here's what we're covering today:

  • A landmark legal battle between The New York Times and tech giants over AI's use of media content without permission, echoing the Napster vs Spotify saga

  • A fun way to leverage ChatGPT to help hold you accountable to your 2024 New Year's resolutions

  • The hidden costs of AI reliance on cloud storage

... and if someone forwarded this email to you, thank them 😉, and subscribe here!

Let’s get to it! 👇

TODAY'S PERSPECTIVE

The Napster-Spotify Moment for AI

There’s a new landmark case that could redefine the landscape of AI data usage: The New York Times (NYT) is taking on tech giants Microsoft and OpenAI over the use of NYT's past articles to train ChatGPT, which they claim now competes directly with them.

What’s interesting is that this legal battle echoes the infamous Napster vs Spotify saga. Napster, if you recall, built a streaming service using artists' music without obtaining permission, while Spotify sought permission first. The outcome of that battle made it clear which approach was more sustainable.

Media companies like NYT possess some of the most valuable text data in the world. It's this data that companies like OpenAI need to train their AI models.

But should they be allowed to use it without permission?

Using private data for AI model training has been previously ruled as OK under fair use, but even platforms like Github’s AI code-writing product Copilot add a check to make sure that the generated code doesn’t match exact lines of code written and shared in its training data.

It's very likely this won't be the last we hear about this.

USE CASES

Using ChatGPT Custom Instructions to Hold You Accountable

Do you have your 2024 new years resolutions locked and loaded? Well, just about everyone struggles with keeping to them.

Here's a fun way to use ChatGPT to hold you accountable.

Many people use ChatGPT to give advice or talk through life decisions. If that's you, then consider adding your new years resolutions to your custom instructions!

Here's an example of what you can add:

- My new years resolutions are {X}, {Y} and {Z}. I get easily distracted from them with shiny object syndrome. If at any point in our conversation, you think I am deviating from my resolutions, please bring them up and motivate me to stick to them.

To add to you custom instructions:

1. Make sure you have a paid ChatGPT Plus account

2. Go to your name in the bottom left > Custom instructions

LINKS

For your reading list 📚

Speaking of copyright…

And if you’re really nerdy…

That's all!

We'll see you again soon! Thoughts, feedback and questions are much appreciated - respond here or shoot us a note at [email protected].

... and if someone forwarded this email to you, thank them 😉, and subscribe here!

Cheers,

🪄 The AI Exchange Team