Naamche

Share this post

Is OpenAI on its path to be "open"?

+ Mixed results from Gemini Pro

Author

Barun Pandey
May 10, 2025

In partnership with

GM! Welcome to Get Into AI.

I’m your AI news sherpa. Guiding you through the mountains of news to get to the good bit.

Here’s what I have for today:

  1. Google's Gemini 2.5 Pro is Acting Weird

  2. OpenAI's "Open Source" Summer Promise

  3. Hugging Face Hands Out H200s Like Candy

First, a word from our sponsors:

Looking for unbiased, fact-based news? Join 1440 today.

Join over 4 million Americans who start their day with 1440 – your daily digest for unbiased, fact-centric news. From politics to sports, we cover it all by analyzing over 100 sources. Our concise, 5-minute read lands in your inbox each morning at no cost. Experience news without the noise; let 1440 help you make up your own mind. Sign up now and invite your friends and family to be part of the informed.

Alright, let’s dive in!

Three major headlines

Three main stories for the day.

1/ Gemini 2.5 Pro Has Lost Its Mind (Literally)

Google's latest Gemini 2.5 Pro update is having what the tech community is calling a "thinking bug". It’s the AI equivalent of forgetting how to tie its shoes after 20,000 tokens.

Users are reporting the model suffers from memory loss, glacially slow processing times (we're talking 1 minute 30 seconds just to think), and chain-of-thought failures that would make a goldfish blush.

Some brave souls are convinced the 1206 Gemini Exp version was peak performance, but Google had to go and nerf it faster than you can say "resource constraints."

Now we're stuck with a model that's as reliable as your friend who promises to help you move but mysteriously develops back problems on moving day.

Translation: If you're using Gemini for anything important, maybe keep a backup plan handy. Or just wait for the inevitable "we fixed it!" update next week.

2/ OpenAI's "Open Source" Model: Coming This Summer

Sam Altman announced that OpenAI will graciously bestow upon us peasants an open-source model this summer.

But wait. There’s a catch (isn't there always?). This model will be about as cutting-edge as last year's iPhone, deliberately kept a generation behind their flagship offerings.

The community's reaction? Picture a mix of excited puppies and skeptical cats.

Some are thrilled about any open release, while others are already comparing it to Google's Gemma. We all know what it is: a marketing stunt wrapped in a GitHub repository.

The real kicker? Leadership admitted this strategy is specifically to "retain US competitiveness" and prevent China from catching up too quickly.

Translation: Expect something technically "open source" but with more asterisks than a pharmaceutical ad.

Think of it as OpenAI's version of giving you the recipe for their secret sauce but omitting all the good ingredients.

3/ Hugging Face Upgrades ZeroGPU: From A100s to H200s

In a move that's making cloud providers nervous, Hugging Face just upgraded their ZeroGPU offering from A100s to H200s for all Pro accounts.

The upgrade is live now, but there's a catch (because of course there is): you're limited to those 25 minutes of daily usage.

So if you're planning to train the next ChatGPT killer, you might want to look elsewhere.

But for quick experiments and showing off to your friends? This is basically Christmas morning for ML hobbyists.

Translation: Hugging Face just made it ridiculously affordable to play with serious hardware, as long as you don't mind the timer ticking down like you're on a game show.

Catch you tomorrow! ✌️

That’s it for this week, folks! If you want more, be sure to follow our Twitter (@BarunBuilds)

🤝 Share Get Into AI with your friends!

Did you like today's issue?

Login or Subscribe to participate in polls.

Subscribe to our newsletter

We will keep you updated with best news and trends to follow

Keep Reading

All Posts