Naamche

Share this post

Github's new Wikipedia?!

+Meta's new release

Author

Barun Pandey
April 26, 2025

GM! Welcome to Get Into AI.

I’m your AI news sherpa. Guiding you through the mountains of news to get to the good bit.

Here’s what I have for today:

  1. DeepWiki: GitHub's New Wikipedia

  2. Meta Releases Perception Encoders

  3. Modded RTX 4090s Boast 48GB Memory

First, a word from or sponsors:

Start learning AI in 2025

Keeping up with AI is hard – we get it!

That’s why over 1M professionals read Superhuman AI to stay ahead.

  • Get daily AI news, tools, and tutorials

  • Learn new AI skills you can use at work in 3 mins a day

  • Become 10X more productive

Alright, let’s dive in!

Three major headlines

Three main stories for the day.

1/ DeepWiki: GitHub's New Wikipedia

Cognition (the folks behind Devin) just announced DeepWiki, a free encyclopedia for GitHub repos.

Simply replace any GitHub URL with "https://deepwiki.com/org/repo" and boom – you get a Wikipedia-style breakdown of the library plus a Devin-powered chatbot to help you use it.

This could be a game-changer for understanding open source projects without spending hours combing through documentation.

Just type a different URL and let DeepWiki do the heavy mental lifting!

2/ Meta Releases Perception Encoders (PE)

Meta just dropped their "swiss army knives for vision" with an A2.0 license.

These image/video encoders handle vision language and spatial understanding tasks, reportedly outperforming both InternVL3 and Qwen2.5VL.

The PE Core model even beats the latest SigLIP2 on zero-shot image tasks.

The best part? They're releasing "gigantic" video and image datasets alongside the models.

Meta might be late to the open-weights party, but they're making up for it with some awe-inspiring vision tech that's free to use commercially.

3/ Modded RTX 4090s Boast 48GB Memory

AI enthusiasts are drooling over modded 48GB RTX 4090s, especially popular in China.

These souped-up GPUs are selling for around $3,500 and essentially double the VRAM of standard 4090s, making them perfect for running larger AI models locally.

While information about a potential 5090 remains scarce due to tight supplies, these modded cards show there's serious demand for more VRAM in the DIY AI community.

People are literally soldering more memory onto their GPUs rather than waiting for Nvidia to release higher-memory consumer cards!

Catch you tomorrow! ✌️

That’s it for this week, folks! If you want more, be sure to follow our Twitter (@BarunBuilds)

🤝 Share Get Into AI with your friends!

Did you like today's issue?

Login or Subscribe to participate in polls.

Subscribe to our newsletter

We will keep you updated with best news and trends to follow

Keep Reading

All Posts