TLDR AI 2025-11-28
DeepSeek IMO Gold 🥇, OpenAI data breach 🦺, Nano Banana prompting 🍌
Google changes Gemini 3 Pro free access limits due to ‘high demand' (2 minute read)
Google has updated the access elements for Gemini 3 Pro. Free users are now only guaranteed basic access, where daily limits may change frequently when using Thinking with 3 Pro. The limits have likely decreased, given the industry's general demand trends. NotebookLM rolled back access to the new Nano Banana Pro-powered Infographics and Slide Decks for free users and implemented limits for Pro users.
OpenAI cuts off Mixpanel after analytics leak exposes API users (3 minute read)
OpenAI API users may be affected by a recent breach at data analytics provider Mixpanel. Only API users are affected - normal ChatGPT users don't need to take any action. The data includes names, approximate locations, operating system and browser details, and user IDs. OpenAI dropped Mixpanel as a result of the attack. It is also carrying out a wider security review across its vendor ecosystem.
👨💻
Engineering & Research
Better Agents (GitHub Repo)
Better Agents is a CLI tool and a set of standards for agent building. It makes coding assistants experts in any agent framework. The tool generates an AGENTS.md that ensures industry best practices. The CLI guides users through selecting a programming language, agent framework, coding assistant, LLM provider, and API keys.
INTELLECT-3: A 100B+ MoE trained with large-scale RL (10 minute read)
INTELLECT-3 is a 100B+ parameter Mixture-of-Experts that achieves state-of-the-art performance for its size across math, code, science, and reasoning benchmarks. It was trained with both SFT and RL on top of the GLM 4.5 Air base model. The researchers used a diverse and challenging mix of RL environments designed to enhance the reasoning and agentic capabilities of their model. Full details about the training are available.
Compounding Engineering Plugin (GitHub Repo)
The Compounding Engineering Plugin is a Claude Code plugin that transforms how developers plan, build, and review code using AI-powered tools that systematically improve their development workflow. Compound engineering is the idea that each unit of engineering work should make subsequent units of work easier, not harder. The plugin provides the tools to make compound engineering practical.
Open Deep Research (GitHub Repo)
An experimental, fully open-source research assistant built on LangGraph that automates deep topic research by planning, gathering, and writing structured markdown reports—either via a human-in-the-loop workflow or a multi-agent architecture—with configurable models, search tools, prompting, and evaluation integration.
Nvidia reportedly no longer supplying VRAM to its GPU board partners in response to memory crunch (2 minute read)
Nvidia will reportedly stop bundling video memory with the GPUs it sells to AIBs. Partners will be left to source the required VRAM on their own. For larger vendors, this shouldn't be a problem as it is already standard practice. The move could put a lot of pressure on low-scale operators and put them at risk of shutting down.
Off-the-Rails Cost (1 minute read)
A 'wasted thread' is when a model starts spitting out tons of leaked thinking or repeating tokens. This usually means users have to abandon and revert the thread. 17.8% of all costs incurred by Gemini users in Amp were on 'wasted tokens'. This is more than 2x worse than Sonnet and almost 8x worse than Opus.
The Current State of the Theory that GPL Propagates to AI Models Trained on GPL Code (45 minute read)
The jury is still out on whether GPL propagation to models should be pursued or avoided. It is unlikely that it will become a reality any time soon. Current lawsuits are seeking injunctions and damages rather than the forced GPL-ization of models, so it is still a legally uncharted territory. We need to continue exploring how to make technology innovation and software freedom compatible.
Get the most interesting AI stories and breakthroughs delivered in a free daily email.
Join 920,000 readers for
one daily email