Logo
READLEARNKNOWCONNECT
Back to posts
this-weeks-shiny-new-ai-model

This Week's Shiny New AI Model

ChriseApril 17, 2026 at 5 PM WAT

Alibaba Just Open-Sourced a Coding AI That Runs on a Single GPU

Alibaba's new Qwen3.6-35B-A3B activates just 3B of its 35B parameters, letting it run on a single GPU.

Alibaba's Qwen team dropped a new model this week, Qwen3.6-35B-A3B. Name's a mouthful, but the architecture is interesting. It's a Mixture-of-Experts model with 35 billion total parameters, but it only activates about 3 billion per inference. That's small enough to run on a single consumer GPU.

The model is designed for agentic coding, so that means it can write code, but also use tools, browse the web, and run terminal commands in a loop to complete complex tasks. You give it a goal, and it figures out the steps, it's not just autocomplete on steroids.

The Numbers

On SWE-bench Verified, it shows strong performance. That's competitive with Google's Gemma4-31B which is a much larger, dense model. It also beats its elder sibling Qwen3.5-35B-A3B by a decent margin. It's not the best out there, but it's solid for something you can run locally.

The model can understand images too, not just text. On tests measuring both vision and language skills, it matches or approaches Claude Sonnet 4.5. That's notable because Claude is a much larger and more expensive model. The context window (how much text or code you can feed it at once), is 256,000 tokens by default, roughly a 500-page book. With a few adjustments, you can push it past a million tokens.

Where to Run It

The model weights are on Hugging Face and ModelScope, fully open too. You can also try it for free on Qwen Studio if you just want to poke around before downloading 70GB.

For API users, Alibaba is offering qwen3.6-flash, which includes a preserve_thinking parameter that keeps the model's chain-of-thought in the conversation history. Very useful for debugging or for multi-step agentic workflows where you want to see why it made a particular decision.

Alibaba says more Qwen3.6 models are coming soon, including a flagship called Qwen3.6-Max. No word yet on whether those will also be open source. We'll see.

Tags

#ai#alibaba#coding#dev-digest#open-source#qwen

Join the Discussion

Enjoyed this? Ask questions, share your take (hot, lukewarm, or undecided), or follow the thread with people in real time. The community’s open, join us.