Welcome to the first edition of MCP Bits! With OpenAI, I think it is clear MCP will be the LLM Tool Integration standard. I love newsletters, and it feels like a good time to start an MCP one to bring subscribers the latest bits from the MCP ecosystem. The goal of the newsletter is to "fine-tune" the latest news, content, and project updates from the MCP ecosystem.
It was a big week - from the release of a new spec version to OpenAI announcing support, I hope you enjoy reading MCP Bits.
Any feedback → Find MCP Bits on Blue Sky
News & Articles
Deep Dive Into the Future of AI Tooling
Soon, we will likely see specialized MCP clients emerge for business-centric tasks such as customer support, marketing copywriting, design, and image editing, as these fields closely align with AI’s strengths in pattern recognition and creative tasks
OpenAI Announces Support for MCP
The Differential for Modern APIs and Systems
MCP enables vastly more resilient system integrations by acting as a differential between APIs, absorbing changes and inconsistencies that would typically break traditional integrations. With "the prompt is the program," we can build more adaptable, intention-based integrations that focus on what needs to be done rather than how it's technically implemented.
MCP: Simply Explained in Five Minutes
So why was MCP introduced? It solves the problem of needing to manually integrate the different APIs you want to use. If you’ve ever tried talking to 3rd-party APIs, you know how much of a pain it is to integrate all the different API types and patterns. MCP solves that by forcing these providers to follow one standard interface.
Calling MCP Servers the Hard Way
Surprisingly, I was unable to find a simple guide to performing this task with the Internet’s own universal API client. Here are a few notes that I made to hopefully help out others who would like to explore this newly emerging protocol.
Videos
I’m joined by Ras Mic to explain Model Context Protocol (MCP). Mic breaks down how MCPs essentially standardize how LLMs connect with external tools and services. While LLMs alone can only predict text, connecting them to tools makes them more capable, but this integration has been cumbersome. MCPs create a unified layer that translates between LLMs and services, making it easier to build more powerful AI assistants.
The Model Context Protocol (MCP) has everyone including myself fired up because of the way it standardizes connecting tools to LLMs. With it, creating powerful AI agents that can do practically anything for you has become MUCH more accessible.
Project Updates
Specification
The latest spec (2025-03-26) was finalized this week!
Key Changes includes the new Streamable HTTP support (this replaces the HTTP+SSE transport from protocol version 2024-11-05) and a new authorization framework.
SDKs
SDK developments are moving fast!
Typescript SDK 1.8.0 released
Python SDK 1.6.0 released
C# SDK 0.1.0-preview2 released
Java SDK 0.8.0 released
Kotlin 0.40 released
rmcp was merged as the official Rust SDK
Is there an official Go SDK coming soon?
Loopwork AI’s Swift SDK merged as the official Swift SDK
Server of the Week
I am starting with one server of the week but will most likely introduce clients too.
Playwright Server - Microsoft released the server to enable browser automation capabilities via Playwright. Simon Willison wrote a quick overview of the integration.