I am so excited to announce the Kubernetes course 2025! It was something that I wanted to bring in for the community with a project based learning. So I created a small multi microservices application where I have a frontend, game service(Yes you can play tic tac toe, snake and RPS), Authentication service using JWT and then Postgres database for the storing the scores and login information. I have covered the most requested topics → CRI, CNI, CSI, Kube-proxy, CoreDNS - The whole networking section is detailed out with demo. I really hope you enjoy learning Kubernetes with this course and it becomes easy for you. This course features latest Kubernetes version 1.33 and I am super glad to have Exoscale partnering with us who are offering FREE $50 worth of credits without credit card for you to try out the entire course for FREE. You can have a production ready experience at absolute ZERO cost. IMO this is the best opportunity to learn Kubernetes for free in 2025.
Click here or the image below to get access to $50 credit on Exoscale.
There have been a lot happening otherwise in the new initiatives I am doing and also in the regular AI world. I started a series on vCluster Youtube channel called vCluster friday’s where I sit with maintainers and learn about a CNCF project. We did one on Kyverno already.
AI is moving so fast that many top enterprise leaders are saying it’s already doing 20–50% of the work, an impressive and somewhat concerning number for new hires or those preparing for jobs. While there's still time before AI reaches a level where it can replace certain fields entirely, we are definitely on that journey. The only way to stay ahead is by upskilling, integrating AI into your workflows, and learning to work better with AI.
Awesome Reads(yes a lot of AI)
Docker MCP Catalog: Finding the Right AI Tools for Your Project - Docker’s new MCP Catalog and Toolkit make it dramatically easier to integrate AI agents with real-world tools by packaging over 100 verified MCP servers as secure, containerized services discoverable via Docker Desktop. With one-click installs, secure credential handling, and native support for LLM clients like Claude and Cursor, developers can now build agent-driven workflows that are portable, scalable, and production-ready, all from the familiar Docker interface.
Introducing Gateway API Inference Extension - The new Gateway API Inference Extension brings model-aware, intelligent routing to Kubernetes for GenAI and LLM workloads, solving challenges traditional load balancers can't handle—like long-running sessions, GPU saturation, and request prioritization. By introducing CRDs like InferencePool and InferenceModel, it enables efficient, scalable, and low-latency AI inference through Kubernetes-native tooling, backed by early benchmarks showing significant latency improvements at high QPS.
Building agents with OpenAI and Cloudflare’s Agents SDK - Cloudflare has introduced an Agents SDK that provides a persistent, scalable execution layer for AI agents built with frameworks like OpenAI’s Agents SDK. By combining OpenAI's reasoning and planning capabilities with Cloudflare's global infrastructure and Durable Objects, developers can now build intelligent, stateful, and composable agents that live and operate at the edge — enabling advanced use cases like multi-agent systems, human-in-the-loop workflows, and real-world interfaces like phone and WebSocket connections.
Containers are available in public beta for simple, global, and programmable compute - Cloudflare has launched Containers in public beta for paid plan users, enabling developers to seamlessly deploy containerized applications alongside Workers with just a few lines of code—offering global distribution, automatic scaling, and integrated observability. This new feature bridges the gap between lightweight serverless and full container workloads, simplifying deployment workflows and expanding use cases like edge media processing, backend services in any language, and dynamic sandboxing.
MCP vs API - MCP (Model Context Protocol) is a purpose-built wire protocol for AI agents that enforces consistent, deterministic, and bidirectional communication—unlike traditional APIs like REST or OpenAPI, which were designed for human developers and often lead to brittle or error-prone integrations. With features like runtime discovery, local execution via stdio, and simplified tool schemas, MCP enables more reliable and scalable agent behavior, while still complementing existing APIs under the hood.
AI Agents in a Nutshell - AI agents, unlike traditional rule-based apps or chatbots, use LLMs to make autonomous decisions, invoke external tools, and adapt to real-time user preferences—making them ideal for personalized, outcome-driven applications like dynamic commute planners or support assistants. However, building production-grade agents comes with challenges in predictability, stability, and operations, which can be tackled using frameworks like ADK, protocols like MCP, and modular multi-agent architectures.
Build and Deploy a Remote MCP Server to Google Cloud Run in Under 10 Minutes - Google Cloud has published a step-by-step guide to deploy a remote Model Context Protocol (MCP) server to Cloud Run in under 10 minutes, making it easy to expose deterministic tools (like add/subtract) for LLM-based AI agents. By using FastMCP, Cloud Run, and streamable-http transport, developers can build scalable, authenticated, and shareable MCP servers with minimal setup—streamlining tool integration for smarter AI workflows.
Awesome Resource/Repo’s
A2A - An open protocol enabling communication and interoperability between opaque agentic applications.
Complete Guide to Build and Deploy an AI Agent with Docker Containers and Python
If you like the resource shared in the newsletter please do share with your network and subscribe for FREE.
Do not forget to check out the Kubernetes Course
Hi Saiyam, Is it possible to share your Excalidraw drawings