PodcastsNotíciasThe New Stack Podcast

The New Stack Podcast

The New Stack
The New Stack Podcast
Último episódio

372 episódios

  • The New Stack Podcast

    How AWS Bedrock is shaping Model Context Protocol

    22/04/2026 | 31min
    At the MCP Summit in New York City, AWS’s Luca Chang, a Bedrock team member and MCP specification maintainer, discussed the rapid rise of the Model Context Protocol (MCP) as a standard for connecting AI models and agents to tools and data. He explained that MCP’s development is shaped by a diverse group of maintainers who collaboratively prioritize features, balancing major challenges with smaller enhancements that can unlock creative new capabilities. This breadth of perspectives prevents groupthink but makes prioritization difficult, as many ideas compete for limited bandwidth.

    Chang highlighted the role of large organizations like Amazon in advancing open source projects. AWS contributions such as Tasks and Elicitations emerged from internal efforts to map cloud services to MCP, revealing gaps in the protocol. Rather than contributing for speed, AWS focuses on real customer use cases, contributing only when clear needs arise. Chang also noted growing demand for MCP servers, while expressing caution about overly specialized, agent-specific implementations that could limit broader interoperability.

    Learn more from The New Stack around  the latest in Model Context Protocol (MCP) becoming a standard for connecting AI models and agents to tools and data: 

    Model Context Protocol: A Primer for the Developers

    Beyond the vibe code: The steep mountain MCP must climb to reach production

    https://thenewstack.io/model-context-protocol-evolution/

    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
  • The New Stack Podcast

    Why Microsoft is betting on temporary identities to stop autonomous agents from going rogue

    21/04/2026 | 24min
    AtKubeCon Europe 2026,Jorge Palmaoutlined how Microsoft is advancing AI operations across cloud and edge environments. He demonstrated an agent capable of diagnosing, mitigating, and explaining application issues in minutes, highlighting the growing role of agentic operations in Kubernetes.

    Palma emphasized that recent progress in tools likeAzure Kubernetes ServiceandAzure Archas made edge AI more practical by bridging cloud and on-prem systems. Kubernetes now acts as the unifying layer, while fleet management automates deployments that previously required manual GitOps workflows.

    To address fragmentation in inference engines, Microsoft introducedAI Runway, a standardized API that allows teams to swap underlying engines without changing workflows.

    Security remains a core challenge. Palma advocates for tightly scoped, temporary permissions and policy validation for agents, enforced through tools like the Agent Governance Toolkit. This reflects a broader shift: applying cloud-native principles—portability, abstraction, and policy control—to manage the unpredictable nature of AI workloads.

    Learn more from The New Stack about the latest around advancing AI operations across cloud and edge environments

    The Future of AI: Hybrid Edge Deployments Are Indispensable

    AI Is Coming to the Edge, but It Will Look Different

    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
  • The New Stack Podcast

    As agentic AI explodes, Amazon doubles down on MCP

    16/04/2026 | 24min
    At the MCP Summit inNew York City,Clare LiguoriofAmazon Web Servicesdiscussed the rapid rise of theModel Context Protocol(MCP), now a leading way to connect AI agents with tools and data. Originally developed byAnthropicand later transferred to theLinux Foundation, MCP has seen surging enterprise adoption as agentic AI expands.

    Liguori highlighted her dual role shaping MCP’s evolving specification, including work on integrating webhooks, events, and notifications to support always-on AI agents. AWS has actively contributed features like Tasks and Elicitations and offers managed MCP servers, positioning itself as both contributor and experimental platform for emerging capabilities.

    This collaboration illustrates how corporate involvement can accelerate open-source innovation and adoption. Looking ahead, MCP’s role as connective infrastructure for AI agents is expected to grow, especially as tools become more accessible. With broader adoption of AI development platforms across non-engineering roles, MCP could help extend automation beyond tech teams to businesses of all sizes.

    Learn more from The New Stack about the latest around Model Context Protocol(MCP): 

    MCP: The Missing Link Between AI Agents and APIs

    Beyond the vibe code: The steep mountain MCP must climb to reach production

    MCP is everywhere, but don’t panic. Here’s why your existing APIs still matter.

    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
  • The New Stack Podcast

    A year in, Google wants its Axion processors to feel like a scheduling decision

    15/04/2026 | 22min
    At KubeCon Europe, Google Cloud’s Jago Macleod and Abdel Sghiouar argued that adopting Arm for Kubernetes workloads has shifted from a complex migration to a practical, low-friction choice. After a year of production use, Google’s custom Arm-based Axion processors—powering C4A and N4A instances—are positioned as broadly viable for most containerized applications, offering strong gains in performance, cost efficiency, and energy usage compared to x86.

    Rather than requiring a full overhaul, moving to Arm typically involves recompiling containers for a multi-architecture target and gradually rolling out via Kubernetes practices like canary deployments. While edge cases exist, they are relatively uncommon.

    A key enabler is GKE’s compute classes, which allow workloads to express preferences across VM types, turning infrastructure decisions into automated scheduling choices rather than manual provisioning.

    Ultimately, the conversation points to a larger constraint: energy. As AI workloads grow, efficiency—measured in “tokens per watt”—is emerging as the defining metric, with cost savings translating directly into greater compute capacity.

    Learn more from The New Stack about the latest developments around Google’s work with Axion: 

    Arm: See a Demo About Migrating a x86-Based App to ARM64 

    Do All Your AI Workloads Actually Require Expensive GPUs? 
    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
  • The New Stack Podcast

    Can you make Kubernetes invisible? Here's why AWS is on a mission to do it.

    14/04/2026 | 23min
    In this episode ofThe New Stack Makers, Jesse Butler, principal product manager for AWS Elastic Kubernetes Service, shares his vision for simplifying cloud-native computing. Since joining AWS in 2020, Butler has focused on making Kubernetes easier to use, emphasizing open-source as a democratizing force. He highlights the role of the Cloud Native Computing Foundation (CNCF) in standardizing and governing open ecosystems while balancing community-driven innovation with commercial contributions.

    Butler describes Kubernetes as widely adopted—used in production by around 80% of enterprises—yet still overly complex. His goal is to make it “invisible,” much like Linux, by abstracting and consolidating services. He points to projects like Karpenter, which enables real-time node provisioning for efficient scaling; Kro, which simplifies resource orchestration; and Cedar, a flexible policy engine for fine-grained authorization.

    He underscores the importance of open-source contributors, noting their critical yet often underappreciated role. Looking ahead, Butler envisions a future where automation and human collaboration further enhance usability and innovation in open-source software.

    Learn more from The New Stack about the latest around AWS Elastic Kubernetes Service

    2026 Will Be the Year of Agentic Workloads in Production on Amazon EKS

    Amazon EKS Auto Mode wants to end Kubernetes toil — one node at a time

    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

Mais podcasts de Notícias

Sobre The New Stack Podcast

The New Stack Podcast is all about the developers, software engineers and operations people who build at-scale architectures that change the way we develop and deploy software. For more content from The New Stack, subscribe on YouTube at: https://www.youtube.com/c/TheNewStack
Site de podcast

Ouça The New Stack Podcast, Comentaristas e muitos outros podcasts de todo o mundo com o aplicativo o radio.net

Obtenha o aplicativo gratuito radio.net

  • Guardar rádios e podcasts favoritos
  • Transmissão via Wi-Fi ou Bluetooth
  • Carplay & Android Audo compatìvel
  • E ainda mais funções

The New Stack Podcast: Podcast do grupo

Informação legal
Aplicações
Social
v8.8.12| © 2007-2026 radio.de GmbH
Generated: 4/23/2026 - 11:17:33 AM