PodcastsNotíciasThe New Stack Podcast

The New Stack Podcast

The New Stack
The New Stack Podcast
Último episódio

373 episódios

  • The New Stack Podcast

    Jim Bugwadia on why finding a Kubernetes problem is only half the battle for Kyverno users

    23/04/2026 | 23min
    Graduating within the CNCF marks a major milestone for an open source project, signaling not just technical maturity but strong governance, security practices, and widespread adoption. Kyverno, a Kubernetes policy engine, reached this stage after five years — becoming only the 35th project to progress from sandbox to graduation. As co-founder Jim Bugwadia explains, incubation reflects production readiness and adoption, while graduation validates the project’s long-term sustainability and governance rigor.

    Originally built to help teams manage Kubernetes complexity through declarative policies, Kyverno has evolved alongside the ecosystem. Its shift to the Kubernetes-native Common Expression Language (CEL) and rising demand driven by AI workloads have expanded its user base beyond regulated industries to mainstream enterprises. With over three billion downloads, it underscores the growing need for automated policy enforcement across development, security, and operations teams.

    Commercially, Nirmata maintains a clear boundary between open source and enterprise offerings, focusing on remediation and advanced management. While only 2–5% of users convert, that small percentage becomes meaningful at Kyverno’s scale.

    Learn more from The New Stack around the latest about Kyverno:

    Simplify Kubernetes Security With Kyverno and OPA Gatekeeper

    Using the Kyverno CLI to Write Policy Test Cases

    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
  • The New Stack Podcast

    How AWS Bedrock is shaping Model Context Protocol

    22/04/2026 | 31min
    At the MCP Summit in New York City, AWS’s Luca Chang, a Bedrock team member and MCP specification maintainer, discussed the rapid rise of the Model Context Protocol (MCP) as a standard for connecting AI models and agents to tools and data. He explained that MCP’s development is shaped by a diverse group of maintainers who collaboratively prioritize features, balancing major challenges with smaller enhancements that can unlock creative new capabilities. This breadth of perspectives prevents groupthink but makes prioritization difficult, as many ideas compete for limited bandwidth.

    Chang highlighted the role of large organizations like Amazon in advancing open source projects. AWS contributions such as Tasks and Elicitations emerged from internal efforts to map cloud services to MCP, revealing gaps in the protocol. Rather than contributing for speed, AWS focuses on real customer use cases, contributing only when clear needs arise. Chang also noted growing demand for MCP servers, while expressing caution about overly specialized, agent-specific implementations that could limit broader interoperability.

    Learn more from The New Stack around  the latest in Model Context Protocol (MCP) becoming a standard for connecting AI models and agents to tools and data: 

    Model Context Protocol: A Primer for the Developers

    Beyond the vibe code: The steep mountain MCP must climb to reach production

    https://thenewstack.io/model-context-protocol-evolution/

    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
  • The New Stack Podcast

    Why Microsoft is betting on temporary identities to stop autonomous agents from going rogue

    21/04/2026 | 24min
    AtKubeCon Europe 2026,Jorge Palmaoutlined how Microsoft is advancing AI operations across cloud and edge environments. He demonstrated an agent capable of diagnosing, mitigating, and explaining application issues in minutes, highlighting the growing role of agentic operations in Kubernetes.

    Palma emphasized that recent progress in tools likeAzure Kubernetes ServiceandAzure Archas made edge AI more practical by bridging cloud and on-prem systems. Kubernetes now acts as the unifying layer, while fleet management automates deployments that previously required manual GitOps workflows.

    To address fragmentation in inference engines, Microsoft introducedAI Runway, a standardized API that allows teams to swap underlying engines without changing workflows.

    Security remains a core challenge. Palma advocates for tightly scoped, temporary permissions and policy validation for agents, enforced through tools like the Agent Governance Toolkit. This reflects a broader shift: applying cloud-native principles—portability, abstraction, and policy control—to manage the unpredictable nature of AI workloads.

    Learn more from The New Stack about the latest around advancing AI operations across cloud and edge environments

    The Future of AI: Hybrid Edge Deployments Are Indispensable

    AI Is Coming to the Edge, but It Will Look Different

    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
  • The New Stack Podcast

    As agentic AI explodes, Amazon doubles down on MCP

    16/04/2026 | 24min
    At the MCP Summit inNew York City,Clare LiguoriofAmazon Web Servicesdiscussed the rapid rise of theModel Context Protocol(MCP), now a leading way to connect AI agents with tools and data. Originally developed byAnthropicand later transferred to theLinux Foundation, MCP has seen surging enterprise adoption as agentic AI expands.

    Liguori highlighted her dual role shaping MCP’s evolving specification, including work on integrating webhooks, events, and notifications to support always-on AI agents. AWS has actively contributed features like Tasks and Elicitations and offers managed MCP servers, positioning itself as both contributor and experimental platform for emerging capabilities.

    This collaboration illustrates how corporate involvement can accelerate open-source innovation and adoption. Looking ahead, MCP’s role as connective infrastructure for AI agents is expected to grow, especially as tools become more accessible. With broader adoption of AI development platforms across non-engineering roles, MCP could help extend automation beyond tech teams to businesses of all sizes.

    Learn more from The New Stack about the latest around Model Context Protocol(MCP): 

    MCP: The Missing Link Between AI Agents and APIs

    Beyond the vibe code: The steep mountain MCP must climb to reach production

    MCP is everywhere, but don’t panic. Here’s why your existing APIs still matter.

    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.
  • The New Stack Podcast

    A year in, Google wants its Axion processors to feel like a scheduling decision

    15/04/2026 | 22min
    At KubeCon Europe, Google Cloud’s Jago Macleod and Abdel Sghiouar argued that adopting Arm for Kubernetes workloads has shifted from a complex migration to a practical, low-friction choice. After a year of production use, Google’s custom Arm-based Axion processors—powering C4A and N4A instances—are positioned as broadly viable for most containerized applications, offering strong gains in performance, cost efficiency, and energy usage compared to x86.

    Rather than requiring a full overhaul, moving to Arm typically involves recompiling containers for a multi-architecture target and gradually rolling out via Kubernetes practices like canary deployments. While edge cases exist, they are relatively uncommon.

    A key enabler is GKE’s compute classes, which allow workloads to express preferences across VM types, turning infrastructure decisions into automated scheduling choices rather than manual provisioning.

    Ultimately, the conversation points to a larger constraint: energy. As AI workloads grow, efficiency—measured in “tokens per watt”—is emerging as the defining metric, with cost savings translating directly into greater compute capacity.

    Learn more from The New Stack about the latest developments around Google’s work with Axion: 

    Arm: See a Demo About Migrating a x86-Based App to ARM64 

    Do All Your AI Workloads Actually Require Expensive GPUs? 
    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

Mais podcasts de Notícias

Sobre The New Stack Podcast

The New Stack Podcast is all about the developers, software engineers and operations people who build at-scale architectures that change the way we develop and deploy software. For more content from The New Stack, subscribe on YouTube at: https://www.youtube.com/c/TheNewStack
Site de podcast

Ouça The New Stack Podcast, the news e muitos outros podcasts de todo o mundo com o aplicativo o radio.net

Obtenha o aplicativo gratuito radio.net

  • Guardar rádios e podcasts favoritos
  • Transmissão via Wi-Fi ou Bluetooth
  • Carplay & Android Audo compatìvel
  • E ainda mais funções

The New Stack Podcast: Podcast do grupo

Informação legal
Aplicações
Social
v8.8.12| © 2007-2026 radio.de GmbH
Generated: 4/24/2026 - 1:51:51 PM