The rise of LLM applications creates new infrastructure challenges related to GPU resources, model serving, and cost management. This cutting-edge workshop demonstrates how to extend your IDP to support AI application teams with self-service infrastructure for AI-powered applications and agents. You'll learn to create templates for model serving, implement cost controls for GPU resources, and provide integration patterns for common use cases.
AGENDA
How to create self-service templates for LLM application infrastructure and GPU resources
Security patterns for handling sensitive data in AI-powered applications and API key management
ADDITIONAL INFO
When:
Tuesday, September 9, 2025 · 11:00 a.m.
Central Time (US & Canada)
Pulumi’s mission is to democratize the cloud for every engineer.
Its open-source Infrastructure as Code tool enables engineers to write infrastructure code in any programming language. Build infrastructure intuitively on any cloud with TypeScript, JavaScriptPython, Go, C#, Java, and YAML.