Runtime and language support
Upsun supports multiple runtimes through container-based deployments including Python, Node.js, PHP, Ruby, Go, Java, and more. For the complete list of supported runtimes and their versions, see the runtime types reference. Configure your runtime in the.upsun/config.yaml
file.
The platform builds your application in a container with the specified runtime
and dependencies.
LLM API integration
You can integrate your AI agent with any LLM API that your chosen runtime supports:- OpenAI API: Use the official OpenAI client libraries for Python, Node.js, and other languages
- Anthropic Claude API: Use the Anthropic client libraries for Python, Node.js, and other languages
- Google Gemini: Use the Google AI client libraries for Python, Node.js, and other languages
- Azure OpenAI: Use the Azure OpenAI client libraries for Python, Node.js, and other languages
- AWS Bedrock: Use the AWS Bedrock client libraries for Python, Node.js, and other languages
- Other providers: Integrate with any API that provides HTTP endpoints
Environment management
Upsun provides isolated environments for development, testing, and production:- Branch-based environments: Each Git branch creates a separate environment
- Data isolation: Each environment has its own services and data
- Easy cloning: Clone production data to development environments for testing
- Environment variables: Store API keys and configuration securely using variables
Tutorials
- Deploy a RAG-based conversational agent with Chainlit: Build a Retrieval-Augmented Generation agent using Chainlit, llama_index, and OpenAI, then deploy it on Upsun. See the Chainlit deployment tutorial.
- Access Documentation contextually via Context7 + MCP: Use the Model Context Protocol to let AI assistants fetch your Upsun documentation in real-time. See the Context7 MCP article.
- Use the Upsun API to automate Agent deployment: Automate deployments, environment management, and configuration through the Upsun API. See the API usage guide.
Configuration example
Here’s a basic configuration for a Python AI agent. For more configuration options, see the complete application reference:Application Code
For examples of how to implement AI agents with different frameworks and APIs, see the AI and Machine Learning tutorials on DevCenter.Deploy your Agent
-
Add your code to Git:
-
Set your OpenAI API key as an environment variable using the [CLI]
(/docs/administration/cli):
For more information about setting variables, see [how to set variables] (/docs/development/variables/set-variables).
-
Deploy to Upsun:
Key benefits
- Runtime flexibility: Choose the programming language and version that fits your needs
- Service independence: Use any LLM API or external service
- Environment isolation: Test changes safely in separate environments
- Automated deployment: Deploy through Git pushes or API calls
- Scalability: The platform handles load balancing and resource allocation