Skip to main content
Pods provide instant access to powerful GPU and CPU resources for AI , , rendering, and other compute-intensive workloads. You have full control over your computing environment, allowing you to customize software, storage, and networking to match your exact requirements.

Get started

Quickstart

Create an account and deploy your first Pod.

Choose a Pod

Select the right GPU type and configuration for your workload.

Connect to your Pod

Access your Pod via SSH, JupyterLab, or VS Code.

Concepts

Templates

Pre-configured Docker image setups that let you quickly spin up Pods without manual environment configuration. Instead of installing PyTorch, configuring JupyterLab, and setting up all dependencies yourself, you can select an official Runpod PyTorch template and have everything ready to go instantly.

Storage

Pods offer three types of storage: for temporary files, for persistent storage throughout the Pod’s lease, and optional s for permanent storage that can be transferred between Pods.

Connection

Once deployed, you can connect to your Pod through SSH for command-line access, web proxy for exposed web services, JupyterLab for data science workflows, or VS Code/Cursor for local IDE integration.

Deployment options

You can deploy Pods in several ways:

Pod types

Runpod offers two cloud options:
  • Secure Cloud: Operates in T3/T4 data centers, providing high reliability and security for enterprise and production workloads.
  • Community Cloud: Connects individual compute providers to users through a vetted, secure peer-to-peer system, with competitive pricing options.

Pricing

Pods are billed by the minute with no fees for ingress/egress. Runpod also offers long-term savings plans for extended usage patterns. See Pod pricing for details.

Limitations

  • Docker Compose is not supported: Runpod runs Docker for you, so you cannot spin up your own Docker instance or use Docker Compose on Pods.
  • UDP connections are not supported: Pods only support TCP and HTTP connections.
  • Windows is not supported: Pods do not currently support Windows.

Tutorials

Run Ollama on a Pod

Run LLM inference with HTTP API access.

Build Docker images with Bazel

Emulate a Docker-in-Docker workflow.

Create a custom template

Build your own reusable Pod template.