Overview

Providers contribute GPU resources to the OpenGPU Network and earn $OGPU for completed tasks.

What You Can Do

  • Monetize idle GPU capacity

  • Earn passive income from AI inference tasks

  • Join and leave the network freely

  • Scale from a single GPU to a data center

How It Works

  1. Install Provider App — Set up the desktop application

  2. Register for Sources — Choose which execution environments to support

  3. Execute Tasks — Your GPU runs tasks automatically

  4. Get Paid — Earn $OGPU for each completed task

Requirements

Hardware

Spec
Minimum
Recommended

GPU VRAM

Any

24GB+

CPU

64-bit with virtualization

8+ cores

RAM

4GB

32GB+

Storage

20GB

256GB+ NVMe

Network

Stable connection

1 Gbps

Minimum = Docker requirementsarrow-up-right to run the Provider App. Recommended = handle moderate workloads (e.g. 20B parameter models).

Actual task requirements vary by source. Register for sources that match your hardware — OpenGPU supports all GPU types and configurations.

Tools

Tool
Purpose

Desktop app to manage your provider

Monitor earnings, performance, registrations

Economics

  • Task payments — Earn $OGPU per completed task

  • Staking rewards — Additional earnings for staked providers

  • Reputation — Higher reputation increases visibility and earning potential

AI Agents

Providers can run AI agents that automatically bid on tasks and optimize earnings. Current agent types include greedy and periodic strategies. See AI Agents for details.

Last updated