Basic Usage

Getting started with the Responses API Beta
Beta API

This API is in beta stage and may have breaking changes.

The Responses API Beta supports both simple string input and structured message arrays, making it easy to get started with basic text generation.

Simple String Input

The simplest way to use the API is with a string input:

Structured Message Input

For more complex conversations, use the message array format:

Response Format

The API returns a structured response with the generated content:

Streaming Responses

Enable streaming for real-time response generation:

Example Streaming Output

The streaming response returns Server-Sent Events (SSE) chunks:

Common Parameters

ParameterTypeDescription
modelstringRequired. Model to use (e.g., openai/o4-mini)
inputstring or arrayRequired. Text or message array
streambooleanEnable streaming responses (default: false)
max_output_tokensintegerMaximum tokens to generate
temperaturenumberSampling temperature (0-2)
top_pnumberNucleus sampling parameter (0-1)

Error Handling

Handle common errors gracefully:

Multiple Turn Conversations

Since the Responses API Beta is stateless, you must include the full conversation history in each request to maintain context:

Required Fields

The id and status fields are required for any assistant role messages included in the conversation history.

Conversation History

Always include the complete conversation history in each request. The API does not store previous messages, so context must be maintained client-side.

Next Steps