Dashboard
Balance
API Keys
Total Spent
Total Inferences
Chat
API Keys
Transaction History
📚 API Documentation
Complete guide to using Gonka AI Gateway API
Getting Started
Gonka AI Gateway provides an OpenAI-compatible API with Web3 authentication and token-based billing.
Authentication
All API requests require authentication using an API key in the Authorization header:
Authorization: Bearer sk-your-api-key-here
Getting Your API Key
- Authorize on the dashboard
- Navigate to "API Keys" section
- Click "Create New API Key"
- Save the key immediately - it won't be shown again!
Base URL
The gateway is served over HTTPS (proxied via Traefik). All API requests should be made to:
https://gonka-gateway.mingles.ai/v1
API Endpoints
Create a chat completion (OpenAI-compatible)
Request
POST /chat/completions
Content-Type: application/json
Authorization: Bearer sk-your-api-key
{
"model": "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
"messages": [
{"role": "user", "content": "Hello!"}
],
"stream": false
}
Response
{
"id": "chatcmpl-123",
"object": "chat.completion",
"created": 1677652288,
"choices": [{
"index": 0,
"message": {
"role": "assistant",
"content": "Hello! How can I help you?"
},
"finish_reason": "stop"
}],
"usage": {
"prompt_tokens": 9,
"completion_tokens": 12,
"total_tokens": 21
}
}
Streaming
To enable streaming, set "stream": true in the request:
{
"model": "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
"messages": [
{"role": "user", "content": "Tell me a story"}
],
"stream": true
}
List available models
Request
GET /models
Authorization: Bearer sk-your-api-key
Response
{
"object": "list",
"data": [
{
"id": "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
"object": "model",
"created": 1677610602,
"owned_by": "gonka"
}
]
}
Using with OpenAI Python SDK
The API is fully compatible with the OpenAI Python SDK:
from openai import OpenAI
client = OpenAI(
api_key="sk-your-api-key-here",
base_url="https://gonka-gateway.mingles.ai/v1"
)
response = client.chat.completions.create(
model="Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
messages=[
{"role": "user", "content": "Hello!"}
]
)
print(response.choices[0].message.content)
Simple Python Example
Here's a simple example using Python's requests library:
import requests
url = "https://gonka-gateway.mingles.ai/v1/chat/completions"
headers = {
"Authorization": "Bearer sk-your-api-key-here",
"Content-Type": "application/json"
}
data = {
"model": "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
"messages": [
{"role": "user", "content": "Hello! How are you?"}
]
}
response = requests.post(url, headers=headers, json=data)
result = response.json()
print(result["choices"][0]["message"]["content"])
Using with curl
curl https://gonka-gateway.mingles.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-your-api-key" \
-d '{
"model": "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
"messages": [
{"role": "user", "content": "Hello!"}
]
}'
Using with n8n
Token Billing
Each API request consumes GNK tokens from your balance. Tokens are calculated from the API response and balance is automatically deducted after each request.
Checking Your Balance
You can check your balance on the dashboard or via the API:
GET /api/user/balance
Authorization: Bearer sk-your-api-key
Rate Limits
Currently, there are no rate limits. However, please use the API responsibly.
Support
For issues or questions, please contact support or check the dashboard for your transaction history.
Connect to OpenClaw
Use this guide to connect OpenClaw to the Gonka gateway: configure the provider on your OpenClaw node, then use the OpenClaw Telegram bot commands to switch to the Gonka model and check status.
1. Provider config (OpenClaw node)
Configure the Gonka provider on your OpenClaw node/gateway so the agent can use Gonka models. Edit openclaw.json or your models.json (wherever OpenClaw reads models.providers).
Replace:
https://gonka-gateway.mingles.ai/v1— with your gateway base URL if different.sk-..........— with your API key (from registration or dashboard).
{
"models": {
"providers": {
"gonka": {
"baseUrl": "https://gonka-gateway.mingles.ai/v1",
"apiKey": "sk-..........",
"auth": "api-key",
"api": "openai-completions",
"authHeader": true,
"models": [
{
"id": "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
"name": "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
"api": "openai-completions",
"reasoning": false,
"input": ["text"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 200000,
"maxTokens": 8192
}
]
}
}
}
}
After editing: save the file and restart the OpenClaw gateway/node if it does not reload config automatically.
2. OpenClaw Telegram bot commands
These commands are used in the OpenClaw Telegram bot chat (not in the gateway config or API). They let you switch to the Gonka model and check that it is active.
/status
In the OpenClaw Telegram bot, send /status to see the current runtime state, including which model is in use.
🦞 OpenClaw 2026.2.15 (3fe22ea)
🧠 Model: gonka/Qwen/Qwen3-235B-A22B-Instruct-2507-FP8 · 🔑 api-key sk-dvg…6GIAPE (models.json)
🧮 Tokens: 18k in / 137 out
📚 Context: 18k/200k (9%) · 🧹 Compactions: 0
🧵 Session: agent:main:main • updated just now
⚙️ Runtime: direct · Think: off · verbose
🪢 Queue: collect (depth 0)
From this you can confirm that the Model line shows gonka/Qwen/Qwen3-235B-A22B-Instruct-2507-FP8 when the Gonka model is active.
/model <provider/model-id>
In the OpenClaw Telegram bot, send /model <provider/model-id> to switch the active model. To use the Gonka Qwen model, send:
/model gonka/Qwen/Qwen3-235B-A22B-Instruct-2507-FP8
Example: you send /model gonka/Qwen/Qwen3-235B-A22B-Instruct-2507-FP8, the agent replies Model set to gonka/Qwen/Qwen3-235B-A22B-Instruct-2507-FP8. Then send /status again to verify the Model line shows the Gonka model.
3. Summary
| What | Where | Action |
|---|---|---|
| Add Gonka provider | OpenClaw node — openclaw.json or models config | Put the models.providers.gonka block from section 1 into your config. Set baseUrl and apiKey. |
| Switch to Gonka model | OpenClaw Telegram bot | Send: /model gonka/Qwen/Qwen3-235B-A22B-Instruct-2507-FP8 |
| Check current model | OpenClaw Telegram bot | Send: /status — look at the Model line in the reply. |
So: config = on the node; /status and /model = in the OpenClaw Telegram bot chat.