OpenAI-compatible API for the NPC Fin 32B model. A fine-tuned finance specialist, quantized for fast inference.
Send a chat completion request using curl:
curl https://bottensor.xyz/api/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-bt-your-key-here" \
-d '{
"model": "npc-fin-32b",
"messages": [
{"role": "system", "content": "You are NPC Fin, a financial analysis AI."},
{"role": "user", "content": "Analyze the current BTC market structure."}
],
"temperature": 0.7,
"max_tokens": 1024
}'All requests require an API key passed in the Authorization header:
Authorization: Bearer sk-bt-your-key-hereKeys follow the format sk-bt-<32 hex chars>. Contact us to obtain a key.
| Parameter | Type | Default | Description |
|---|---|---|---|
| model | string | "npc-fin-32b" | Model ID |
| messages | array | required | Array of {role, content} objects |
| temperature | number | 0.7 | Sampling temperature (0-1) |
| max_tokens | number | 1024 | Max tokens to generate (1-4096) |
| stream | boolean | false | Enable SSE streaming |
| top_p | number | 1.0 | Nucleus sampling |
Returns a list of available models in OpenAI-compatible format.
When rate limited, the API returns 429 with a Retry-After header.
Set "stream": true to receive tokens as they are generated via SSE.
curl https://bottensor.xyz/api/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-bt-your-key" \
-d '{"model":"npc-fin-32b","messages":[{"role":"user","content":"Explain DCF valuation."}],"stream":true}'from openai import OpenAI
client = OpenAI(
base_url="https://bottensor.xyz/api/v1",
api_key="sk-bt-your-key-here"
)
stream = client.chat.completions.create(
model="npc-fin-32b",
messages=[
{"role": "system", "content": "You are NPC Fin, a financial analysis AI."},
{"role": "user", "content": "Walk me through a DCF for a SaaS company."}
],
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
print()import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://bottensor.xyz/api/v1",
apiKey: "sk-bt-your-key-here",
});
const stream = await client.chat.completions.create({
model: "npc-fin-32b",
messages: [
{ role: "system", content: "You are NPC Fin, a financial analysis AI." },
{ role: "user", content: "What is the difference between EBITDA and FCF?" }
],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || "");
}
console.log();All errors return JSON with {"error": {"message": "...", "type": "...", "code": N}}