Skip to main content

DeepSeek Provider Guide

Text generation with DeepSeek-V3 (chat) and DeepSeek-R1 (reasoning) through a single API


Overview

DeepSeek is a Chinese AI research lab offering highly capable open-weight models via a hosted cloud API. NeuroLink wraps their OpenAI-compatible endpoint, giving you access to two model families:

  • deepseek-chat — DeepSeek-V3, a 671B mixture-of-experts model optimised for everyday chat and code tasks. Supports tool calling and structured output.
  • deepseek-reasoner — DeepSeek-R1, a reasoning model that performs extended chain-of-thought before producing an answer. The AI SDK surfaces the reasoning trace separately so you can inspect it.

Key Facts

  • Protocol: OpenAI-compatible (/v1/chat/completions)
  • Default base URL: https://api.deepseek.com
  • Context window: 64K tokens (both models)
  • Vision: Not supported — text-only
  • Streaming: Supported
  • Tool calling: Supported on deepseek-chat; limited on deepseek-reasoner
  • Reasoning trace: deepseek-reasoner exposes reasoning_content (surfaced as reasoning parts in the AI SDK response)

Quick Start

1. Get an API Key

Sign up at https://platform.deepseek.com and create an API key under API Keys.

2. Configure Environment

Add to your .env file:

# Required
DEEPSEEK_API_KEY=sk-...

# Optional: override the default model (default: deepseek-chat)
DEEPSEEK_MODEL=deepseek-chat

# Optional: override the base URL (default: https://api.deepseek.com)
DEEPSEEK_BASE_URL=https://api.deepseek.com
npm install @juspay/neurolink
# or
pnpm add @juspay/neurolink

4. Generate Your First Response

import { NeuroLink } from "@juspay/neurolink";

const ai = new NeuroLink();

const result = await ai.generate({
provider: "deepseek",
input: {
text: "Explain the difference between synchronous and asynchronous programming.",
},
});

console.log(result.content);

Supported Models

Model IDFamilyContextTool CallingNotes
deepseek-chatDeepSeek-V364KYesDefault; best for chat and code tasks
deepseek-reasonerDeepSeek-R164KLimitedExtended reasoning; exposes reasoning trace

Pass any model ID via --model (CLI) or model: (SDK). Only these two models are officially hosted on api.deepseek.com.


SDK Usage

Basic Generation

import { NeuroLink } from "@juspay/neurolink";

const ai = new NeuroLink();

const result = await ai.generate({
provider: "deepseek",
input: { text: "Write a TypeScript function to debounce an async function." },
});

console.log(result.content);

Using the Reasoner Model

const result = await ai.generate({
provider: "deepseek",
model: "deepseek-reasoner",
input: { text: "Prove that the square root of 2 is irrational." },
});

// The reasoning trace is available separately from the final answer
console.log(result.content);

Note: deepseek-reasoner produces a longer response latency because it thinks before answering.

Streaming

import { NeuroLink } from "@juspay/neurolink";

const ai = new NeuroLink();

const stream = await ai.stream({
provider: "deepseek",
input: { text: "Explain how B-trees work, step by step." },
});

for await (const chunk of stream.stream) {
process.stdout.write(chunk);
}

Per-Call Credential Override

Pass credentials at call time to override the instance-level or environment-variable defaults. Useful when routing requests for different users through separate DeepSeek accounts.

const result = await ai.generate({
provider: "deepseek",
input: { text: "Hello, world!" },
credentials: {
deepseek: {
apiKey: "sk-user-specific-key",
},
},
});

You can also override the base URL per call — useful when pointing at a self-hosted OpenAI-compatible proxy in front of DeepSeek:

const result = await ai.generate({
provider: "deepseek",
input: { text: "Hello" },
credentials: {
deepseek: {
apiKey: "sk-...",
baseURL: "https://my-proxy.example.com/v1",
},
},
});

CLI Usage

Basic Commands

# Generate with default model (deepseek-chat)
pnpm run cli generate "What is the halting problem?" --provider deepseek

# Use an alias
pnpm run cli generate "Hello" --provider ds

# Use the reasoning model
pnpm run cli generate "Prove P != NP (attempt)" --provider deepseek --model deepseek-reasoner

# Interactive loop mode
pnpm run cli loop --provider deepseek

Streaming via CLI

The CLI streams output by default when a TTY is attached. No extra flags are required.

pnpm run cli generate "Explain TCP/IP in detail" --provider deepseek --model deepseek-chat

Provider Aliases

The DeepSeek provider can be referenced by any of the following names:

AliasExample
deepseek--provider deepseek
ds--provider ds

Configuration Reference

Environment VariableRequiredDefaultDescription
DEEPSEEK_API_KEYYesDeepSeek API key (starts with sk-)
DEEPSEEK_MODELNodeepseek-chatDefault model to use
DEEPSEEK_BASE_URLNohttps://api.deepseek.comBase URL for the API (override for proxies)

Feature Support Matrix

Featuredeepseek-chatdeepseek-reasoner
Text generationYesYes
StreamingYesYes
Tool callingYesLimited
Structured outputYesLimited
Vision / imagesNoNo
EmbeddingsNoNo
Reasoning traceNoYes

Troubleshooting

"Invalid DeepSeek API key"

The DEEPSEEK_API_KEY is missing or incorrect.

# Verify the variable is set
echo $DEEPSEEK_API_KEY

# Set it inline
export DEEPSEEK_API_KEY=sk-...

Get or rotate keys at https://platform.deepseek.com/api_keys.

"DeepSeek account has insufficient balance"

Your account credit is exhausted. Top up at https://platform.deepseek.com/usage.

"DeepSeek rate limit exceeded"

Too many requests in a short window. Implement exponential backoff or reduce request concurrency. Rate limits are published in the DeepSeek API docs.

"Model not found"

Only deepseek-chat and deepseek-reasoner are hosted on api.deepseek.com. Check the model name for typos.

Slow responses on deepseek-reasoner

Expected. R1 performs extended chain-of-thought reasoning before producing its final answer, which adds latency proportional to reasoning complexity. Use deepseek-chat for latency-sensitive paths.

Tool calls failing on deepseek-reasoner

DeepSeek documents limited tool support on R1. For tool-heavy workflows, use deepseek-chat.


See Also


Need Help? Join the GitHub Discussions or open an issue.