Skip to main content

SDK Reference

The NeuroLink SDK provides a TypeScript-first programmatic interface for integrating AI capabilities into your applications.

Overview

The SDK is designed for:

  • Web applications (React, Vue, Svelte, Angular)
  • Backend services (Node.js, Express, Fastify)
  • Serverless functions (Vercel, Netlify, AWS Lambda)
  • Desktop applications (Electron, Tauri)

Quick Start

import { NeuroLink } from "@juspay/neurolink";

const neurolink = new NeuroLink();

// Generate text
const result = await neurolink.generate({
input: { text: "Write a haiku about programming" },
provider: "google-ai",
});

console.log(result.content);

Documentation Sections

  • API Reference — Complete TypeScript API documentation with interfaces, types, and method signatures.
  • Framework Integration — Integration guides for Next.js, SvelteKit, React, Vue, and other popular frameworks.
  • Custom Tools — How to create and register custom tools for enhanced AI capabilities.

Core Architecture

The SDK uses a Factory Pattern architecture that provides:

  • Unified Interface: All providers implement the same AIProvider interface
  • Type Safety: Full TypeScript support with IntelliSense
  • Automatic Fallback: Seamless provider switching on failures
  • Built-in Tools: 6 core tools available across all providers
type AIProvider = {
generate(options: TextGenerationOptions): Promise<EnhancedGenerateResult>;
stream(options: StreamOptions): Promise<StreamResult>;
supportsTools(): boolean;
};

Configuration

The SDK automatically detects configuration from:

// Environment variables
process.env.OPENAI_API_KEY;
process.env.GOOGLE_AI_API_KEY;
process.env.ANTHROPIC_API_KEY;
// ... and more

// Programmatic configuration
const neurolink = new NeuroLink({
conversationMemory: { enabled: true },
enableOrchestration: true,
observability: { langfuse: { enabled: true } },
});

Advanced Features

Auto Provider Selection

NeuroLink automatically selects the best available AI provider based on your configuration:

import { createBestAIProvider } from "@juspay/neurolink";

// Automatically selects best available provider
const provider = createBestAIProvider();

const result = await provider.generate({
input: { text: "Explain quantum computing" },
maxTokens: 500,
temperature: 0.7,
});

Selection Priority:

  1. OpenAI (most reliable)
  2. Anthropic (high quality)
  3. Google AI Studio (free tier)
  4. Other configured providers

Custom Priority:

import { AIProviderFactory } from "@juspay/neurolink";

// Create with fallback
const { primary, fallback } =
await AIProviderFactory.createProviderWithFallback(
"bedrock", // Prefer Bedrock
"openai", // Fall back to OpenAI
);

Learn more: Provider Orchestration Guide


Conversation Memory

Automatic context management for multi-turn conversations:

const neurolink = new NeuroLink({
conversationMemory: {
enabled: true,
enableSummarization: true,
},
});

// Multi-turn conversations
const result1 = await neurolink.generate({
input: { text: "My name is Alice" },
});

const result2 = await neurolink.generate({
input: { text: "What's my name?" },
// Remembers previous context via conversation memory
});
// AI responds: "Your name is Alice"

Memory Types:

  • In-Memory: Fast, single-instance only
  • Redis: Distributed, persistent across restarts

Features:

  • Automatic context window management
  • Session isolation by ID
  • Export/import conversation history
  • Context summarization for long sessions

Learn more:


Analytics & Evaluation

const result = await neurolink.generate({
input: { text: "Generate a business proposal" },
enableAnalytics: true, // Track usage and costs
enableEvaluation: true, // AI quality scoring
});

console.log(result.analytics); // Usage data
console.log(result.evaluation); // Quality scores

Custom Tools

// Register a single tool
neurolink.registerTool("weatherLookup", {
description: "Get current weather for a city",
parameters: z.object({
city: z.string(),
units: z.enum(["celsius", "fahrenheit"]).optional(),
}),
execute: async ({ city, units = "celsius" }) => {
// Your implementation
return { city, temperature: 22, units, condition: "sunny" };
},
});

// Register multiple tools - Object format
neurolink.registerTools({
stockPrice: {
description: "Get stock price",
execute: async () => ({ price: 150.25 }),
},
calculator: {
description: "Calculate math",
execute: async () => ({ result: 42 }),
},
});

// Register multiple tools - Array format (Lighthouse compatible)
neurolink.registerTools([
{
name: "analytics",
tool: {
description: "Get analytics data",
parameters: z.object({
merchantId: z.string(),
dateRange: z.string().optional(),
}),
execute: async ({ merchantId, dateRange }) => {
return { data: "analytics result" };
},
},
},
{
name: "processor",
tool: {
description: "Process payments",
execute: async () => ({ status: "processed" }),
},
},
]);

Context Integration

const result = await neurolink.generate({
input: { text: "Create a summary" },
context: {
userId: "123",
project: "Q1-report",
department: "sales",
},
});

Framework Examples

// app/api/ai/route.ts
import { NeuroLink } from "@juspay/neurolink";

export async function POST(request: Request) {
const { prompt } = await request.json();
const neurolink = new NeuroLink();

const result = await neurolink.generate({
input: { text: prompt },
timeout: "2m",
});

return Response.json({ text: result.content });
}