Using DeepSeek V3 API in Node.js with Vercel AI SDK

DeepSeek V3 represents a significant advancement in the realm of open-source language models, boasting enhanced speed and capabilities. In this tutorial, we'll walk through integrating the DeepSeek V3 API into a Node.js application using the Vercel AI SDK. By the end, you'll have a terminal-based interface that prompts users for input and displays responses from DeepSeek seamlessly.
Why DeepSeek V3?
DeepSeek V3 offers impressive features that make it a compelling choice for developers:
- Speed: Processes up to 60 tokens per second, three times faster than its predecessor.
- Enhanced Capabilities: Improved performance across various benchmarks.
- API Compatibility: Maintains compatibility with OpenAI's API, simplifying integration.
- Open-Source: Fully open-source models and research papers available for the community.
DeepSeek V3 Benchmark
DeepSeek V3 excels in multiple benchmarks, rivaling even some of the most advanced closed-source models. Here's a snapshot of its performance:
Metric | DeepSeek V3 | DeepSeek V2.5 | Qwen2.5 | Llama3.1 | Claude-3.5 | GPT-4o |
---|---|---|---|---|---|---|
MMLU (EM) | 88.5 | 80.6 | 85.3 | 88.6 | 88.3 | 87.2 |
MMLU-Redux (EM) | 89.1 | 80.3 | 85.6 | 86.2 | 88.9 | 88.0 |
DROP (3-shot F1) | 91.6 | 87.8 | 76.7 | 88.7 | 88.3 | 83.7 |
Getting Started
Prerequisites
Before diving in, ensure you have the following:
- Node.js 18+ installed on your machine.
- pnpm as the package manager.
- DeepSeek API Key: Sign up at DeepSeek, navigate to "Access API" after creating your account, and generate a new API key. A minimum balance of $2 is required.
- Vercel AI SDK: We'll utilize this SDK due to DeepSeek's API compatibility with OpenAI.
Setting Up Your Node.js Application
-
Create a New Directory:
mkdir deepseek-nodejs-app cd deepseek-nodejs-app pnpm init
-
Install Dependencies:
Install the necessary packages, including the Vercel AI SDK and OpenAI provider.
pnpm add ai @ai-sdk/openai zod dotenv pnpm add -D @types/node tsx typescript
Note: Ensure you're using
ai
version 3.1 or higher for compatibility.
Configuring the API Key
Create a .env
file in the root of your project to securely store your DeepSeek API key.
touch .env.local
Edit the .env.local
file:
DEEPSEEK_API_KEY=your_deepseek_api_key_here
Replace your_deepseek_api_key_here
with the API key obtained from your DeepSeek dashboard.
Building the Application
Create an index.ts
file in the project root and add the following code:
import { createOpenAI } from '@ai-sdk/openai';
import { CoreMessage, streamText } from 'ai';
import dotenv from 'dotenv';
import * as readline from 'node:readline/promises';
dotenv.config();
const terminal = readline.createInterface({
input: process.stdin,
output: process.stdout,
});
const messages: CoreMessage[] = [];
async function main() {
const openai = createOpenAI({
apiKey: process.env.DEEPSEEK_API_KEY,
baseURL: 'https://api.deepseek.com/v1',
});
const model = openai('deepseek-chat');
while (true) {
const userInput = await terminal.question('You: ');
messages.push({ role: 'user', content: userInput });
const result = streamText({
model,
messages,
});
let fullResponse = '';
process.stdout.write('\nDeepSeek: ');
for await (const delta of result.textStream) {
fullResponse += delta;
process.stdout.write(delta);
}
process.stdout.write('\n\n');
messages.push({ role: 'assistant', content: fullResponse });
}
}
main().catch(console.error);
Understanding the Code
- Environment Variables: Utilizes
dotenv
to manage environment variables securely. - Readline Interface: Sets up a terminal interface for user interaction.
- Message History: Maintains a conversation history to provide context to DeepSeek.
- Streaming Responses: Streams DeepSeek's responses in real-time to the terminal.
Running the Application
Start your application using the following command:
pnpm tsx index.ts
Upon running, you'll see a prompt in your terminal. Enter your queries, and DeepSeek will respond interactively.
Leveraging Vercel AI SDK with DeepSeek
The Vercel AI SDK provides a unified interface to interact with various AI models, including DeepSeek. This abstraction simplifies the integration process, allowing you to switch providers with minimal code changes.
Example: Generating Structured Data
Here's how you can generate structured content using DeepSeek with the Vercel AI SDK:
import { createOpenAI } from '@ai-sdk/openai';
import { generateObject } from 'ai';
import dotenv from 'dotenv';
import { z } from 'zod';
dotenv.config();
const openai = createOpenAI({
apiKey: process.env.DEEPSEEK_API_KEY,
baseURL: 'https://api.deepseek.com/v1',
});
const articleSchema = z.object({
title: z.string(),
content: z.string(),
// Define additional schema properties as needed
});
async function createArticle(userPrompt: string) {
try {
const model = openai('deepseek-chat');
const { object } = await generateObject({
model,
prompt: `Generate a structured (article/tutorial/news article) based on the following instructions:\n\n ${userPrompt}.`,
schema: articleSchema,
});
return object;
} catch (error) {
console.error('Error generating the structured content:', error);
throw error;
}
}
This snippet demonstrates generating a structured article based on user instructions, leveraging DeepSeek's capabilities through the Vercel AI SDK.
Conclusion
Integrating DeepSeek V3 into your Node.js applications using the Vercel AI SDK offers a powerful and efficient way to harness advanced language models. Whether building chatbots, content generators, or other AI-driven tools, DeepSeek provides the speed and performance needed for modern applications.
For more details and updates, visit the DeepSeek Official Website.
Discuss Your Project with Us
We're here to help with your web development needs. Schedule a call to discuss your project and how we can assist you.
Let's find the best solutions for your needs.