Building a Genkit Plugin for Deepseek: A Step-by-Step Guide

Why Deepseek?
Deepseek is an exciting AI model that stands out for being:
- Cost-effective: Deepseek is designed to be efficient in terms of inference costs, though training costs may vary based on model size and hardware.
- Open-source: Providing transparency and potential customization options for developers.
- Self-hostable: Allowing businesses and researchers to deploy it on their infrastructure for more control over data and privacy.
At Oddbit, we love diving into new technologies by building practical tools that solve real-world gaps. We believe that the best way to learn something is to pick it apart and try and build it.
We are very pragmatic in this way and when we see something missing, we don’t wait for someone else to do it, instead we build it for ourselves and others.
One of our latest explorations led us to create a Genkit plugin for Deepseek, making it easier for developers to integrate and use Deepseek within Firebase projects.
Why Build a Dedicated Deepseek Plugin?
Deepseek already works with the OpenAI SDK. So why did we build a dedicated plugin?
Handling Key Differences
While Deepseek follows OpenAI’s API format, it is not fully compatible with all of OpenAI’s features. Some differences include:
- Lack of support for media responses, which limits its multimodal capabilities.
- Variations in available API parameters, meaning certain options present in OpenAI’s API may not be fully supported or behave differently.
Future-Proofing for Rapid Changes
The AI space is evolving rapidly, with companies frequently introducing new features and optimizations to enhance their models. By developing a dedicated plugin, we can:
- Ensure compatibility with Deepseek’s evolving API without being restricted by OpenAI’s SDK updates.
- Leverage Deepseek’s unique capabilities while addressing its limitations to provide a streamlined developer experience.
Enforcing Customization and Constraints
By building a standalone plugin, we can ensure that Deepseek is utilized to its fullest potential while maintaining compatibility with evolving industry standards. This approach allows us to:
- Implement fine-tuned controls tailored to Deepseek-specific use cases, enabling optimized performance and more precise AI interactions.
- Enforce proper API constraints, preventing unsupported requests from leading to unexpected behavior, ensuring reliability and stability.
- Streamline integration and future updates, allowing for easier adoption of new features and improvements without being constrained by external dependencies.
- Enhance security and compliance, making it easier to apply necessary restrictions to meet specific use case requirements.
How We Built the Plugin
Creating the Genkit plugin for Deepseek was straightforward, thanks to the existing OpenAI plugin, which served as a foundation.
Declaring the Models
To begin, we define the models that our Deepseek plugin will support. This includes specifying configurations and constraints that ensure smooth integration with Genkit. The models are declared with essential metadata, such as supported functionalities and configuration schema, ensuring seamless compatibility with the framework.
export const DeepSeekConfigSchema = GenerationCommonConfigSchema.extend({
frequencyPenalty: z.number().min(-2).max(2).optional(),
logProbs: z.boolean().optional(),
presencePenalty: z.number().min(-2).max(2).optional(),
seed: z.number().int().optional(),
topLogProbs: z.number().int().min(0).max(20).optional(),
user: z.string().optional(),
});
export const deepseekChat = modelRef({
name: "deepseek/deepseek-chat",
info: {
label: "DeepSeek - Chat",
supports: {
media: false,
output: ["text"],
multiturn: true,
systemRole: true,
tools: false,
},
},
configSchema: DeepSeekConfigSchema,
});
export const deepseekReasoner = modelRef({
name: "deepseek/deepseek-reasoner",
info: {
label: "DeepSeek - Reasoner",
supports: {
media: false,
output: ["text"],
multiturn: true,
systemRole: true,
tools: false,
},
},
configSchema: DeepSeekConfigSchema,
});
The DeepSeekConfigSchema
defines adjustable parameters such as penalties, logging options, and user-specific configurations. These models (deepseekChat
and deepseekReasoner
) are then referenced within the plugin to ensure structured and consistent behavior when making requests to Deepseek.
Registering the Models and Setting Up the Runner
In this step, we register the Deepseek models with Genkit and set up the runner that handles requests. By doing this, we ensure Deepseek models are available within our AI framework while leveraging OpenAI’s SDK client for seamless integration.
The deepseek
function initializes the plugin, fetching the API key and base URL from the provided options or environment variables. The plugin then creates an OpenAI client instance, which acts as the interface for interacting with Deepseek’s API.
For each supported Deepseek model, the plugin dynamically registers it using ai.defineModel()
, pulling configuration details from the SUPPORTED_DEEPSEEK_MODELS
object. Each model is linked to a dedicated execution function, deepseekRunner()
, which is responsible for formatting requests and handling responses. This modular approach ensures that new models can be easily added without altering the plugin’s core logic.
export const deepseek = (options?: PluginOptions) =>
genkitPlugin("deepseek", async (ai: Genkit) => {
const apiKey = options?.apiKey || process.env.DEEPSEEK_API_KEY;
if (!apiKey) {
throw new Error("Deepseek API key is required. Pass plugin options or set DEEPSEEK_API_KEY environment variable.");
}
const baseURL = options?.baseURL || process.env.DEEPSEEK_API_URL || 'https://api.deepseek.com';
const client = new OpenAI({ apiKey, baseURL });
for (const name of Object.keys(SUPPORTED_DEEPSEEK_MODELS)) {
const model = SUPPORTED_DEEPSEEK_MODELS[name];
ai.defineModel(
{
name: model.name,
...model.info,
configSchema: model.configSchema,
},
deepseekRunner(name, client)
);
}
});
export default deepseek;
By structuring the plugin this way, we ensure that Deepseek models integrate smoothly with Genkit while maintaining flexibility for future updates and expansions.
Understanding the Deepseek Runner
The Deepseek runner acts as the core execution mechanism that translates a Genkit request into a format compatible with Deepseek’s API. This ensures that Deepseek’s capabilities are correctly leveraged within the Genkit ecosystem.
At a high level, the runner:
- Process requests: Converts a Genkit
GenerateRequest
into a Deepseek API request. - Handles responses: Transforms the API response into a structured Genkit
GenerateResponseData
format. - Supports configuration options: Ensures key parameters such as token limits, penalties, and user-defined settings are properly applied.
- Facilitates streaming or batch responses: Determines whether responses should be delivered incrementally (streaming) or as a single batch.
While this provides a general understanding of the runner’s purpose, the intricate details of how it functions, including request formatting, role mapping, and data transformations are best understood by reviewing the source code directly.
For a deeper dive into the runner implementation, check out the source code in our GitHub repository.
Using the Deepseek Plugin in a Genkit Project
To use the plugin, install it in your Genkit project:
npm install genkitx-deepseek
Then, import and configure it:
import { genkit } from 'genkit';
import deepseek, { deepseekChat } from 'genkitx-deepseek';
const ai = genkit({
plugins: [deepseek({ apiKey: process.env.DEEPSEEK_API_KEY })],
model: deepseekChat,
});
Now, you can generate responses using Deepseek:
const response = await ai.generate({
model: deepseekChat,
prompt: 'Tell me a joke!',
});
console.log(response);
Get Started with Deepseek in Your Genkit Project
We believe Deepseek is a game-changer for developers looking for an affordable and open-source LLM. With our Deepseek Genkit plugin, you can start integrating it into your Firebase apps today!
We are excited to hear from the community about your experiences using this plugin. Whether you’re building new applications, experimenting with different AI use cases, or pushing the boundaries of what Deepseek can do, we’d love to see what you create.
We also welcome contributions in the form of code improvements, testing, and feedback to help refine and enhance the plugin. Let’s collaborate and make this tool even better together!