RAG with the current DeepAILab routes
This guide documents the safest RAG surface to use today: the LangChain-backed routes at/api/langchain/rag/*. They are implemented in the live service and are the same routes used by the frontend client in the current codebase.
Overview
DeepAILab currently exposes two RAG-shaped surfaces in the repository: a LangChain-backed route used by the web product and a gateway proxy for the standalone RAG service. This guide intentionally stays on the LangChain-backed surface because it is grounded in current application code and has a clear request schema in the controller.
export DEEPAILAB_BASE_URL="https://api.deepailab.ai"
export DEEPAILAB_SESSION_TOKEN="your-session-jwt"
export DEEPAILAB_USER_ID="user_123"Quick start
1. Add documents to the user vector store
curl -X POST "$DEEPAILAB_BASE_URL/api/langchain/rag/add-documents" \
-H "Authorization: Bearer $DEEPAILAB_SESSION_TOKEN" \
-H "Content-Type: application/json" \
-H "X-User-ID: $DEEPAILAB_USER_ID" \
-d '{
"documents": [
{
"content": "DeepAILab requires Node.js 22 or newer for local development.",
"metadata": {
"source": "setup-guide",
"section": "requirements"
}
},
{
"content": "Recommended profile installs MCP, skills, and agent-team by default.",
"metadata": {
"source": "devtools-guide",
"section": "profiles"
}
}
]
}'2. Query with grounded retrieval
curl -X POST "$DEEPAILAB_BASE_URL/api/langchain/rag/query" \
-H "Authorization: Bearer $DEEPAILAB_SESSION_TOKEN" \
-H "Content-Type: application/json" \
-H "X-User-ID: $DEEPAILAB_USER_ID" \
-d '{
"question": "What profile should I choose for a new engineering team?",
"top_k": 5,
"score_threshold": 0.2,
"projectId": "docs-demo"
}'API examples
Use the REST route directly or call it from your own application. The important detail is the request shape: question plus optional retrieval controls, not the older query field shown in historical docs.
cURL
curl -X POST "$DEEPAILAB_BASE_URL/api/langchain/rag/query" \
-H "Authorization: Bearer $DEEPAILAB_SESSION_TOKEN" \
-H "Content-Type: application/json" \
-H "X-User-ID: $DEEPAILAB_USER_ID" \
-d '{
"question": "What profile should I choose for a new engineering team?",
"top_k": 5,
"score_threshold": 0.2,
"projectId": "docs-demo"
}'JavaScript / TypeScript
const baseUrl = process.env.DEEPAILAB_BASE_URL ?? "https://api.deepailab.ai";
const sessionToken = process.env.DEEPAILAB_SESSION_TOKEN!;
const userId = process.env.DEEPAILAB_USER_ID!;
const response = await fetch(`${baseUrl}/api/langchain/rag/query`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${sessionToken}`,
"X-User-ID": userId,
},
body: JSON.stringify({
question: "Summarize the recommended developer-tool install profile.",
top_k: 4,
score_threshold: 0.15,
projectId: "docs-demo",
}),
});
const result = await response.json();
console.log(result);Best practices
Use the LangChain RAG routes documented here
These routes are implemented in langchain-service and are the same ones used by the frontend AI client today.
Send question, not query
The current RAG query controller expects a question field with optional top_k, score_threshold, and projectId.
Always include bearer auth and X-User-ID
The documented routes are JWT-protected and use the user identity for tenant isolation during retrieval.
Treat /api/v1/rag/* as deployment-specific
The API gateway still proxies /api/v1/rag/* to the standalone RAG service, but the collection/document lifecycle depends on how that deployment is wired. This guide stays on the safer LangChain-backed surface.
Continue with the API reference or review the install center if you are also configuring local coding tools against the same DeepAILab gateway.