Human API
Memory-enhanced LLM endpoint. Automatically inject user context into any AI conversation with just 2 lines of code.
Get Started in 2 Minutes
Just change 2 lines of code to add memory to your LLM calls.
OpenAI Compatible
Use any OpenAI SDK — just change the base URL
Memory Injection
Add {onairos_memory} to prompts for instant context
Zero Infrastructure
No databases, embeddings, or RAG to manage
Privacy Built-in
Automatic PII removal and isolated user data
See How Apps Benefit From Using Onairos Memory
One base URL change. Personalized for every user, across any app. Click a scenario to compare responses.
Quick Example
// Standard OpenAI call
const response = await openai.chat.completions.create({
model: 'gpt-4o',
messages: [{
role: 'user',
content: 'What should I work out today?'
}]
});
// Returns: Generic workout advice
// Just change these two lines in your existing code:
const onairos_client = new OpenAI({
apiKey: 'your_app_api_key_here', // Your developer key
baseURL: 'https://developer.onairos.uk/v1' // Our endpoint
});
// Human API call with memory
const response = await onairos_client.chat.completions.create({
model: 'gpt-4o',
messages: [{
role: 'user',
content: 'Based on {onairos_memory}, what should I work out today?'
}]
});
// Returns: "Since you did legs yesterday and prefer morning cardio..."Documentation
Use Cases
Fitness Apps
"Based on {onairos_memory}, what should I focus on today?"
Without memory: Generic workout advice
With memory: "Since you did legs yesterday and prefer 30-minute sessions..."
Dating Apps
"Based on {onairos_memory}, suggest conversation starters"
Without memory: Generic icebreakers
With memory: "Based on your interest in jazz music and hiking..."
Ready to add memory to your LLM calls?
Human API is available now. Book a demo to see how it can transform your AI application.