Quick Start
Quick Start
Section titled “Quick Start”Choose your runtime and start making AI calls in minutes.
Prerequisites
Section titled “Prerequisites”- An API key from any supported provider (e.g.,
OPENAI_API_KEY,ANTHROPIC_API_KEY,DEEPSEEK_API_KEY) - The AI-Protocol repository (automatically fetched from GitHub if not local)
1. Add the dependency
Section titled “1. Add the dependency”[dependencies]ai-lib = "0.6.6"tokio = { version = "1", features = ["full"] }2. Set your API key
Section titled “2. Set your API key”export ANTHROPIC_API_KEY="your-key-here"3. Write your first program
Section titled “3. Write your first program”use ai_lib::{AiClient, StreamingEvent};use futures::StreamExt;
#[tokio::main]async fn main() -> ai_lib::Result<()> { // Create client — protocol manifest is loaded automatically let client = AiClient::from_model("anthropic/claude-3-5-sonnet").await?;
// Streaming chat let mut stream = client.chat() .user("What is AI-Protocol?") .temperature(0.7) .max_tokens(500) .stream() .execute_stream() .await?;
while let Some(event) = stream.next().await { match event? { StreamingEvent::ContentDelta { text, .. } => print!("{text}"), StreamingEvent::StreamEnd { .. } => println!(), _ => {} } } Ok(())}4. Run
Section titled “4. Run”cargo runPython
Section titled “Python”1. Install the package
Section titled “1. Install the package”pip install ai-lib-python2. Set your API key
Section titled “2. Set your API key”export ANTHROPIC_API_KEY="your-key-here"3. Write your first script
Section titled “3. Write your first script”import asynciofrom ai_lib_python import AiClient
async def main(): # Create client — protocol manifest loaded automatically client = await AiClient.create("anthropic/claude-3-5-sonnet")
# Streaming chat async for event in client.chat() \ .user("What is AI-Protocol?") \ .temperature(0.7) \ .max_tokens(500) \ .stream(): if event.is_content_delta: print(event.as_content_delta.text, end="") print()
asyncio.run(main())4. Run
Section titled “4. Run”python main.pySwitching Providers
Section titled “Switching Providers”The magic of AI-Lib: change one string to switch providers.
// Rust — just change the model IDlet client = AiClient::from_model("openai/gpt-4o").await?;let client = AiClient::from_model("deepseek/deepseek-chat").await?;let client = AiClient::from_model("gemini/gemini-2.0-flash").await?;# Python — same thingclient = await AiClient.create("openai/gpt-4o")client = await AiClient.create("deepseek/deepseek-chat")client = await AiClient.create("gemini/gemini-2.0-flash")No code changes needed. The protocol manifest handles endpoint, auth, parameter mapping, and streaming format for each provider.
Next Steps
Section titled “Next Steps”- Ecosystem Architecture — How the pieces fit together
- Chat Completions Guide — Detailed chat API usage
- Function Calling — Tool use and function calling
- Rust SDK Details — Deep dive into Rust
- Python SDK Details — Deep dive into Python