opncrafter
🤖

OpenAI Agents

The API powering most production AI applications in 2025.

OpenAI's APIs and the Assistants API v2 are the starting point for most developers building real AI applications. If you want to go from "I've used ChatGPT" to "I've deployed a production AI feature", the OpenAI API is where you begin. It handles the hard parts — token streaming, context windows, tool calling — so you can focus on building your product.

In this track, I cover the Assistants API v2 (Threads, Runs, and the new File Search), function calling with structured outputs, context management strategies, and a complete project: an AI math tutor with a Code Interpreter. These aren't toy examples — they're the patterns I use in production.

By the end of this track, you'll understand how LLMs actually work (not just how to call them), how to give your agent tools it can use to take real-world actions, and how to manage long conversations without hitting context limits. Whether you're building a customer support bot, a coding assistant, or a data analysis tool, these fundamentals apply.

📚 Learning Path

  1. Understanding LLMs first principles
  2. Assistants API and thread management
  3. Function calling and tool use
  4. Context management strategies
  5. Build: AI Math Tutor with Code Interpreter

5 Guides in This Track

Understanding LLMs

How models think and reason.

Read Guide →

Assistants API v2

Threads, Runs, and File Search.

Read Guide →

Tools & Function Calling

Giving your agent hands to act.

Read Guide →

Context Management

Handling memory limits.

Read Guide →

Project: Math Tutor

Build a tutor with Code Interpreter.

Read Guide →
← Browse all topics