Overview
Dive into a comprehensive tutorial on implementing AI function calling in Rust programming using the OpenAI API. Learn how to apply similar approaches with other language models like Mistral, Mixtral, LLAMA, and Gemini. Follow along with six coded examples, starting from basic setup and progressing through simple chat implementation, tool calls, tool responses, and advanced refactoring. Explore concurrent OpenAI requests using joinset and bonus content on rpc_router state. Access the accompanying GitHub repository for hands-on practice and utilize the Rust10x extension for enhanced development. Gain valuable insights into production-level Rust coding for AI applications through this nearly two-hour instructional video.
Syllabus
- Overview
- 00:01:36 C00 - Setup - Cargo.toml / dependencies
- 00:03:50 C00 Setup - Error.rs
- 00:07:11 C01 - Simple Chat
- 00:23:09 C02 - Tool Calls
- 00:34:23 C03 - Tool Responses
- 00:52:11 C04 - Conv refactoring
- 01:03:36 C05.1 - Continue refactor for schema.rs
- 01:14:24 C05.2 - Schema.rs and Spec
- 01:31:09 C05.3 - into spec params
- 01:46:50 C06.1 - joinset - concurrent OpenAI Request
- 01:51:44 C06.2 - Bonus - rpc_router state i.e. resources
Taught by
Jeremy Chone