Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Jlama: A Native Java LLM Inference Engine

Devoxx via YouTube

Overview

Explore JLama, a cutting-edge inference engine designed to empower Java developers with native AI capabilities. Discover how this innovative tool brings the power of large language models directly into the Java ecosystem without the need for GPUs. Learn about JLama's support for popular open models like Llama, Gemma, and Mixtral, and its utilization of the new Vector API in Java 21 for enhanced performance. Delve into key features such as advanced model support, tokenizer compatibility, and implementation of state-of-the-art techniques including Flash Attention, Mixture of Experts, and Group Query Attention. Understand how JLama integrates with the LangChain4j project and complements JVector's Java native vector search capabilities to create a comprehensive AI stack for Java. Gain insights into JLama's technical intricacies, practical applications, and witness a live demonstration showcasing its potential to revolutionize Java-AI integration.

Syllabus

Jlama: A Native Java LLM inference engine by Jake Luciani

Taught by

Devoxx

Reviews

Start your review of Jlama: A Native Java LLM Inference Engine

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.