Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Building Low Latency Microservices & Monoliths in Java Using High Performance Serialization & Messaging

GOTO Conferences via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore microservices architecture and monoliths in Java through this conference talk from GOTO Chicago 2016. Delve into the importance of latency in microservices, learn when to choose microservices over monoliths, and discover techniques for building high-performance systems. Gain insights on low-latency serialization, messaging, and garbage collection optimization. Follow along as Peter Lawrey, CEO at Higher Frequency Trading Ltd, demonstrates practical examples, discusses component-based design, and explores distributed systems. Understand CPU layouts, thread affinity, and data modeling for optimal performance. Learn about testing strategies, transports like Chronicle Queue, and flow control in Java-based microservices. Acquire valuable knowledge on building efficient, scalable, and responsive Java applications for both microservices and monolithic architectures.

Syllabus

Introduction
My first computer
Microservices
Web Page Size
Chart
Asynchronous messaging
Low latency definition
Low latency example
Doing less work
Componentbased design
Lambda architecture
Putting it all together
Distributed Systems
Single Core Processes
Single Core Example
CPU Layout
Thread Affinity
Challenges
Approach
Example
Data Model
Data Model Example
Mocking Component
Testing Component
Transports
Chronicle Queue
Microsoft JMH
Flow Control
Examples
Products
Summary

Taught by

GOTO Conferences

Reviews

Start your review of Building Low Latency Microservices & Monoliths in Java Using High Performance Serialization & Messaging

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.