Completed
Generating NeRFs through CLI
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Stable Diffusion: Worldbuilding with 3D and AI Using MVDream - Part 1
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 MVDream: what is it/what does it solve?
- 3 Dataset Overview
- 4 Camera settings explanation
- 5 Multiview perspective
- 6 Multiview 2D code dive
- 7 Camera embedding
- 8 Camera utils
- 9 Setting up MVDream 2D environment
- 10 Trying out the Gradio server
- 11 Start of MVDream-threestudio
- 12 Setting up Docker environment for MVDream-threestudio
- 13 Explaining why the gradio server for 3D is not usable
- 14 Generating NeRFs through CLI
- 15 Exporting meshes
- 16 Evaluating MVDream mesh fidelity
- 17 Second stage refinement and why I don't recommend it
- 18 Redesign from refinement = unusable
- 19 Showing some other NeRF to 3D mesh objects
- 20 Rendering out a 3D object
- 21 Using 3D renders as ControlNet guides
- 22 Worldbuilding overview context
- 23 Potential room designs
- 24 Potential chair designs
- 25 Generating albedo map texture in ComfyUI
- 26 Using Adobe 3D Sampler to convert albedo into PBR textures
- 27 Quick setup of converted PBR textures
- 28 Using same process to generate metal textures
- 29 Quick overview of using Cube by CSM to convert a picture to a mesh
- 30 Checking refined mesh from Cube
- 31 Closing thoughts