Stable Diffusion: Worldbuilding with 3D and AI Using MVDream - Part 1

Stable Diffusion: Worldbuilding with 3D and AI Using MVDream - Part 1

kasukanra via YouTube Direct link

Generating NeRFs through CLI

14 of 31

14 of 31

Generating NeRFs through CLI

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Stable Diffusion: Worldbuilding with 3D and AI Using MVDream - Part 1

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 MVDream: what is it/what does it solve?
  3. 3 Dataset Overview
  4. 4 Camera settings explanation
  5. 5 Multiview perspective
  6. 6 Multiview 2D code dive
  7. 7 Camera embedding
  8. 8 Camera utils
  9. 9 Setting up MVDream 2D environment
  10. 10 Trying out the Gradio server
  11. 11 Start of MVDream-threestudio
  12. 12 Setting up Docker environment for MVDream-threestudio
  13. 13 Explaining why the gradio server for 3D is not usable
  14. 14 Generating NeRFs through CLI
  15. 15 Exporting meshes
  16. 16 Evaluating MVDream mesh fidelity
  17. 17 Second stage refinement and why I don't recommend it
  18. 18 Redesign from refinement = unusable
  19. 19 Showing some other NeRF to 3D mesh objects
  20. 20 Rendering out a 3D object
  21. 21 Using 3D renders as ControlNet guides
  22. 22 Worldbuilding overview context
  23. 23 Potential room designs
  24. 24 Potential chair designs
  25. 25 Generating albedo map texture in ComfyUI
  26. 26 Using Adobe 3D Sampler to convert albedo into PBR textures
  27. 27 Quick setup of converted PBR textures
  28. 28 Using same process to generate metal textures
  29. 29 Quick overview of using Cube by CSM to convert a picture to a mesh
  30. 30 Checking refined mesh from Cube
  31. 31 Closing thoughts

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.