Overview
Syllabus
Intro
The problem of novel view interpolation
GB-alpha volume rendering for view synthesis
eural networks as a continuous shape representation
Neural network replaces large N-d array
enerate views with traditional volume rendering
igma parametrization for continuous opacity
Two pass rendering: coarse
Two pass rendering: fine
Viewing directions as input
Volume rendering is trivially differentiable
ptimize with gradient descent on rendering loss
HeRF encodes convincing view-dependent effects using directional dependence
NeRF encodes detailed scene geometry
Going forward
Fourier Features Let Networks Learn
Input Mapping
Key Points
sing kernel regression to approximate deep networks
TK: modeling deep network as kernel regression
Sinusoidal mapping results in a composed stationary NTK
Resulting composed NTK is stationary
No-mapping NTK clearly not stationary
Toy example of stationarity in practice
Modifying mapping manipulates kernel spectrum
ernel spectrum has dramatic effect on convergence and generalization
requency sampling distribution bandwidth matter more than shape
Mapping Code
2D Images
3D Shape
Indirect supervision tasks Ground Truth
Taught by
Andreas Geiger