Brian Cabral | Publications

Notable Publications

An Integrated 6DoF Video Camera and System Design
Designing a fully integrated 360° video camera supporting 6DoF head motion parallax requires overcoming many technical hurdles, including camera placement, optical design, sensor resolution, system calibration, real-time video capture, depth reconstruction, and real-time novel view synthesis. While there is a large body of work describing various system components, such as multi-view depth estimation, our paper is the first to describe a complete, reproducible system that considers the challenges arising when designing, building, and deploying a full end-to-end 6DoF video camera and playback environment. Our system includes a computational imaging software pipeline supporting online markerless calibration, high-quality reconstruction, and real-time streaming and rendering. Most of our exposition is based on a professional 16-camera configuration, which will be commercially available to film producers. However, our software pipeline is generic and can handle a variety of camera geometries and configurations. The entire calibration and reconstruction software pipeline along with example datasets is open sourced to encourage follow-up research in high-quality 6DoF video reconstruction and rendering.

Reflection Space Image Based Rendering
We extended the notion of plenoptic, image-based rendering, into reflection space by merging sphere maps and radially isotropic BRDF into the a series of relfection maps defined. One each defined on each vertex of an object enclosing geodesic. We introduced the notion of spherical barycentric interpolation to blend adjancent reflection maps to get intermediate reflection maps. This paper was the first to generalize the notion of barycentric interpolation onto a spherical manifold.

Efficient Bump Mapping Hardware
We recognized that bump mapping can be done in "flat" tangent space (tangent fiber bundle) and thus ammenable to computation in a plane of pixel processors. This work lead directly to the first pixel shader compiler for graphics HW.

Accelerated Volume Rendering and Tomographic Reconstruction Using Texture Mapping Hardware
We introduced the notion of using texture mapping hardware to perform volume rendering. However, the real touchstone of this paper was the insight that fan and cone beam back projection used in tomographic reconstuction (i.e. to perform a Radon Transform using the central projection slice theorem) was essentially the same as the forward projection used in volume rendering and thus could use the same texture mapping HW with a projective interpolation term.

Imaging Vector Fields Using Line Integral Convolution
Line Integral Connvolution (LIC) is a vector visualization imaging operator. It takes a vector field, an image, and a 1-D convolution kernal and convolves the image along the vector field lines by the kernel. It represents a seminal work in this arena spawning substantive research and a number Ph.D. thesis based directly on this technique.

Bidirectional Reflection Functions From Surface Bump Maps
In this paper we used micro-facet simulations to build up BRDF sample points. The sampled data was then fit using spherical harmonics (orthogonal wave basis functions defined on a spherical manifold). It was early work in BRDFs introducing the notion of spherical harmonics to approximate BRDFs and the use of mirco-facet simulations. It became the basis of my Master thesis. It also was first to use ANSI standard illumination notation to the computer graphics community.


© 2010-present Brian K. Cabral. All Rights Reserved.