PhBRML: Physically Based Rendering Modeling Language
Status, planning and unresolved issues
Note: some of this documentation is out-dated. The main ideas are
still valid however.
Note: Confidential working document, do not distribute
(also distribute nothing else in this library until it is in a more final state).
Goal
Making VRML'97 usable for physically based rendering, including
simple image based rendering and augmented reality applications.
- Physically based rendering: obtain the ultimate realistic images of
virtual scenes by (approximately) solving the equations from physics
that describe light transport in the scene.
Illustration: two images of the same scene: left: rendered
with common practice rendering techniques (OpenGL model) in VR, right: same
scene with bidirectional path tracing (which gives the very best images of
general environments to date).
- Needed inputs: geometry of the scene + physically based description of
light sources and light-matter interaction, both at surfaces (surface
scattering) and volumetric (transparency and participating media).
Only the former is available in VRML'97. We extend VRML'97
to express the latter as well.
- For an extensive list of what we want to be able to describe, see
Glassner "Principles of Digital Image Synthesis" chapters 11-15.
Motivation
- Need for a powerful, flexible and open 3D model file format for
physics based rendering. We will need it for testing and demonstrating
our future physically based rendering research ideas. Current file formats
do not fulfil our needs.
- Promote physically based rendering for VR applications by extending the
standard language for describing virtual worlds: VRML'97. Global illumination
research aims at a more realistic representation of virtual worlds than
possible with the simple visual representation model available in VRML.
PhB rendering will eventually significantly stimulate the feeling of immersion
in VR applications.
- VR applications often require in the first place real-time rendering.
The current state of technology and art does not yet allow real-time physics
based rendering. The ability to describe virtual worlds with real light
sources and appearance will stimulate global illumination research in this
direction.
- For applications in e.g. architecture,
reasonable-time rather than real-time rendering suffices.
Reasonable-time rendering is already available at this time.
Rumour has it that real-time physics based rendering of models of
reasonable size for VR applications is only a couple of years away.
We will be ready to use it.
- VRML'97 is a powerful, flexible and open file format for the
exchange of 3D model data. It supports key-frame animation, scripting,
user interaction, integrates well with other media types (audio, ...).
AND: VRML'97 has a well described extension mechanism. Extending VRML'97 is
far easier than hacking together a new file format of our own.
Previous work
Concepts
- A description of physically based light emission and scattering
characteristics involves functions of place, direction(s),
wavelength, and time. Our job: making it easy to express this position,
direction, wavelength and time dependence for a as wide as possible
class of light scattering and emission models.
- Appearance = surface emission and scattering (EDF and BSDF) +
volume emission and scattering (participating media: isotropic emission,
general phase function) + geometry distortions
(bump- and displacement mapping):
- Surfaces: homogeneous, 2D and 3D textured, layered (lacquered surfaces,
human skin, plant tissue, ...);
- Media: homogeneous, 3D textured;
- Bump- and displacement maps: 2D and 3D.
- Position dependence of inhomogeneous surfaces and media
is expressed by means of 2D and 3D texture maps.
The standard VRML'97 2D texture nodes (image, pixel and movie) are extended
with a procedural 2D texturing node and 3D texturing nodes. Texture map
values are used as weights for mixing surfaces or media components that
can be of any surface or medium type listed above, including inhomogeneous
surfaces or media again.
- Homogeneous EDF, BSDF and phase function are expressed
as a linear combination of spectral basis functions: sums of products
of directional distributions times spectra times weights. The basis functions
do not need to be independent: the same mechanism allows to easily
express e.g. a modified Phong reflection model as sum of diffuse and
specular component.
- Directional distributions can be specified in a variety of ways,
including tabulated samples and scripts (procedural distributions).
A small number of popular distributions are built in, including
directional light fields for image based rendering and augmented reality.
Other distributions will be provided as procedural distributions (student
exercise):
- Emitters: directional distribution of EDFs. Types:
diffuse, Phong-like, sampled isotropic (sampled intensity
values versus angle w.r.t. axis of symmetry), directional light field
(given as a texture), procedural. Future plans: IES light source description
files and/or similar, ...
- Scatterers: directional distribution of BSDFs. Types:
diffuse reflector/refractor, modified Phong reflector/refractor,
procedural. Future plans: Fresnel, Cook-Torrance, Poulin-Fournier, HTSG,
Strauss, Ward, Schlick, sampled, ...
- Phase functions: directional distribution of volume scattering functions.
Types: isotropic, procedural. Future plans: sampled, Rayleigh, Murky and Hazy
Mie, Henyey-Greenstein, Schlick, ...
- Wavelength dependence is expressed by means of spectra. A spectrum
is a scalar function of wavelength.
Types: XYZ, Lxy, monochromatic, black body, sampled, tabulated,
procedural + linear combinations.
- Additional node types describe
- background radiation (sky illumination, augmented reality, ...)
as a function of incident direction: procedural or expressed by a texture.
- atmosphere: medium outside any object in the scene: e.g. misty air,
underwater scenes, ...
- Time dependence is handled using the standard VRML'97 event handling
system with new interpolators for spectra, surfaces and participating
media descriptions. Future plans: geometry distortion interpolators.
Specifications
- VRML'97 is extended using the
EXTERNPROTO
mechanism. That means: define the interface of new VRML'97 scene graph
nodes that will describe physics based appearance and light sources. A
specialised browser (RenderPark, ART) will recognise and use these new node
types. For less fortunate browsers, a default implementation will be developed
that converts the physics based material and light source descriptions as good
as possible into standard VRML materials and light sources.
In short: standard browsers will still be able to process the extended models
while intelligent browsers will also understand the extensions.
- Extensions stick as close as possible with the semantics of standard
VRML nodes. There are two abberations:
- Procedural textures, spectra and directional distributions
use the same scripting language interface
as the VRML Script node.
That is: arguments are described by eventIn's and return values
as eventOut's. However, unlike
Script nodes, these
procedural nodes do not serve to dynamically modify the world
being interacted with. They do not participate in normal VRML event
processing. There is no way to dynamically change the behaviour of such
procedural description nodes. There is no loss of generality: dynamic medium
and surface changes can be expressed by other means.
- New interpolators for spectra, surfaces and
media have slightly different look than standard VRML interpolator nodes
because spectra, surfaces and media are not VRML field types.
- Only basic building blocks are provided, with some redundancy for
convenience.
More complex yet easy-to-use descriptions can be obtained
by composing the basic building blocks using mechanisms already
present in standard VRML'97: PROTO's and named nodes.
- PhBRML node reference (needs updating)
Examples
See TEST subdirectory.
Implementation
- Above ideas will be implemented in a VRML'97 parser which I started writing
in my spare time somewhere halfway 1997. The extended VRML parser will
make it very easy to support the file format in applications such as
RenderPark and ART as well as e.g. for converting to other file formats
if someone ever would like to do so. The VRML parser will be made
available under the GNU public library license or similar.
- In order to stimulate supporting the extensions, the parser
will offer a number of utility routines. Utility routines include
surface and volume emission and scattering evaluation, sampling and
integration as well as for carrying out several transforms needed to put
arguments of forementionned routine in the right coordinate frame.
- Sampling routines use multiple importance sampling. With each directional distribution component
of an EDF, BSDF and phase function corresponds a sample probability
distribution. The probability of drawing a sample according to a given
directional distribution component is proportional with its associated spectral
basis function times a given spectral weight, its intensity and its spatial
weights (texture map values at the spatial position being considered).
Results
Some nice images will be shown here with explanation why
they can not be described in any other existing file format.
(That is basically a restatement of why previous work doesn't
fulfil our needs.)
input file (format has changed slightly in the mean while)
Conclusion
- Summary.
- Future work: 4D light fields in full generality.
- Future work: anything that we don't succeed finishing before the
paper deadline (whenever that may be).
References
- VRML'97 specs.
- References to Radiance, MGF, RenderMan for global illumination, ...
- Toblers' WSCG'98 paper.
- Glassner, "Principles of Digital Image Synthesis".
- Hanrahan, "Layered Surfaces", SIGGRAPH'93
- Veach, "Optimally Combining Sampling Techniques", SIGGRAPH'95
- Meyer, "Spectral Rendering", SIGGRAPH'95 (?)
- Cook, "Shade trees", SIGGRAPH'84 (?)
- ...
This page is maintained by
Philippe Bekaert