PhBRML: Physically Based Rendering Modeling Language
Note: some of this documentation is out-dated. The main ideas are
still valid however.
Making VRML'97 usable for physically based rendering, including
simple image based rendering and global illumination in mixed
- Physically based rendering: obtain the ultimate realistic images of
virtual scenes by (approximately) solving the equations from physics
that describe light transport in the scene.
- Needed inputs: geometry of the scene + physically based description of
light sources and light-matter interaction, both at surfaces (surface
scattering) and volumetric (transparency and participating media).
Only the former is available in VRML'97. We extend VRML'97
to express the latter as well.
- For an extensive list of what we want to be able to describe, see
Glassner "Principles of Digital Image Synthesis" chapters 11-15.
- A description of physically based light emission and scattering
characteristics involves functions of place, direction(s),
wavelength, and time. Our job: making it easy to express this position,
direction, wavelength and time dependence for a as wide as possible
class of light scattering and emission models.
- Appearance = surface emission and scattering (EDF and BSDF) +
volume emission and scattering (participating media: isotropic emission,
general phase function) + geometry distortions
(bump- and displacement mapping):
- Surfaces: homogeneous, 2D and 3D textured, layered (lacquered surfaces,
human skin, plant tissue, ...);
- Media: homogeneous, 3D textured;
- Bump- and displacement maps: 2D and 3D.
- Position dependence of inhomogeneous surfaces and media
is expressed by means of 2D and 3D texture maps.
The standard VRML'97 2D texture nodes (image, pixel and movie) are extended
with a procedural 2D texturing node and 3D texturing nodes. Texture map
values are used as weights for mixing surfaces or media components that
can be of any surface or medium type listed above, including inhomogeneous
surfaces or media again.
- Homogeneous EDF, BSDF and phase function are expressed
as a linear combination of spectral basis functions: sums of products
of directional distributions times spectra times weights. The basis functions
do not need to be independent: the same mechanism allows to easily
express e.g. a modified Phong reflection model as sum of diffuse and
- Directional distributions can be specified in a variety of ways,
including tabulated samples and scripts (procedural distributions).
A small number of popular distributions are built in, including
directional light fields for image based rendering and augmented reality.
Other distributions will be provided as procedural distributions (student
- Emitters: directional distribution of EDFs. Types:
diffuse, Phong-like, sampled isotropic (sampled intensity
values versus angle w.r.t. axis of symmetry), directional light field
(given as a texture), procedural. Future plans: IES light source description
files and/or similar, ...
- Scatterers: directional distribution of BSDFs. Types:
diffuse reflector/refractor, modified Phong reflector/refractor,
procedural. Future plans: Fresnel, Cook-Torrance, Poulin-Fournier, HTSG,
Strauss, Ward, Schlick, sampled, ...
- Phase functions: directional distribution of volume scattering functions.
Types: isotropic, procedural. Future plans: sampled, Rayleigh, Murky and Hazy
Mie, Henyey-Greenstein, Schlick, ...
- Wavelength dependence is expressed by means of spectra. A spectrum
is a scalar function of wavelength.
Types: XYZ, Lxy, monochromatic, black body, sampled, tabulated,
procedural + linear combinations.
- Additional node types describe
- background radiation (sky illumination, augmented reality, ...)
as a function of incident direction: procedural or expressed by a texture.
- atmosphere: medium outside any object in the scene: e.g. misty air,
underwater scenes, ...
- Time dependence is handled using the standard VRML'97 event handling
system with new interpolators for spectra, surfaces and participating
media descriptions. Future plans: geometry distortion interpolators.
- VRML'97 is extended using the
mechanism. That means: define the interface of new VRML'97 scene graph
nodes that will describe physics based appearance and light sources. A
specialised browser (RenderPark, ART) will recognise and use these new node
types. For less fortunate browsers, a default implementation will be developed
that converts the physics based material and light source descriptions as good
as possible into standard VRML materials and light sources.
In short: standard browsers will still be able to process the extended models
while intelligent browsers will also understand the extensions.
- Extensions stick as close as possible with the semantics of standard
VRML nodes. There are two abberations:
- Procedural textures, spectra and directional distributions
use the same scripting language interface
as the VRML Script node.
That is: arguments are described by eventIn's and return values
as eventOut's. However, unlike
Script nodes, these
procedural nodes do not serve to dynamically modify the world
being interacted with. They do not participate in normal VRML event
processing. There is no way to dynamically change the behaviour of such
procedural description nodes. There is no loss of generality: dynamic medium
and surface changes can be expressed by other means.
- New interpolators for spectra, surfaces and
media have slightly different look than standard VRML interpolator nodes
because spectra, surfaces and media are not VRML field types.
- Only basic building blocks are provided, with some redundancy for
More complex yet easy-to-use descriptions can be obtained
by composing the basic building blocks using mechanisms already
present in standard VRML'97: PROTO's and named nodes.
- PhBRML node reference (needs updating)
This page is maintained by