Download PPT(updated) - Modeling & Simulation Lab.

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Stereo photography techniques wikipedia , lookup

Ray tracing (graphics) wikipedia , lookup

Framebuffer wikipedia , lookup

Anaglyph 3D wikipedia , lookup

Apple II graphics wikipedia , lookup

Image editing wikipedia , lookup

Stereoscopy wikipedia , lookup

Color wikipedia , lookup

Color vision wikipedia , lookup

Dither wikipedia , lookup

List of 8-bit computer hardware palettes wikipedia , lookup

BSAVE (bitmap format) wikipedia , lookup

Stereo display wikipedia , lookup

Spatial anti-aliasing wikipedia , lookup

2.5D wikipedia , lookup

Hold-And-Modify wikipedia , lookup

Rendering (computer graphics) wikipedia , lookup

Indexed color wikipedia , lookup

Transcript
Photo-realistic Rendering and Global
Illumination in Computer Graphics
Spring 2012
Visual Appearance
K. H. Ko
School of Mechatronics
Gwangju Institute of Science and Technology
Surface Detail
(Simulation of missing surface detail)

Surface-Detail Polygons
 Add
gross detail through the use of surface-detail polygons
to show features on a base polygon.
 Each surface-detail polygon is coplanar with its base
polygon.
 It does not need to be compared with other polygons
during visible-surface determination.
2
Surface Detail
(Simulation of missing surface detail)
 Texture Mapping
 Map
an image, either digitized or synthesized, onto a
surface.
 Two steps.


Mapping the four corners of the pixel onto the surface.
The pixel’s corner points in the surface’s (s,t) coordinate space are
mapped into the texture’s (u,v) coordinate space.

We compute a value for the pixel by summing all texels that lie
within the quadrilateral, weighting each by the fraction of the texel
that lies within the quadrilateral.
3
Surface Detail
(Simulation of missing surface detail)
 Bump Mapping
 It
simulates slight roughness on a surface by perturbing the
surface normal before it is used in the illumination model.
 We introduce a bump map, which is an array of
displacements, each of which can be used to simulate
displacing a point on a surface a little above or below that
point’s actual position.
 A good approximation of a disturbed normal
Bu ( N  Pt )  Bv ( N  Ps )
N' N 
|N|
A surface P=P(t,s) is given.
Bu and Bv are the partial
derivatives of the selected
bump-map entry B with
respect to the bump-map
parameterization axes, u
and v.
4
Surface Detail
(Simulation of missing surface detail)

Bump Mapping
5
Transparency

Transparency effects
 The
bending of light (refraction)
 Attenuation of light due to the thickness of the transparent
object
 Reflectivity
 Transmission changes due to the viewing angle
 etc.


They are limited in real-time rendering systems.
A little transparency is better than none at all.
6
Transparency

How to achieve transparency effect
 Screen-door
transparency
A simple method for giving the illusion of transparency
 The idea is to render the transparent polygon with a
checkerboard fill pattern : Every other pixel of the
polygon is rendered, thereby leaving the object behind it
partially visible.


In general the pixels on the screen are close enough together
that the checkerboard pattern itself is not visible.
7
Transparency

How to achieve transparency effect
 Screen-door

transparency
Drawbacks
A transparent object can be only 50% transparent.
 Only one transparent object can be convincingly rendered on
one area of the screen.

8
Transparency

How to achieve transparency effect
 Screen-door

transparency
Advantage

Simplicity: Transparent objects can be rendered at any time, in
any order and no special hardware is needed.
9
Transparency

How to achieve transparency effect
 Alpha
Blending
A method for more general and flexible transparency
effects.
 Blends the transparent object’s color with the color of
the object behind it.

10
Transparency
11
Transparency

How to achieve transparency effect
 Alpha

Blending
Alpha is a value describing the degree of opacity of an
object for a given pixel.
1.0: the object is opaque and entirely covers the pixel’s area of
interest
 0.0: the pixel is not obscured at all.

When an object is rendered on the screen, an RGB color
and a Z-buffer depth are associated with each pixel.
 Another component, called alpha, can also be generated
and, optionally, stored.

12
Transparency

How to achieve transparency effect
 Alpha
Blending
To make an object transparent, it is rendered on top of
the existing scene with an alpha of less than 1.0.
 Blending formula


co = ascs + (1 – as) cd.
 cs : the color of the transparent object (source)
 as : the object’s alpha
 cd : the pixel color before blending (destination)
 co : the resulting color due to placing the transparent object
over the existing scene.
13
Transparency

How to achieve transparency effect
 To
render transparent objects properly into a scene
requires sorting.
First, the opaque objects are rendered.
 Second, the transparent objects are blended on top of
them in back-to-front order.

 The

blending equation is order-dependent.
Blending in arbitrary order can produce serious artifacts.
14
Transparency
15
Transparency

How to achieve transparency effect
 Transparency
can be computed using two or more depth
buffers and multiple passes.


First, a rendering pass is made so that the opaque surfaces’ z-depths
are in the first Z-buffer.
On the second rendering pass, the depth test is modified to accept
the surface that is both closer than the depth of the first buffer’s
stored z-depth and the farthest among such surfaces.


This step renders the backmost transparent object into the frame
buffer and the z-depths into a second Z-buffer.
This Z-buffer is then used to derive the next-closest transparent
surface in the next pass.
 Effective,
but slow!!!
16
Compositing

The blending process of photographs or synthetic
renderings of objects is called compositing.
 The
alpha value at each pixel is stored along with the RGB
color value for the object.


The alpha channel is called the matte, and shows the silhouette
shape of the object.
This RGBα image can then be used to blend it with other such
elements or against a background.
17
Compositing

The most common way to store RGBα images
are with premultiplied alphas.
 The
RGB values are multiplied by the alpha value
before being stored.
 Image file formats that support alpha include TIFF
and PNG.
18
Compositing

Chroma-keying
A
concept related to the alpha channel.
 Actors are filmed against a blue, yellow, or green screen
and blended with a background.

Blue-screen matting
A
particular color is designated to be considered
transparent.


Where it is detected, the background is displayed.
One drawback of this scheme is that the object is either entirely
opaque or transparent at any pixel.

Alpha is effectively only 1.0 or 0.0.
19
Fog

Fog is a simple atmospheric effect that can be added
to the final image.
 It
increases the level of realism for outdoor scenes.
 Since the fog effect increases with the distance from the
viewer, it helps the viewer of a scene to determine how far
away objects are located.
 If used properly, it helps to provide smoother culling of
objects by the far plane.
 Fog is often implemented in hardware, so it can be used
with little or no additional cost.
20
Fog
21
Fog

The color of the fog is denoted Cf. The fog
factor is called f ∈[0,1].
 The
fog factor decreases with the distance from the
viewer.

The final color of the pixel is determined by
 cp

= fcs + (1 – f) cf
As f decreases, the effect of the fog increases.
22
Fog

Linear fog.



Exponential fog


The fog factor decreases linearly with the depth from the viewer.
F = (zend-zp)/(zend-zstart)
F = e –dfzp
Squared exponential fog

F = e –(dfzp)^2


df is a parameter that is used to control the density of the fog.
The value is clamped to [0,1]
23
Fog


Tables are sometimes used in implementing these fog functions in
hardware accelerators.
Some assumptions that can be considered in real-time systems which can
affect the quality of the output.

Fog can be applied on a vertex level or a pixel level.




The distance along the viewing axis is used as the depth for computing the fog
effect.


Applying it on the vertex level -> The fog effect is computed as part of the
illumination equation and the computed color is interpolated across the polygon
using Gouraud shading.
Pixel-level fog is computed using the depth stored at each pixel.
Pixel-level fog gives a better result, all other factors being equal.
One can use the true distance from the viewer to the object to compute fog. ->
Called radial fog, range-based fog, or Euclidean distance fog.
The highest-quality fog is generated by using pixel-level radial fog.
24
Fog
25
Gamma Correction

Once the pixel values have been computed, we need
to display them on a monitor.
 There
is a physical relationship between the voltage input
to an electron gun in a Cathode-Ray Tube (CRT) monitor
and the light output by the screen.

I = a(V + ε)γ





V: input voltage
a and γ : constants for each monitor
ε : the black level (brightness) setting for the monitor
I : intensity generated.
The gamma value for a particular CRT ranges from about 2.3 to 2.6.
Mostly the value of 2.5 is used. This is a good average monitor
value.

But it can be differently set depending on the situation.
26
Gamma Correction

There exists nonlinearity in the CRT response curve.
 The
relation of voltage to intensity for an electron gun in a
CRT is nonlinear.
 This causes a problem within the field of computer
graphics.

Lighting equations compute intensity values that have a linear
relationship to each other.



A computed value of 0.5 is expected to appear half as bright as 1.0.
But due to the nonlinearity, this expected effect may not be obtained.
To ensure that the computed values are perceived correctly relative to
each other, gamma correction is necessary.
27
Gamma Correction

Assume that the black level is zero. The computed
color component ci is converted by
C
= ci1/γ for display by the CRT.
 Ex.
With a gamma of 2.2 and ci = 0.5, the gamma-corrected
c is 0.73. So if the electron gun level is set at 0.73, an
intensity level of 0.5 is displayed.
 Computed colors need to be boosted by this equation to be
perceived properly with respect to one another when
displayed on a CRT monitor.
28
Gamma Correction

Gamma correction is important to real-time
graphics.
 Cross-platform
compatibility
 Color fidelity, consistency, and interpolation
 Dithering
 Line and edge antialiasing equality
 Alpha blending and compositing
 Texturing
29
Gamma Correction

Cross-platform compatibility.
 It
affects all images displayed, not just scene renderings.
 If gamma correction is ignored, models authored and
rendered on, say, an SGI machine will display differently
when moved to a Macintosh or a PC.

This issue instantly affects any images or models made available on
a web server.

Some sites have employed the strategy of attempting to detect the
platform of the client requesting information and serving up images or
models tailored for it.
30
Gamma Correction

Color fidelity


The appearance of a color will differ from its true hue.
Color consistency

Without correction, intensity controls will not work as expected.


If a light or material color is changed from (0.5,0.5,0.5) to (1.0,1.0,1.0), the
user will expect it to appear twice as bright, but it will not.
Color interpolation


A surface that goes from dark to light will not appear to increase
linearly in brightness across its surface.
The midtones will appear too dark.
31
Gamma Correction

Dithering
 In
dithering, two colors are displayed
close together and the eye combines
them and perceives a blend of the two.
 Lack of gamma correction adversely
affects dithering algorithms.


Without accounting for gamma, the
dithered color can be perceptibly different
from the color that is to be represented.
Using screen-door transparency will result
in a different perceived color from the
blended transparency color.
32
Gamma Correction

Line and edge antialiasing quality

As the use of line and edge antialiasing in real-time rendering increases,
gamma’s effect on the quality of these techniques will be more noticeable.

For example, a polygon edge covers four screen grid cells. The polygon is
white and the background is black.

Left to right, the cells are covered 1/8, 3/8, 5/8 and 7/8.



We want the pixels to appear as 0.125, 0.375, 0.625 and 0.875.
If the system has a gamma of 2.2, we need to send values of 0.389, 0.64, 0.808, and
0.941 to the electron guns.
Failing to do so will mean that the perceived brightness will not increase linearly.


Sending 0.125 to the guns will result in a perceived relative brightness of only 0.01.
0.875 will be affected somewhat less and will be perceived as 0.745.
33
Gamma Correction

Line and edge antialiasing quality
 This

nonlinearity causes an artifact called roping.
The edge looks somewhat like a twisted rope.
34
Gamma Correction

Line and edge antialiasing quality
 The
Liquid-Crystal Displays (LCDs) often have
different voltage/luminance response curves.

Because of these different response curves, lines that
look antialiased on CRTs may look jagged on LCDs, or
vice versa.
35
Gamma Correction

Alpha blending and compositing
 They
should be done in a linear space and the final
result should be gamma-corrected.
 This can lead to difficulties, as pixel values stored
in the color buffer are likely to be gammacorrected.

Bits of accuracy are lost as values are transferred from
one computational space to another.
36
Gamma Correction

Texturing
 Images
used as textures are normally stored in
gamma-corrected form for some particular type of
system, e.g., a PC or Macintosh.
 When using textures in a synthesized scene, care
must be taken to gamma-correct the texture a sum
total of only one time.
37
Gamma Correction


Hardware for gamma correction
If no hardware is available, try to perform gamma
correction earlier on in the pipeline.
 Ex.
We could gamma-correct the illumination value
computed at the vertex, then Gouraud shade from there.


This approach partially solves the cross-platform problem.
Another solution is to ignore the gamma correction
problem entirely and not do anything about it.
 Even
if the issues you encounter cannot be fixed, it is
important to understand what problems are caused by a
lack of gamma correction.
38
Gamma Correction
Gamma correction is not a user preference.
 It is something that can be designed into an
application to allow cross-platform consistency
and to improve image fidelity and rendering
algorithm quality.

39