Download 4 MPEG-J extensions for rendering

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
© ISO/IEC 2004 — All rights reserved
INTERNATIONAL ORGANIZATION FOR STANDARDIZATION
ORGANISATION INTERNATIONALE NORMALISATION
ISO/IEC JTC 1/SC 29/WG 11
CODING OF MOVING PICTURES AND AUDIO
ISO/IEC JTC 1/SC 29/WG 11
N6751
October 2004, Mallorca
Title
Source
Status
Editors
ISO/IEC CD 14496-21
SNHC
Approved
Mikaël Bourges-Sévenier (Mindego Inc.) - Editor, Itaru Kaneko (Waseda Univ.),
Vishy Swaminathan (Sun Microsystems)
ISO/IEC JTC 1/SC 29 N
Date: 2004-07-22
ISO/IEC CD 14496-21
ISO/IEC JTC 1/SC 29/WG 11
Secretariat:
Information technology — Coding of audio-visual objects — Part 21:
MPEG-J extensions for rendering
Technologies de l'information — Codage des objets audio-visuels — Partie 21: Extensions MPEG-J pour
rendu
Warning
This document is not an ISO International Standard. It is distributed for review and comment. It is subject to
change without notice and may not be referred to as an International Standard.
Recipients of this draft are invited to submit, with their comments, notification of any relevant patent rights of
which they are aware and to provide supporting documentation.
ISO/IEC CD 14496-21
Copyright notice
This ISO document is a working draft or committee draft and is copyright-protected by ISO. While the
reproduction of working drafts or committee drafts in any form for use by participants in the ISO standards
development process is permitted without prior permission from ISO, neither this document nor any extract
from it may be reproduced, stored or transmitted in any form for any other purpose without prior written
permission from ISO.
Requests for permission to reproduce this document for the purpose of selling it should be addressed as
shown below or to ISO's member body in the country of the requester:
[Indicate the full address, telephone number, fax number, telex number, and electronic mail address, as
appropriate, of the Copyright Manger of the ISO member body responsible for the secretariat of the TC or
SC within the framework of which the working document has been prepared.]
Reproduction for sales purposes may be subject to royalty payments or a licensing agreement.
Violators may be prosecuted.
© ISO/IEC 2004 — All rights reserved
iii
ISO/IEC CD 14496-21
Contents
Page
Foreword ..............................................................................................................................................................v
1
Scope ......................................................................................................................................................1
2
Normative references ............................................................................................................................1
3
Symbols and abbreviated terms ..........................................................................................................1
4
4.1
4.2
4.3
4.4
4.4.1
4.4.2
4.4.3
4.5
MPEG-J extensions for rendering ........................................................................................................1
Introduction ............................................................................................................................................1
Architecture ............................................................................................................................................2
Sequence of operations ........................................................................................................................4
Extensions to MPEG-J APIs (ISO/IEC 14496-11) ................................................................................5
Renderer .................................................................................................................................................6
Decoder ...................................................................................................................................................6
API specification ....................................................................................... Error! Bookmark not defined.
Restrictions ............................................................................................................................................7
References ........................................................................................................................................................ 10
Annex A (normative) Normative annex .......................................................................................................... 11
Annex B (informative) Java binding to OpenGL ES ...................................................................................... 11
B.1
Overview .............................................................................................................................................. 11
B.2
Design .................................................................................................................................................. 11
B.3
Java bindings to EGL design ............................................................................................................ 14
B.4
MemItem - Native memory wrapper .................................................................................................. 14
iv
© ISO/IEC 2004 — All rights reserved
ISO/IEC CD 14496-21
Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are members of
ISO or IEC participate in the development of International Standards through technical committees
established by the respective organization to deal with particular fields of technical activity. ISO and IEC
technical committees collaborate in fields of mutual interest. Other international organizations, governmental
and non-governmental, in liaison with ISO and IEC, also take part in the work. In the field of information
technology, ISO and IEC have established a joint technical committee, ISO/IEC JTC 1.
International Standards are drafted in accordance with the rules given in the ISO/IEC Directives, Part 2.
The main task of the joint technical committee is to prepare International Standards. Draft International
Standards adopted by the joint technical committee are circulated to national bodies for voting. Publication as
an International Standard requires approval by at least 75 % of the national bodies casting a vote.
Attention is drawn to the possibility that some of the elements of this document may be the subject of patent
rights. ISO and IEC shall not be held responsible for identifying any or all such patent rights.
ISO/IEC 14496-21 was prepared by Joint Technical Committee ISO/IEC JTC 1, Information technology,
Subcommittee SC 29, Coding of audio, picture, multimedia and hypermedia information.
This second/third/... edition cancels and replaces the first/second/... edition (), [clause(s) / subclause(s) /
table(s) / figure(s) / annex(es)] of which [has / have] been technically revised.
ISO/IEC 14496 consists of the following parts, under the general title Information technology — Coding of
audio-visual objects:

Part 1: Systems

Part 2: Visual

Part 3: Audio

Part 4: Conformance testing

Part 5: Reference Sofware

Part 6: Delivery Multimedia Integration Framework (DMIF)

Part 7: Optimized software for MPEG-4 tools

Part 8: MPEG-4 over IP networks

Part 9: Reference hardware description

Part 10: Advanced Video Coding (AVC)

Part 11: Scene description and Application Engine

Part 12: ISO Media File Format

Part 13: IPMP extensions
© ISO/IEC 2004 — All rights reserved
v
ISO/IEC CD 14496-21

Part 14: MP4 File Format

Part 15: AVC File Format

Part 16: Animation Framework eXtension (AFX)

Part 17: Streaming text format

Part 18: Font compression and streaming

Part 19: Synthesized texture streaming

Part 20: Lightweight Scene Representation

Part 21: MPEG-J extensions for rendering
vi
© ISO/IEC 2004 — All rights reserved
COMMITTEE DRAFT
ISO/IEC CD 14496-21
Information technology — Coding of audio-visual objects —
Part 21: MPEG-J extensions for rendering
1
Scope
This part of ISO/IEC 14496 specifies MPEG-4 MPEG-J extensions for rendering. This extension enables
Java-based applications to control the rendering and composition of synthetic and natural media in a
programmatic manner.
2
Normative references
The following normative documents contain provisions which, through reference in this text, constitute
provisions of this part of ISO/IEC 14496. For dated references, subsequent amendments to, or revisions of,
any of these publications do not apply. However, parties to agreements based on this part of ISO/IEC 14496
are encouraged to investigate the possibility of applying the most recent editions of the normative documents
indicated below. For undated references, the latest edition of the normative document referred to applies.
Members of ISO and IEC maintain registers of currently valid International Standards.
ISO/IEC 14496-1:2001, Information technology — Coding of audio-visual objects — Part 1: Systems
ISO/IEC 14496-11:2003, Information technology — Coding of audio-visual objects — Part 11: Scene Description and
Application Engine
JSR-184, Mobile 3D Graphics
JSR-239, Java bindings to OpenGL ES
For notations used in this document
ISO/IEC 19501, Information technology — Unified Modeling Language (UML)
UML 2.0 — Object Modeling Group, http://www.omg.org
ISO/IEC 14750, Information technology — Open Distributed Processing — Interface Definition Language (IDL)
Java language specification — Sun Microsystems, http://java.sun.com
3
Symbols and abbreviated terms
BIFS
ES
OD
JCP
JSR
MPEG-J
4
4.1
BInary Format for Scene
Elementary Stream
Object Descriptor
Java Community Process
Java Specification Request
MPEG-4 Java Application Engine
MPEG-J extensions for rendering
Introduction
In an MPEG-4 terminal, multiple media are composed to create a final image displayed on its screen. These
media may be synthetic (e.g. made by a computer such as vector graphic) or natural (e.g. audio and video
© ISO/IEC 2004 — All rights reserved
1
ISO/IEC CD 14496-21
captured from a sensor). Composition of visual media to produce a final image is achieved, for each frame, by
rendering instructions.
In MPEG-4 Part 11, the scene description or BIFS describes rendering and composition operations in a
structured manner using a tree or scene graph. The application engine interacts with the scene description to
arrange rendering and composition operations based on application's logic. However, the application has no
direct access to rendering or composition operations; rather the terminal interprets the operations and
performs the requested operations.
In this document, MPEG-4 Part 21, the application engine is extended with direct access to rendering and
composition operations. This enables applications to optimize organization of such operations based on their
logic and to produce visual effects currently not possible with a descriptive language such as BIFS.
To enable as many applications as possible, this document proposes using the well-used and known industry
computer graphics standard OpenGL and in particular the ES version targeted at embedded systems. The
binding from Java to OpenGL is defined by JSR-239 expert group and restricted in this specification for
security and performance reasons of a multimedia terminal. However, some applications may prefer using a
lightweight scene graph and possibly a proprietary rasterizer optimized for such graphs. In this case, this
document recommends using the specification defines by JSR-184 expert group. Figure 1 depicts the block
organization of systems and APIs in an MPEG-4 terminal using the specification in this document.
Application (MPEGlet)
Java (sent to terminal)
MPEG-J
(ISO/IEC 14496-11)
Other SG
(incl. BIFS)
JSR-184
Java (in terminal)
Native (in terminal)
MPEG terminal
(Systems / Decoders)
JSR-239
OpenGL ES
Proprietary
rasterizer
Figure 1 - Block diagram of an MPEG-4 Player with MPEG-J extensions for rendering.
The two Java APIs JSR-239 and JSR-184 may be implemented on top of OpenGL ES. However, in some
environments, a simpler rasterizer may be used to implement JSR-184. Typically, an MPEGlet or application
may define its own scene graph APIs built upon JSR-239, may use an API similar to MPEG-4 Part 11 BIFS
built upon JSR-239, or use JSR-184 rich and lightweight scene graph API. Through MPEG-J and the
extensions define in this document, an application can interact with other resources in an MPEG terminal.
4.2
Architecture
Figure 2 shows the typical workflow in an MPEG-4 terminal. From left to right, a multiplexed stream is
received by the demultiplexer. The demultiplexer splits the stream in elementary streams that are decoded by
decoders. MPEG-4 defines various decoders such as audio, video, and MPEGJ. The MPEGJ decoder
receives Java classes and launches MPEGlets, which are similar to Java Applets in the context of an MPEG
terminal.
In this specification, an MPEGlet controls rendering and compositing operations by issuing graphic commands
to JSR-239 or JSR-184 APIs. Calls to JSR-239 methods are directly executed onto OpenGL ES rendering
context. Calls to JSR-184 may be translated to OpenGL ES or interpreted by a more specialized proprietary
renderer (as shown on Figure 1). Using MPEG-J API defined in ISO/IEC 14496-11, an MPEGlet can control
decoders and other resources in the system.
On the native side, software or hardware video decoders output pixel arrays per frame that refresh a texture
object in the Renderer fast texture memory (OpenGL ES may be implemented on a dedicated chip or in
software). Using OpenGL calls, texture objects can be accessed and mapped onto 3D surfaces at any time.
2
© ISO/IEC 2004 — All rights reserved
ISO/IEC CD 14496-21
This enables any type of composition and effects using texture addressing, texture mapping, and blending
operations. The same features are available in JSR-184 API.
DB
Graphic commands
MPEGlet
(App logic)
JSR-239/184
JSR-239/184
Java
DeMux
Native
DB
Video Dec
pixels
DB
Video Dec
pixels
(Decoding Buffers)
Renderer
OpenGL ES
or
Proprietary
Final image
(composition buffers)
Figure 2 – Conceptual workflow.
Figure 3 provides a more detailed view of the interaction between decoders, renderer, and an MPEGlet. An
MPEGlet can control a decoder via its Decoder interface. The decoder outputs composition buffers and the
MPEGlet can retrieve information about this buffer by querying the Decoder's BufferInfo object. BufferInfo may
wrap different types of information depending on the decoder:

In the case of a video decoder, its output is a byte array of pixels. Since a native decoder outputs this
buffer into a native memory area, on the Java side, the BufferInfo contains a MemItem (short for Memory
Item). This MemItem can be passed as an argument of GL methods for texture operations.

In the case of a decoder creating a mesh, BufferInfo would contain a vertex array inside a MemItem and
this MemItem can be used as argument of GL vertex arrays methods.

In the case of a BIFS decoder,
org.iso.mpeg.mpegj.scene.Scene object.

For other decoders, specific BufferInfos may be defined.
BufferInfo
would
contain
a
reference
to
a
MPEGlet
BufferInfo
Decoder
MemItem
Renderer
EGL
GL
Java
control
Native Decoder
native memory area
JNI
graphic operations
OpenGL ES
native
Native window
Figure 3 – Interaction between decoders, GL renderer, and MPEGlet.
© ISO/IEC 2004 — All rights reserved
3
ISO/IEC CD 14496-21
4.3
Sequence of operations
Figure 4 details the sequence of operations to initialize, to run, and to dispose an MPEGlet by extending the
lifecycle of an MPEGlet as defined in ISO/IEC 14496-11.
4

Initialization – The MPEG terminal calls MPEGlet.init() method. The MPEGlet registers itself to the
terminal by creating an MPEGTerminal. The MPEGTerminal gives access to resource manager,
which in turn gives access to the Renderer object in this terminal. The MPEGlet can now retrieve the
EGL interface and create a rendering context (i.e. a GLContext). If successful, the MPEGlet can now
issue graphic operations i.e. GL commands. Typically, an application will initialize resources it might
need later at each frame.

Run – Since an MPEGlet implements the Runnable interface, MPEGlet.run() is called to run the
application. This is during this call that the MPEGlet renders its scene by issuing GL commands and
finishing with EGL.eglSwapBuffers().

Dispose – When the MPEGlet is to be stopped, MPEGlet.stop() is called followed by
MPEGlet.destroy() so to stop rendering operations and dispose from memory any resources created
by the MPEGlet.
© ISO/IEC 2004 — All rights reserved
ISO/IEC CD 14496-21
Decoder
Terminal
GL
EGL
Renderer
Initialization
creates
MPEGlet
init
creates
MPEGJTerminal
getResourceManager().getRenderer()
getGL
getEGL
configure EGL and call eglCreateContext()
Initialization of GL may be done anywhere
after EGL returns a valid context to Renderer's
window surface.
init GL resources
GL commands
EGL commands
Per frame operations
run() is Runnable.run(). It is assumed that
the Terminal starts a Thread for this
MPEGlet.
display() is here for conceptual purpose, the
code may be in run() method itself.
run
*display
getMediaTime
getMediaTime() may be called at any time.
GL commands
EGL commands
eglSwapBuffers
Destruction
stop
stop operatiorns
destroy
deallocate resources
*eglDestroySurface
eglDestroyContext
Figure 4 – Sequence of operations between MPEG terminal, MPEGlet, and JSR-239
An MPEGlet can query a decoder to get the current media time and to query about its status among other
operations. Using a media’s media time, an animation is able to synchronize its clock to the media.
When using JSR-184, calls to GL and EGL are replaced by calls to Grpahics3D when using the immediate
mode and to the scene API in the retained mode.
4.4
Extensions to MPEG-J APIs (ISO/IEC 14496-11)
Following the architecture depicted in Figure 2, this section describes decoder and renderer extensions to
MPEG-J for usage with JSR-239 and JSR-184.
© ISO/IEC 2004 — All rights reserved
5
ISO/IEC CD 14496-21
4.4.1
Renderer
MPEG-J Renderer interface (from package org.iso.mpeg.mpegj.resource) is extended with methods that
return the dimension of the terminal's window. Extending Renderer interface, GLRenderer and M3GRenderer
interfaces provide access to JSR-239 and JSR-184 respectively.
GLRenderer exposes GL and EGL interfaces. M3GRenderer exposes Graphics3D object.
«interface»
Renderer
+
+
Proprietary renderers may be added
by extending Renderer interface.
getWidth() : int
getHeight() : int
«interface»
GLRenderer
+
+
getGL() : GL
getEGL() : EGL
«interface»
M3GRenderer
+
getGraphics3D() : Graphics3D
Figure 5 – Renderer architecture for JSR-239 and JSR-184.
NOTE
Renderer may need to add methods for registration of Mouse/Keyboard event listeners. However, this is not
necessary as an EventManager object may be defined to manage all events in the system. Using Mouse/Keyboard event
registration in Renderer may thus be a matter of convenience since Renderer knows about the terminal’s window, where
these events come from.
4.4.2
Decoder
Decoder interface extends MPEG-J MPDecoder interface (from package org.iso.mpeg.mpegj.decoder) to
expose the media time and the BufferInfo interface. The media time is given in milliseconds and corresponds
to the current position in the stream. This information is useful to synchronize graphic objects with other
streams.
BufferInfo exposes the necessary information for an MPEGlet to render the output of a decoder. Specific
interfaces are provided for BIFS decoder (BifsInfo interface), video decoders (TexInfo interface), and mesh
decoders (MeshInfo interface). Other interfaces may be defined in the future for other media types carried by
MPEG systems.

BifsInfo exposes a Scene object to access the underlying BIFS scene graph,

TexInfo exposes the video frame dimension and format. The format is always an OpenGL format
supported by glTexImage2D or glCompressedTexImage2D (e.g. GL_RGB, GL_RGBA, and so on) as
supported by OpenGL ES.

MeshInfo provides the number of vertices, the number of components per vertex, and the GL format of
the data (e.g. GL_BYTE, GL_FIXED, GL_FLOAT, and so on).
Since video and mesh decoders output a continuous region in memory (i.e. a byte array), TexInfo and
MeshInfo provides access to the MemItem necessary for GL methods. GL specifications call this argument the
pointer, and so the naming to get MemItems in MeshInfo and TexInfo classes using the getPointer() method,
see Figure 6 below.
6
© ISO/IEC 2004 — All rights reserved
ISO/IEC CD 14496-21
«interface»
MPDecoder
+
+
+
+
+
+
+
+
+
+
+
attach(ES_Descriptor) : void
detach() : void
getES_Descriptor() : ES_Descriptor
getInstance() : int
getType() : DecoderType
getVendor() : String
isPauseable() : boolean
pause() : void
resume() : void
start() : void
stop() : void
«interface»
«interface»
BufferInfo
Decoder
+
+
getMediaTime() : long
getCompositionBufferInfo() : BufferInfo
+
Specialized decoders may
define more appropriate
interfaces.
0..1
«interface»
«interface»
BifsInfo
TexInfo
getScene() : Scene
+
+
+
+
getWidth() : int
getHeight() : int
getFormat() : int
getPointer() : MemItem
«interface»
MeshInfo
+
+
+
+
getNumVertices() : int
getNumComponents() : int
getFormat() : int
getPointer() : MemItem
Figure 6 – Decoder architecture.
NOTE
For mesh info, since getValue is not possible, we may need to expose bounding volume information for culling,
collision and other applications. But should such decoder information be exposed to the application?
NOTE
4.5
For non power-of-2 textures, It may be useful to know if the texture has been scaled to a power-of-2 size?
Restrictions
The design of this specification favored non exposure of native handles to region in memory, terminal window,
graphic context, and so on, for security reasons. This imposes some restrictions on the usage of JSR-239:

An MPEGlet cannot create a window surface; it must use the terminal's window surface. This is strictly
enforced in eglCreateWindowSurface() which doesn't take into consideration the native window
parameter and hence should be null.

Native memory regions are not accessible from the Java side i.e. there is no MemItem.getValue() method.
This is not necessary with OpenGL and an MPEGlet may maintain a backing array if it needs to
dynamically update texture arrays or vertex arrays.
Only note that, per OpenGL ES specification, only one MPEGlet at a time can access the rendering context.
Therefore, special care must be taken in a multithreaded environment such as multiple MPEGlets running in
parallel, or an MPEGlet spawning multiple threads, or an MPEGlet listening to AWT events (e.g. mouse
events, keyboard events and so on). In those cases, one must use proper synchronization mechanisms to
protect shared information among threads including the Renderer.
© ISO/IEC 2004 — All rights reserved
7
ISO/IEC CD 14496-21
In the case of an MPEGlet creating a new rendering context when an already existing MPEGlet has a
rendering context, no rendering context will be returned (i.e. a null value will be returned).
4.6
Example
EGLSurface drawSurface;
EGLDisplay display;
EGLContext context;
boolean stopped;
/**
* Implements MPEGlet.init() method
* Initialize resources needed by the application, in particular EGL.
*/
void init()
{
MPEGJTerminal terminal=new MPEGJTerminal(this);
Renderer renderer=terminal.getRenderer();
gl = renderer.getGL();
egl = renderer.getEGL();
try {
/* get an EGL display connection */
EGLDisplay display = egl.eglGetDisplay(EGL.EGL_DEFAULT_DISPLAY);
/* initialize the EGL display connection */
egl.eglInitialize(display, null, null);
/* get an appropriate EGL frame buffer configuration */
int attribute_list[] = {
EGL.EGL_RED_SIZE, 8,
EGL.EGL_GREEN_SIZE, 8,
EGL.EGL_BLUE_SIZE, 8,
EGL.EGL_ALPHA_SIZE, 8,
EGL.EGL_DEPTH_SIZE, 24,
EGL.EGL_NONE,
};
EGLConfig configs[] = new EGLConfig[1];
int numConfigs[] = { 0 };
egl.eglChooseConfig(_display,
attribute_list,
configs,
numConfigs);
/* create an EGL rendering context */
context = _egl.eglCreateContext(_display,
null);
configs[0],
configs.length,
EGL.EGL_NO_CONTEXT,
/* create an EGL window surface
*
* Note: there is no native window passed to this method as in EGL
* specification because only terminal’s window can be used.
*/
drawSurface = egl.eglCreateWindowSurface(display, configs[0], null);
/* connect the context to the surface */
egl.eglMakeCurrent(_display, drawSurface, drawSurface, context);
}
// catch EGL errors
catch(EGLException ex) {
8
© ISO/IEC 2004 — All rights reserved
ISO/IEC CD 14496-21
…
}
}
/**
* Implements Runnable.run()
*/
public void run()
{
while(!stopped)
display();
}
/**
* render a frame.
*/
void display()
{
try {
// draw the scene with any GL commands using GL interface
// swap back and front buffers
egl.eglSwapBuffers(display, drawSurface);
}
catch(GLException glex)
{ … }
catch(EGLException eglex)
{ … }
}
/**
* Shutdown application
* Implements MPEGlet.stop()
*/
void stop()
{
// stop rendering
stopped=true;
…
}
/**
* Destroy all allocated resources
* Implements MPEGlet.destroy().
*/
void destroy()
{
// clean up EGL
eglDestroySurface(display, drawSurface);
eglDestroyContext(display, context);
// free application resources
…
}
© ISO/IEC 2004 — All rights reserved
9
ISO/IEC CD 14496-21
References
[1]
OpenGL
ES
reference
manual.
http://www.khronos.org/opengles/documentation/opengles1_0/html/index.html
[2]
JSR-239 Java binding to OpenGL ES. http://www.jcp.org/en/jsr/detail?id=239
[3]
JSR-184
Mobile
3D
Graphics
specification.
http://jcp.org/aboutJava/communityprocess/final/jsr184/index.html
10
© ISO/IEC 2004 — All rights reserved
ISO/IEC CD 14496-21
Annex A
(normative)
Java binding to OpenGL ES
A.1 Overview
This annex describes how the Java binding to OpenGL ES has been defined from OpenGL ES C specification
(available on http://www.khronos.org).
NOTE
This annex is kept in this CD since JSR-239 has no draft specification at this time and the specification below
has been proposed for their consideration. Therefore, it is expected that the normative reference to JSR-239 will suffice
and this section to disappear at FCD or FDIS stage.
A.2 Design
It is assumed that the system provides a Renderer class (such as implementing
org.iso.mpeg.mpeg.resource.Renderer interface) from which an application can retrieve EGL and GL
interfaces.

EGL – exposes all EGL Window system methods and constants

GL – exposes all OpenGL ES methods and constants
EGL
Renderer
GL
Figure 1 – Render object exposes EGL and GL interfaces.
The naming convention of native to Java methods is straightforward: it is a one to one mapping with the
following rules in Table 1.
Table 1 – C to Java type conversion rules.
C type
#define GL_constant 0x1234
const
GLAPI
APIENTRY
GLenum
GLboolean
GLbitfield
GLbyte
GLshort
GLint
GLsizei
GLubyte
GLushort
© ISO/IEC 2004 — All rights reserved
Java type
public final const int GL_constant = 0x1234;
int
boolean
int
byte
short
int
int
byte
short
int
11
ISO/IEC CD 14496-21
GLuint
GLfloat
GLclampf
GLvoid
GLfixed
GLclampx
EGLBoolean;
EGLint;
void *EGLDisplay;
void *EGLConfig;
void *EGLSurface;
void *EGLContext;
glXXX<type>v(…, GL<type> *params)
void *pointer
&pointer[offset]
glGetIntegerv(GLenum
pname,
GLint *params)
GLAPI const GLubyte *
APIENTRY glGetString (GLenum
name);
GLAPI void APIENTRY
glGenTextures (GLsizei n,
GLuint *textures);
GLAPI void APIENTRY
glDeleteTextures (GLsizei n,
const GLuint *textures);
Texture methods
Vertex array methods
float
float
void
int
int
boolean
int
EGLDisplay
EGLConfig
EGLSurface
EGLContext
glXXXv(…,<type>[] params)
MemItem pointer
MemItem pointer, int offset
glGetIntegerv(int pname, int[] params)
* see note below about state query methods
String glGetString (int name);
void glGenTextures (int n, int []
textures);
void glDeleteTextures (int n, int[]
textures);
* see note below
* see note below
* Note
The last two rules add a change for all methods that use memory access. As discussed in Section A.4,
memory access is provided by MemItem objects that wrap native memory. MemItem could provide an offset
attribute to mimic the C call but we believe it is clearer to add an extra offset parameter to all GL methods
using arrays of memory (or pointers to it). Therefore the following methods have been modified:
Texture methods
GLAPI void APIENTRY glCompressedTexImage2D (GLenum target, GLint level, GLenum
internalformat, GLsizei width, GLsizei height, GLint border, GLsizei imageSize,
const GLvoid *data);
GLAPI void APIENTRY glCompressedTexSubImage2D (GLenum target, GLint level, GLint
xoffset, GLint yoffset, GLsizei width, GLsizei height, GLenum format, GLsizei
imageSize, const GLvoid *data);
GLAPI void APIENTRY glReadPixels (GLint x, GLint y, GLsizei width, GLsizei
height, GLenum format, GLenum type, GLvoid *pixels);
GLAPI void APIENTRY glTexImage2D (GLenum target, GLint level, GLint
internalformat, GLsizei width, GLsizei height, GLint border, GLenum format,
GLenum type, const GLvoid *pixels);
GLAPI void APIENTRY glTexSubImage2D (GLenum target, GLint level, GLint xoffset,
GLint yoffset, GLsizei width, GLsizei height, GLenum format, GLenum type, const
GLvoid *pixels);
Vertex array methods
12
© ISO/IEC 2004 — All rights reserved
ISO/IEC CD 14496-21
GLAPI void APIENTRY glColorPointer (GLint size, GLenum type, GLsizei stride,
const GLvoid *pointer);
GLAPI void APIENTRY glDrawElements (GLenum mode, GLsizei count, GLenum type,
const GLvoid *indices);
GLAPI void APIENTRY glNormalPointer (GLenum type, GLsizei stride, const GLvoid
*pointer);
GLAPI void APIENTRY glTexCoordPointer (GLint size, GLenum type, GLsizei stride,
const GLvoid *pointer);
GLAPI void APIENTRY glVertexPointer (GLint size, GLenum type, GLsizei stride,
const GLvoid *pointer);
The previous methods have been translated to
Texture methods
int width, int height, int border, int imageSize, MemItem data, int offset);
void glCompressedTexSubImage2D (int target, int level, int xoffset, int yoffset,
int width, int height, int format, int imageSize, MemItem data, int offset);
void glReadPixels (int x, int y, int width, int height, int format, int type,
MemItem pixels, int offset);
void glTexImage2D (int target, int level, int internalformat, int width, int
height, int border, int format, int type, MemItem pixels, int offset);
void glTexSubImage2D (int target, int level, int xoffset, int yoffset, int width,
int height, int format, int type, MemItem pixels, int offset);
Vertex array methods
void glColorPointer (int size, int type, int stride, MemItem pointer, int
offset);
void glCompressedTexImage2D (int target, int level, int internalformat, void
glDrawElements (int mode, int count, int type, MemItem indices, int offset);
void glNormalPointer (int type, int stride, MemItem pointer, int offset);
void glTexCoordPointer (int size, int type, int stride, MemItem pointer, int
offset);
void glVertexPointer (int size, int type, int stride, MemItem pointer, int
offset);
State query methods such as glGetIntegerv() are identical to their C counterpart and the application developer
must be careful to allocate the necessary memory for the value queried.
For all methods, if arguments are incorrect or an error occurs in Java or in native side, an GLException is
thrown.
NOTE
© ISO/IEC 2004 — All rights reserved
13
ISO/IEC CD 14496-21
In OpenGL specifications, if a method raises an error, a call to glGetError() returns the error code. In Java, a
method throws an Exception, which enables a better handling of errors when they happened and where they
happened, and glGetError() returns the GL error code.
A.3 Java bindings to EGL design
OpenGL ES interface to a native window system (EGL) defines four objects abstracting native display
resources:

EGLDisplay, represents the abstract display on which graphics are drawn

EGLConfig describes the depth of the color buffer components and the types, quantities and sizes of
the ancillary buffers (i.e., the depth, multisample, and stencil buffers).

EGLSurface are created with respect to an EGLConfig. They can be a window, a pbuffer (offscreen
drawing surface), or a pixmap.

EGLContext defines both client state and server state.
We define exactly the same objects in Java, they wrap information used in the native layer. A user has no
access to such information for security reasons. Moreover, for MPEG terminals, restrictions on EGL API
applies as described in section 4.5.
The naming conventions are the same as for GL (Table 1) and EGLException is thrown if an error occurs.
A.4 MemItem - Native memory wrapper
While many OpenGL calls contain few parameters, geometry and texture calls contains a large number of
information. Moreover, it is often necessary to share parts of such buffers. While using parts of a buffer is a
basic feature in all native languages (e.g. C, C++), it is not always available in scripting languages such as
Java. In fact, pointers to memory address are often not exposed.
For security reasons, directly accessing memory of the terminal is dangerous as a malicious script could
potentially access vital information within the terminal, thereby crashing it or stealing user information. In order
to avoid such scenarios, we wrap native memory area into an object called MemItem (which stands for
memory item for short).
MemItem object is responsible for allocating native memory areas necessary for the application, putting
information into it, and updating information in it. In Java Virtual Machine (JVM) 1.4 and higher, the ByteBuffer
feature enables this feature. However, embedded systems use lower version of JVMs and hence don’t have
ByteBuffers. ByteBuffers provide a generic cross-platform mechanism for wrapping native memory area,
providing a feature referred to as memory pinning. With memory pinning, the location of the buffer is
guaranteed not to move as the garbage collector reclaims memory from destroyed objects. The proposed
MemItem also provides memory pinning for any JVMs.
A MemItem is a wrapper around a native array of bytes. No access to the native values is given in order to
avoid JNI performance hit or memory hit for a backing array on the Java side; the application may maintain a
backing array for its needs. Therefore, operations are provided to set values (setValues()) from Java side to
the native array. setValues() with source values from a MemItem enables native memory transfer from a
source native array to a native destination array. clear() methods enable the buffer to be cleared to a
predefined value. Note that MemItem native buffer is always initialized with value 0 (as if calling clear(0)).
While this mechanism is less generic than Java 1.4’s ByteBuffers, it is sufficient for graphic operations with
OpenGL.
14
© ISO/IEC 2004 — All rights reserved