OpenGL Programming/Android GLUT Wrapper

From Wikibooks, open books for an open world
Jump to: navigation, search

Our wrapper: Making-of

If you plan to write your own OpenGL ES 2.0 application, here are some tips on how the wrapper does it:

Writing C/C++ code for Android[edit]

Android's applications are written in Java, but they can call C/C++ code using JNI (Java Native Interface), which in Android is presented as the NDK (Native Development Kit).

You can either:

  • Write both a Java wrapper and C++ code:
    • Available since Android 1.5
    • The C++ code may interact with an OpenGL ES context created by Java
    • Creating an OpenGL ES 2.0 context (with EGL) directly from C++ requires Android 2.3/Gingerbread/API android-9
    • OpenGL ES 2.0 available since Android 2.0/API android-5
    • Example: NDK's hello-gl2 sample
  • From rely on the built-in "NativeActivity" java wrapper, and only write C++ code:
    • Available since Android 2.3/Gingerbread/API android-9
    • Use EGL to create the OpenGL ES context
    • Example: NDK's native-activity sample (it's OpenGL ES 1.x, but can easily be upgraded)

Native Activity details[edit]

Android 2.3/Gingerbread/API android-9 introduces native activities, which allows to write an application without any Java.

While the sample mentions a default API level of 8, while it should be 9.

    <uses-sdk android:minSdkVersion="9" />

Also, make sure your manifest has:

<application ...
        android:hasCode="true"

otherwise the application won't start.

Your entry point is the android_main function (instead of the more common main or WinMain). For portability, you could rename it at the pre-processor level using -Dmain=android_main[1].

Build system[edit]

The wrapper is based on the native-activity sample. It uses the 'android_native_app_glue' code that deals with non-blocking Android events processing.

<!-- Android.mk -->
LOCAL_STATIC_LIBRARIES := android_native_app_glue
...
$(call import-module,android/native_app_glue)

Since we don't call directly the glue code (its entry points are callbacks used by Android, not us), android_native_app_glue.o may be stripped by the compiler, so let's call its dummy entry point:

    // Make sure glue isn't stripped.
    app_dummy();

It uses OpenGL ES 2.0 (rather than the sample's OpenGL ES 1.X):

<!-- Android.mk -->
LOCAL_LDLIBS    := -llog -landroid -lEGL -lGLESv2

To use GLM, we need to enable the C++ STL:

<!-- Application.mk -->
APP_STL := gnustl_static

and advertise its install location:

<!-- Android.mk -->
LOCAL_CPPFLAGS  := -I/usr/src/glm

We now can declare our source files (tut.cpp):

<!-- Android.mk -->
LOCAL_SRC_FILES := main.c GL/glew.c tut.cpp


To run the build system:

  • Compile the C/C++ code
ndk-build NDK_DEBUG=1 V=1
  • Prepare the Java build system (only once):
android update project --name wikibooks-opengl --path . --target "android-10"
  • Create the .apk package:
ant debug
  • Install it:
ant installd
# or manually:
adb install -r bin/wikibooks-opengl.apk
  • Clean:
ndk-build clean
ant clean

We included these commands in the wrapper Makefile.

Creating the OpenGL ES context with EGL[edit]

We need to tell EGL to create an OpenGL ES with version 2.0 (not 1.x).

Firstly when requesting the available contexts:

    const EGLint attribs[] = {
            ...
	    EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT,
            EGL_NONE
    };
    ...
    eglChooseConfig(display, attribs, &config, 1, &numConfigs);

Secondly when creating the context:

    static const EGLint ctx_attribs[] = {
      EGL_CONTEXT_CLIENT_VERSION, 2,
      EGL_NONE
    };
    context = eglCreateContext(display, config, EGL_NO_CONTEXT, ctx_attribs);

(In Java code:)

setEGLContextClientVersion(2);
// or in a custom Renderer:
int[] attrib_list = {EGL_CONTEXT_CLIENT_VERSION, 2, EGL10.EGL_NONE };
EGLContext context = egl.eglCreateContext(display, eglConfig, EGL10.EGL_NO_CONTEXT, attrib_list);

It is good practice, but not mandatory, to declare GLES 2.0 requirement in your AndroidManifest.xml:

<uses-feature android:glEsVersion="0x00020000"></uses-feature>
<uses-sdk android:targetSdkVersion="9" android:minSdkVersion="9"></uses-sdk>

When the user goes to the home (or receives a call), your application is paused. When the user goes back to your application, it's unpaused, but the OpenGL context may be lost. In this case, you need to reload all the GPU-side resources (VBOs, textures, etc.). There is an Android event to detect when your application is un-paused.

Similarly when the user presses the Back button, the application is destroyed, but it still resides in memory and may be restarted.

For our wrapper, we considered that GLUT applications are generally not designed to resume the OpenGL context, let alone reset all statically-assigned variables. Consequently, the application just exits completely when the context is lost - just like when the application window is closed on desktops.

Android Events[edit]

Even if we write native code, our application is still started through a Java process, using the android.app.NativeActivity built-in activity. That process is responsible for receiving device events and forwarding them to our app.

Workflow:

  • The Android OS sends an event to the NativeActivity Java process
  • The Java Activity framework calls the appropriate Activity callback functions (e.g. such as protected void onLowMemory())
  • NativeActivity calls its JNI matching function in android_app_NativeActivity.cpp (e.g. void onLowMemory_native(...))
  • android_app_NativeActivity.cpp calls the matching NativeCode callback in android_native_app_glue.c (e.g. void onLowMemory(...))
  • android_native_app_glue.c write a message through a C pipe(2) (e.g. APP_CMD_LOW_MEMORY), and returns immediately so that the Java process doesn't get stuck (otherwise the user would be offered to kill it)
  • in our native app, on a regular basis, we check the event queue and call android_native_app_glue.c's process_cmd (or process_input)
  • going back one level up in android_native_app_glue.c, where process_cmd it executes a pre-event and a post-event generic hook, and in-between calls our app onAppCmd callback
  • back down in our app, where the onAppCmd hook (e.g. engine_handle_cmd) processes the event at last!

Resources/Assets[edit]

Android applications typically extract resources (such as shaders or meshes) from their .apk file (which is really a Zip archive).

  • resources are located in res/ sub-folders (e.g. res/layout/); there are Android functions to load them depending on their type
  • assets are located in the assets/ folder and are accessed through a more traditional directory structure

That's not common for GLUT applications, so let's try to make resources available transparently:

  • using a wrapper around fopen/open
    • loaded with LD_PRELOAD, such as zlibc
    • using the kernel ptrace hooks
  • redefining fopen in our .cpp file
  • extracting files beforehand

Using a fopen/open wrapper is tedious to implement, because our application is called through JNI. This means we cannot just execv another application after setting LD_PRELOAD. Instead, we'd need to start child process, forward it all the Android events, and setup an IPC to share the android_app and ALooper data structures. ptrace also requires a child process.

Redefining fopen locally would work for C fopen, but not for C++ cout.

Pre-extracting assets requires additional disk space to store the files, but is the more reasonable solution.

Accessing assets[edit]

Developers have been struggling to access resources easily in the NDK:

  • Android API : you can call the Android Java functions through JNI, but getting a file descriptor requires using unofficial functions and only work on uncompressed files; using Java buffer operations instead is quite tedious C/C++
  • libzip : you can easily access the .apk with libzip, though you need to integrate the library in your build system
  • NDK API : in Android 2.3/Gingerbread/API android-9 at last, there is an NDK api to access resources

Let's use the NDK API. It is not transparent for the developer either (no fopen/cout replacement) but is reasonably easy to use.

What's a bit more tricky is to grab the AssetManager from Java/JNI in our native activity.

Note: we'll use the slightly simplified C++ syntax for JNI (not the C syntax).

First, our native activity works in its own thread, so we need care when retrieving the JNI handle in android_main:

    JNIEnv* env = state_param->activity->env;
    JavaVM* vm = state_param->activity->vm;
    vm->AttachCurrentThread(&env, NULL);

Then let's get a handle on our calling NativeActivity instance:

    jclass activityClass = env->GetObjectClass(state_param->activity->clazz);

We also need to decide where to extract the files. We'll use the application's standard cache directory:

    // Get path to cache dir (/data/data/org.wikibooks.OpenGL/cache)
    jmethodID getCacheDir = env->GetMethodID(activityClass, "getCacheDir", "()Ljava/io/File;");
    jobject file = env->CallObjectMethod(state_param->activity->clazz, getCacheDir);
    jclass fileClass = env->FindClass("java/io/File");
    jmethodID getAbsolutePath = env->GetMethodID(fileClass, "getAbsolutePath", "()Ljava/lang/String;");
    jstring jpath = (jstring)env->CallObjectMethod(file, getAbsolutePath);
    const char* app_dir = env->GetStringUTFChars(jpath, NULL);
 
    // chdir in the application cache directory
    LOGI("app_dir: %s", app_dir);
    chdir(app_dir);
    env->ReleaseStringUTFChars(jpath, app_dir);

We now can get the NativeActivity AssetManager:

#include <android/asset_manager.h>
    jobject assetManager = state_param->activity->assetManager;
    AAssetManager* mgr = AAssetManager_fromJava(env, assetManager);

The actual extraction is simple: browse all files and copy them on disk one by one:

    AAssetDir* assetDir = AAssetManager_openDir(mgr, "");
    const char* filename = (const char*)NULL;
    while ((filename = AAssetDir_getNextFileName(assetDir)) != NULL) {
	AAsset* asset = AAssetManager_open(mgr, filename, AASSET_MODE_STREAMING);
	char buf[BUFSIZ];
	int nb_read = 0;
	FILE* out = fopen(filename, "w");
	while ((nb_read = AAsset_read(asset, buf, BUFSIZ)) > 0)
	    fwrite(buf, nb_read, 1, out);
	fclose(out);
	AAsset_close(asset);
    }
    AAssetDir_close(assetDir);

Now, all files can be accessed using plain fopen/cout by the application.

This technique is adapted to our tutorials, but probably not for bigger applications. In this case, you could either:

  • request write privilege on the SD card and extract files there (that's what the SDL Android port does),
  • use a wrapper around your file accesses that uses the AssetManager on Android (beware that it's read-only access)

Orientation[edit]

By setting:

        <activity ...
                android:screenOrientation="portrait"

Your application only works in portrait mode, independently of the device orientation or shape. This is not recommended but this may be useful for some games.

To handle orientation more efficiently, you theoretically need to check for onSurfaceChanged events. The onSurfaceChanged_native handler in android_app_NativeActivity.cpp wrapper doesn't seem to create a onNativeWindowResized event appropriately on orientation change, so instead we'll just monitor it regularly:

/* glutMainLoop */
 
    int32_t lastWidth = -1;
    int32_t lastHeight = -1;
 
    // loop waiting for stuff to do.
    while (1) {
 
        ...
 
	int32_t newWidth = ANativeWindow_getWidth(engine.app->window);
	int32_t newHeight = ANativeWindow_getHeight(engine.app->window);
	if (newWidth != lastWidth || newHeight != lastHeight) {
	    lastWidth = newWidth;
	    lastHeight = newHeight;
	    onNativeWindowResized(engine.app->activity, engine.app->window);
	    // Process new resize event :)
	    continue;
	}

Now we can process the event:

static void onNativeWindowResized(ANativeActivity* activity, ANativeWindow* window) {
    struct android_app* android_app = (struct android_app*)activity->instance;
    LOGI("onNativeWindowResized");
    // Sent an event to the queue so it gets handled in the app thread
    // after other waiting events, rather than asynchronously in the
    // native_app_glue event thread:
    android_app_write_cmd(android_app, APP_CMD_WINDOW_RESIZED);
}

Note: it is possible to process APP_CMD_CONFIG_CHANGED events, but it happens before the screen is resized, so it's too early to get the new screen size.

Android can only detect the new screen size after a buffer swap, so let's abuse another hook to get a resize event:

/* android_main */
    state_param->activity->callbacks->onContentRectChanged = onContentRectChanged;
 
...
 
static void onContentRectChanged(ANativeActivity* activity, const ARect* rect) {
    LOGI("onContentRectChanged: l=%d,t=%d,r=%d,b=%d", rect->left, rect->top, rect->right, rect->bottom);
    // Make Android realize the screen size changed, needed when the
    // GLUT app refreshes only on event rather than in loop.  Beware
    // that we're not in the GLUT thread here, but in the event one.
    glutPostRedisplay();
}

Input events[edit]

We reuse engine_handle_input from the native-activity sample.

It's important to return 0 when the event is not directly handled, so that the Android system does it. For instance we usually let Android take care of the Back button.

The NativeActivity framework doesn't seem to send appropriate repeat events: the key is pressed and unpressed at the exact same time, and the repeat count is always 0. Consequently it doesn't seem possible to process arrow keys from Hacker's Keyboard without rewriting part of the framework.

Motion (touchscreen) and keyboard events are handle through the same channel.

The allow users without keyboard to use the arrow keys, we implemented a virtual keypad (VPAD), located in the bottom-left corner, activated on touchscreen. Effort was made to avoid mixing a VPAD event with an existing motion event and vice-versa.

References[edit]

  1. That's the technique used by SDL for Windows's WinMain.

Links[edit]

Source code for the NativeActivity built-in:

< OpenGL Programming

Browse & download complete code