Quickstart Guide

This section introduces the process of accessing and operating OrangeFilter Android SDK 1.4.0 or above versions' rendering effects for beautification products.

Development Environment:

  • Android Studio 2.0 or above
  • Android 4.0 (SDK API 14+) or above
  1. Obtain OrangeFilter SDK, plus OrangeHelper related files and authorization serial numbers.

    The OrangeFilter Android SDK is offered in the form of an Android Studio aar library. The library file is found in the compressed SDK package android/lib/orangefilterpub_android-x.x.x.aar (x.x.x signifies the version).

    (The Android OrangeFilterSDK is bundled with AI model data. Resources are automatically unpacked when the engine is enabled for the first time. When you update OrangeFilterSDK, clear the application layer's pre-existing SDK data cache to prevent possible cache errors. The cache path is the resource path in SDK creation.)

  2. Copy orangefilterpub_android-x.x.x.aar to the application module catalog folder libs and import OrangeHelper related files to the project.

  3. Add a library dependency to the application module's build.gradle folder:

    dependencies { ... implementation fileTree(dir: 'libs', include: ['*.aar']) }

  4. Add permissions with AndroidManifest.xml

  5. Disable SDK API obfuscation (if service codes require obfuscation)

    Obfuscation supplementary parameters: -keep class com.orangefilter.** {*;}

Import SDK Dependency Resources

As input resources for rendering effects, effect packages will be provided according to authorized functions.

Put the authorized effect packages in the specified project path src\main\assets\effects so that the SDK can automatically access these resources. (For dynamic download of the effect package, the storage path is resDir + "/orangefilter/effects/". resDir is a parameter of createContext, and its default value is activity.getFilesDir().getPath().)

Prepare an OpenGL ES Rendering Context

OrangeFilter Android SDK uses NDK to compile a dynamic library. The Java layer uses native API encapsulation, offering access in the form of an Android Studio aar library.

OrangeFilter mainly provides video effect rendering. Because the current version is rendered with OpenGL ES at its base, it is necessary to prepare the rendering context for OpenGL ES in the access application layer.

Specify OpenGL ES Functional Requirements

Add the requirements to AndroidManifest.xml

<uses-feature android:glEsVersion="0x00020000" android:required="true" />

to specify that OpenGL ES 2.0 or above versions are required.

OpenGL ES 3.x API is compatible with 2.0 API. It is only required to specify OpenGL ES 2.0 API as the default version. Check and use 3.x API when running OpenGL ES.

Create an OpenGL Context

Android SDK provides the GLSurfaceView view and can quickly create the OpenGL Context for rendering.

In the example, we use CameraView to take over GLSurfaceView for operating rendering tasks.

public class CameraView extends GLSurfaceView implements GLSurfaceView.Renderer

Initialize GLSurfaceView:

private void init() { 
    ... 
    setEGLConfigChooser(8, 8, 8, 8, 0, 0); 
    setEGLContextClientVersion(3); 
    setPreserveEGLContextOnPause(true); 
    setRenderer(this); 
    setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY); 
}

Operate rendering tasks in the implementation method of GLSurfaceView.Renderer:

@Override 
public void onDrawFrame(GL10 gl10) { 
	GLES20.xxx 
}

In this GLSurfaceView.RENDERMODE_WHEN_DIRTY rendering mode, the renderer triggers onDrawFrame callback when the image changes.

You can also call this.requestRender() to send a rendering request to the renderer.

Rendering Effects

The sample project's CameraView tool (the authorizer provides the open source demo) integrates with camera capture when creating the OpenGL ES rendering context.

In the rendering callback, the camera image is imported for use as the OpenGL texture and the original image in effect rendering.

Create a View

mCameraView = new CameraView(this);

Set Rendering Callback

mCameraView.setDrawFrameCallback(720, 1280, new CameraView.DrawFrameCallback() { 
	API implementation... 
});

The first two parameters 720, 1280 are used to specify the resolution of the rendered texture.

Complete a rendering task in the rendering callback API:

a. Create an OrangeFilter Context and effect, then prepare the rendering data

@Override 
public void onInit() { 
	//Authorization code: Get from the authorizer 
	final String ofSerialNumber = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"; 
	// Initialization of the ofsdk engine 
	boolean ret = OrangeHelper.createContext(MainActivity.this, ofSerialNumber, OrangeHelper.VENUS_ALL, null); 
	if (!ret) { 
		Log.e(TAG, OrangeHelper.LogMsg);//Failure report 
	}... 
	// Effect pre-loading. (When necessary, you may also enable or disable the effect in the rendering thread (or the same thread as the SDK engine.)) 
	
	// Load and enable effects. (Only the loading status is required. You can disable the effect later.) 
	ret = OrangeHelper.enableEffect(OrangeHelper.EffectType.ET_BasicBeauty, true); 
	/* 
	//If you need to set an initial default value, you may proceed as follows. 
	if (!bInit) { 
		bInit = true; 
		OrangeHelper.setEffectParam(OrangeHelper.EffectParamType.EP_BasicBeautyIntensity, 70); 
		OrangeHelper.setEffectParam(OrangeHelper.EffectParamType.EP_BasicBeautyOpacity, 70); 
	}
	*/ 
	//Disable effect. In this way, you complete a function comparable to loading. If you do not disable it, the effect is displayed. 
	OrangeHelper.enableEffect(OrangeHelper.EffectType.ET_BasicBeauty, false); 	//Other effects that you wish to pre-load 
		... 
		
	}

b. Render an effect in callback

@Override 
public void onDrawFrame(GLTexture textureIn, GLTexture textureOut) { 
	... 
	//Enter member variables to improve performance 
	/*OrangeHelper.OrangeGLTexture mInTexture = new OrangeHelper.OrangeGLTexture(); 
	OrangeHelper.OrangeGLTexture mOutTexture = new OrangeHelper.OrangeGLTexture(); 
	OrangeHelper.OrangeImageInfo mImageInfo = new OrangeHelper.OrangeImageInfo();*/ 
	
	mInTexture.mTextureId = textureIn.getTextureId();//ID of the opengl import texture: The texture mixes with the effect. 
	mInTexture.mWidth = textureIn.getWidth();//Texture width 
	mInTexture.mHeight = textureIn.getHeight();//Texture height 
	mOutTexture.mTextureId = textureOut.getTextureId();//ID of the opengl export texture: Output texture id is the texture output result after rendering. 
	mOutTexture.mWidth = textureOut.getWidth();//Texture width 
	mOutTexture.mHeight = textureOut.getHeight();//Texture height 
	mImageInfo.data = image.data;//Facial recognition import image 
	mImageInfo.format = OrangeFilter.OF_PixelFormat_NV21; //Default Android camera data format: modified according to the actual image format when you use a third-party SDK 
	mImageInfo.dir = isLandScape()?(image.dir + 1) % 4 : image.dir; //Image direction recognition: isLandScape() Check whether it is in landscape mode 
	mImageInfo.frontCamera = image.frontCamera;//Whether it is the front camera 
	mImageInfo.width = isLandScape()?image.height : image.width;//Image wdith recognition 
	mImageInfo.height = isLandScape()?image.width : image.height;//Image height recognition 
	mImageInfo.orientation = image.orientation;//Only Android requires the hardware information captured by the camera. 
	
	//Refresh the engine's input parameters 
	if (!OrangeHelper.updateFrameParams(mInTexture, mOutTexture, 
	mImageInfo)) { 
		//Processing failure: You may directly output the original input image. You may also report the error to locate its cause. 
		... 
	} 
} 
mInTexture is the image captured by the camera while `mOutTexture` is the image with added effects rendered to the screen.

c. Release resources

@Override 
	public void onRelease() { 
		...
		//Release engine 
		OrangeHelper.destroyContext(); 
}

Was this page helpful?

Helpful Not helpful
Submitted! Your feedback would help us improve the website.
Feedback
Top