Qt on Android: How to create a zero-copy Android SurfaceTexture QML item
Motivation:
Android SurfaceTexture is used by all Android classes that need to display (a lot of) frames, it can be used to display media player, camera, etc. You can also use it in combination with other players: OpenMAX, VLC, etc. (well, VLC and other C++ player are using MediaCodec) to decode the images directly to our SurfaceTexture. As you can see it’s pretty useful.
In this tutorial we’re going to see how easy it is to create a media player using this QML item and the Android’s Media player. We’re going to use Android’s MediaPlayer, to exercise our JNI skills ;-).
Please create a simple Quick(Controls) Applications.
Step I
Add QtAndroidExtras module to your .pro file . We’re going to use QtAndroidExtras a lot.
QT += androidextras
Step II
First let’s create a new class QAndroidMediaPlayer, which inherits QObject. Then, let’s create our MediaPlayer, in QAndroidMediaPlayer constructor
qandroidmediaplayer.cpp
QAndroidMediaPlayer::QAndroidMediaPlayer(QObject *parent) : QObject(parent) , m_mediaPlayer("android/media/MediaPlayer") { }
m_mediaPlayer is declared as: QAndroidJniObject m_mediaPlayer
In QAndroidMediaPlayer’s destructor we must call release method. Because we don’t know what state the m_mediaPlayer is in, before reset, we’re going to call stop and reset methods.
QAndroidMediaPlayer::~QAndroidMediaPlayer() { QAndroidJniEnvironment env; m_mediaPlayer.callMethod<void>("stop"); m_mediaPlayer.callMethod<void>("reset"); m_mediaPlayer.callMethod<void>("release"); }
Next task is to implement a playFile Q_INVOKABLE method.
void QAndroidMediaPlayer::playFile(const QString &file) { QAndroidJniEnvironment env; // m_mediaPlayer must be in idle state when calling // setDataSource, so we call stop and reset before. // try to stop the media player. m_mediaPlayer.callMethod<void>("stop"); // try to reset the media player. m_mediaPlayer.callMethod<void>("reset"); // set the path of the file m_mediaPlayer.callMethod<void>("setDataSource", "(Ljava/lang/String;)V", QAndroidJniObject::fromString(file).object()); // prepare media player m_mediaPlayer.callMethod<void>("prepare"); // start playing m_mediaPlayer.callMethod<void>("start"); }
At this moment we have a workable media player that can play any media files (supported by Android codecs). The only problem we have is that we don’t see any images when playing video files 🙂 . Let’s fix that problem!
First and foremost, let’s add a new property to QAndroidMediaPlayer to set a new Surface to Android’s MediaPlayer object.
void QAndroidMediaPlayer::setVideoOut(QSurfaceTexture *videoOut) { if (m_videoOut == videoOut) return; m_videoOut = videoOut; // Create a new Surface object from our SurfaceTexture QAndroidJniObject surface("android/view/Surface", "(Landroid/graphics/SurfaceTexture;)V", videoOut->surfaceTexture().object()); // Set the new surface to m_mediaPlayer object m_mediaPlayer.callMethod<void>("setSurface", "(Landroid/view/Surface;)V", surface.object()); emit videoOutChanged(); }
Step III
Implement a QQuickItem which wraps a SurfaceTexture Android object. This item will be used to display the frames that are pushed into SurfaceTexture. Basically we rewrite Google’s MyGLSurfaceView.java in Qt.
We start by creating a new class QSurfaceTexture which inherits QQuickItem and overwrites updatePaintNode method.
Let’s see the header file:
class QSurfaceTexture : public QQuickItem { Q_OBJECT public: QSurfaceTexture(QQuickItem *parent = nullptr); ~QSurfaceTexture(); // returns surfaceTexture Java object. const QAndroidJniObject &surfaceTexture() const { return m_surfaceTexture; } // QQuickItem interface protected: QSGNode *updatePaintNode(QSGNode *n, UpdatePaintNodeData *) override; private: // our texture uint32_t m_textureId = 0; // Java SurfaceTexture object QAndroidJniObject m_surfaceTexture; };
There’s no magic here, we need surfaceTexture member to access m_surfaceTexture (used by QAndroidMediaPlayer::setVideoOut), we also declare our m_textureId.
Now, let’s see the implementation file:
QSurfaceTexture::QSurfaceTexture(QQuickItem *parent) : QQuickItem(parent) { setFlags(ItemHasContents); } QSurfaceTexture::~QSurfaceTexture() { // Delete our texture if (m_textureId) { glBindTexture(GL_TEXTURE_EXTERNAL_OES, 0); glDeleteTextures(1, &m_textureId); } }
We need to set ItemHasContents flag in constructor, and, in destructor, to delete the texture (if it was created).
Now, let’s focus on updatePaintNode method.
QSGNode *QSurfaceTexture::updatePaintNode(QSGNode *n, QQuickItem::UpdatePaintNodeData *) { SurfaceTextureNode *node = static_cast<SurfaceTextureNode *>(n); if (!node) { // Create texture glGenTextures(1, &m_textureId); glBindTexture(GL_TEXTURE_EXTERNAL_OES, m_textureId); // Can't do mipmapping with camera source glTexParameterf(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexParameterf(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MAG_FILTER, GL_LINEAR); // Clamp to edge is the only option glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); // Create a SurfaceTexture Java object m_surfaceTexture = QAndroidJniObject("android/graphics/SurfaceTexture", "(I)V", m_textureId); // We need to setOnFrameAvailableListener, to be notify when a new frame was decoded // and is ready to be displayed. Check android/src/com/kdab/android/SurfaceTextureListener.java // file for implementation details. m_surfaceTexture.callMethod<void>("setOnFrameAvailableListener", "(Landroid/graphics/SurfaceTexture$OnFrameAvailableListener;)V", QAndroidJniObject("com/kdab/android/SurfaceTextureListener", "(J)V", jlong(this)).object()); // Create our SurfaceTextureNode node = new SurfaceTextureNode(m_surfaceTexture, m_textureId); } // flip vertical QRectF rect(boundingRect()); float tmp = rect.top(); rect.setTop(rect.bottom()); rect.setBottom(tmp); QSGGeometry::updateTexturedRectGeometry(node->geometry(), rect, QRectF(0, 0, 1, 1)); node->markDirty(QSGNode::DirtyGeometry | QSGNode::DirtyMaterial); return node; }
We’ll check SurfaceTextureListener.java immediately. SurfaceTextureNode is a custom QSGGeometryNode, we’ll see its implementation after we’ll check SurfaceTextureListener. The rest of the comments should be enough to understand the code.
Now let’s check SurfaceTextureListener.java.
public class SurfaceTextureListener implements SurfaceTexture.OnFrameAvailableListener { private long m_callback = 0; public SurfaceTextureListener(long callback) { m_callback = callback; } @Override public void onFrameAvailable (SurfaceTexture surfaceTexture) { // call the native method frameAvailable(m_callback, surfaceTexture); } public native void frameAvailable(long nativeHandle, SurfaceTexture surfaceTexture); }
You’ll need to add this file to your project in android/src/com/kdab/android folder. Also, make sure ANDROID_PACKAGE_SOURCE_DIR qmake variable is set to android sources folder.
... ANDROID_PACKAGE_SOURCE_DIR = $$PWD/android ...
This class is needed to call a native function to signal when a new frame was decoded. Let’s check the C/C++ function implementation.
extern "C" void Java_com_kdab_android_SurfaceTextureListener_frameAvailable(JNIEnv */*env*/, jobject /*thiz*/, jlong ptr, jobject /*surfaceTexture*/) { // a new frame was decoded, let's update our item QMetaObject::invokeMethod(reinterpret_cast<QSurfaceTexture*>(ptr), "update", Qt::QueuedConnection); }
This function is called from decoder thread, therefore we’re using QMetaObject::invokeMethod to post a method call to update method on QSurfaceTexture’s thread.
Next, let’s take a look to SurfaceTextureNode implementation. SurfaceTextureNode inherits QSGGeometryNode which is needed to render the texture content using the geometry and material. As you can see, the texture is not bound as a GL_TEXTURE_2D, but as GL_TEXTURE_EXTERNAL_OES (needed by Android MediaCodec/Player/Camera/etc.), so, we’ll need to use a shader to draw it (otherwise QSGSimpleTextureNode is more suitable to display textures).
class SurfaceTextureNode : public QSGGeometryNode { public: SurfaceTextureNode(const QAndroidJniObject &surfaceTexture, GLuint textureId) : QSGGeometryNode() , m_surfaceTexture(surfaceTexture) , m_geometry(QSGGeometry::defaultAttributes_TexturedPoint2D(), 4) , m_textureId(textureId) { // we're going to use "preprocess" method to update the texture image // and to get the new matrix. setFlag(UsePreprocess); setGeometry(&m_geometry); // Create and set our SurfaceTextureShader QSGSimpleMaterial<State> *material = SurfaceTextureShader::createMaterial(); material->setFlag(QSGMaterial::Blending, false); setMaterial(material); setFlag(OwnsMaterial); // We're going to get the transform matrix for every frame // so, let's create the array once QAndroidJniEnvironment env; jfloatArray array = env->NewFloatArray(16); m_uSTMatrixArray = jfloatArray(env->NewGlobalRef(array)); env->DeleteLocalRef(array); } ~SurfaceTextureNode() { // delete the global reference, now the gc is free to free it QAndroidJniEnvironment()->DeleteGlobalRef(m_uSTMatrixArray); } // QSGNode interface void preprocess() override; private: QAndroidJniObject m_surfaceTexture; QSGGeometry m_geometry; jfloatArray m_uSTMatrixArray = nullptr; GLuint m_textureId; }; void SurfaceTextureNode::preprocess() { QSGSimpleMaterial<State> *mat = static_cast<QSGSimpleMaterial<State> *>(material()); if (!mat) return; // update the texture content m_surfaceTexture.callMethod<void>("updateTexImage"); // get the new texture transform matrix m_surfaceTexture.callMethod<void>("getTransformMatrix", "([F)V", m_uSTMatrixArray); QAndroidJniEnvironment env; env->GetFloatArrayRegion(m_uSTMatrixArray, 0, 16, mat->state()->uSTMatrix.data()); // Activate and bind our texture glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_EXTERNAL_OES, m_textureId); }
This class creates and sets our SurfaceTextureShader (the material), then on every frame it updates the texture image and gets its transform matrix.
Next let’s check SurfaceTextureShader class and State structure. SurfaceTextureShader is based on QSGSimpleMaterialShader
struct State { // the texture transform matrix QMatrix4x4 uSTMatrix; int compare(const State *other) const { return uSTMatrix == other->uSTMatrix ? 0 : -1; } };
In this structure, we only need the texture transform matrix. This matrix is updated by SurfaceTextureNode::preprocess method and used, below, by SurfaceTextureShader::updateState method.
class SurfaceTextureShader : QSGSimpleMaterialShader<State> { QSG_DECLARE_SIMPLE_COMPARABLE_SHADER(SurfaceTextureShader, State) public: // vertex & fragment shaders are shamelessly "stolen" from MyGLSurfaceView.java :) const char *vertexShader() const { return "uniform mat4 qt_Matrix; n" "uniform mat4 uSTMatrix; n" "attribute vec4 aPosition; n" "attribute vec4 aTextureCoord; n" "varying vec2 vTextureCoord; n" "void main() { n" " gl_Position = qt_Matrix * aPosition; n" " vTextureCoord = (uSTMatrix * aTextureCoord).xy; n" "}"; } const char *fragmentShader() const { return "#extension GL_OES_EGL_image_external : require n" "precision mediump float; n" "varying vec2 vTextureCoord; n" "uniform lowp float qt_Opacity; n" "uniform samplerExternalOES sTexture; n" "void main() { n" " gl_FragColor = texture2D(sTexture, vTextureCoord) * qt_Opacity; n" "}"; } QList<QByteArray> attributes() const { return QList<QByteArray>() << "aPosition" << "aTextureCoord"; } void updateState(const State *state, const State *) { program()->setUniformValue(m_uSTMatrixLoc, state->uSTMatrix); } void resolveUniforms() { m_uSTMatrixLoc = program()->uniformLocation("uSTMatrix"); program()->setUniformValue("sTexture", 0); // we need to set the texture once } private: int m_uSTMatrixLoc; };
In SurfaceTextureShader we have (almost) the same vertex & fragment shaders as in MyGLSurfaceView.java, the only difference is the qt shader vars (qt_Matrix and qt_Opacity).
Step IV
Put everything together and enjoy our MediaPlayer!
The main.cpp changes
int main(int argc, char *argv[]) { QGuiApplication app(argc, argv); // Register our QML type qmlRegisterType<QSurfaceTexture>("com.kdab.android", 1, 0, "SurfaceTexture"); // Create a player QAndroidMediaPlayer player; QQmlApplicationEngine engine; // Set the player engine.rootContext()->setContextProperty("_mediaPlayer", &player); engine.load(QUrl(QStringLiteral("qrc:/main.qml"))); return app.exec(); }
The QML code
import QtQuick 2.5 import QtQuick.Controls 1.4 import com.kdab.android 1.0 ApplicationWindow { visible: true width: 640 height: 480 title: qsTr("SurfaceTexture example") SurfaceTexture { id: videoItem anchors.fill: parent // Set media player's video out Component.onCompleted: _mediaPlayer.videoOut = videoItem; MouseArea { anchors.fill: parent onClicked: _mediaPlayer.playFile("/sdcard/testfile.mp4"); } } }
As you can see, the path to file is hard-coded, so make sure you push a video file to /sdcard/testfile.mp4, or change that value.
Last thing we need to do is to set min API version on your AndroidManifest.xml file to 14, because the needed Surface constructor was added in API 14.
Now, the only thing we need to do, is to tap on the screen to play that video file!
You can find the full source code of this article here: https://github.com/KDAB/android
Thanks! Is there way to show video on SurfaceView, without texture? It’s useful for low-end devices and maybe help to resolve https://bugreports.qt.io/browse/QTBUG-36021
Not very easy.
You can put a SurfaceView on top of the other Qt controls and pass it to a media player, but you’ll not be able to draw anything on top.
Thanks for that! Did this implementation use Android hardware acceleration?
Yep, it’s using hardware acceleration if available.
Could you please have a look at following issue:
https://github.com/KDAB/android/issues/1
I have posted the stacktrace of the crash when trying to start the media file.
I’ll check it this week
Could You please take a look on this issue:
https://github.com/KDAB/android/issues/3
I am trying to make this work using loader.
Hi, BogDan Vatra! Great example.
I have the problem with creating 2 SurfaceTexture (QML element) with different movies to play.
When I started to play one of them I see only one movie at a time and only one GL_TEXTURE_EXTERNAL_OES is showing on the screen (also second texture mirrored by vertical and horizontal).
I think in “preprocess” must be some functions or manipulations with texture. Tried to unbind texture, updateTexImage and then bind texture again – no changes.
Any posibilities to resolve the problem above?
Thanks!
Hi Bogdan,
Your solution is very interesting, great job.
I have a question, can this example be extended in a way that any android view can be rendered as texture and displayed in QML tree. I am thinking using your approach, but in slightly different way. I want to try to reimplement onDraw and render on surface that is created from SurfaceTexture that is created in C++. When onDraw is called I will invokeMethod to notify Qt QML thread that it needs to redrew. What do you think about this approach?
Hi Yurii,
I have the same idea as you.
Can you share your progress with me?
Yup, I think with some effort it can be done.
Thanks very much
Hi Bogdan,
Your solution is very interesting, great job.
I have a question, can this example be extended in a way that any android view can be rendered as texture and displayed in QML tree. I am thinking using your approach, but in slightly different way. I want to try to reimplement onDraw and render on surface that is created from SurfaceTexture that is created in C++. When onDraw is called I will invokeMethod to notify Qt QML thread that it needs to redrew. What do you think about this approach?
Hi Bodgan.
Thanks for that great job.
That works fine with a mp4 local file & with a mp4 file located on a remote http server but I have the following error (please have a look on the trace at the bottom of this message) when I try with a local network camera RTSP flow (the screen remains black). The “setDataSource” MediaPlayer function documentation specifies that RTSP url is supported. I tried to use the “prepareAsync” call insted of “prepare” but the result is the same. I specify that I tried to add INTERNET permissions in the manifest. With an other application of the playstore, the flow is correctly displayed.
My environment : Win10,Qt5.12,NDK r10e, Android 8.1.0, gcc armv7a.
Any idea or suggestion (increase loglevel or… something that is missing) ?
Thanks for your help.
Regards,
David.
V MediaPlayer-JNI: start
V MediaPlayerNative: start
V MediaPlayerNative: message received msg=300, ext1=0, ext2=0
V MediaPlayerNative: Received SEC_MM_PLAYER_CONTEXT_AWARE
V MediaPlayerNative: callback application
V MediaPlayerNative: back from callback
V MediaPlayerNative: message received msg=100, ext1=1, ext2=-5001
E MediaPlayerNative: error (1, -5001)
V MediaPlayerNative: callback application
V MediaPlayerNative: back from callback
V MediaPlayerNative: message received msg=200, ext1=10951, ext2=0
W MediaPlayerNative: info/warning (10951, 0)
V MediaPlayerNative: callback application
V MediaPlayerNative: back from callback
E MediaPlayer: Error (1,-5001)
Hi Bogdan,
Sorry, I made a small mistake in my message. The Qt version was 5.11 (and not 5.12) on my laptop. I tried your example on my desktop (difference is Qt5.12/Clang on my desktop vs 5.11/GCC on my laptop). When I build with my desktop for the same Android 8 device, the RTSP flow is correctly displayed. I tried to build with my desktop after reading the bug tracker QTBUG-50539.
Great job & thank you very much.
Regards.
David.
Unfortunately this does not work anymore in Qt 6. There are no more QSGSimpleMaterial and QSGSimpleMaterialShader. Also classes such as QSGMaterialShader are quite different to what they were in Qt 5.
Would it be possible to update this great example to Qt 6?
Sadly, in this moment I have no time to update the example.