Overview of Qt3D 2.0 – Part 2
An Example of Rendering with Qt3D
In the previous article we learned about the requirements and high-level architecture of Qt3D 2.0. In order to put some of this into context and to give you a concrete example of how it looks to draw something in Qt3D using the QML API, we will now briefly show the important parts of one of the simple examples that will ship with Qt3D. We will start off simple and just draw a single entity (a trefoil knot) but to make it slightly more interesting we will use a custom set of shaders to implement a single-pass wireframe rendering method. This is what we will draw:
As mentioned in the previous article, the renderer aspect looks for entities that have some geometry, a material and optionally a transformation. These are all specified in the form of subclasses of QComponent which have been exported to the QML engine in the form of Mesh, Material and Transform respectively. So let’s use these components to make a custom QML item in TrefoilKnot.qml
import Qt3D 2.0 import Qt3D.Render 2.0 Entity { id: root property alias x: translation.dx property alias y: translation.dy property alias z: translation.dz property alias scale: scaleTransform.scale property alias theta: thetaRotation.angle property alias phi: phiRotation.angle property Material material components: [ transform, mesh, root.material ] Transform { id: transform Translate { id: translation } Scale { id: scaleTransform } Rotate{ id: thetaRotation; axis: Qt.vector3d( 1.0, 0.0, 0.0 ) } Rotate{ id: phiRotation; axis: Qt.vector3d( 0.0, 1.0, 0.0 ) } } Mesh { id: mesh source: ":/assets/obj/trefoil.obj" } }
Let’s break this down to see what’s going on here. We start off by importing the Qt3D 2.0 module that provides the Entity type and value type helpers like Qt.vector3d(). We also import the Qt3D.Render 2.0 module that provides the components and other types picked up by the renderer aspect. If we were using components from other aspects, then we would also need to import the corresponding QML module here too.
We then use Entity as the root element of the custom QML type exposing some custom properties just as you would with any other type in QML.
Entities, Entities, Everywhere
In addition to aggregating components, Entity objects can be used to group child objects together. This is analogous to how Item is used in Qt Quick 2.
At the bottom of the TrefoilKnot.qml file we instantiate a Transform component and a Mesh component. The Mesh component is very simple. We use its source property to load in a static set of geometry (vertex positions, normal vectors, texture coordinates etc.) from a file in the Wavefront Obj format. This data was exported from the excellent and free Blender application. The Transform component specifies how the renderer should transform the geometry when it is drawn with the OpenGL pipeline. Exactly how this happens is a topic for a future article. For now, simply be happy that you are able to combine an ordered set of transformations into a single Transform component and that your shaders will have this information available to them automatically via some standard named uniform variables.
Dynamic Per-Vertex Data
In addition to the Mesh element, Qt3D also allows dynamic generation of per-vertex attribute data via some C++ hooks called by the task-based engine.
Simply instantiating components is not enough however. In order for them to imbue special behaviour on an entity, the entity must aggregate the components by means of its components property. This allows components to be shared between multiple entities very easily. In this example we have components for the transform and mesh which are contained within our custom type. The final component, of type Material is, in this case, provided by means of a property on the TrefoilKnot itself. This allows users of this type to easily customise the appearance of the entity, which we will make use of shortly.
Now that we have defined a custom entity, let’s see how to use it to actually get our desired result. The code for our main.qml file looks like this:
import Qt3D 2.0 import Qt3D.Render 2.0 import QtQuick 2.1 as QQ2 Entity { id: root // Use the renderer configuration specified in ForwardRenderer.qml // and render from the mainCamera components: [ FrameGraph { activeFrameGraph: ForwardRenderer { camera: mainCamera } } ] BasicCamera { id: mainCamera position: Qt.vector3d( 0.0, 0.0, 25.0 ) } Configuration { controlledCamera: mainCamera } WireframeMaterial { id: wireframeMaterial effect: WireframeEffect {} ambient: Qt.rgba( 0.2, 0.0, 0.0, 1.0 ) diffuse: Qt.rgba( 0.8, 0.0, 0.0, 1.0 ) } TrefoilKnot { id: trefoilKnot material: wireframeMaterial } }
We start off again with the same import statements as before but this time we also add in a namespaced import for the Qt Quick 2.1 module as we will need this shortly for some animations. Once again we also use Entity as the root element simply to act as a parent for its children. In this sense, Entity is much like the Item element type from Qt Quick.
Here, we will gloss over the FrameGraph component as that is worthy of an entire article on it’s own. For now, it suffices to say that the contents of the ForwardRenderer type is what completely configures the renderer without touching any C++ code at all. It’s pretty cool stuff but you’ll have to wait for the details as this is already a long article. Similarly, please ignore the Configuration element. This is a temporary hack that is needed to have mouse control of the camera until we finish implementing that part correctly using aspects and components.
The BasicCamera element is a trivial wrapper around the built-in Camera type, that as you can probably deduce, represents a virtual camera. It has properties for things like the near and far planes, field of view, aspect ratio, projection type, position, orientation etc.
Multiple Cameras
It is trivial to use multiple cameras and choose between them using the framegraph for all or part of the scene rendering. We will cover this in a future article.
Next up we have the WireframeMaterial element. This is custom type that wraps up the built-in Material type. Qt3D has a robust and very flexible material system that allows multiple levels of customisation. This caters for different rendering approaches on different platforms or OpenGL versions; allows multiple rendering passes with different state sets; provides mechanisms for overriding of parameters at different levels; and also allows easy switching of shaders — all from C++ or using QML property bindings. Once again, to do this topic justice would require more space than we have here so we will defer it for another time. For now, the take away point is that properties on a Material can easily be mapped through to uniform variables in a GLSL shader program that is itself specified in the referenced effect property.
Supported Shader Stages
Qt3D supports all of the OpenGL programmable rendering pipeline stages: Vertex, tessellation control, tessellation evaluation, geometry and fragment shaders. Compute shaders require a little more API work for getting data into and out of them before they are fully supported.
Instantiating the TrefoilKnot and setting our material on it is simplicity itself. Once we have done that and with the parts we have glossed over, the Qt3D engine in conjunction with the renderer aspect has enough information to finally render our mesh using the material we specified.
Of course we can go further and make things a little more interesting by making use of the animation elements provided by Qt Quick 2. When we animate properties of our custom TrefoilKnot or the Wireframematerial, the properties of its components get updated by means of the usual QML property binding mechanism. For example:
WireframeMaterial { id: wireframeMaterial effect: WireframeEffect {} ambient: Qt.rgba( 0.2, 0.0, 0.0, 1.0 ) diffuse: Qt.rgba( 0.8, 0.0, 0.0, 1.0 ) QQ2.SequentialAnimation { loops: QQ2.Animation.Infinite running: true QQ2.NumberAnimation { target: wireframeMaterial; property: "lineWidth"; duration: 1000; from: 1.0 to: 3.0 } QQ2.NumberAnimation { target: wireframeMaterial; property: "lineWidth"; duration: 1000; from: 3.0 to: 1.0 } QQ2.PauseAnimation{ duration: 1500 } } }
The property updates are noticed by the QNode base class and are automatically sent through to the corresponding objects in the renderer aspect. The renderer then takes care of translating the property updates through to new values for uniform variables in the GLSL shader programs. You can find the full source code for this example in the Qt 5 git repository (see below) and when you run it gives the following view of a trefoil knot with the width of the wireframe lines pulsing. All the heavy lifting is being done by the GPU of course. All the CPU has to do is the property animations and the little bit of work to translate the scenegraph and framegraph into raw OpenGL calls.
Even More Win?
In the future, even the animations will be able to be performed across multiple cores by providing a specialised animation aspect. It is also already possible to animate on the GPU via a custom shader program and material.
What Is the Status of Qt3D?
As of December 2014, most of the core framework of Qt3D is now in place. There are a few areas that we want to tidy up and extend before release to make it easier for users to extend. For the renderer aspect, most of the features for an initial release are working. We have just finished implementing the first pass of using data gathered from entities in the scenegraph to populate Uniform Buffer Objects (UBOs) that can be bound to OpenGL shader programs to make large amounts of data readily available. Typical use cases for UBOs are for sets of material or lighting parameters but they can be used for anything you can think of.
The two big things that we have yet to implement for the renderer aspect are:
- Support for instanced rendering. Instancing is a way of getting the GPU to draw many copies (instances) of a base object that varies in some way for each copy. Often, in position, orientation, colour, material properties, scale etc. Our plan is to provide an API similar to Qt Quick’s Repeater element. In this case the delegate will be the base object and the model will provide the per-instance data. So whereas an entity with a Mesh component attached eventually gets transformed into a call to glDrawElements, an entity with a instanced component will be translated into a call to glDrawElementsInstanced.
- Qt Quick 2 Integration. There are a number of ways in which Qt Quick 2 could be integrated with Qt3D (or vice versa). For example, you may simply wish to embed a Qt3D scene into a custom Qt Quick 2 item to put it into your UI. Alternatively, you may want to overlay a Qt Quick 2 scene as a UI over your Qt3D scene. You may also want to be able to use Qt Quick 2 to render into a texture and then use that texture within your Qt3D scene, perhaps to apply it to some geometry such as a sign post. With a custom Qt Quick 2 item based on QQuickFramebufferObject and by making use of QQuickRenderControl all of these options should be possible.
In addition, we have yet to implement a sane set of default materials that are ready to use out of the box. The materials example that is included in the Qt3D examples shows a good selection of what some defaults may look like. They need making a little more generic and testing on other platforms, particularly OpenGL ES 2 and ES 3.
Beyond the renderer, the other aspect to be shipped when we release Qt3D 2.0 will be the keyboard and mouse input aspect. Support for keyboard input is already implemented and is usable. Mouse support will come in the New Year. For now we have the hacked together solution mentioned in the above wireframe example for controlling the camera.
Qt3D API
Please note that the Qt3D API is not yet frozen. The API will change before release, but hopefully not by much.
What Can You Do To Help Qt3D?
So far Qt3D 2.0 has been almost entirely designed and implemented by KDAB engineers. I would like to highlight the efforts of Paul Lemire, James Turner, Kevin Ottens, Giuseppe D’Angelo and Milian Wolff who have done a huge amount of work to rebuild Qt3D from the ground up. A lot of work has gone into Qt3D, much of it not visible, in the form of prototypes that were discarded and never saw the light of day, API reviews, testing, debugging, and profiling. This has resulted in over 1200 commits since we moved development onto the public Qt git repositories.
Most of the work in rewriting Qt3D has been funded by our employer, KDAB, and also in the spare time of the above people. Recently we were fortunate enough to get some external funding from our friends at eCortex to help implement some of the missing functionality of Qt3D. This was a fantastic boost for us, because it allowed KDAB to have Paul Lemire focus primarily on Qt3D for an extended period without distraction in addition to facilitating an incredibly helpful API review.
Funding or Using Qt3D
If you wish to help contribute to Qt3D (or any other part of Qt), but you don’t have the time or resources to write patches or if you just wish to invest in Qt3D with some R&D money you have left over, then please do consider funding us to do work in Qt3D on your behalf. Also, the best way to drive new features is to use a technology in the real world, so if you want to use Qt3D (or any other part of Qt) in your next project then please get in touch with us.
If you want to get involved directly with Qt3D or if you just want to try it out, then take a look at how to build Qt 5 from source and drop in to the #qt-3d channel on freenode. You will find a bunch of us in there most of the time. If you need help getting up to speed or if you want something to work on or need some guidance around the architecture please feel free to ping us in there or on the development mailing list. Please use the dev branch to build Qt3D.
We hope to release Qt3D 2.0 along with Qt 5.5.0 in the spring, but the more help or funding we can get with implementation, documentation, testing, and examples the better shape Qt3D will be in for everybody. So thank you to everybody that has provided help and feedback so far. We are happy with the direction Qt3D is going in and we are really looking forwards to an initial release and many more releases in the future.
If you like this article and want to read similar material, consider subscribing via our RSS feed.
Subscribe to KDAB TV for similar informative short video content.
KDAB provides market leading software consulting and development services and training in Qt, C++ and 3D/OpenGL. Contact us.
Awesome work! Way better than Qt3D 1.0. Keep up the hard work. I can’t wait until the next article.
Thank you. Still plenty to do and rough edges to polish but we’re getting there I think.
I downloaded the source and ran the examples. They all seemed to work. Many of them crashed at on closing them.
On another note, does a3d support anti-aliasing out of the box? Does the frame graph support MSAA? I’m assuming that I could easily implement SSAA or FXAA in the frame graph?
Yeah, known issue. I was hoping to resolve it over Xmas but ended up tracking down an issue with qdoc. I have some time assigned this month and that bug is high on my list.
We will offer a property for enabling MSAA. Yes, supporting SSAA and FXAA will be trivial via the framegraph and a custom post-proc shader.
Sweet! This is definitely an exciting technology!
This is really exciting, also the component approach i like a lot.
all the best for Qt3D !
Can’t wait for next article 🙂
Wow, this is very exciting indeed and I am glad Qt3D is coming back stronger than ever! Excellent job KDAB!
Great work guys.
Nice work! I want to contribute my code and technology to KADB on Qt 3D work! Currently I have Android, iOS and Window Phone devices available for testing Qt 3D, in addition, I have a few code want to live happy with Qt 3D. So I want to know is it possible to write a custom Model format parser with Qt 3D? Is it to write under Assimp library or under one of Qt 3D’s interface. This format support skeleton animation but the animation stays in another animation file. How to join KDAB to contribute my code to Qt 3D?
Thank you! You are most welcome to join us improving Qt3D. We are developing Qt3D in the open as part of the overall Qt Project. You can usually find us hanging out on #qt-3d channel on freenode. Alternatively you can reach us via the development@qt-project.org mailing list.
Any help in making Qt3D work on Android, iOS or Windows Phone would be most welcome indeed. We think the necessary abstractions are in place but I’m sure there will be some build time and runtime issues to resolve.
We would also love to help you get your skeletal animation format supported for Qt3D 2.1. It’s something we have on the roadmap but we have yet to start development of a skeletal animation aspect so this would be a great area for you to dive into. Qt3D has hooks for loading custom mesh formats but we do not yet have anything to support skeletal animation. I don’t think this should be too hard to add in however.
Hope to see you online shortly!
Thank you for your reply!
These time in addition to work with my project currently, I want to look deep into the awesome Qt 3D. Recently I want to download Qt 5.5 together with Qt 3D and try to learn some concept on Qt 3D. I’d be happy to go forward with KDAB and Qt contributors!
Thanks for the great work. For my current project I need the integration of QtQuick 2 and Qt3D for UI implementation as you already mentioned above. Do you have an example for that? The clues regarding QQuickFramebufferObject and QQuickRenderControl are not helpful, cause I’m totally new to QtQuick and Qt3D. Any help would be great! Thanks in advance!
Wrap your 3D world in a Scene3D{} element and use this like any other Qt Quick 2 item. Take a look at the Scene3D example that ships with Qt3D.
Thanks, that was really quick!
Sadly I still did not get it working cause now in the combination of Qt3D and QtQuick I get a message which says: “GLSL 1.50 is not supported. Supported versions are: 1.10, 1.20, 1.30, 1.00 ES, and 3.00 ES” Somehow he tries to compile a fragment shader. This happens only if I try to load files exported from Blender (*.obj). The above mentioned example works well.
Seems you are trying to use a GLSL shader in one of the materials that is assuming OpenGL 3.2 but your system doesn’t have that. Can you track down which shader is giving that error?
The application output is “Failed to compile shader: []
QOpenGLShader::compile(Fragment) []
*** Problematic Fragment shader source code ***
#version 150
#define lowp
#define mediump
#define highp
#line 2
”
It happens when I apply no special material at all and even as well when I apply the WireframeMaterial from your example. Thanks a lot for your help!
The wireframe material is expected to require OpenGL 3, but the default material should work anywhere. Please file a bug at bugreports.qt.io for this with a small test case and a description of your setup.
If I apply Phong Material it works… At least that…
Opened a bugreport. Here is the link: https://bugreports.qt.io/browse/QTBUG-45366
One more question: You wrote “Mouse support will come in the New Year”. Is it already possible to detect which object (/mesh) in a Scene3D was clicked?
It seems that I dont have this head file : Qt3DQuick/quickwindow.h
what should I do?
Other includes are just fine
It no longer exists as it was redundant. Please take a look at the examples in HEAD of the 5.5 branch to see how they are implemented now.
What a great work on Qt3D! Thanks for making it true, especially with the assimp supported features. Anyway I have tried to implement the assimp and successfully loaded many different 3D formats. But the problem is that I couldn’t able to load the textures correctly though I have tried to edit the sources of “Assimpparser” as well. Do you have any idea what could possibly the mistake that I have done here?
Best regards,
Bramastyo Harimukti
Paul Lemire is working on a patch that does exactly that. It should appear on gerrit and be merged shortly.
And here’s the patch if you want to try it https://codereview.qt-project.org/#/c/111317/3
Awesome! Thanks for your quick reply. Now it loads the textures very well! I have another question regarding how to run the applied animations inside the 3D file. When I used assimp_viewer such a feature is possible there. Is it also possible to do so with Qt 3D 2.0? Or is there any patch done regarding this task? Anyway I love this version of Qt3D a lot!
Best regards,
Bramastyo Harimukti
Nothing in place as yet for this. One possibility is to have the assimp loader expose the sub meshes as a palette we can select from to then apply animations to. Skeletal animations are not planned until after the initial release, unless somebody wants to step up and implement them of course, in which case we will gladly take any help we can get.
Thanks a lot Mr. Harmer for your kind answers and explanations. Is there any example done which expose the sub meshes as a palette to be animated so far? And I have another question about transparencies, is there any way to setup the transparent mode while rendering at the moment?
Best regards,
Bramastyo Harimukti
No example yet as this will be a new feature that needs implementing. Patches welcome.
To use transparency you need a custom framegraph that turns on blending and sorts primitives from back to front for the transparent object pass.
Hi Mr. Harmer,
I have another small question. Is it possible to replace the Objloader in QMesh with assimp? And is there any example to dynamically specify shaders with QML for models which are loaded through Assimp? It works with Mesh but not with SceneLoader. I am trying to implement this but have no idea how it works. Thanks a lot!
Best regards,
Bramastyo Harimukti
Not as yet. We’re investigating ways of selecting sub-objects loaded by assimp via the SceneLoader so that they can be rendered with custom materials etc.
Hi Mr. Harmer,
Thanks a lot for your reply. Yes I realize that we can’t change the color or shader when we are loading a model with SceneLoader as we can do with Mesh (ObjLoader). Could you please tell me why this approach is not the same like what Mesh does? I realized that with the SceneLoader(AssimpParser) we are returning a QMesh pointer but in Mesh we are simply returning loaded Mesh into QMesh. Is this one of the reason behind?
Thank you so much for your time!
Best regards,
Bramastyo Harimukti
We are researching in to how to get access to the constructed tree of elements. In fact Laszlo from The Qt Company pushed an example around this today. https://codereview.qt-project.org/#/c/122323/
Hi Mr. Harmer,
thanks for all of your answers here. Though I have one small question regarding the texture.
So basically I need to import the image and convert the format into “QImage::Format_RGBA8888_Premultiplied”. And if I have a look into some sources and examples that the QAbstractTextureProvider requires QTextureImage for its texture image, how can I possibly insert the QImage to this QTextureImage? Since there is no possibility to convert the format into premultiplied alpha. Is there any example that has done this before?
I need to do this because I have to use a png image that has an alpha value as the texture.
Thanks a lot for your time!
BR,
Bramastyo Harimukti
Hi Mr. Harmer,
never mind my previous question. I have solved the problem 😀
BR,
Bramastyo Harimukti Santoso
Is Qt3D handling the devicePixelRatio()? On the mac, the viewport is clipped to the botttom left 1/4th area. All the samples shipped have the same issue. Would love to know how to fix this problem.
I am not a professional software developer, rather I am an aerospace engineer. I want to use Qt as the central framework for a new autonomous systems simulation I am building, largely because of the excellent toolset offered by Qt3D. However, after downloading the Qt5.5 beta, I can get only a handful of the examples to work (nothing relying on exampleresources) and I am now growing very concerned about any further commitment to Qt. Should the examples be available and working right out of the box in the beta or am i commiting some newbie mistake?
Hi Dave,
What platform is this on please? The examples are working here for me on Linux and OS X. I know there have been some relatively late changes made to the examples that use the assets so it’s possible something got broken. Can you file a bug report at bugreports.qt.io please and we can take a look into this. Please be aware that Qt 5.5 ships with a technology preview of Qt3D so there will be gaps and broken pieces of functionality still. We aim for a stable release with Qt 5.6 at the end of this year.
How to draw custom dynamic per vertex data for c++ and qml?does this support for qml,? Thanks
At present, by subclassing QAbstractMesh and providing a functor that provides your vertex data. This is on the short list for improving and extending in the next month or so.
Hi, i m new to qt3d world and i didn’t use the qt3d 1.0, My question is really stupid but how to draw a simple line or point in the 3d space using qt3d?, thank you in advance.
At present you will need to create a custom piece of geometry by subclassing QAbstractMesh. We will be adding more such standard pieces of geometry as time goes on and making it efficient to draw very large numbers of them via instancing (where the underlying GPU supports it). Contributions welcome.
thank you for your response, i am working on an application in civil enginnering and i need to draw meshs based on nodes(points) and edges(lines) and have the capability to pick any node or edge to do further processing. from your response i understand that this kind of drawing will not be done efficiently on qt3d as it is now?
It all depends upon how many buffers you put your points and lines into. It’s perfectly possible to render many 10’s or hundreds of thousands of vertices in one go form a single buffer. However, if you have many small buffers with a few points each this is the area where we need some improvements (which are planned and designed). We are looking to implement these in the next 1 to 2 months.
thank you for your quick response, one more question, is there an example or an article on how to subclass QAbstarctMesh for making custom piece of geometry ?.
Take a look at the Qt3D sources, QSphere for e.g.
You wrote “Mouse support will come in the New Year”.
So it is not possible to detect which object in a Scene3D was clicked?
Hi,
very helpfull tutorial.
But…
I have a question. Im working on Scene3D and Entities inside. Now, how to pick one of the loaded models (meshes/entities) on click on them?
I create MouseArea and recive x,y on click for now, but I can’t find any function in Scene3D that can return entity on actual x,y.
Hi
maybe you need to have a look into transformation component for that case. And try to insert the transformation component as the components of your entity together with your loaded mesh.
BR,
Bramastyo Harimukti
Is there a to get Qt3D working with QWidget (or QOpenGLWidget)? I have seen suggestions on using QWidget::createWindowContainer but have been unable to get it to work. Any sample code would be really helpful. Ideally, I would want Qt3d to work directly with widgets without having to create QWindow.
Hi Ram,
i have the same Problem. I want also use OpenGL/Qt3d in Dialog-Window and run it in a Widget.
I have not any solution for These Problem. I found only the way over a new Window.
Do you have found a solution in the meantime?
Create the QWindow and embed it into your widget hierarchy using QWidget::createWindowContainer().
Hello, I’m currently exploring the Qt3D, especially the qml part and have tried most of the Qt3D qml examples.
However, I fail to combine Scene3D and a RenderGraph with multiple viewports.
What I want to achieve is a Qt Quick application with a 3D scene with four Viewports showing a little 3D world from different points of view on the one side of my appications screen
and another screen area containing Qt Quick elements like buttons, textfield etc. to interact with and show information about the 3D scene.
I tried to combine the examples ‘multiviewport’ and ‘scene3d’.
When I add my custom 3D scene inside the multiviewport example, it works fine and I can watch my scene from four differen points of view with the viewports arranged in a grid.
However, when I try to embedd this 3D content in a Qt Quick application by putting the root Entity in a Scene3D element like it is done in the ‘scene3d’ example, my RenderGraph with four subviewports inside one main viewport is not rendered correctly and the four viewports are not displayed one besides each other in the main viewport, but one over another, all four filling the whole 3D screen area.
Can you give me a hint what I do wrong, or what is the rigth way to embed multiple views of the same 3D scene into a Qt Quick application?
Is the Scene3D element some kind of limited to simple render graphs or only one viewport at a time?
That should work. It sounds like you’ve made a mistake adapting the framegraph to utilise multiple viewports. In fact the multiviewport example uses a Scene3D item so it must work.
Thank your for your quick answer. I’m pleased to hear, that it should work to combine Scene3D and a multiviewport RenderGraph – however, I can’t find my mistake.
First, the multiviewport example I got with my Qt 5.5 installation does not use a Scene3D item. (Is there another version of this example perhaps?)
In my multiviewport example project, I have a main.qml file with an Entity as root element, which contains a CameraLens Element and another Entity sceneRoot.
The sceneRoot Entity contains (amongst others) the multiviewport FrameGraph definition, four Entities for the cameras/viewports and an Entity with a SceneLoader – thats where I put my custom 3D world (Custom Qml file with root Entity, some Objects, some Materials..) instead.
Thats the only change I made in this example project.
And this works fine! I can see my animated scene in four viewports arranged in a grid.
Now I try to modify the scene3d example to embed my multiviewport graph in a Qt Quick application.
My scene3d example’s main.qml file contains amongst others a Scene3D element containing an instance of a custom Entity AnimatedEntity.qml.
I added a MultiviewportEntity.qml file instead of AnimatedEntity.qml, where I put the whole content of the multiviewport example main.qml file – without changing anything in the multiviewport FrameGraph definition. (Of course, I also added the qml files defining my custom 3D world to the project)
Inside the Scene3D element I now add an instance of MultiviewportEntity.qml instead of AnimatedEntity.qml in the example code.
The result is, that all viewports are displayed one over/inside each other, each filling the whole Scene3D area.
But I didn’t change anything in the FrameGraph definition of the multiviewport example but only copy pasted the whole multiviewport example’s main.qml content into MultiviewportEntity.qml.
Sorry it’s impossible to see where the mistake is without seeing the code. For reference the multiviewport using a Scene3D element is here http://code.qt.io/cgit/qt/qt3d.git/tree/examples/qt3d/multiviewport/main.qml. Please feel free to drop by the #qt-3d channel on irc if you need help.
Seems that the problem is fixed. Running the same project in Qt Creator 5.6, everything works fine.
Hi,
I have tried to run the Qt3D example called “materials” and compiled it without a problem. Then when I run the program, it couldn’t open a single .obj or .webp.
(Qt3D.Renderer.Jobs: virtual Qt3D::QMeshDataPtr Qt3D::MeshFunctor::operator()() OBJ load failure for: “:/assets/chest/Chest.obj”
Failed to load image : “:/assets/textures/pattern_09/diffuse.webp”
Texture data is null, texture data failed to load)
I am pretty sure this can be traced back to QMesh.setSource() and QTextureImage.setSource() but I don’t where to go from here on.
This is probably a very rookie mistake. Thanks in advance for your help.
YL
I actually solved the problem now. Very rookie mistake…
With apologies,YL
Out of interest, what was the issue?
I’m also facing the same issue. How you fixed this?
Hi YL,
i get the same message. How did you solve the problem ? May you could post your code.Thanks in advance for your help.
geaggle
Hi,
please help me ! How can i Blender created file 3ds or obj with a Texturefile png.
thanks
nonamepalmer
Hi all,
I am using qt 3d and having trouble importing and displaying my mesh from blender whether it is a obj. file or .3ds file. I am using qml, and from the example code I cant seem to create my own meshes. In the example for main.cpp where does the window.h file come from is it important to display 3d meshes?
HI Sean Harmer,
Can you help me answer this question about drawing a massive amount of Vertex Buffer from BVH tree nodes using QT3D 2.0.
https://forum.qt.io/topic/60579/draw-a-massive-amount-of-gl-vertex-buffer-in-one-mesh-using-qt3d
Hello,
I’m using Qt quick-Qml with Qt version Qt 5.9, and I want to load a detailed 3d-model (from collada file) using sceneLoader (assimp). then dealing with separated objects (subparts of this model).
e.g: loading a “car.dae” model then handels the left front wheel.
I managed to load the model but I’m getting a runtime error when I’m trying to accsess the subparts (entities) inside the model. I have tried “SceneHelper” from assimp and tyhe methods “entity(string name)” and “entitynames()” from SceneLoader.
The run error comes with the a debug message “ASSERT: “entities.size() == 1″ in file io\qsceneloader.cpp, line 260”.
Can you please informe me if this feature )accessing the sub parts) is supported in Qt3D (Qt version Qt5.9) or not.
with best regards