Qt 6.x
The Qt SDK
All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Properties Friends Macros Pages
qtquick3d-architecture.qdoc
Go to the documentation of this file.
1// Copyright (C) 2020 The Qt Company Ltd.
2// SPDX-License-Identifier: LicenseRef-Qt-Commercial OR GFDL-1.3-no-invariants-only
3
4/*!
5\page qtquick3d-architecture.html
6\title Qt Quick 3D Architecture
7\brief An overview of the architecture of Qt Quick 3D
8
9Qt Quick 3D extends Qt Quick to support the rendering of 3D content. It adds
10extensive functionality, including several new public QML imports, as well as
11a new internal scene graph and renderer. This document describes the
12architecture of Qt Quick 3D from the public API to the details of how the
13rendering pipeline works.
14
15\section1 Module Overview
16
17Qt Quick 3D consists of several modules and plugins that expose the
18additional 3D APIs as well as utilities for conditioning and importing
19existing 3D assets.
20
21\section2 QML Imports
22
23\list
24 \li QtQuick3D - The main import which contains all the core components of
25 Qt Quick 3D
26 \li \l{QtQuick3D.AssetUtils QML Types}{QtQuick3D.AssetUtils} - A library for importing 3D assets at runtime
27 \li \l{Qt Quick 3D Helpers QML Types}{QtQuick3D.Helpers} - A library of additional components which can be
28 used to help design 3D and debug 3D scenes.
29\endlist
30
31\section2 C++ Libraries
32
33\list
34 \li \l{Qt Quick 3D C++ Classes}{QtQuick3D} - The only public C++ module.
35 Contains the definitions of all types exposed to the QtQuick3D QML import
36 as well as a few C++ APIs
37 \list
38 \li QQuick3DGeometry - Subclass to create procedural mesh data
39 \li QQuick3DTextureData - Subclass to create procedural texture data
40 \li QQuick3D::idealSurfaceFormat - used to get the ideal surface format
41 \endlist
42 \li \c QtQuick3DAssetImport - An internal and private library to aid in
43 importing assets and convert assets to QML.
44 \li \c QtQuick3DRuntimeRender - An internal and private library that
45 contains the spatial scene graph nodes and renderer.
46 \li \c QtQuick3DUtils - An internal and private library used as a common
47 utility library by all of the other C++ modules.
48\endlist
49
50\section2 AssetImporters Plugins
51The asset import tooling is implemented using a plugin based architecture. The
52plugins shipped with Qt Quick 3D extend the functionality of the asset importer
53library and tool, \l{Balsam Asset Import Tool}{Balsam}.
54\list
55 \li Assimp - This plugin uses the 3rd party library libAssimp to convert
56 3D assets in 3D interchange formats to Qt Quick 3D QML components.
57\endlist
58
59\section1 How does Qt Quick 3D fit into the Qt Graphics Stack
60
61\image quick3d-graphics-stack.drawio.svg
62
63The above diagram illustrates how Qt Quick 3D fits into the larger Qt
64graphics stack. Qt Quick 3D works as an extension to the 2D Qt Quick API, and
65when using 3D scene items in conjunction with View3D the scene will be
66rendered via the Qt Rendering Hardware Interface (RHI). The RHI will
67translate API calls into the correct native rendering hardware API calls for
68a given platform. The diagram above shows the options available for
69each platform. If no native backend is explicitly defined, then Qt Quick will
70default to a sensible native backend for rendering for each platform.
71
72The integration between the Qt Quick 3D components of the stack and the Qt Quick
73stack are described below in the next sections.
74
75\section1 3D in 2D Integration
76
77Displaying 3D content in 2D is the primary purpose of the Qt Quick 3D API. The
78primary interface for integrating 3D content into 2D is the View3D component.
79
80The View3D component works like any other QQuickItem derived class with
81content and implements the virtual function QQuickItem::updatePaintNode. Qt
82Quick calls updatePaintNode for all "dirty" items in the Qt Quick scenegraph
83during the synchronization phase. This includes the 3D items managed by a
84View3D, which also undergo their synchronization phase as a result of the
85updatePaintNode call.
86
87The updatePaintNode method of View3D performs the following actions:
88\list
89 \li Set up a renderer and render target if one doesn't exist already
90 \li Synchronize items in the 3D scene via SceneManager
91 \li Update any "dynamic" textures that were rendered by Qt Quick (\l {Texture Path}{2D in 3D Texture path} below)
92\endlist
93
94The rendering of the 3D scene, however, does not occur in the View3D
95updatePaintNode method. Instead updatePaintNode returns a QSGNode subclass
96containing the renderer for Qt Quick 3D, which will render the 3D scene during
97the preprocess phase of the Qt Quick render process.
98
99The plumbing for how Qt Quick 3D will render depends on which
100View3D::renderMode is used:
101
102\section2 Offscreen
103
104The default mode for View3D is \l {View3D::renderMode}{Offscreen}. When using offscreen mode
105View3D becomes a texture provider by creating an offscreen surface and
106rendering to it. This surface can be mapped as a texture in Qt Quick and
107rendered with a QSGSimpleTextureNode.
108
109This pattern is very close to how QSGLayerNodes work already in Qt Quick.
110
111\section2 Underlay
112
113When using the \l {View3D::renderMode}{Underlay} mode the 3D scene is directly rendered to the
114QQuickWindow containing the View3D. Rendering occurs as a result of the signal
115QQuickWindow::beforeRenderPassRecording() which means that everything else in
116Qt Quick will be rendered on top of the 3D content.
117
118\section2 Overlay
119
120When using the \l {View3D::renderMode}{Overlay} mode the 3D scene is directly rendered to the
121QQuickWindow containing the View3D. Rendering occurs as a result of the signal
122QQuickWindow::afterRenderPassRecording() which means that the 3D content will
123be rendered on top of all other Qt Quick content.
124
125\section2 Inline
126
127The \l {View3D::renderMode}{Inline} render mode uses QSGRenderNode, which enables direct
128rendering to Qt Quick's render target without using an offscreen surface. It
129does this by injecting the render commands inline during the 2D rendering of
130the Qt Quick Scene.
131
132This mode can be problematic because it uses the same depth buffer as the
133Qt Quick renderer, and z values mean completely different things in Qt Quick
134vs Qt Quick 3D.
135
136\section1 2D in 3D Integration
137
138When rendering a 3D scene, there are many scenarios where there is a need to
139embed 2D elements into 3D. There are two different ways to integrate 2D
140content inside of 3D scenes, each of which has its own path to get to the
141screen.
142
143\section2 Direct Path
144
145The direct path is used to render 2D Qt Quick content as if it existed as an
146flat item in the 3D scene. For example, consider the following scene
147definition:
148
149\code
150Node {
151 Text {
152 text: "Hello world!"
153 }
154}
155\endcode
156
157What happens here is: when a child component is set on
158a spatial node of type QQuickItem, it is first wrapped by a
159QQuick3DItem2D, which is just a container that adds 3D coordinates to a 2D item.
160This sets the base 3D transformation for how all further 2D children are
161rendered so that they appear correctly in the 3D scene.
162
163When the time comes to render the scene, these 2D items' QSGNodes are passed to
164the Qt Quick Renderer to generate the appropriate render commands. Because the
165commands are done inline and take the current 3D transformation into
166consideration, they are rendered exactly the same as in the 2D renderer, but
167show up as if they were rendered in 3D.
168
169The drawback of this approach is that no lighting information of the 3D scene
170can be used to shade the 2D content, because the Qt Quick 2D renderer has no
171concept of lighting.
172
173\section2 Texture Path
174
175The texture path uses a 2D Qt Quick scene to create dynamic texture
176content. Consider the following Texture definition:
177
178\code
179Texture {
180 sourceItem: Item {
181 width: 256
182 height: 256
183 Text {
184 anchors.centerIn: parent
185 text: "Hello World!"
186 }
187 }
188}
189\endcode
190
191This approach works in the same way that Layer items work in Qt Quick, in that
192everything is rendered to an offscreen surface the size of the top-level Item,
193and that offscreen surface is then usable as a texture that can be reused
194elsewhere.
195
196This Texture can then be used by materials in the scene to render Qt Quick
197content on items.
198
199\section1 Scene Synchronization
200
201\section2 Scene Manager
202
203The scene manager in Qt Quick 3D is responsible for keeping track of spatial
204items in a 3D scene, and for making sure that items are updating their
205corresponding scene graph nodes during the synchronize phase. In Qt Quick,
206this role is performed by QQuickWindow for the 2D case. The scene manager is
207the primary interface between the frontend nodes and the backend scene graph
208objects.
209
210Each View3D item will have at least one Scene Manager, as one is created and
211associated with the built-in scene root on construction. When spatial nodes
212are added as children of the View3D, they are registered with the View3D's
213scene manager. When using an imported scene, a second SceneManager is created
214(or referenced if one exists already) to manage the nodes that are not direct
215children of the View3D. This is needed because, unlike the View3D, an
216imported scene doesn't exist on a QQuickWindow until it is referenced. The
217additional SceneManager makes sure that assets belonging to the imported
218scene are created at least once per QQuickWindow they are referenced in.
219
220While the scene manager is an internal API, it is important to know that the
221scene manager is responsible for calling updateSpatialNode on all objects that
222have been marked dirty by calling the update() method.
223
224\section2 Frontend/Backend Synchronization
225
226The objective of synchronization is to make sure that the states set on the
227frontend (Qt Quick) match what is set on the backend (Qt Quick Spatial Scene
228Graph Renderer). By default the frontend and backend live in separate threads:
229the frontend in the Qt Main thread, and the backend in Qt Quick's render thread. The
230synchronization phase is where the main thread and render thread can safely
231exchange data. During this phase, the scene manager will call updateSpatialNode for each dirty
232node in the scene. This will either create a new backend node or update an
233existing one for use by the renderer.
234
235\section2 Qt Quick Spatial Scene Graph
236
237Qt Quick 3D is designed to use the same frontend/backend separation pattern
238as Qt Quick: frontend objects are controlled by the Qt Quick engine, while
239backend objects contain state data for rendering the scene. Frontend objects
240inherit from QObject and are exposed to the Qt Quick engine. Items in QML
241source files map directly to frontend objects.
242
243As the properties of these frontend objects are updated, one or more backend nodes
244are created and placed into a scenegraph. Because rendering 3D scenes
245involves a lot more state than rendering 2D, there is a separate set of specialized scene
246graph nodes for representing the state of the 3D scene objects.
247This scene graph is know as the Qt Quick Spatial Scene Graph.
248
249Both the frontend objects and backend nodes can be categorized into two classes.
250The first are spatial, in the sense that they exist somewhere in the in 3D space.
251What this means in practice is that each of these types contains a transform
252matrix. For spatial items the parent child relationship is significant because
253each child item inherits the transform of its parents.
254
255The other class of items are resources. Resource items do not have a position
256in 3D space, but rather are just state that is used by other items. There can
257be a parent-child relationship between these items, but it has no other meaning
258than ownership.
259
260Unlike the 2D scene graph in Qt Quick, the spatial scene graph exposes resource
261nodes to the user directly. So for example in Qt Quick, while QSGTexture is
262public API, there is no QQuickItem that exposes this object directly. Instead
263the user must either use an Image item, which describes both where the texture
264comes from as well as how to render it, or write C++ code to operate on the
265QSGTexture itself. In Qt Quick 3D these resources are exposed directly in the
266QML API. This is necessary because resources are an important part of the scene
267state. These resources can be referenced by many objects in the scene: for
268example, many Materials could use the same Texture. It is also possible to
269set properties of a Texture at runtime that would directly change how a texture
270is sampled, for example.
271
272\section3 Spatial Objects
273
274All spatial Objects are subclasses of the Node component, which contains the
275properties defining the position, rotation, and scale in 3D space.
276
277\list
278 \li \l [QtQuick3D QML] {Node}{Node}
279 \li \l [QtQuick3D QML] {Light}{Light}
280 \list
281 \li DirectionalLight
282 \li PointLight
283 \li SpotLight
284 \endlist
285 \li \l [QtQuick3D QML] {Camera}{Camera}
286 \list
287 \li PerspectiveCamera
288 \li OrthographicCamera
289 \li FrustumCamera
290 \li CustomCamera
291 \endlist
292 \li \l [QtQuick3D QML] {Model}{Model}
293 \li Loader3D
294 \li Repeater3D
295 \li \l [QtQuick3D QML] {Skeleton}{Skeleton}
296 \li \l [QtQuick3D QML] {Joint}{Joint}
297\endlist
298
299\section3 Resource Objects
300
301Resource objects are subclasses of the Object3D component. Object3D is just a
302QObject subclass with some special helpers for use with the scene manager.
303Resource objects do have parent/child associations, but these are mostly useful
304for resource ownership.
305
306\list
307 \li \l [QtQuick3D QML] {Texture}{Texture}
308 \li \l [QtQuick3D QML] {TextureData}{TextureData}
309 \li \l [QtQuick3D QML] {Geometry}{Geometry}
310 \li \l [QtQuick3D QML] {Material}{Material}
311 \list
312 \li DefaultMaterial
313 \li PrincipledMaterial
314 \li CustomMaterial
315 \endlist
316 \li \l [QtQuick3D QML] {Effect}{Effect}
317 \li SceneEnvironment
318\endlist
319
320\section3 View3D and Render Layers
321
322With regard to the frontend/backend separation, View3D is the separation
323point from the user perspective because a View3D is what defines what scene
324content to render. In the Qt Quick Spatial Scene Graph, the root node for a
325scene that will be rendered is a Layer node. Layer nodes are created by the
326View3D using a combination of the the View3D's properties and the properties
327of the SceneEnvironment. When rendering a scene for a View3D, it is this Layer
328node that is being passed to the renderer to render a scene.
329
330\section1 Scene Rendering
331
332\image qtquick3d-rendergraph.drawio.svg
333
334\section2 Set up Render Target
335
336The first step in the rendering process is to determine and set up the scene
337render target. Depending on which properties are set in the SceneEnvironment,
338the actual render target will vary. The first decision is whether content is
339being rendered directly to a window surface, or to an offscreen texture.
340By default, View3D will render to an offscreen texture. When using post
341processing effects, rendering to an offscreen texture is mandatory.
342
343Once a scene render target is determined, then some global states are set.
344\list
345 \li window size - if rendering to a window
346 \li viewport - the size of the scene area being rendered
347 \li scissor rect - the subset of a window that the viewport should be
348 clipped to
349 \li clear color - what color to clear the render target with, if any.
350\endlist
351
352\section2 Prepare for Render
353
354The next stage of rendering is the prepare stage where the renderer does
355house-keeping to figure out what needs to be rendered for a given frame,
356and that all necessary resources are available and up to date.
357
358The prepare stage itself has two phases: the high-level preparation of
359determining what is to be rendered and what resources are needed; and the
360low-level preparation that uses RHI to actually set up rendering pipelines and
361buffers, as well as setting up the rendering dependencies of the main scene pass.
362
363\section3 High level render preparation
364
365The purpose of this phase is to extract the state of the spatial scene graph
366into something that can be used to create render commands. The overview here is
367that the renderer is creating lists of geometry and material combinations to
368render from the perspective of a single camera with a set of lighting states.
369
370The first thing that is done is to determine the global common state for all
371content. If the SceneEnvironment defines a \l {SceneEnvironment::lightProbe}{lightProbe}, then it checks if the
372environment map associated with that light probe texture is loaded, and if its
373not, a new environment map is is loaded or generated. The generation of
374an environment will itself be a set of passes to convolve the source texture
375into a cube map. This cube map will contain both specular reflection information
376as well as irradiance, which is used for material shading.
377
378The next thing is that the renderer needs to determine which camera in the
379scene to use. If an active camera is not explicitly defined by a View3D, the
380first camera available in the scene is used. If there are no cameras
381in the scene, then no content is rendered and the renderer bails out.
382
383With a camera determined, it is possible to calculate the projection matrix
384for this frame. The calculation is done at this point because each renderable
385needs to know how to be projected. This also means that it is now possible to
386calculate which renderable items should be rendered. Starting with the list of
387all renderable items, we remove all items that are not visible because they
388are either disabled or completely transparent. Then, if frustum culling is
389enabled on the active camera, each renderable item is checked to see if it is
390completely outside of the view of the camera's frustum, and if so it is
391removed from the renderable list.
392
393In addition to the camera projection, the camera direction is also calculated
394as this is necessary for lighting calculations in the shading code.
395
396If there are light nodes in the scene, these are then gathered into a list the
397length of the maximum available lights available. If more light nodes exist in
398the scene than the amount of lights the renderer supports, any additional
399light nodes over that limit are ignored and don't contribute to the lighting of
400the scene. It is possible to specify the scope of light nodes, but note that
401even when setting a scope the lighting state of each light is still sent to
402every material which has lighting, but for lights not in scope the brightness
403will be set to 0, so in practice those lights will not contribute to the
404lighting of those materials.
405
406Now with a hopefully shorter list of renderables, each of these items need to
407be updated to reflect the current state of the scene. For each renderable we
408check that a suitable material is loaded, and if not a new one is created.
409A material is a combination of shaders and a rendering pipeline, and it is needed
410for creating a draw call. In addition the renderer makes sure that any
411resources needed to render a renderable is loaded, for example geometry and
412textures set on the Model. Resources that are not loaded already are
413loaded here.
414
415The renderables list is then sorted into 3 lists.
416\list
417 \li Opaque Items: these are sorted from front to back, or in other words
418 from items that are closest to the camera to items that are furthest from the
419 camera. This is done to take advantage of hardware occlusion culling or
420 early z detection in the fragment shader.
421 \li 2D Items: these are QtQuick Items that are rendered by the Qt Quick
422 renderer.
423 \li Transparent Items: these are sorted from back to front, or in other
424 words from items that are farthest from the camera to items that are nearest
425 to the camera. This is done because transparent items need to be blended
426 with all items that are behind them.
427\endlist
428
429\section3 Low Level render preparation
430
431Now that everything that needs to be considered for this frame has been
432determined, the plumbing and dependencies for the main render pass can be
433addressed. The first thing that is done in this phase is to render any
434pre-passes that are required for the main pass.
435
436\list
437 \li Render DepthPass - Certain features like Screen Space Ambient Occlusion
438 and Shadowing require a depth pre-pass. This pass consists of all opaque
439 items being rendered to a depth texture.
440
441 \li Render SSAOPass - The objective of the Screen Space Ambient Occlusion
442 pass is to generate an ambient occlusion texture. This texture is used
443 later by materials to darken certain areas when shading.
444
445 \li Render ShadowPasses - Each light in the scene that has shadow enabled,
446 contributes to an additional shadow pass. There are two different shadowing
447 techniques employed by the renderer, so depending on the light types there
448 will be different passes. When rendering shadows from a directional light,
449 the scene is rendered to a 2D occlusion texture from a combination of the
450 directional light's direction and the size of the camera frustum. When
451 rendering shadows from a point or spot light the light's occlusion texture is
452 a cube map representing the occlusion contribution relative to each face
453 direction of the light.
454
455 \li Render ScreenTexture - This pass will only occur when using a
456 CustomMaterial that requires a screen texture, which can be used for
457 rendering tecniques such as refraction. This pass works like a depth pass,
458 but instead renders all opaque items to a color texture.
459\endlist
460
461After the dependency renders are done, the rest of the passes are prepared but
462not rendered. This preparation involves taking the state gathered in the
463high-level prep stage and translating that to graphics primitives like
464creating/updating uniform buffers values, associating samplers with dependency
465textures, setup for shader resource bindings, and everything else involved in
466creating a pipeline state necessary for performing a draw call.
467
468\section2 Scene Rendering
469
470Now that the hard work of preperation is done, the easy part is running the
471commands that contribute to the main scene's content. That rendering works
472in this order:
473
474\list
475 \li Clear Pass - This isn't really a pass, but depending on what
476 backgroundMode is set on SceneEnvironment, different things can happen here.
477 If the background mode is either transparent or color, then the color buffer
478 will be cleared with either transparency or the color specified. If, however,
479 the background mode is set to SkyBox, then a pass will be run that renders
480 the SkyBox from the perspective of the camera, which will also fill the buffer
481 with initial data.
482
483 \li Opaque Pass - Next all opaque items will be drawn. This just involves
484 setting the pipeline state, and running the draw command for each item in
485 the order in the list since they are already sorted at this point.
486
487 \li 2D Pass - If there are any 2D items in the scene, then the Qt Quick
488 renderer is invoked to generate the render commands necessary to render
489 those items.
490
491 \li Transparent Pass - Then finally the transparent items in the scene are
492 rendered one by one in the same manner as the opaque items.
493\endlist
494
495This concludes the rendering of the scene.
496
497\section2 Post-Processing
498
499If any post-processing functionality is enabled, then it can be assumed that the
500result of the scene renderer was a texture that is an input for the post
501processing phase. All post-processing methods are additional passes that
502operate on this scene input texture.
503
504All steps of the Post-Processing phase are optional, and if no built-in
505features and no user-defined effects are enabled, the output of the scene
506render is what is used by the final render target. Note however that
507\l{ExtendedSceneEnvironment::tonemapMode}{tonemapping} is enabled by default.
508
509\image qtquick3d-postprocess-graph.drawio.svg
510
511\section3 Built-in Post-Processing
512
513\l ExtendedSceneEnvironment and its parent type \l SceneEnvironment offer the
514most common effects used in 3D scenes, as well as tonemapping that is used to
515map the high dynamic range color values generated by the renderer to the 0-1
516LDR range. The effects include depth of field, glow/bloom, lens flare,
517vignette, color adjustment and grading, fog, and ambient occlusion.
518
519\section3 Post-Processing Effects
520
521Applications can specify their own custom post-processing effects as an ordered
522list in the SceneEnvironment::effects property. When this list is non-empty,
523the effects in it are applied \e before the built-in effects provided by \l
524ExtendedSceneEnvironment. Each post-processing effect is part of a chain such
525that the output of the previous effect is the input for the next. The first
526effect in this chain gets its input directly from the output of the scene
527renderer step. It is also possible for effects to access the depth texture
528output of the scene renderer.
529
530Each effect in this process can consist of multiple sub-passes, which means it
531is possible to render content into intermediate buffers. The final pass of a
532multi-pass effect is expected to output a single texture containing the color
533data to be used by the next steps of the post-processing phase.
534
535\section3 Temporal and Progressive Antialiasing
536
537The Temporal and Progressive antialiasing steps are optionally enabled by
538setting properties in the SceneEnvironment. While not strictly a part of the
539post-processing phase, the actual results of Temporal and Progressive
540antialiasing are realized during the post-processing phase.
541
542Temporal Antialiasing is performed when a scene is being actively updated.
543When enabled, the active camera makes very small adjustments to the camera
544direction for each frame while drawing the scene. The current frame is then
545blended with the previously rendered frame to smooth out what was rendered.
546
547Progressive Antialiasing is only performed when a scene is not being updated.
548When enabled, an update is forced and the current state of the scene is
549rendered with very small adjustments to the active cameras direction. Up to 8
550frames are accumulated and blended together with pre-defined weights. This has
551the effect of smoothing out a non-animating scene, but comes at a
552performance cost because several extra frames will be rendered for each update.
553
554\section3 Super Sampling Antialiasing (SSAA)
555
556Super Sampling Antialiasing is a brute force way of smoothing out a scene. It
557works by rendering to a texture that is a multiple of the requested size of
558the scene, and then afterwards downsampling it to the target size. So for
559example if 2X SSAA is requested, then the scene would be rendered to a texture
560that is 2 times the intended size, and then downsampled as part of this
561phase. This can have a huge impact on performance and resource usage so
562should be avoided if possible. It's also possible for the View3D size to be
563too large to use this method, since the texture needed for this method may be
564larger than what is supported by the rendering hardware.
565
566*/