Qt 6.x
The Qt SDK
Loading...
Searching...
No Matches
qtquick3d-custom.qdoc
Go to the documentation of this file.
1// Copyright (C) 2020 The Qt Company Ltd.
2// SPDX-License-Identifier: LicenseRef-Qt-Commercial OR GFDL-1.3-no-invariants-only
3
4/*!
5\page qtquick3d-custom.html
6\title Programmable Materials, Effects, Geometry, and Texture data
7\brief Custom materials, effects, geometry and texture data providers in Qt Quick 3D
8
9While the built-in materials of Qt Quick 3D, \l DefaultMaterial and \l PrincipledMaterial,
10allow a wide degree of customization via their properties, they do not provide
11programmability on the vertex and fragment shader level. To allow that, the \l
12CustomMaterial type is provided.
13
14\table
15\header
16\li A model with PrincipledMaterial
17\li With a CustomMaterial transforming the vertices
18\row
19\li \image quick3d-custom-mat1.jpg
20\li \image quick3d-custom-mat2.jpg
21\endtable
22
23Post-processing effects, where one or more passes of processing on the color buffer are
24performed, optionally taking the depth buffer into account, before the View3D's output is
25passed on to Qt Quick, also exist in two varieties:
26\list
27\li built-in post-processing steps that can be configured via \l ExtendedSceneEnvironment, such as
28glow/bloom, depth of field, vignette, lens flare,
29\li \c custom effects implemented by the application in form of fragment shader code and a
30specification of the processing passes in an \l Effect object.
31\endlist
32
33In practice there is a third category of post-processing effects: 2D effects
34implemented via Qt Quick, operating on the output of the \l View3D item without
35any involvement from the 3D renderer. For example, to apply a blur to a \l
36View3D item, the simplest approach is to use Qt Quick's existing facilities,
37such as \l MultiEffect. The 3D post-processing system becomes beneficial for
38complex effects that involve 3D scene concepts such as the depth buffer or the
39screen texture, or need to deal with HDR tonemapping or need multiple passes
40with intermediate buffers, etc. Simple 2D effects that do not require any
41insight into the 3D scene and renderer can always be implemented with \l
42ShaderEffect or \l MultiEffect instead.
43
44\table
45\header
46\li Scene without effect
47\li The same scene with a custom post-processing effect applied
48\row
49\li \image quick3d-custom-effect1.jpg
50\li \image quick3d-custom-effect2.jpg
51\endtable
52
53In addition to programmable materials and post-processing, there are two types of data that is
54normally provided in form of files (\c{.mesh} files or images such as \c{.png}):
55
56\list
57
58\li vertex data, including the geometry for the mesh to be rendered, texture coordinates,
59normals, colors, and other data,
60
61\li the content for textures that are then used as texture maps for the rendered
62objects, or used with skybox or image based lighting.
63
64\endlist
65
66If they so wish, applications can provide such data from C++ in form of a QByteArray. Such
67data can also be changed over time, allowing to procedurally generate and later alter the
68data for a \l Model or \l Texture.
69
70\table
71\header
72\li A grid, rendered by specifying vertex data dynamically from C++
73\li A cube textured with image data generated from C++
74\row
75\li \image quick3d-custom-geom.jpg
76\li \image quick3d-custom-tex.jpg
77\endtable
78
79These four approaches to customizing and making materials, effects, geometry, and textures
80dynamic enable the programmability of shading and procedural generation of the data the
81shaders get as their input. The following sections provide an overview of these
82features. The full reference is available in the documentation pages for the respective
83types:
84
85\table
86\header
87\li Feature
88\li Reference Documentation
89\li Relevant Examples
90\row
91\li Custom materials
92\li \l CustomMaterial
93\li \l {Qt Quick 3D - Custom Shaders Example}, \l {Qt Quick 3D - Custom Materials
94Example}
95\row
96\li Custom post-processing effects
97\li \l Effect
98\li \l {Qt Quick 3D - Custom Effect Example}
99\row
100\li Custom geometry
101\li \l QQuick3DGeometry, \l{Model::geometry}
102\li \l {Qt Quick 3D - Custom Geometry Example}
103\row
104\li Custom texture data
105\li \l QQuick3DTextureData, \l{Texture::textureData}
106\li \l {Qt Quick 3D - Procedural Texture Example}
107\endtable
108
109\section1 Programmability for Materials
110
111Let's have a scene with a cube, and start with a default \l PrincipledMaterial and
112\l CustomMaterial:
113
114\table
115\header
116\li PrincipledMaterial
117\li CustomMaterial
118\row
119\li
120 \qml
121 import QtQuick
122 import QtQuick3D
123 Item {
124 View3D {
125 anchors.fill: parent
126 environment: SceneEnvironment {
127 backgroundMode: SceneEnvironment.Color
128 clearColor: "black"
129 }
130 PerspectiveCamera { z: 600 }
131 DirectionalLight { }
132 Model {
133 source: "#Cube"
134 scale: Qt.vector3d(2, 2, 2)
135 eulerRotation.x: 30
136 materials: PrincipledMaterial { }
137 }
138 }
139 }
140 \endqml
141\li
142 \qml
143 import QtQuick
144 import QtQuick3D
145 Item {
146 View3D {
147 anchors.fill: parent
148 environment: SceneEnvironment {
149 backgroundMode: SceneEnvironment.Color
150 clearColor: "black"
151 }
152 PerspectiveCamera { z: 600 }
153 DirectionalLight { }
154 Model {
155 source: "#Cube"
156 scale: Qt.vector3d(2, 2, 2)
157 eulerRotation.x: 30
158 materials: CustomMaterial { }
159 }
160 }
161 }
162 \endqml
163\endtable
164
165These both lead to the exact same result, because a \l CustomMaterial is effectively a \l
166PrincipledMaterial, when no vertex or fragment shader code is added to it.
167
168\image quick3d-custom-cube1.jpg
169
170\note Properties, such as, \l{PrincipledMaterial::baseColor}{baseColor},
171\l{PrincipledMaterial::metalness}{metalness},
172\l{PrincipledMaterial::baseColorMap}{baseColorMap}, and many others, have no equivalent
173properties in the \l CustomMaterial QML type. This is by design: customizing the material
174is done via shader code, not by merely providing a few fixed values.
175
176\section2 Our first vertex shader
177
178Let's add a custom vertex shader snippet. This is done by referencing a file in the
179\l{CustomMaterial::vertexShader}{vertexShader} property. The approach will be the same for
180fragment shaders. These references work like \l{Image::source}{Image.source} or
181\l{ShaderEffect::vertexShader}{ShaderEffect.vertexShader}: they are local or \c qrc URLs,
182and a relative path is treated relative to the \c{.qml} file's location. The common
183approach is therefore to place the \c{.vert} and \c{.frag} files into the Qt resource
184system (\c qt_add_resources when using CMake) and reference them using a relative path.
185
186In Qt 6.0 inline shader strings are no longer supported, neither in Qt Quick nor in Qt
187Quick 3D. (make note of the fact that these properties are URLs, not strings) However, due
188to their intrinsically dynamic nature, custom materials and post-processing effects in Qt
189Quick 3D still provide shader snippets in source form in the referenced files. This is a
190difference to \l ShaderEffect where the shaders are complete on their own, with no further
191amending by the engine, and so are expected to be provided as pre-conditioned \c{.qsb}
192shader packs.
193
194\note In Qt Quick 3D URLs can only refer to local resources. Schemes for remote content
195are not supported.
196
197\note The shading language used is Vulkan-compatible GLSL. The \c{.vert} and \c{.frag}
198files are not complete shaders on their own, hence being often called \c snippets. That is
199why there are no uniform blocks, input and output variables, or sampler uniforms provided
200directly by these snippets. Rather, the Qt Quick 3D engine will amend them as appropriate.
201
202\table
203\header
204\li Change in main.qml, material.vert
205\li Result
206\row
207 \li \qml
208 materials: CustomMaterial {
209 vertexShader: "material.vert"
210 }
211 \endqml
212 \badcode
213 void MAIN()
214 {
215 }
216 \endcode
217 \li \image quick3d-custom-cube1-small.jpg
218\endtable
219
220A custom vertex or fragment shader snippet is expected to provide one or more functions
221with pre-defined names, such as \c MAIN, \c DIRECTIONAL_LIGHT, \c POINT_LIGHT, \c
222SPOT_LIGHT, \c AMBIENT_LIGHT, \c SPECULAR_LIGHT. For now let's focus on \c MAIN.
223
224As shown here, the end result with an empty MAIN() is exactly the same as before.
225
226Before making it more interesting, let's look at an overview of the most commonly used
227special keywords in custom vertex shader snippets. This is not the full list. For a full
228reference, check the \l CustomMaterial page.
229
230\table
231\header
232\li Keyword
233\li Type
234\li Description
235\row
236\li MAIN
237\li
238\li void MAIN() is the entry point. This function must always be present in a custom
239vertex shader snippet, there is no point in providing one otherwise.
240\row
241\li VERTEX
242\li vec3
243\li The vertex position the shader receives as input. A common use case for vertex shaders
244in custom materials is to change (displace) the x, y, or z values of this vector, by simply
245assigning a value to the whole vector, or some of its components.
246\row
247\li NORMAL
248\li vec3
249\li The vertex normal from the input mesh data, or all zeroes if there were no normals provided.
250As with VERTEX, the shader is free to alter the value as it sees fit. The altered value is then
251used by the rest of the pipeline, including the lighting calculations in the fragment stage.
252\row
253\li UV0
254\li vec2
255\li The first set of texture coordinates from the input mesh data, or all zeroes if there
256were no UV values provided. As with VERTEX and NORMAL, the value can altered.
257\row
258\li MODELVIEWPROJECTION_MATRIX
259\li mat4
260\li The model-view-projection matrix. To unify the behavior regardless of which graphics API
261rendering happens with, all vertex data and transformation matrices follow OpenGL conventions
262on this level. (Y axis pointing up, OpenGL-compatible projection matrix) Read only.
263\row
264\li MODEL_MATRIX
265\li mat4
266\li The model (world) matrix. Read only.
267\row
268\li NORMAL_MATRIX
269\li mat3
270\li The transposed inverse of the top-left 3x3 slice of the model matrix. Read only.
271\row
272\li CAMERA_POSITION
273\li vec3
274\li The camera position in world space. In the examples on this page this is \c{(0, 0, 600)}. Read only.
275\row
276\li CAMERA_DIRECTION
277\li vec3
278\li The camera direction vector. In the examples on this page this is \c{(0, 0, -1)}. Read only.
279\row
280\li CAMERA_PROPERTIES
281\li vec2
282\li The near and far clip values of the camera. In the examples on this page this is \c{(10, 10000)}. Read only.
283\row
284\li POINT_SIZE
285\li float
286\li Relevant only when rendering with a topology of points, for example because the
287\l{QQuick3DGeometry}{custom geometry} provides such a geometry for the mesh. Writing to
288this value is equivalent to setting \l{PrincipledMaterial::pointSize}{pointSize on a
289PrincipledMaterial}.
290\row
291\li POSITION
292\li vec4
293\li Like \c gl_Position. When not present, a default assignment statement is generated
294automatically using \c MODELVIEWPROJECTION_MATRIX and \c VERTEX. This is why an empty
295MAIN() is functional, and in most cases there will be no need to assign a custom value to
296it.
297\endtable
298
299Let's make a custom material that displaces the vertices according to some pattern. To
300make it more interesting, have some animated QML properties, the values of which end up
301being exposed as uniforms in the shader code. (to be precise, most properties are going to
302be mapped to members in a uniform block, backed by a uniform buffer at run time, but Qt
303Quick 3D conveniently makes such details transparent to the custom material author)
304
305\table
306\header
307\li Change in main.qml, material.vert
308\li Result
309\row
310 \li \qml
311 materials: CustomMaterial {
312 vertexShader: "material.vert"
313 property real uAmplitude: 0
314 NumberAnimation on uAmplitude {
315 from: 0; to: 100; duration: 5000; loops: -1
316 }
317 property real uTime: 0
318 NumberAnimation on uTime {
319 from: 0; to: 100; duration: 10000; loops: -1
320 }
321 }
322 \endqml
323 \badcode
324 void MAIN()
325 {
326 VERTEX.x += sin(uTime + VERTEX.y) * uAmplitude;
327 }
328 \endcode
329 \li \image quick3d-custom-cube2-anim.gif
330\endtable
331
332\section2 Uniforms from QML properties
333
334Custom properties in the CustomMaterial object get mapped to uniforms. In the above
335example this includes \c uAmplitude and \c uTime. Any time the values change, the updated
336value will become visible in the shader. This concept may already be familiar from \l
337ShaderEffect.
338
339The name of the QML property and the GLSL variable must match. There is no separate
340declaration in the shader code for the individual uniforms. Rather, the QML property name
341can be used as-is. This is why the example above can just reference \c uTime and \c
342uAmplitude in the vertex shader snippet without any previous declaration for them.
343
344The following table lists how the types are mapped:
345
346\table
347\header
348\li QML Type
349\li Shader Type
350\li Notes
351\row
352\li real, int, bool
353\li float, int, bool
354\li
355\row
356\li color
357\li vec4
358\li sRGB to linear conversion is performed implicitly
359\row
360\li vector2d
361\li vec2
362\li
363\row
364\li vector3d
365\li vec3
366\li
367\row
368\li vector4d
369\li vec4
370\li
371\row
372\li matrix4x4
373\li mat4
374\li
375\row
376\li quaternion
377\li vec4
378\li scalar value is \c w
379\row
380\li rect
381\li vec4
382\li
383\row
384\li point, size
385\li vec2
386\li
387\row
388\li TextureInput
389\li sampler2D
390\li
391\endtable
392
393\section2 Improving the example
394
395Before moving further, let's make the example somewhat better looking. By adding a rotated
396rectangle mesh and making the \l DirectionalLight cast shadows, we can verify that the
397alteration to the cube's vertices is correctly reflected in all rendering passes,
398including shadow maps. To get a visible shadow, the light is now placed a bit higher on
399the Y axis, and a rotation is applied to have it pointing partly downwards. (this being a
400\c directional light, the rotation matters)
401
402\table
403\header
404\li main.qml, material.vert
405\li Result
406\row \li \qml
407import QtQuick
408import QtQuick3D
409Item {
410 View3D {
411 anchors.fill: parent
412 environment: SceneEnvironment { backgroundMode: SceneEnvironment.Color; clearColor: "black" }
413 PerspectiveCamera { z: 600 }
414 DirectionalLight {
415 y: 200
416 eulerRotation.x: -45
417 castsShadow: true
418 }
419 Model {
420 source: "#Rectangle"
421 y: -250
422 scale: Qt.vector3d(5, 5, 5)
423 eulerRotation.x: -45
424 materials: PrincipledMaterial { baseColor: "lightBlue" }
425 }
426 Model {
427 source: "#Cube"
428 scale: Qt.vector3d(2, 2, 2)
429 eulerRotation.x: 30
430 materials: CustomMaterial {
431 vertexShader: "material.vert"
432 property real uAmplitude: 0
433 NumberAnimation on uAmplitude {
434 from: 0; to: 100; duration: 5000; loops: -1
435 }
436 property real uTime: 0
437 NumberAnimation on uTime {
438 from: 0; to: 100; duration: 10000; loops: -1
439 }
440 }
441 }
442 }
443}
444\endqml
445\badcode
446void MAIN()
447{
448 VERTEX.x += sin(uTime + VERTEX.y) * uAmplitude;
449}
450\endcode
451\li \image quick3d-custom-cube3-anim.gif
452\endtable
453
454\section2 Adding a fragment shader
455
456Many custom materials will want to have a fragment shader as well. In fact, many will want
457only a fragment shader. If there is no extra data to be passed from the vertex to fragment
458stage, and the default vertex transformation is sufficient, setting the \c vertexShader
459property can be left out from the \l CustomMaterial.
460
461\table
462\header
463\li Change in main.qml, material.frag
464\li Result
465\row \li \qml
466materials: CustomMaterial {
467 fragmentShader: "material.frag"
468}
469\endqml
470\badcode
471void MAIN()
472{
473}
474\endcode
475\li \image quick3d-custom-cube4.jpg
476\endtable
477
478Our first fragment shader contains an empty MAIN() function. This is no different than not
479specifying a fragment shader snippet at all: what we get looks like what we get with a
480default PrincipledMaterial.
481
482Let's look at some of the commonly used keywords in fragment shaders. This is not the full
483list, refer to the \l CustomMaterial documentation for a complete reference. Many of these
484are read-write, meaning they have a default value, but the shader can, and often will want
485to, assign a different value to them.
486
487As the names suggest, many of these map to similarly named \l PrincipledMaterial
488properties, with the same meaning and semantics, following the
489\l{https://github.com/KhronosGroup/glTF/tree/master/specification/2.0#metallic-roughness-material}{metallic-roughness
490material model}. It is up the custom material implementation to decide how these values
491are calculated: for example, a value for BASE_COLOR can be hard coded in the shader, can
492be based on sampling a texture, or can be calculated based on QML properties exposed as
493uniforms or on interpolated data passed along from the vertex shader.
494
495\table
496\header
497\li Keyword
498\li Type
499\li Description
500\row
501\li BASE_COLOR
502\li vec4
503\li The base color and alpha value. Corresponds to \l{PrincipledMaterial::baseColor}. The
504final alpha value of the fragment is the model opacity multiplied by the base color
505alpha. The default value is \c{(1.0, 1.0, 1.0, 1.0)}.
506\row
507\li EMISSIVE_COLOR
508\li vec3
509\li The color of self-illumination. Corresponds to
510\l{PrincipledMaterial::emissiveFactor}. The default value is \c{(0.0, 0.0, 0.0)}.
511\row
512\li METALNESS
513\li float
514\li \l{PrincipledMaterial::metalness}{Metalness} value in range 0-1. Default to 0, which
515means the material is dielectric (non-metallic).
516\row
517\li ROUGHNESS
518\li float
519\li \l{PrincipledMaterial::roughness}{Roughness} value in range 0-1. The default value is
5200. Larger values soften specular highlights and blur reflections.
521\row
522\li SPECULAR_AMOUNT
523\li float
524\li \l{PrincipledMaterial::specularAmount}{The strength of specularity} in range 0-1. The
525default value is \c 0.5. For metallic objects with \c metalness set to \c 1 this value
526will have no effect. When both \c SPECULAR_AMOUNT and \c METALNESS have values larger than
5270 but smaller than 1, the result is a blend between the two material models.
528\row
529\li NORMAL
530\li vec3
531\li The interpolated normal in world space, adjusted for double-sidedness when face culling is disabled. Read only.
532\row
533\li UV0
534\li vec2
535\li The interpolated texture coordinates. Read only.
536\row
537\li VAR_WORLD_POSITION
538\li vec3
539\li Interpolated vertex position in world space. Read only.
540\endtable
541
542Let's make the cube's base color red:
543
544\table
545\header
546\li Change in main.qml, material.frag
547\li Result
548\row \li \qml
549materials: CustomMaterial {
550 fragmentShader: "material.frag"
551}
552\endqml
553\badcode
554void MAIN()
555{
556 BASE_COLOR = vec4(1.0, 0.0, 0.0, 1.0);
557}
558\endcode
559\li \image quick3d-custom-cube5.jpg
560\endtable
561
562Now strengthen the level of self-illumination a bit:
563
564\table
565\header
566\li Change in main.qml, material.frag
567\li Result
568\row \li \qml
569materials: CustomMaterial {
570 fragmentShader: "material.frag"
571}
572\endqml
573\badcode
574void MAIN()
575{
576 BASE_COLOR = vec4(1.0, 0.0, 0.0, 1.0);
577 EMISSIVE_COLOR = vec3(0.4);
578}
579\endcode
580\li \image quick3d-custom-cube6.jpg
581\endtable
582
583Instead of having values hardcoded in the shader, we could also use QML properties exposed
584as uniforms, even animated ones:
585
586\table
587\header
588\li Change in main.qml, material.frag
589\li Result
590\row \li \qml
591materials: CustomMaterial {
592 fragmentShader: "material.frag"
593 property color baseColor: "black"
594 ColorAnimation on baseColor {
595 from: "black"; to: "purple"; duration: 5000; loops: -1
596 }
597}
598\endqml
599\badcode
600void MAIN()
601{
602 BASE_COLOR = vec4(baseColor.rgb, 1.0);
603 EMISSIVE_COLOR = vec3(0.4);
604}
605\endcode
606\li \image quick3d-custom-cube7-anim.gif
607\endtable
608
609Let's do something less trivial, something that is not implementable with a
610PrincipledMaterial and its standard, built-in properties. The following material
611visualizes the texture UV coordinates of the cube mesh. U runs 0 to 1, so from black to
612red, while V is also 0 to 1, black to green.
613
614\table
615\header
616\li Change in main.qml, material.frag
617\li Result
618\row \li \qml
619materials: CustomMaterial {
620 fragmentShader: "material.frag"
621}
622\endqml
623\badcode
624void MAIN()
625{
626 BASE_COLOR = vec4(UV0, 0.0, 1.0);
627}
628\endcode
629\li \image quick3d-custom-cube8.jpg
630\endtable
631
632While we are at it, why not visualize normals as well, this time on a sphere. Like with
633UVs, if a custom vertex shader snippet were to alter the value of NORMAL, the interpolated
634per-fragment value in the fragment shader, also exposed under the name NORMAL, would
635reflect those adjustments.
636
637\table
638\header
639\li Change in main.qml, material.frag
640\li Result
641\row \li \qml
642Model {
643 source: "#Sphere"
644 scale: Qt.vector3d(2, 2, 2)
645 materials: CustomMaterial {
646 fragmentShader: "material.frag"
647 }
648}
649\endqml
650\badcode
651void MAIN()
652{
653 BASE_COLOR = vec4(NORMAL, 1.0);
654}
655\endcode
656\li \image quick3d-custom-cube9.jpg
657\endtable
658
659\section2 Colors
660
661Let's switch over to a teapot model for a moment, make the material a blend of metallic
662and dielectric, and try to set a green base color for it. The \c green QColor value maps
663to \c{(0, 128, 0)}, based on which our first attempt could be:
664
665\table
666\header
667\li main.qml, material.frag
668\row \li \qml
669import QtQuick
670import QtQuick3D
671Item {
672 View3D {
673 anchors.fill: parent
674 environment: SceneEnvironment { backgroundMode: SceneEnvironment.Color; clearColor: "black" }
675 PerspectiveCamera { z: 600 }
676 DirectionalLight { }
677 Model {
678 source: "teapot.mesh"
679 scale: Qt.vector3d(60, 60, 60)
680 eulerRotation.x: 30
681 materials: CustomMaterial {
682 fragmentShader: "material.frag"
683 }
684 }
685 }
686}
687\endqml
688\badcode
689void MAIN()
690{
691 BASE_COLOR = vec4(0.0, 0.5, 0.0, 1.0);
692 METALNESS = 0.6;
693 SPECULAR_AMOUNT = 0.4;
694 ROUGHNESS = 0.4;
695}
696\endcode
697\endtable
698
699\image quick3d-custom-color1.jpg
700
701This does not look entirely right. Compare with the second approach:
702
703\table
704\header
705\li Change in main.qml, material.frag
706\li Result
707\row \li \qml
708materials: CustomMaterial {
709 fragmentShader: "material.frag"
710 property color uColor: "green"
711}
712\endqml
713\badcode
714void MAIN()
715{
716 BASE_COLOR = vec4(uColor.rgb, 1.0);
717 METALNESS = 0.6;
718 SPECULAR_AMOUNT = 0.4;
719 ROUGHNESS = 0.4;
720}
721\endcode
722\li \image quick3d-custom-color2.jpg
723\endtable
724
725Switching to a PrincipledMaterial, we can confirm that setting the
726\l{PrincipledMaterial::baseColor} to "green" and following the metalness and other
727properties, the result is identical to our second approach:
728
729\table
730\header
731\li Change in main.qml
732\li Result
733\row \li \qml
734materials: PrincipledMaterial {
735 baseColor: "green"
736 metalness: 0.6
737 specularAmount: 0.4
738 roughness: 0.4
739}
740\endqml
741\li \image quick3d-custom-color3.jpg
742\endtable
743
744If the type of the \c uColor property was changed to \c vector4d, or any type other than
745\c color, the results would suddenly change and become identical to our first approach.
746
747Why is this?
748
749The answer lies in the sRGB to linear conversion that is performed implicitly for color
750properties of DefaultMaterial, PrincipledMaterial, and also for custom properties with a
751\c color type in a CustomMaterial. Such conversion is not performed for any other value,
752so if the shader hardcodes a color value, or bases it on a QML property with a type
753different from \c color, it will be up to the shader to perform linearization in case the
754source value was in sRGB color space. Converting to linear is important since Qt Quick 3D
755performs \l{SceneEnvironment::tonemapMode}{tonemapping} on the results of fragment
756shading, and that process assumes values in the sRGB space as its input.
757
758The built-in QColor constants, such as, \c{"green"}, are all given in sRGB
759space. Therefore, just assigning \c{vec4(0.0, 0.5, 0.0, 1.0)} to BASE_COLOR in the first
760attempt is insufficient if we wanted a result that matches an RGB value \c{(0, 128, 0)} in
761the sRGB space. See the \c BASE_COLOR documentation in \l CustomMaterial for a formula for
762linearizing such color values. The same applies to color values retrieved by sampling
763textures: if the source image data is not in the sRGB color space, a conversion is needed
764(unless \l{SceneEnvironment::tonemapMode}{tonemapping} is disabled).
765
766\section2 Blending
767
768Just writing a value less than \c 1.0 to \c{BASE_COLOR.a} is not sufficient if the
769expectation is to get alpha blending. Such materials will very often change the values of
770\l{CustomMaterial::sourceBlend}{sourceBlend} and
771\l{CustomMaterial::destinationBlend}{destinationBlend} properties to get the desired
772results.
773
774Also keep in mind that the combined alpha value is the \l{Node::opacity}{Node opacity}
775multiplied by the material alpha.
776
777To visualize, let's use a shader that assigns red with alpha \c 0.5 to \c BASE_COLOR:
778
779\table
780\header
781\li main.qml, material.frag
782\li Result
783\row \li \qml
784import QtQuick
785import QtQuick3D
786Item {
787 View3D {
788 anchors.fill: parent
789 environment: SceneEnvironment {
790 backgroundMode: SceneEnvironment.Color
791 clearColor: "white"
792 }
793 PerspectiveCamera {
794 id: camera
795 z: 600
796 }
797 DirectionalLight { }
798 Model {
799 source: "#Cube"
800 x: -150
801 eulerRotation.x: 60
802 eulerRotation.y: 20
803 materials: CustomMaterial {
804 fragmentShader: "material.frag"
805 }
806 }
807 Model {
808 source: "#Cube"
809 eulerRotation.x: 60
810 eulerRotation.y: 20
811 materials: CustomMaterial {
812 sourceBlend: CustomMaterial.SrcAlpha
813 destinationBlend: CustomMaterial.OneMinusSrcAlpha
814 fragmentShader: "material.frag"
815 }
816 }
817 Model {
818 source: "#Cube"
819 x: 150
820 eulerRotation.x: 60
821 eulerRotation.y: 20
822 materials: CustomMaterial {
823 sourceBlend: CustomMaterial.SrcAlpha
824 destinationBlend: CustomMaterial.OneMinusSrcAlpha
825 fragmentShader: "material.frag"
826 }
827 opacity: 0.5
828 }
829 }
830}
831\endqml
832\badcode
833void MAIN()
834{
835 BASE_COLOR = vec4(1.0, 0.0, 0.0, 0.5);
836}
837\endcode
838\li \image quick3d-custom-blend.jpg
839\endtable
840
841The first cube is writing 0.5 to the alpha value of the color but it does not bring
842visible results since alpha blending is not enabled. The second cube enables simple alpha
843blending via the CustomMaterial properties. The third one also assigns an opacity of 0.5
844to the Model, which means that the effective opacity is 0.25.
845
846\section2 Passing data between the vertex and fragment shader
847
848Calculating a value per vertex (for example, assuming a single triangle, for the 3 corners
849of the triangle), and then passing it on to the fragment stage, where for each fragment
850(for example, every fragment covered by the rasterized triangle) an interpolated value is
851made accessible. In custom material shader snippets this is made possible by the \c
852VARYING keyword. This provides a syntax similar to GLSL 120 and GLSL ES 100, but will work
853regardless of the graphics API used at run time. The engine will take care of rewriting
854the varying declaration as appropriate.
855
856Let's see how the classic texture sampling with UV coordinates would look like. Textures
857are going to be covered in an upcoming section, for now let's focus on how we get the UV
858coordinates that can be passed to the \c{texture()} function in the shader.
859
860\table
861\header
862\li main.qml, material.vert, material.frag
863\row \li \qml
864import QtQuick
865import QtQuick3D
866Item {
867 View3D {
868 anchors.fill: parent
869 environment: SceneEnvironment { backgroundMode: SceneEnvironment.Color; clearColor: "black" }
870 PerspectiveCamera { z: 600 }
871 DirectionalLight { }
872 Model {
873 source: "#Sphere"
874 scale: Qt.vector3d(4, 4, 4)
875 eulerRotation.x: 30
876 materials: CustomMaterial {
877 vertexShader: "material.vert"
878 fragmentShader: "material.frag"
879 property TextureInput someTextureMap: TextureInput {
880 texture: Texture {
881 source: "qt_logo_rect.png"
882 }
883 }
884 }
885 }
886 }
887}
888\endqml
889\badcode
890VARYING vec2 uv;
891void MAIN()
892{
893 uv = UV0;
894}
895\endcode
896\badcode
897VARYING vec2 uv;
898void MAIN()
899{
900 BASE_COLOR = texture(someTextureMap, uv);
901}
902\endcode
903\endtable
904
905\table
906\header
907\li qt_logo_rect.png
908\li Result
909\row \li \image quick3d-custom-varying-map.png
910\li \image quick3d-custom-varying1.jpg
911\endtable
912
913Note that \c VARYING declarations. The name and type must match, \c uv in the fragment
914shader will expose the interpolated UV coordinate for the current fragment.
915
916Any other type of data can be passed on to the fragment stage in a similar manner. It is
917worth noting that in many cases setting up the material's own varyings is not necessary
918because there are builtins provided that cover many of typical needs. This includes making
919the (interpolated) normals, UVs, world position (\c VAR_WORLD_POSITION), or the vector
920pointing towards the camera (\c VIEW_VECTOR).
921
922The above example can in fact be simplified to the following as \c UV0 is automatically
923available in the fragment stage as well:
924
925\table
926\header
927\li Change in main.qml, material.frag
928\li Result
929\row \li \qml
930materials: CustomMaterial {
931 fragmentShader: "material.frag"
932 property TextureInput someTextureMap: TextureInput {
933 texture: Texture {
934 source: "qt_logo_rect.png"
935 }
936}
937\endqml
938\badcode
939void MAIN()
940{
941 BASE_COLOR = texture(someTextureMap, UV0);
942}
943\endcode
944\li \image quick3d-custom-varying1.jpg
945\endtable
946
947\section2 Textures
948
949A \l CustomMaterial has no built-in texture maps, meaning there is no equivalent of, for
950example, \l{PrincipledMaterial::baseColorMap}. This is because implementing the same is
951often trivial, while giving a lot more flexibility than what DefaultMaterial and
952PrincipledMaterial has built in. Besides simply sampling a texture, custom fragment shader
953snippets are free to combine and blend data from various sources when calculating the
954values they assign to \c BASE_COLOR, \c EMISSIVE_COLOR, \c ROUGHNESS, etc. They can base
955these calculations on data provided via QML properties, interpolated data sent on from the
956vertex stage, values retrieved from sampling textures, and on hardcoded values.
957
958As the previous example shows, exposing a texture to the vertex, fragment, or both shaders
959is very similar to scalar and vector uniform values: a QML property with the type \l
960TextureInput will automatically get associated with a \c sampler2D in the shader code. As
961always, there is no need to declare this sampler in the shader code.
962
963A \l TextureInput references a \l Texture, with an additional
964\l{TextureInput::enabled}{enabled} property. A \l Texture can source its data in three
965ways: \l{Texture::source}{from an image file}, \l{Texture::sourceItem}{from a texture with
966live Qt Quick content}, or \l{Texture::textureData}{can be provided from C++} via
967QQuick3DTextureData.
968
969\note When it comes to \l Texture properties, the source, tiling, and filtering related
970ones are the only ones that are taken into account implicitly with custom materials, as
971the rest (such as, UV transformations) is up to the custom shaders to implement as they
972see fit.
973
974Let's see an example where a model, a sphere in this case, is textured using live Qt Quick
975content:
976
977\table
978\header
979\li main.qml, material.frag
980\row \li \qml
981import QtQuick
982import QtQuick3D
983Item {
984 View3D {
985 anchors.fill: parent
986 environment: SceneEnvironment { backgroundMode: SceneEnvironment.Color; clearColor: "black" }
987 PerspectiveCamera { z: 600 }
988 DirectionalLight { }
989 Model {
990 source: "#Sphere"
991 scale: Qt.vector3d(4, 4, 4)
992 eulerRotation.x: 30
993 materials: CustomMaterial {
994 fragmentShader: "material.frag"
995 property TextureInput someTextureMap: TextureInput {
996 texture: Texture {
997 sourceItem: Rectangle {
998 width: 512; height: 512
999 color: "red"
1000 Rectangle {
1001 width: 32; height: 32
1002 anchors.horizontalCenter: parent.horizontalCenter
1003 y: 150
1004 color: "gray";
1005 NumberAnimation on rotation { from: 0; to: 360; duration: 3000; loops: -1 }
1006 }
1007 Text {
1008 anchors.centerIn: parent
1009 text: "Texture Map"
1010 font.pointSize: 16
1011 }
1012 }
1013 }
1014 }
1015 }
1016 }
1017 }
1018}
1019\endqml
1020\badcode
1021void MAIN()
1022{
1023 vec2 uv = vec2(UV0.x, 1.0 - UV0.y);
1024 vec4 c = texture(someTextureMap, uv);
1025 BASE_COLOR = c;
1026}
1027\endcode
1028\endtable
1029
1030\image quick3d-custmat-tex1-anim.gif
1031
1032Here the 2D subtree (Rectangle with two children: another Rectangle and the Text) is
1033rendered in to an 512x512 2D texture every time this mini-scene changes. The texture is
1034then exposed to the custom material under the name of \c someTextureMap.
1035
1036Note the flipping of the V coordinate in the shader. As noted above, custom materials,
1037where there is full programmability on shader level, do not offer the "fixed" features of
1038\l Texture and \l PrincipledMaterial. This means that any transformations to the UV
1039coordinates will need to be applied by the shader. Here we know that the texture is
1040generated via \l{Texture::sourceItem} and so V needs to be flipped to get something that
1041matches the UV set of the mesh we are using.
1042
1043What this example shows is possible to do with a \l PrincipledMaterial too. Let's make it
1044more interesting by doing a simple emboss effect in addition:
1045
1046\table
1047\header
1048\li material.frag
1049\li Result
1050\row \li \badcode
1051void MAIN()
1052{
1053 vec2 uv = vec2(UV0.x, 1.0 - UV0.y);
1054 vec2 size = vec2(textureSize(someTextureMap, 0));
1055 vec2 d = vec2(1.0 / size.x, 1.0 / size.y);
1056 vec4 diff = texture(someTextureMap, uv + d) - texture(someTextureMap, uv - d);
1057 float c = (diff.x + diff.y + diff.z) + 0.5;
1058 BASE_COLOR = vec4(c, c, c, 1.0);
1059}
1060\endcode
1061\li \image quick3d-custmat-tex2-anim.gif
1062\endtable
1063
1064With the features covered so far a wide range of possibilities are open for creating
1065materials that shade the meshes in visually impressive ways. To finish the basic tour,
1066let's look at an example that applies height and normal maps to a plane mesh. (a dedicated
1067\c{.mesh} file is used here because the builtin \c{#Rectangle} does not have enough
1068subdivisions) For better lighting results, we will use image based lighting with a 360
1069degree HDR image. The image is also set as the skybox to make it more clear what is
1070happening.
1071
1072First let's start with an empty CustomMaterial:
1073
1074\table
1075\header
1076\li main.qml
1077\li Result
1078\row \li \qml
1079import QtQuick
1080import QtQuick3D
1081Item {
1082 View3D {
1083 anchors.fill: parent
1084 environment: SceneEnvironment {
1085 backgroundMode: SceneEnvironment.SkyBox
1086 lightProbe: Texture {
1087 source: "00489_OpenfootageNET_snowfield_low.hdr"
1088 }
1089 }
1090 PerspectiveCamera {
1091 z: 600
1092 }
1093 Model {
1094 source: "plane.mesh"
1095 scale: Qt.vector3d(400, 400, 400)
1096 z: 400
1097 y: -50
1098 eulerRotation.x: -90
1099 materials: CustomMaterial { }
1100 }
1101 }
1102}
1103\endqml
1104\li \image quick3d-custom-tex3.jpg
1105\endtable
1106
1107Now let's make some shaders that apply a height and normal map to the mesh:
1108
1109\table
1110\header
1111\li Height map
1112\li Normap map
1113\row
1114\li \image quick3d-custom-heightmap.png
1115\li \image quick3d-custom-normalmap.jpg
1116\endtable
1117
1118\table
1119\header
1120\li material.vert, material.frag
1121\row
1122\li \badcode
1123float getHeight(vec2 pos)
1124{
1125 return texture(heightMap, pos).r;
1126}
1127
1128void MAIN()
1129{
1130 const float offset = 0.004;
1131 VERTEX.y += getHeight(UV0);
1132 TANGENT = normalize(vec3(0.0, getHeight(UV0 + vec2(0.0, offset)) - getHeight(UV0 + vec2(0.0, -offset)), offset * 2.0));
1133 BINORMAL = normalize(vec3(offset * 2.0, getHeight(UV0 + vec2(offset, 0.0)) - getHeight(UV0 + vec2(-offset, 0.0)), 0.0));
1134 NORMAL = cross(TANGENT, BINORMAL);
1135}
1136\endcode
1137\badcode
1138void MAIN()
1139{
1140 vec3 normalValue = texture(normalMap, UV0).rgb;
1141 normalValue.xy = normalValue.xy * 2.0 - 1.0;
1142 normalValue.z = sqrt(max(0.0, 1.0 - dot(normalValue.xy, normalValue.xy)));
1143 NORMAL = normalize(mix(NORMAL, TANGENT * normalValue.x + BINORMAL * normalValue.y + NORMAL * normalValue.z, 1.0));
1144}
1145\endcode
1146\endtable
1147
1148\table
1149\header
1150\li Change in main.qml
1151\li Result
1152\row
1153\li \qml
1154materials: CustomMaterial {
1155 vertexShader: "material.vert"
1156 fragmentShader: "material.frag"
1157 property TextureInput normalMap: TextureInput {
1158 texture: Texture { source: "normalmap.jpg" }
1159 }
1160 property TextureInput heightMap: TextureInput {
1161 texture: Texture { source: "heightmap.png" }
1162 }
1163}
1164\endqml
1165\li \image quick3d-custom-tex4.jpg
1166\endtable
1167
1168\note The \l WasdController object can be immensely helpful during development and
1169troubleshooting as it allows navigating and looking around in the scene with the keyboard
1170and mouse in a familiar manner. Having a camera controlled by the WasdController is as
1171simple as:
1172
1173\qml
1174import QtQuick3D.Helpers
1175View3D {
1176 PerspectiveCamera {
1177 id: camera
1178 }
1179 // ...
1180}
1181WasdController {
1182 controlledObject: camera
1183}
1184\endqml
1185
1186\section2 Depth and screen textures
1187
1188When a custom shader snippet uses the \c DEPTH_TEXTURE or \c SCREEN_TEXTURE keywords, it
1189opts in to generating the corresponding textures in a separate render pass, which is not
1190necessarily a cheap operation, but allows implementing a variety of techniques, such as
1191refraction for glass-like materials.
1192
1193\c DEPTH_TEXTURE is a \c sampler2D that allows sampling a texture with the contents of the
1194depth buffer with all the \c opaque objects in the scene rendered. Similarly, \c
1195SCREEN_TEXTURE is a \c sampler2D that allows sampling a texture containing the contents of
1196the scene excluding any transparent materials or any materials also using the
1197SCREEN_TEXTURE. The texture can be used for materials that require the contents of the
1198framebuffer they are being rendered to. The SCREEN_TEXTURE texture uses the same clear mode
1199as the View3D. The size of these textures matches the size of the View3D in pixels.
1200
1201Let's have a simple demonstration by visualizing the depth buffer contents via \c
1202DEPTH_TEXTURE. The camera's \l{PerspectiveCamera::clipFar}{far clip value} is reduced here from the
1203default 10000 to 2000, in order to have a smaller range, and so have the visualized depth
1204value differences more obvious. The result is a rectangle that happens to visualize the
1205depth buffer for the scene over its surface.
1206
1207\table
1208\header
1209\li main.qml, material.frag
1210\li Result
1211\row
1212\li \qml
1213import QtQuick
1214import QtQuick3D
1215import QtQuick3D.Helpers
1216Rectangle {
1217 width: 400
1218 height: 400
1219 color: "black"
1220 View3D {
1221 anchors.fill: parent
1222 PerspectiveCamera {
1223 id: camera
1224 z: 600
1225 clipNear: 1
1226 clipFar: 2000
1227 }
1228 DirectionalLight { }
1229 Model {
1230 source: "#Cube"
1231 scale: Qt.vector3d(2, 2, 2)
1232 position: Qt.vector3d(150, 200, -1000)
1233 eulerRotation.x: 60
1234 eulerRotation.y: 20
1235 materials: PrincipledMaterial { }
1236 }
1237 Model {
1238 source: "#Cylinder"
1239 scale: Qt.vector3d(2, 2, 2)
1240 position: Qt.vector3d(400, 200, -1000)
1241 materials: PrincipledMaterial { }
1242 opacity: 0.3
1243 }
1244 Model {
1245 source: "#Sphere"
1246 scale: Qt.vector3d(2, 2, 2)
1247 position: Qt.vector3d(-150, 200, -600)
1248 materials: PrincipledMaterial { }
1249 }
1250 Model {
1251 source: "#Cone"
1252 scale: Qt.vector3d(2, 2, 2)
1253 position: Qt.vector3d(0, 400, -1200)
1254 materials: PrincipledMaterial { }
1255 }
1256 Model {
1257 source: "#Rectangle"
1258 scale: Qt.vector3d(3, 3, 3)
1259 y: -150
1260 materials: CustomMaterial {
1261 fragmentShader: "material.frag"
1262 }
1263 }
1264 }
1265 WasdController {
1266 controlledObject: camera
1267 }
1268}
1269\endqml
1270\badcode
1271void MAIN()
1272{
1273 float zNear = CAMERA_PROPERTIES.x;
1274 float zFar = CAMERA_PROPERTIES.y;
1275 float zRange = zFar - zNear;
1276 vec4 depthSample = texture(DEPTH_TEXTURE, vec2(UV0.x, 1.0 - UV0.y));
1277 float zn = 2.0 * depthSample.r - 1.0;
1278 float d = 2.0 * zNear * zFar / (zFar + zNear - zn * zRange);
1279 d /= zFar;
1280 BASE_COLOR = vec4(d, d, d, 1.0);
1281}
1282\endcode
1283\li \image quick3d-custom-depth-anim.gif
1284\endtable
1285
1286Note how the cylinder is not present in \c DEPTH_TEXTURE due to its reliance on
1287semi-transparency, which puts it into a different category than the other objects that are
1288all opaque. These objects do not write into the depth buffer, although they do test
1289against the depth values written by opaque objects, and rely on being rendered in back to
1290front order. Hence they are not present in \c DEPTH_TEXTURE either.
1291
1292What happens if we switch the shader to sample \c SCREEN_TEXTURE instead?
1293
1294\table
1295\header
1296\li material.frag
1297\li Result
1298\row \li \badcode
1299void MAIN()
1300{
1301 vec4 c = texture(SCREEN_TEXTURE, vec2(UV0.x, 1.0 - UV0.y));
1302 if (c.a == 0.0)
1303 c.rgb = vec3(0.2, 0.1, 0.3);
1304 BASE_COLOR = c;
1305}
1306\endcode
1307\li \image quick3d-custom-screen.jpg
1308\endtable
1309
1310Here the rectangle is textured with \c SCREEN_TEXTURE, while replacing transparent pixels
1311with purple.
1312
1313\section2 Light processor functions
1314
1315An advanced feature of \l CustomMaterial is the ability to define functions in the
1316fragment shader that reimplement the lighting equations that are used to calculate the
1317fragment color. A light processor function, when present, is called once per each light in
1318the scene, for each fragment. There is a dedicated function for different light types, as
1319well as the ambient and specular contribution. When no corresponding light processor
1320function is present, the standard calculations are used, just like a PrincipledMaterial
1321would do. When a light processor is present, but the function body is empty, it means
1322there will be no contribution from a given type of lights in the scene.
1323
1324Refer to the \l CustomMaterial documentation for details on functions such as \c
1325DIRECTIONAL_LIGHT, \c POINT_LIGHT, \c SPOT_LIGHT, \c AMBIENT_LIGHT, and \c SPECULAR_LIGHT.
1326
1327\section2 Unshaded custom materials
1328
1329There is another type of \l CustomMaterial: \c unshaded custom materials. All the example
1330so far used \c shaded custom materials, with the
1331\l{CustomMaterial::shadingMode}{shadingMode} property left at its default
1332CustomMaterial.Shaded value.
1333
1334What happens if we switch this property to CustomMaterial.Unshaded?
1335
1336First of all, keywords like \c BASE_COLOR, \c EMISSIVE_COLOR, \c METALNESS, etc. no longer
1337have the desired effect. This is because an unshaded material, as the name suggests, does
1338not automatically get amended with much of the standard shading code, thus ignoring
1339lights, image based lighting, shadows, and ambient occlusion in the scene. Rather, an
1340unshaded material gives full control to the shader via the \c FRAGCOLOR keyword. This is
1341similar to gl_FragColor: the color assigned to \c FRAGCOLOR is the result and the final
1342color of the fragment, without any further adjustments by Qt Quick 3D.
1343
1344\table
1345\header
1346\li main.qml, material.frag, material2.frag
1347\li Result
1348\row \li \qml
1349import QtQuick
1350import QtQuick3D
1351Item {
1352 View3D {
1353 anchors.fill: parent
1354 environment: SceneEnvironment {
1355 backgroundMode: SceneEnvironment.Color
1356 clearColor: "black"
1357 }
1358 PerspectiveCamera { z: 600 }
1359 DirectionalLight { }
1360 Model {
1361 source: "#Cylinder"
1362 x: -100
1363 eulerRotation.x: 30
1364 materials: CustomMaterial {
1365 fragmentShader: "material.frag"
1366 }
1367 }
1368 Model {
1369 source: "#Cylinder"
1370 x: 100
1371 eulerRotation.x: 30
1372 materials: CustomMaterial {
1373 shadingMode: CustomMaterial.Unshaded
1374 fragmentShader: "material2.frag"
1375 }
1376 }
1377 }
1378}
1379\endqml
1380\badcode
1381void MAIN()
1382{
1383 BASE_COLOR = vec4(1.0);
1384}
1385\endcode
1386\badcode
1387void MAIN()
1388{
1389 FRAGCOLOR = vec4(1.0);
1390}
1391\endcode
1392\li \image quick3d-custom-unshaded1.jpg
1393\endtable
1394
1395Notice how the right cylinder ignores the DirectionalLight in the scene. Its shading knows
1396nothing about scene lighting, the final fragment color is all white.
1397
1398The vertex shader in an unshaded material still has the typical inputs available: \c
1399VERTEX, \c NORMAL, \c MODELVIEWPROJECTION_MATRIX, etc. and can write to \c POSITION. The
1400fragment shader no longer has the similar conveniences available, however: \c NORMAL, \c
1401UV0, or \c VAR_WORLD_POSITION are not available in an unshaded material's fragment
1402shader. Rather, it is now up to the shader code to calculate and pass on using \c VARYING
1403everything it needs to determine the final fragment color.
1404
1405Let's look at an example that has both a vertex and fragment shader. The altered vertex
1406position is passed on to the fragment shader, with an interpolated value made available to
1407every fragment.
1408
1409\table
1410\header
1411\li main.qml, material.vert, material.frag
1412\row \li \qml
1413import QtQuick
1414import QtQuick3D
1415Item {
1416 View3D {
1417 anchors.fill: parent
1418 environment: SceneEnvironment {
1419 backgroundMode: SceneEnvironment.Color
1420 clearColor: "black"
1421 }
1422 PerspectiveCamera { z: 600 }
1423 Model {
1424 source: "#Sphere"
1425 scale: Qt.vector3d(3, 3, 3)
1426 materials: CustomMaterial {
1427 property real time: 0.0
1428 NumberAnimation on time { from: 0; to: 100; duration: 20000; loops: -1 }
1429 property real amplitude: 10.0
1430 shadingMode: CustomMaterial.Unshaded
1431 vertexShader: "material.vert"
1432 fragmentShader: "material.frag"
1433 }
1434 }
1435 }
1436}
1437\endqml
1438\badcode
1439VARYING vec3 pos;
1440void MAIN()
1441{
1442 pos = VERTEX;
1443 pos.x += sin(time * 4.0 + pos.y) * amplitude;
1444 POSITION = MODELVIEWPROJECTION_MATRIX * vec4(pos, 1.0);
1445}
1446\endcode
1447\badcode
1448VARYING vec3 pos;
1449void MAIN()
1450{
1451 FRAGCOLOR = vec4(vec3(pos.x * 0.02, pos.y * 0.02, pos.z * 0.02), 1.0);
1452}
1453\endcode
1454\endtable
1455
1456\image quick3d-custom-unshaded-anim.gif
1457
1458Unshaded materials are useful when interacting with scene lighting is not necessary or
1459desired, and the material needs full control on the final fragment color. Notice how the
1460example above has neither a DirectionalLight nor any other lights, but the sphere with the
1461custom material shows up as expected.
1462
1463\note An unshaded material that only has a vertex shader snippet, but does not specify the
1464fragmentShader property, will still be functional but the results are as if the
1465shadingMode was set to Shaded. Therefore it makes little sense to switch shadingMode for
1466materials that only have a vertex shader.
1467
1468\section1 Programmability for Effects
1469
1470Post-processing effects apply one or more fragment shaders to the result of a \l
1471View3D. The output from these fragment shaders is then displayed instead of the original
1472rendering results. This is conceptually very similar to Qt Quick's \l ShaderEffect and \l
1473ShaderEffectSource.
1474
1475\note Post-processing effects are only available when the
1476\l{View3D::renderMode}{renderMode} for the View3D is set to View3D.Offscreen.
1477
1478Custom vertex shader snippets can also be specified for an effect, but they have limited
1479usefulness and therefore are expected to be used relatively rarely. The vertex input for a
1480post-processing effect is a quad (either two triangles or a triangle strip), transforming
1481or displacing the vertices of that is often not helpful. It can however make sense to have
1482a vertex shader in order to calculate and pass on data to the fragment shader using the \c
1483VARYING keyword. As usual, the fragment shader will then receive an interpolated value
1484based on the current fragment coordinate.
1485
1486The syntax of the shader snippets associated with a \l Effect is identical to the shaders
1487for an unshaded \l CustomMaterial. When it comes to the built-in special keywords, \c
1488VARYING, \c MAIN, \c FRAGCOLOR (fragment shader only), \c POSITION (vertex shader only), \c
1489VERTEX (vertex shader only), and \c MODELVIEWPROJECTION_MATRIX work identically to \l
1490CustomMaterial.
1491
1492The most important special keywords for \l Effect fragment shaders are the following:
1493
1494\table
1495\header
1496\li Name
1497\li Type
1498\li Description
1499\row
1500\li INPUT
1501\li sampler2D
1502\li The sampler for the input texture. An effect will typically sample this using \c INPUT_UV.
1503\row
1504\li INPUT_UV
1505\li vec2
1506\li UV coordinates for sampling \c INPUT.
1507\row
1508\li INPUT_SIZE
1509\li vec2
1510\li The size of the \c INPUT texture, in pixels. This is a convenient alternative to calling textureSize().
1511\row
1512\li OUTPUT_SIZE
1513\li vec2
1514\li The size of the output texture, in pixels. Equal to \c INPUT_SIZE in many cases, but a multi-pass effect
1515may have passes that output to intermediate textures with different sizes.
1516\row
1517\li DEPTH_TEXTURE
1518\li sampler2D
1519\li Depth texture with the depth buffer contents with the opaque objects in the scene. Like with CustomMaterial,
1520the presence of this keyword in the shader triggers generating the depth texture automatically.
1521\endtable
1522
1523\section2 A post-processing effect
1524
1525Let's start with a simple scene, this time using a few more objects, including a textured
1526rectangle that uses a checkerboard texture as its base color map.
1527
1528\table
1529\header
1530\li main.qml
1531\li Result
1532\row \li \qml
1533import QtQuick
1534import QtQuick3D
1535Item {
1536 View3D {
1537 anchors.fill: parent
1538 environment: SceneEnvironment {
1539 backgroundMode: SceneEnvironment.Color
1540 clearColor: "black"
1541 }
1542
1543 PerspectiveCamera { z: 400 }
1544
1545 DirectionalLight { }
1546
1547 Texture {
1548 id: checkerboard
1549 source: "checkerboard.png"
1550 scaleU: 20
1551 scaleV: 20
1552 tilingModeHorizontal: Texture.Repeat
1553 tilingModeVertical: Texture.Repeat
1554 }
1555
1556 Model {
1557 source: "#Rectangle"
1558 scale: Qt.vector3d(10, 10, 1)
1559 eulerRotation.x: -45
1560 materials: PrincipledMaterial {
1561 baseColorMap: checkerboard
1562 }
1563 }
1564
1565 Model {
1566 source: "#Cone"
1567 position: Qt.vector3d(100, -50, 100)
1568 materials: PrincipledMaterial { }
1569 }
1570
1571 Model {
1572 source: "#Cube"
1573 position.y: 100
1574 eulerRotation.y: 20
1575 materials: PrincipledMaterial { }
1576 }
1577
1578 Model {
1579 source: "#Sphere"
1580 position: Qt.vector3d(-150, 200, -100)
1581 materials: PrincipledMaterial { }
1582 }
1583 }
1584}
1585\endqml
1586\li \image quick3d-custom-effect-section-scene.jpg
1587\endtable
1588
1589Now let's apply an affect to the entire scene. More precisely, to the View3D. When there
1590are multiple View3D items in the scene, each has its own SceneEnvironment and therefore
1591have their own post-processing effect chain. In the example there is one single View3D
1592covering the entire window.
1593
1594\table
1595\header
1596\li Change in main.qml
1597\li effect.frag
1598\row \li \qml
1599 environment: SceneEnvironment {
1600 backgroundMode: SceneEnvironment.Color
1601 clearColor: "black"
1602 effects: redEffect
1603 }
1604
1605 Effect {
1606 id: redEffect
1607 property real uRed: 1.0
1608 NumberAnimation on uRed { from: 1; to: 0; duration: 5000; loops: -1 }
1609 passes: Pass {
1610 shaders: Shader {
1611 stage: Shader.Fragment
1612 shader: "effect.frag"
1613 }
1614 }
1615 }
1616\endqml
1617\li \badcode
1618void MAIN()
1619{
1620 vec4 c = texture(INPUT, INPUT_UV);
1621 c.r = uRed;
1622 FRAGCOLOR = c;
1623}
1624\endcode
1625\endtable
1626
1627This simple effect alters the red color channel value. Exposing QML properties as uniforms
1628works the same way with effects as with custom materials. The shader starts with a line
1629that is going to be very common when writing fragment shaders fro effects: sampling \c
1630INPUT at the UV coordinates \c INPUT_UV. It then performs its desired calculations, and
1631assigns the final fragment color to \c FRAGCOLOR.
1632
1633\image quick3d-custom-first-effect-anim.gif
1634
1635Many properties set in the example are in plural (effects, passes, shaders). While the
1636list \c{[ ]} syntax can be omitted when having a single element only, all these properties
1637are lists, and can hold more than one element. Why is this?
1638
1639\list
1640
1641\li \l{SceneEnvironment::effects}{effects} is a list, because View3D allows chaining
1642multiple effects together. The effects are applied in the order in which they are added to
1643the list. This allows easily applying two or more effects together to the View3D, and is
1644similar to what one can achieve in Qt Quick by nesting \l ShaderEffect items. The \c INPUT
1645texture of the next effect is always a texture that contains the previous effect's
1646output. The output of the last effect in what gets used as the final output of the View3D.
1647
1648\li \l{Effect::passes}{passes} is a list, because unlike ShaderEffect, Effect has built-in
1649support for multiple passes. A multi-pass effect is more powerful than chaining together
1650multiple, independent effects in \l{SceneEnvironment::effects}{effects}: a pass can output
1651to a temporary, intermediate texture, which can then be used as input to subsequent
1652passes, in addition to the original input texture of the effect. This allows creating
1653complex effects that calculate, render, and blend together multiple textures in order to
1654get to the final fragment color. This advanced use case is not going to be covered
1655here. Refer to the \l Effect documentation page for details.
1656
1657\li \l{Pass::shaders}{shaders} is a list, because an effect may have both a vertex and a
1658fragment shader associated.
1659
1660\endlist
1661
1662\section2 Chaining multiple effects
1663
1664Let's look at an example where the effect from the previous example gets complemented by
1665another effect similar to the built-in \l DistortionSpiral effect.
1666
1667\table
1668\header
1669\li Change in main.qml
1670\li effect2.frag
1671\row \li \qml
1672 environment: SceneEnvironment {
1673 backgroundMode: SceneEnvironment.Color
1674 clearColor: "black"
1675 effects: [redEffect, distortEffect]
1676 }
1677
1678 Effect {
1679 id: redEffect
1680 property real uRed: 1.0
1681 NumberAnimation on uRed { from: 1; to: 0; duration: 5000; loops: -1 }
1682 passes: Pass {
1683 shaders: Shader {
1684 stage: Shader.Fragment
1685 shader: "effect.frag"
1686 }
1687 }
1688 }
1689
1690 Effect {
1691 id: distortEffect
1692 property real uRadius: 0.1
1693 NumberAnimation on uRadius { from: 0.1; to: 1.0; duration: 5000; loops: -1 }
1694 passes: Pass {
1695 shaders: Shader {
1696 stage: Shader.Fragment
1697 shader: "effect2.frag"
1698 }
1699 }
1700 }
1701\endqml
1702\li \badcode
1703void MAIN()
1704{
1705 vec2 center_vec = INPUT_UV - vec2(0.5, 0.5);
1706 center_vec.y *= INPUT_SIZE.y / INPUT_SIZE.x;
1707 float dist_to_center = length(center_vec) / uRadius;
1708 vec2 texcoord = INPUT_UV;
1709 if (dist_to_center <= 1.0) {
1710 float rotation_amount = (1.0 - dist_to_center) * (1.0 - dist_to_center);
1711 float r = radians(360.0) * rotation_amount / 4.0;
1712 float cos_r = cos(r);
1713 float sin_r = sin(r);
1714 mat2 rotation = mat2(cos_r, sin_r, -sin_r, cos_r);
1715 texcoord = vec2(0.5, 0.5) + rotation * (INPUT_UV - vec2(0.5, 0.5));
1716 }
1717 vec4 c = texture(INPUT, texcoord);
1718 FRAGCOLOR = c;
1719}
1720\endcode
1721\endtable
1722
1723\image quick3d-custom-chained-effect-anim.gif
1724
1725Now the perhaps surprising question: why is this a bad example?
1726
1727More precisely, it is not bad, but rather shows a pattern that can often be beneficial to
1728avoid.
1729
1730Chaining effects this way can be useful, but it is important to keep in mind the
1731performance implications: doing two render passes (one to generate a texture with the
1732adjusted red color channel, and then another one two calculate the distortion) is quite
1733wasteful when one would be enough. If the fragment shader snippets were combined, the same
1734result could have been achieved with one single effect.
1735
1736\section1 Defining Mesh and Texture Data from C++
1737
1738Procedurally generating mesh and texture image data both follow similar steps:
1739
1740\list
1741\li Subclass \l QQuick3DGeometry or \l QQuick3DTextureData
1742\li Set the desired vertex or image data upon construction by calling the protected member functions
1743from the base class
1744\li If dynamic changes are needed afterwards at some point, set the new data and call update()
1745\li Once the implementation is done, the class needs to be registered to make it visible in QML
1746\li \l Model and \l Texture objects in QML can now use the custom vertex or image data provider by
1747setting the \l{Model::geometry} or \l{Texture::textureData} property
1748\endlist
1749
1750\section2 Custom vertex data
1751
1752Vertex data refers to the sequence of (typically \c float) values that make up a
1753mesh. Instead of loading \c{.mesh} files, a custom geometry provider is responsible for
1754providing the same data. The vertex data consist of \c attributes, such as position,
1755texture (UV) coordinates, or normals. The specification of attributes describes what kind
1756of attributes are present, the component type (for example, a 3 component float vector for
1757vertex position consisting of x, y, z values), which offset they start at in the provided
1758data, and what the stride (the increment that needs to be added to the offset to point to
1759the next element for the same attribute) is.
1760
1761This may seem familiar if one has worked with graphics APIs, such as OpenGL or Vulkan
1762directly, because the way vertex input is specified with those APIs maps loosely to what a
1763\c{.mesh} file or a \l QQuick3DGeometry instance defines.
1764
1765In addition, the mesh topology (primitive type) must be specified too. For indexed
1766drawing, the data for an index buffer must be provided as well.
1767
1768There is one built-in custom geometry implementation: the QtQuick3D.Helpers module
1769includes a \l GridGeometry type. This allows rendering a grid in the scene with line
1770primitives, without having to implement a custom \l QQuick3DGeometry subclass.
1771
1772One other common use cases is rendering points. This is fairly simple to do since the
1773attribute specification is going to be minimal: we provide three floats (x, y, z) for each
1774vertex, nothing else. A QQuick3DGeometry subclass could implement a geometry consisting of
17752000 points similarly to the following:
1776
1777\badcode
1778 clear();
1779 const int N = 2000;
1780 const int stride = 3 * sizeof(float);
1781 QByteArray v;
1782 v.resize(N * stride);
1783 float *p = reinterpret_cast<float *>(v.data());
1784 QRandomGenerator *rg = QRandomGenerator::global();
1785 for (int i = 0; i < N; ++i) {
1786 const float x = float(rg->bounded(200.0f) - 100.0f) / 20.0f;
1787 const float y = float(rg->bounded(200.0f) - 100.0f) / 20.0f;
1788 *p++ = x;
1789 *p++ = y;
1790 *p++ = 0.0f;
1791 }
1792 setVertexData(v);
1793 setStride(stride);
1794 setPrimitiveType(QQuick3DGeometry::PrimitiveType::Points);
1795 addAttribute(QQuick3DGeometry::Attribute::PositionSemantic, 0, QQuick3DGeometry::Attribute::F32Type);
1796\endcode
1797
1798Combined with a material of
1799
1800\qml
1801DefaultMaterial {
1802 lighting: DefaultMaterial.NoLighting
1803 cullMode: DefaultMaterial.NoCulling
1804 diffuseColor: "yellow"
1805 pointSize: 4
1806}
1807\endqml
1808
1809the end result is similar to this (here viewed from an altered camera angle, with the help
1810of \l WasdController):
1811
1812\image quick3d-custom-points.jpg
1813
1814\note Be aware that point sizes and line widths other than 1 may not be supported at run
1815time, depending on the underlying graphics API. This is not something Qt has control
1816over. Therefore, it can become necessary to implement alternative techniques instead of
1817relying on point and line drawing.
1818
1819\section2 Custom texture data
1820
1821With textures, the data that needs to be provided is a lot simpler structurally: it is the
1822raw pixel data, with a varying number of bytes per pixel, depending on the texture
1823format. For example, an \c RGBA texture expects four bytes per pixel, whereas \c RGBA16F
1824is four half-floats per pixel. This is similar to what a \l QImage stores
1825internally. However, Qt Quick 3D textures can have formats the data for which cannot be
1826represented by a QImage. For example, floating point HDR textures, or compressed
1827textures. Therefore the data for \l QQuick3DTextureData is always provided as a raw
1828sequence of bytes. This may seem familiar if one has worked with graphics APIs, such as
1829OpenGL or Vulkan directly.
1830
1831For details, refer to the \l QQuick3DGeometry and \l QQuick3DTextureData documentation pages.
1832
1833\sa CustomMaterial, Effect, QQuick3DGeometry, QQuick3DTextureData, {Qt Quick 3D - Custom
1834Effect Example}, {Qt Quick 3D - Custom Shaders Example}, {Qt Quick 3D - Custom Materials
1835Example}, {Qt Quick 3D - Custom Geometry Example}, {Qt Quick 3D - Procedural Texture
1836Example}
1837
1838*/