Tags:
- Graphics
#TechnicalArt#ComputerGraphics#GraphicsAPI#GameEngine
This note is mainly about how to use Metal Shading Language in SwiftUI to write materials provided to views.
Shaders in SwiftUI
Since iOS 17, SwiftUI supports applying three different visual effects to a View, known as colorEffect, layerEffect, and distortEffect. These three Effects can apply a 2D plane-based Metal Shader to the current view.
However, even in the official documentation [1], the explanations for the Shaders that can be used in SwiftUI are pitifully sparse, and some limitations are not clearly stated. This note aims to lay out some key details to help more people develop Metal Shaders in SwiftUI.
In addition, all the Shaders and the code repository mentioned in this article have been open-sourced. Interested readers can check the GitHub repository themselves: https://github.com/Lockbrains/SwiftUI-2D-Shader-Assets.
Usage
Calling from the SwiftUI Side
In SwiftUI, you can directly apply one (or multiple simultaneously) .colorEffect, .layerEffect, or .distortEffect modifiers to a view, for example,
The three Shader Effects here correspond to the following situations respectively:
- .colorEffect: If the corresponding Shader only needs the color information of the current pixel, Color Effect should be used. You can think of .colorEffect as a fragment shader.
- .layerEffect: If the corresponding Shader needs more than just the color information of the current pixel, layerEffect provides us with the entire layer of the modified view, so we can implement some context-dependent effects, such as Gaussian blur.
- .distortEffect: If the corresponding Shader modifies the vertex positions, we need to use Distort Effect. You can think of .distortEffect as a vertex shader.
In the code above, the content inside colorEffect() is an example Shader I defined in another file (for readability), which has the following form:
Among them, the dissolveEffect function has the following form:
Two layers of encapsulation are provided here, again for code readability and to make it more intuitive for third parties when using the Shader. I will detail the reasoning behind this in later sections. But if you need to write a temporary Shader individually, you can just use ShaderLibrary.shaderName() directly inside .colorEffect, and then pass the parameters to the Shader using the methods mentioned in the Data Passing section below.
MSL Syntax
Each Effect needs to use a compliant MSL function to provide the Shader content. Functions for different Effects must adhere to different function signatures.
Color Effect
For Color Effect, the required signature is as follows:
Here, regarding the 0th parameter float2 position, the official statement is that position is the pixel coordinate in user space. Before developing SwiftUI shaders, it is essential to understand the meaning of user space coordinates. If you are unclear, you can refer to the User Space section below.
In addition, a half4 color must also be provided at the 1st parameter position, which is the current color of our view at that logical position.
However, the word "provided" here is actually not very accurate. Both position and color only need to be provided in the MSL Shader signature; we do not need to provide these two properties in SwiftUI's ShaderLibrary.shaderName, as they are passed in automatically. But for other signatures, they correspond one-to-one and in order. For example, in the dissolveEffect I wrote, its MSL signature is as follows:
Correspondingly, the order used when calling this Shader in SwiftUI is:
Note their one-to-one correspondence in order. In the SwiftUI API, we cannot see the name of each parameter, which indeed increases the difficulty of debugging when developing multi-input shaders, so I strongly recommend remembering to add sufficient comments when calling.
Layer Effect
For Layer Effect, the required signature is as follows:
In MSL, we can get the color of the current view at the position by using layer.sample(position). Then, because we have the global information of the entire layer, we can obviously also get information about other positions related to the current position by using layer.sample(f(position)). This gives us the possibility to write operations like blur.
For example, in this simple Gaussian blur, we can perform Gaussian blur by sampling a total of 9 logical points around the position.
In SwiftUI, the way to pass data is the same as Color Effect, so I won't repeat it. What needs to be noted is that we need to provide an additional maxSampleOffset:
According to the information in the official documentation,
For example, in the Gaussian blur above, we might blur the surrounding 9 pixels, so setting width and height to 3 here is reasonable. This maxSampleOffset is equivalent to telling the shader that although we might use information from pixels outside the current position, it will absolutely not exceed this range. This is also Apple's official way of optimizing Layer Effect.
User Space
User Space Coordinates (USC) is the coordinate system used in the application logic, rather than the physical pixel coordinates of the device or screen. This coordinate space is usually normalized and scaled, translated, or rotated according to application needs, making it more convenient to lay out or design content.
In SwiftUI, we usually do not need to use physical pixel coordinates directly, but rather base things on logical coordinates. User coordinate space allows us to lay out graphical content in a device-independent way.
Example
Suppose we use a Shader on a 300 × 300 SwiftUI Rectangle component, then the position parameter will be provided in user space coordinates, for example, the position at the center might be (150, 150). If we scale the Rectangle up or down (using the .scaleEffect modifier), the coordinate values of position will still be based on the 300 x 300 range, and will not reflect the specific number of pixels.
Note that the 300 x 300 or (150, 150) here are all in units of logical points, not pixels. Points in SwiftUI are logical units independent of screen resolution; for example, on a high-resolution Retina screen, one logical point might correspond to multiple physical pixels.
Growth Direction
In SwiftUI, we usually consider the top-left corner to be (0,0) and the bottom-right corner to be the maximum value (for example, (300, 300) in the previous example). We can verify this with a simple Shader.
Use a simple Shader to return the simplest Gradient, and then apply .colorEffect to a view.
The size here should provide the logical size of the current view. This gradient indeed reflects the characteristic that the top-left corner is (0,0) and the bottom-right corner is the maximum value.
UV
Understanding these basics, if we still want to use a normalized method (i.e., the commonly known uv coordinate system) to write Shaders, then I suggest adding a float2 size to the MSL input. This size is the logical size of the current view. Through this logical size, we can easily calculate the normalized uv, just as was done in the trivialGradient above:
Then in SwiftUI, pass the logical size into MSL using .float2(x, y), and you can create effects based on normalized coordinates.
Data Passing
For this part, I can only say that Apple officially did provide explanations, but it really takes a long time to find them. In short, there are differences between the data types of Metal and SwiftUI. To give a simple example, float2 and half2 are very common data types in Metal, but SwiftUI only has Float. Passing data requires knowing what data types exist on both sides and what APIs to use for data passing.
MSL Side
For data supported on the MSL side, it is recommended to consult the Metal Shading Language Specification. However, most of the time we will only use simple float, floatn, half, halfn, and texture2d<half> types. In particular, when describing and recording colors, we should always use half4.
It should be noted that implicit conversion between halfn and floatn is not possible in MSL. Although you can use halfn * float, or floatn * half, you cannot multiply halfn with floatn.
SwiftUI Side
In SwiftUI, we need to pass data to Metal through the following interfaces:
- .color(Color): Input a SwiftUI Color type inside the parentheses, and this data will be translated into a half4 type. Note that in Metal, the half4 type is used by default to describe colors; you can just treat half4 as color4.
- .float(T): Input a SwiftUI type conforming to the BinaryFloatingPoint protocol inside the parentheses. This data will be translated into a float type.
- .float2(T, T): Input two SwiftUI types conforming to the BinaryFloatingPoint protocol inside the parentheses. This data will be translated into a float2 type. Similarly, there are .float3 and .float4, which will not be elaborated on.
- .image(Image): Input a SwiftUI Image type inside the parentheses, and this data will be translated into a texture2d<half> type. It is particularly important to note that currently, shaders in SwiftUI can have at most one Texture type input. If you write a Shader that needs to sample two textures, unfortunately, it will have no effect.
In addition to the above, there are also some ways to pass Arrays:
- .colorArray([Color]): Input a SwiftUI Color array inside the parentheses. This array will be translated into a pair, namely (device const half4 *, count).
- .floatArray([T]): Input a SwiftUI type array conforming to the BinaryFloatingPoint protocol inside the parentheses. This array will be translated into a pair, namely (device const float *, count).
- .data(Data): Input SwiftUI Data inside the parentheses. This data will be translated into a pair, namely (device const void *, size_in_bytes).
How to Translate Shaders from Other Platforms
Shaders from ShaderToy
Shaders from Unity
References:
- Apple Developer Documentation (Shader), https://developer.apple.com/documentation/swiftui/shader
- Apple Developer Documentation (Shader.Argument), https://developer.apple.com/documentation/swiftui/shader/argument
- How to add Metal shaders to SwiftUI views using layer effects, https://www.hackingwithswift.com/quick-start/swiftui/how-to-add-metal-shaders-to-swiftui-views-using-layer-effects
- Using Metal Shader in SwiftUI, https://www.cnblogs.com/jerrywossion/p/18090457
