What is a Metal
shader?
Metal shader is a function written in a special Metal shading language that Metal
framework executes on a very low level of the GPU hardware. It can either do some rendering stuff, or perform more generic computations.
Rendering an arbitrary MTLTexture
on screen
- Initialise
MTKView
, that we will use for rendering a texture. - Implement shaders in Metal shading language.
- Do the drawing!
In order to instantiate a graphics rendering process, you are going to need three important things: pipeline state, command encoder and command buffer, with three protocols in Metal
framework that define behaviour of each: MTLRenderPipelineState
, MTLRenderCommandEncoder
and MTLCommandBuffer
So a command buffer
is responsible for accumulating and storing a sequence of encoded commands that are eventually committed for execution by the GPU.
Once created a command buffer, you then instantiate an encoder
object, which would be an interface for you to specify those commands and fill the buffer with them, where pipeline state
is going to define the shaders
you are going to use in those commands.
So the rendering process (in a very simplified way) is as follows:
- Create a render pipeline state listing graphics shader functions you are intended to use.
- Create command buffer that will eventually dispatch your commands for execution by the GPU.
- Create command encoder that will let you specify shader functions context, like provide input and output for each.
- Commit the commands to schedule the buffer for execution.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
public class MTKViewController: UIViewController {
// MARK: - Public interface
/// Metal texture to be drawn whenever the view controller is asked to render its view.
public var texture: MTLTexture?
// MARK: - Public overrides
override public func loadView() {
super.loadView()
assert(device != nil, "Failed creating a default system Metal device. Please, make sure Metal is available on your hardware.")
}
// MARK: - Private Metal-related properties and methods
/// `UIViewController`'s view
private var metalView: MTKView!
/// Metal device
private var device = MTLCreateSystemDefaultDevice()
}
Initialise MTKView
1
2
3
4
5
6
7
8
9
10
11
12
13
/**
initializes and configures the `MTKView` we use as `UIViewController`'s view.
*/
private func initializeMetalView() {
metalView = MTKView(frame: view.bounds, device: device)
metalView.delegate = self
metalView.framebufferOnly = true
metalView.colorPixelFormat = .BGRA8Unorm
metalView.contentScaleFactor = UIScreen.mainScreen().scale
metalView.autoresizingMask = [.FlexibleWidth, .FlexibleHeight]
view.insertSubview(metalView, atIndex: 0)
}
Now add a initializeMetalView()
call at the end of MTKViewController
’s loadView()
.
Implement shaders
Now, there are three types of shaders: kerne
, vertex
and fragment
functions.
Although kernel shaders are arguably the most interesting ones, as kernels is a type of shader one would use for parallel computations. For instance, multiplying large vectors and matrices, which is the heart and soul of neural networks and other machine learning algorithms.
The shaders are then compiled into a MTLLibrary
object, which is an interface for referencing all of your shaders.
The default library simply compiles .metal
files you have in your main bundle (so beware of possible issues in case you decide to put shaders into a framework, you may need to use a customised library intead).
1
2
#include <metal_stdlib>
using namespace metal;
Defining a struct
mapping a texture coordinate
to a rendered coordinate
:
1
2
3
4
typedef struct {
float4 renderedCoordinate [[position]];
float2 textureCoordinate;
} TextureMappingVertex;
Mind the [[position]]
part: this is a so called Attribute Qualifier, which is labeling a struct field with one of predefined attributes
.
We use position
here because we are going to use this TextureMappingVertex
as an output of a vertex shader. Since vertex shader should output a vertex position, and we are going to use some custom struct
as a return value, you need to explicitly mark one of the struct
fields with a position
attribute, in order for the command encoder to know which part of this struct
is going to define the actual vertext position.
Define a vertex mapTexture()
and a fragment displayTexture()
functions. Vertex function will map the texture vertices to rendering coordinates, and fragment will simply return the color of each pixel.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
vertex TextureMappingVertex mapTexture(unsigned int vertex_id [[ vertex_id ]]) {
float4x4 renderedCoordinates = float4x4(float4( -1.0, -1.0, 0.0, 1.0 ), /// (x, y, depth, W)
float4( 1.0, -1.0, 0.0, 1.0 ),
float4( -1.0, 1.0, 0.0, 1.0 ),
float4( 1.0, 1.0, 0.0, 1.0 ));
float4x2 textureCoordinates = float4x2(float2( 0.0, 1.0 ), /// (x, y)
float2( 1.0, 1.0 ),
float2( 0.0, 0.0 ),
float2( 1.0, 0.0 ));
TextureMappingVertex outVertex;
outVertex.renderedCoordinate = renderedCoordinates[vertex_id];
outVertex.textureCoordinate = textureCoordinates[vertex_id];
return outVertex;
}
The mapping is specified by renderedCoordinates
(position in the coordinate space of the view) and textureCoordinates
(position in the coordinate space of the texture). Each of the renderedCoordinates
is a vertex defined by four values: x, y, depth and W coordinate.
So renderedCoordinates
has coordinates of the screen edges in the following order:
1
left-bottom (-1.0, -1.0), right-bottom (1.0, -1.0), left-top (-1.0, 1.0), right-top (1.0, 1.0).
Now, textureCoordinates
contains exactly same points, but in the coordinate space of the texture, which is different from that of the view. The origin of the pixel coordinate system of a texture starts with its top left corner, which would have the coordinates (0, 0). Basically it’s same as the coordinate system in UIKit, with x axis going left to right and y going top to bottom.
So textureCoordinates
would list the exactly same edges of the texture in the same order as renderedCoordinates
:
1
left-bottom (0.0, 1.0), right-bottom (1.0, 1.0), left-top (0.0, 0.0), right-top (1.0, 0.0).
1
2
3
4
5
6
fragment half4 displayTexture(TextureMappingVertex mappingVertex [[ stage_in ]],
texture2d<float, access::sample> texture [[ texture(0) ]]) {
constexpr sampler s(address::clamp_to_edge, filter::linear);
return half4(texture.sample(s, mappingVertex.textureCoordinate));
}
This fragment function simply returns color value for each pixel in the texture.
Do the drawing
1
2
3
4
5
6
7
8
9
10
extension MTKViewController: MTKViewDelegate {
public func drawInMTKView(view: MTKView) {
guard
var texture = texture,
let device = device
else { return }
/// The rendering goes here.
}
}
Create a render pipeline state
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
/**
initializes render pipeline state with a default vertex function mapping texture to the view's frame and a simple fragment function returning texture pixel's value.
*/
private func initializeRenderPipelineState() {
guard let
device = device,
library = device.newDefaultLibrary()
else { return }
let pipelineDescriptor = MTLRenderPipelineDescriptor()
pipelineDescriptor.sampleCount = 1
pipelineDescriptor.colorAttachments[0].pixelFormat = .BGRA8Unorm
pipelineDescriptor.depthAttachmentPixelFormat = .Invalid
/**
* Vertex function to map the texture to the view controller's view
*/
pipelineDescriptor.vertexFunction = library.newFunctionWithName("mapTexture")
/**
* Fragment function to display texture's pixels in the area bounded by vertices of `mapTexture` shader
*/
pipelineDescriptor.fragmentFunction = library.newFunctionWithName("displayTexture")
do {
try renderPipelineState = device.newRenderPipelineStateWithDescriptor(pipelineDescriptor)
}
catch {
assertionFailure("Failed creating a render state pipeline. Can't render the texture without one.")
return
}
}
We have defined the vertex function and the fragment function, so the render pipeline state now has enough information to render the texture on screen: vertex shader specifying the position, and fragment shader specifying which color to draw each pixel in. Now, add a call to initializeRenderPipelineState()
at the end of your loadView()
method.
Create command buffer
We are going to do it in the MTKView
drawing callback, so add this line at the end of the drawInMTKView(_: MTKView)
function:
1
let commandBuffer = device.newCommandQueue().commandBuffer()
Create command encoder
MTKView
is supposed to be a convinience wrapper for a bunch of Metal
drawing routines, so will use this fact to get a couple other Metal
specific objects required to create a command encoder: render pass descriptor and current drawable object. Add this at the end of your drawInMTKView(_: MTKView)
callback:
1
2
3
4
5
guard let
currentRenderPassDescriptor = metalView.currentRenderPassDescriptor,
currentDrawable = metalView.currentDrawable,
renderPipelineState = renderPipelineState
else { return }
Ok, we now can create the command encoder and specify the commands. We are going to wrap them into a debug group named “RenderFrame” (since that’s exactly what we’re doing here):
1
2
3
4
5
6
7
let encoder = commandBuffer.renderCommandEncoderWithDescriptor(currentRenderPassDescriptor)
encoder.pushDebugGroup("RenderFrame")
encoder.setRenderPipelineState(renderPipelineState)
encoder.setFragmentTexture(texture, atIndex: 0)
encoder.drawPrimitives(.TriangleStrip, vertexStart: 0, vertexCount: 4, instanceCount: 1)
encoder.popDebugGroup()
encoder.endEncoding()
Here we have specified the input texture and commands to draw a primitive with 4 vertices (since we are drawing a rectangle texture on a rectangle screen).
Commit the commands
1
2
commandBuffer.presentDrawable(currentDrawable)
commandBuffer.commit()