iOS.& Swift Books 金属由教程

14
多铝合金& Deferred Rendering 由m harga撰写

截至目前,您已经运行了只有一个渲染通行证的项目和游乐场。换句话说,您使用单个命令编码器将您的所有绘制调用提交给GPU。

对于更复杂的应用程序,您可能需要多个渲染通过,然后在将纹理呈现到屏幕之前,让您使用下一个传递的结果。您甚至可能需要潜水屏幕以供以后使用。

有多个渲染通行证,您可以呈现具有多个灯光和阴影的场景,如下所示:

请注意,因为在本章中,您将创建该场景。一路上,你会学到一些关键概念,例如:

  • 阴影地图。
  • 多铝合金rendering.
  • 推迟与G-缓冲器的渲染。
  • 所述的blit命令编码器。

你会与阴影第一次启动。

阴影地图

阴影表示表面上没有光。当另一个表面或物体从光线遮挡时,对象存在阴影。在一个项目中有阴影使您的场景看起来更加真实并提供深度的感觉。

阴影地图 只不过是包含有关场景的阴影信息的纹理。当一个灯光闪耀在一个物体上时,那个对象后面的东西都会在它上投射阴影。

通常,您可以从相机的位置呈现场景,但要构建阴影图,则需要从光源的位置呈现您的场景 - 在这种情况下,太阳。

左侧的图像显示了从相机的位置与指向的定向光的渲染。右侧的图像显示了从定向光的位置显示的渲染。

眼睛显示相机位于第一图像中的位置。

你会做两渲染通道:

  • 第一次通过:使用持有太阳位置的单独视图矩阵,从光线的角度来看。因为您在此阶段对颜色不感兴趣,所以只有太阳可以看到的物体的深度,您只能在此通行证中呈现深度纹理。这是灰度纹理,灰色值指示深度。黑色靠近浅色和白色很远。

  • 第二次通过:您将像往常一样使用相机,但您将使用每个深度映射片段进行比较相机片段。如果片段的深度比该位置的深度图呈颜色更轻,则表示片段在阴影中。光可以“看到”在上面的图像中的蓝色x,所以它不是在阴影中。

阴影和延迟渲染是复杂的科目,因此本章有一个初学项目。在Xcode中打开它,浏览一下。

代码类似于第5章“照明基础”末尾的可用。

为简单起见,您将仅在弥漫性颜色上工作;该项目不提供镜面和环境照明。

构建并运行项目,您将看到一列火车和一架树模型,都在飞机上面:

添加这些属性 Renderer.swift., at the top of Renderer:

var shadowTexture: MTLTexture!
let shadowRenderPassDescriptor = MTLRenderPassDescriptor()

Later, when you create the render command encoder for drawing the shadow, you’ll use this render pass descriptor. Each render pass descriptor can have up to eight color textures attached to it, plus a depth texture and a stencil texture. The shadowRenderPassDescriptor points to shadowTexture as a depth attachment.

您需要多个纹理在章节中,因此为构建它们创建帮助方法。

Add this new method to Renderer:

func buildTexture(pixelFormat: MTLPixelFormat, 
                  size: CGSize, 
				  label: String) -> MTLTexture {
  let descriptor = MTLTextureDescriptor.texture2DDescriptor(
                              pixelFormat: pixelFormat,
                              width: Int(size.width),
                              height: Int(size.height),
                              mipmapped: false)
  descriptor.usage = [.shaderRead, .renderTarget]
  descriptor.storageMode = .private
  guard let texture = 
    Renderer.device.makeTexture(descriptor: descriptor) else {
    fatalError() 
  }
  texture.label = "\(label) texture"
  return texture
}

在此方法中,您可以配置纹理描述符并使用该描述符创建纹理。渲染通过描述符使用的纹理必须配置为 渲染目标。渲染目标是内存缓冲区或纹理,允许呈现呈现像素不需要最终在帧缓冲区中的情况下渲染的屏幕渲染。存储模式是私有的,这意味着纹理存储在内存中,只有GPU可以访问的地方。

接下来,将以下内容添加到文件底部:

private extension MTLRenderPassDescriptor {
  func setUpDepthAttachment(texture: MTLTexture) {
    depthAttachment.texture = texture
    depthAttachment.loadAction = .clear
    depthAttachment.storeAction = .store
    depthAttachment.clearDepth = 1
  }
}

This creates a new extension on MTLRenderPassDescriptor with a new method that sets up the depth attachment of a render pass descriptor and configures it to store the provided texture. This is where you’ll attach shadowTexture to shadowRenderPassDescriptor.

您正在创建一个单独的方法,因为您将在本章后面拥有其他渲染传递描述符。负载和存储操作描述了附件应在渲染通行证的开始和结尾所采取的操作。在这种情况下,您可以在通过的开头清除纹理并在通过结束时存储纹理。

Now, add the following to the Renderer class:

func buildShadowTexture(size: CGSize) {
  shadowTexture = buildTexture(pixelFormat: .depth32Float,
                               size: size, label: "Shadow")
  shadowRenderPassDescriptor.setUpDepthAttachment(
                               texture: shadowTexture)
}

This builds the depth texture by calling the two helper methods you just created. Next, call this method at the end of init(metalView:):

buildShadowTexture(size: metalView.drawableSize)

Also, call it at the end of mtkView(_:drawableSizeWillChange:) so that when the user resizes the window, you can rebuild the textures with the correct size:

buildShadowTexture(size: size)

构建并运行项目以确保一切都有效。您尚未看到任何视觉变化;在移动到下一个任务之前,您只需验证事物就会无错误。

多铝合金rendering

渲染通行证包括将命令发送到命令编码器。当您在该命令编码器上结束编码时,传递结束。多支渲染使用多个命令编码器,并在一个渲染中促进呈现内容并使用此通过的输出作为下一个渲染传递的输入。

影子通过

在影子通行证期间,您将从太阳的角度来看,因此您需要一个新的视图矩阵。

matrix_float4x4 shadowMatrix;
var shadowPipelineState: MTLRenderPipelineState!
func renderShadowPass(renderEncoder: MTLRenderCommandEncoder) {
  renderEncoder.pushDebugGroup("Shadow pass")
  renderEncoder.label = "Shadow encoder"
  renderEncoder.setCullMode(.none)
  renderEncoder.setDepthStencilState(depthStencilState)
  // 1
  renderEncoder.setDepthBias(0.01, slopeScale: 1.0, clamp: 0.01)
  // 2
  uniforms.projectionMatrix = float4x4(orthoLeft: -8, right: 8, 
                                       bottom: -8, top: 8, 
                                       near: 0.1, far: 16)
  let position: float3 = [sunlight.position.x, 
                          sunlight.position.y,
                          sunlight.position.z]
  let center: float3 = [0, 0, 0]
  let lookAt = float4x4(eye: position, center: center, 
                        up: [0,1,0])
  uniforms.viewMatrix = lookAt
  uniforms.shadowMatrix = 
       uniforms.projectionMatrix * uniforms.viewMatrix
  
  renderEncoder.setRenderPipelineState(shadowPipelineState)
  for model in models {
    draw(renderEncoder: renderEncoder, model: model)
  }
  renderEncoder.endEncoding()
  renderEncoder.popDebugGroup()
}
guard let shadowEncoder = commandBuffer.makeRenderCommandEncoder(
                descriptor: shadowRenderPassDescriptor) else {
  return
}
renderShadowPass(renderEncoder: shadowEncoder)
func buildShadowPipelineState() {
  let pipelineDescriptor = MTLRenderPipelineDescriptor()
  pipelineDescriptor.vertexFunction = 
       Renderer.library.makeFunction(name: "vertex_depth")
  pipelineDescriptor.fragmentFunction = nil
  pipelineDescriptor.colorAttachments[0].pixelFormat = .invalid
  pipelineDescriptor.vertexDescriptor =
      MTKMetalVertexDescriptorFromModelIO(
                Model.defaultVertexDescriptor)
  pipelineDescriptor.depthAttachmentPixelFormat = .depth32Float
  do {
    shadowPipelineState = 
       try Renderer.device.makeRenderPipelineState(
                     descriptor: pipelineDescriptor)
  } catch let error {
    fatalError(error.localizedDescription)
  }
}
buildShadowPipelineState()
#import "../Utility/Common.h"

struct VertexIn {
  float4 position [[ attribute(0) ]];
};

vertex float4 
       vertex_depth(const VertexIn vertexIn [[ stage_in ]],
                    constant Uniforms &uniforms [[buffer(1)]]) {
  matrix_float4x4 mvp = 
        uniforms.projectionMatrix * uniforms.viewMatrix
        * uniforms.modelMatrix;
  float4 position = mvp * vertexIn.position;
  return position;
}

uniforms.projectionMatrix = camera.projectionMatrix

主要通过

既然您将暗影图保存到纹理,您需要做的就是将其发送到下一个传递 - 主通过 - 所以您可以在片段函数中使用纹理在照明计算中。

renderEncoder.setFragmentTexture(shadowTexture, index: 0)
float4 shadowPosition;
out.shadowPosition = 
     uniforms.shadowMatrix * uniforms.modelMatrix 
     * vertexIn.position;
depth2d<float> shadowTexture [[texture(0)]]
// 1
float2 xy = in.shadowPosition.xy;
xy = xy * 0.5 + 0.5;
xy.y = 1 - xy.y;
// 2
constexpr sampler s(coord::normalized, filter::linear,
                    address::clamp_to_edge, 
                    compare_func:: less);
float shadow_sample = shadowTexture.sample(s, xy);
float current_sample = 
     in.shadowPosition.z / in.shadowPosition.w;
// 3
if (current_sample > shadow_sample ) {
  diffuseColor *= 0.5;
}

延期渲染

本章前,你只是一直在使用 前进渲染 但假设您有一百个模型(或实例)和场景中的一百盏灯。假设这是一个大都会市中心,建筑物和街灯的数量很容易到这个场景中对象的数量。

G-Buffer通过

All right, time to build that G-buffer up! First, create four new textures. Add this code at the top of Renderer:

var albedoTexture: MTLTexture!
var normalTexture: MTLTexture!
var positionTexture: MTLTexture!  
var depthTexture: MTLTexture!
var gBufferPipelineState: MTLRenderPipelineState!
var gBufferRenderPassDescriptor: MTLRenderPassDescriptor!
func buildGbufferTextures(size: CGSize) {
  albedoTexture = buildTexture(pixelFormat: .bgra8Unorm, 
                          size: size, label: "Albedo texture")
  normalTexture = buildTexture(pixelFormat: .rgba16Float, 
                          size: size, label: "Normal texture")
  positionTexture = buildTexture(pixelFormat: .rgba16Float, 
                          size: size, label: "Position texture")
  depthTexture = buildTexture(pixelFormat: .depth32Float, 
                          size: size, label: "Depth texture")
}
func setUpColorAttachment(position: Int, texture: MTLTexture) {
  let attachment: MTLRenderPassColorAttachmentDescriptor = 
    colorAttachments[position]
  attachment.texture = texture
  attachment.loadAction = .clear
  attachment.storeAction = .store
  attachment.clearColor = MTLClearColorMake(0.73, 0.92, 1, 1)
}
func buildGBufferRenderPassDescriptor(size: CGSize) {
  gBufferRenderPassDescriptor = MTLRenderPassDescriptor()
  buildGbufferTextures(size: size)
  let textures: [MTLTexture] = [albedoTexture, 
                                normalTexture, 
                                positionTexture]
  for (position, texture) in textures.enumerated() {
    gBufferRenderPassDescriptor.setUpColorAttachment(
          position: position, texture: texture)
  }
  gBufferRenderPassDescriptor.setUpDepthAttachment(
          texture: depthTexture)
}
buildGBufferRenderPassDescriptor(size: size)
func buildGbufferPipelineState() {
  let descriptor = MTLRenderPipelineDescriptor()
  descriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
  descriptor.colorAttachments[1].pixelFormat = .rgba16Float
  descriptor.colorAttachments[2].pixelFormat = .rgba16Float
  descriptor.depthAttachmentPixelFormat = .depth32Float
  descriptor.label = "GBuffer state"

  descriptor.vertexFunction = 
    Renderer.library.makeFunction(name: "vertex_main")
  descriptor.fragmentFunction = 
    Renderer.library.makeFunction(name: "gBufferFragment")
  descriptor.vertexDescriptor = 
    MTKMetalVertexDescriptorFromModelIO(
               Model.defaultVertexDescriptor)
  do {
    gBufferPipelineState = try 
	  Renderer.device.makeRenderPipelineState(
               descriptor: descriptor)
  } catch let error {
    fatalError(error.localizedDescription)
  }
}
buildGbufferPipelineState()
func renderGbufferPass(renderEncoder: MTLRenderCommandEncoder) {
  renderEncoder.pushDebugGroup("Gbuffer pass")
  renderEncoder.label = "Gbuffer encoder"

  renderEncoder.setRenderPipelineState(gBufferPipelineState)
  renderEncoder.setDepthStencilState(depthStencilState)

  uniforms.viewMatrix = camera.viewMatrix
  uniforms.projectionMatrix = camera.projectionMatrix
  fragmentUniforms.cameraPosition = camera.position
  renderEncoder.setFragmentTexture(shadowTexture, index: 0)
  renderEncoder.setFragmentBytes(&fragmentUniforms, 
                  length: MemoryLayout<FragmentUniforms>.stride, 
                  index: 3)
  for model in models {
    draw(renderEncoder: renderEncoder, model: model)
  }
  renderEncoder.endEncoding()
  renderEncoder.popDebugGroup()
}
guard let gBufferEncoder = commandBuffer.makeRenderCommandEncoder(
                        descriptor: gBufferRenderPassDescriptor) else {
  return
}
renderGbufferPass(renderEncoder: gBufferEncoder)
#import "../Utility/Common.h"

struct VertexOut {
  float4 position [[position]];
  float3 worldPosition;
  float3 worldNormal;
  float4 shadowPosition;
};

struct GbufferOut {
  float4 albedo [[color(0)]];
  float4 normal [[color(1)]];
  float4 position [[color(2)]];
};
fragment GbufferOut gBufferFragment(VertexOut in [[stage_in]],
             depth2d<float> shadow_texture [[texture(0)]],
             constant Material &material [[buffer(1)]]) {
  GbufferOut out;
  // 1
  out.albedo = float4(material.baseColor, 1.0);
  out.albedo.a = 0;
  out.normal = float4(normalize(in.worldNormal), 1.0);
  out.position = float4(in.worldPosition, 1.0);
  // 2
  // copy from fragment_main
  float2 xy = in.shadowPosition.xy;
  xy = xy * 0.5 + 0.5;
  xy.y = 1 - xy.y;
  constexpr sampler s(coord::normalized, filter::linear, 
                      address::clamp_to_edge, 
                      compare_func:: less);
  float shadow_sample = shadow_texture.sample(s, xy);
  float current_sample = 
         in.shadowPosition.z / in.shadowPosition.w;

  // 3
  if (current_sample > shadow_sample ) {
    out.albedo.a = 1;
  }
  return out;
}

该BLIT命令编码器

闪光意味着从内存的一部分复制到另一部分。您在诸如纹理和缓冲区之类的资源上使用闪烁命令编码器。它通常用于图像处理,但您可以(和遗嘱)也使用它来复制偏离屏幕渲染的图像数据。

guard let blitEncoder = commandBuffer.makeBlitCommandEncoder() else {
  return
}
blitEncoder.pushDebugGroup("Blit")
blitEncoder.label = "Blit encoder"
let origin = MTLOriginMake(0, 0, 0)
let size = MTLSizeMake(Int(view.drawableSize.width), Int(view.drawableSize.height), 1)
blitEncoder.copy(from: albedoTexture, sourceSlice: 0, 
                 sourceLevel: 0, 
                 sourceOrigin: origin, sourceSize: size, 
                 to: drawable.texture, destinationSlice: 0, 
                 destinationLevel: 0, destinationOrigin: origin)
blitEncoder.endEncoding()
blitEncoder.popDebugGroup()
metalView.framebufferOnly = false

照明通行证

到目前为止,您已将场景颜色附件呈现为多个渲染目标,保存它们以供以后在片段着色器中使用。这确保只有可见的片段被处理,从而减少了在场景中模型中所有几何形状所做的计算量。

var compositionPipelineState: MTLRenderPipelineState!

var quadVerticesBuffer: MTLBuffer!
var quadTexCoordsBuffer: MTLBuffer!

let quadVertices: [Float] = [
  -1.0,  1.0,
   1.0, -1.0,
  -1.0, -1.0,
  -1.0,  1.0,
   1.0,  1.0,
   1.0, -1.0
]

let quadTexCoords: [Float] = [
  0.0, 0.0,
  1.0, 1.0,
  0.0, 1.0,
  0.0, 0.0,
  1.0, 0.0,
  1.0, 1.0
]
quadVerticesBuffer = 
    Renderer.device.makeBuffer(bytes: quadVertices, 
      length: MemoryLayout<Float>.size * quadVertices.count, 
      options: [])
quadVerticesBuffer.label = "Quad vertices"
quadTexCoordsBuffer = 
    Renderer.device.makeBuffer(bytes: quadTexCoords, 
      length: MemoryLayout<Float>.size * quadTexCoords.count, 
      options: [])
quadTexCoordsBuffer.label = "Quad texCoords"
func renderCompositionPass(
             renderEncoder: MTLRenderCommandEncoder) {
  renderEncoder.pushDebugGroup("Composition pass")
  renderEncoder.label = "Composition encoder"
  renderEncoder.setRenderPipelineState(compositionPipelineState)
  renderEncoder.setDepthStencilState(depthStencilState)
  // 1
  renderEncoder.setVertexBuffer(quadVerticesBuffer, 
                                offset: 0, index: 0)
  renderEncoder.setVertexBuffer(quadTexCoordsBuffer, 
                                offset: 0, index: 1)
  // 2
  renderEncoder.setFragmentTexture(albedoTexture, index: 0)
  renderEncoder.setFragmentTexture(normalTexture, index: 1)
  renderEncoder.setFragmentTexture(positionTexture, index: 2)
  renderEncoder.setFragmentBytes(&lights, 
    length: MemoryLayout<Light>.stride * lights.count, 
    index: 2)
  renderEncoder.setFragmentBytes(&fragmentUniforms, 
    length: MemoryLayout<FragmentUniforms>.stride, 
	index: 3)
  // 3
  renderEncoder.drawPrimitives(type: .triangle, 
                               vertexStart: 0, 
                               vertexCount: quadVertices.count)
  renderEncoder.endEncoding()
  renderEncoder.popDebugGroup()
}
guard let compositionEncoder = 
    commandBuffer.makeRenderCommandEncoder(
                        descriptor: descriptor) else {
  return
}
renderCompositionPass(renderEncoder: compositionEncoder)
func buildCompositionPipelineState() {
  let descriptor = MTLRenderPipelineDescriptor()
  descriptor.colorAttachments[0].pixelFormat = 
      Renderer.colorPixelFormat
  descriptor.depthAttachmentPixelFormat = .depth32Float
  descriptor.label = "Composition state"
  descriptor.vertexFunction = Renderer.library.makeFunction(
    name: "compositionVert")
  descriptor.fragmentFunction = Renderer.library.makeFunction(
    name: "compositionFrag")
  do {
    compositionPipelineState = 
      try Renderer.device.makeRenderPipelineState(
          descriptor: descriptor)
  } catch let error {
    fatalError(error.localizedDescription)
  }
}
buildCompositionPipelineState()
#import "../Utility/Common.h"

struct VertexOut {
  float4 position [[position]];
  float2 texCoords;
};
vertex VertexOut compositionVert(
  constant float2 *quadVertices [[buffer(0)]],
  constant float2 *quadTexCoords [[buffer(1)]],
  uint id [[vertex_id]]) {
  VertexOut out;
  out.position = float4(quadVertices[id], 0.0, 1.0);
  out.texCoords = quadTexCoords[id];
  return out;
}
fragment float4 compositionFrag(VertexOut in [[stage_in]],
     constant FragmentUniforms &fragmentUniforms [[buffer(3)]],
     constant Light *lightsBuffer [[buffer(2)]],
     texture2d<float> albedoTexture [[texture(0)]],
     texture2d<float> normalTexture [[texture(1)]],
     texture2d<float> positionTexture [[texture(2)]],
     depth2d<float> shadowTexture [[texture(4)]]) {
  // 1
  constexpr sampler s(min_filter::linear, mag_filter::linear);
  float4 albedo = albedoTexture.sample(s, in.texCoords);
  float3 normal = normalTexture.sample(s, in.texCoords).xyz;
  float3 position = positionTexture.sample(s, in.texCoords).xyz;
  float3 baseColor = albedo.rgb;
  // 2
  float3 diffuseColor = compositeLighting(normal, position, 
                          fragmentUniforms, 
                          lightsBuffer, baseColor);
  // 3
  float shadow = albedo.a;
  if (shadow > 0) {
    diffuseColor *= 0.5;
  }
  return float4(diffuseColor, 1);
}

lights.append(sunlight)
createPointLights(count: 30, min: [-3, 0.3, -3], max: [1, 2, 2])
models[0].rotation.y += 0.01

var lightsBuffer: MTLBuffer!
lightsBuffer = Renderer.device.makeBuffer(bytes: lights, 
  length: MemoryLayout<Light>.stride * lights.count, 
  options: [])
renderEncoder.setFragmentBytes(&lights,
  length: MemoryLayout<Light>.stride * lights.count,
  index: 2)
renderEncoder.setFragmentBuffer(lightsBuffer, 
                                offset: 0, index: 2)
createPointLights(count: 300, min: [-10, 0.3, -10], 
                  max: [10, 2, 20])

然后去哪儿?

如果您正在尝试提高您的应用程序性能,您可以尝试几种方法。一个是将灯光作为光量卷,并使用模板测试仅选择影响片段的灯光,只能渲染这些灯而不是所有的灯光。

有技术问题吗?要报告bug? 您可以向官方书籍论坛中的书籍作者提出问题和报告错误 这里 .

有反馈分享在线阅读体验吗? 如果您有关于UI,UX,突出显示或我们在线阅读器的其他功能的反馈,您可以将其发送到设计团队,其中表格如下所示:

© 2021 Razeware LLC

您可以免费读取,本章的部分显示为 混淆了 文本。解锁这本书,以及我们整个书籍和视频目录,带有Raywenderlich.com的专业订阅。

现在解锁

要突出或记笔记,您需要在订阅中拥有这本书或自行购买。