Initial pain point
SpriteKit rather brilliantly integrates with CoreGraphics. This means sprites can be created procedurally from CGMutablePath sequences. The mobs in Escape Into Infinity (called "drones") are defined as follows.
A path of straight lines is specified, then provided to the constructor of SKShapeNode for the path argument. Some additional properties are set to style the shape node and add a glow effect, then bam, we've got our mob sprite. So, what's the problem here?
It turns out, SpriteKit will create a unique texture for every SKShapeNode instance we declare this way. If you're at all familiar with lower level GPU frameworks like OpenGL or Metal, you'll know that swapping out texture objects, along with other render state changes, tends to be costly, performance-wise. On 64-bit phones and tablets that supported Metal this didn't make things dip below the point of playability, but it was still noticeably sluggish. On 32-bit iOS devices, which only support OpenGL, the result was disastrous. Generating a map with hundreds of mobs on it, each with its own texture, ate up enough CPU time with switching textures and state validation to make the game unplayable. Add to that other enemies, props, parallax star fields, and various particle effects, and what should have been a fairly simple game ended up being a slideshow.
let path = CGMutablePath() path.move(to: .zero) path.addLine(to: CGPoint(x: 15, y: 20)) path.addLine(to: CGPoint(x: 20, y: -20)) path.addLine(to: CGPoint(x: 5, y: 0)) path.addLine(to: CGPoint(x: 0, y: -20)) path.addLine(to: CGPoint(x: -5, y: 0)) path.addLine(to: CGPoint(x: -20, y: -20)) path.addLine(to: CGPoint(x: -15, y: 20)) path.addLine(to: .zero) let shape = SKShapeNode(path: path, centered: true) shape.lineWidth = 1 shape.strokeColor = .magenta shape.lineJoin = .round shape.glowWidth = Drone.glowWidth
A path of straight lines is specified, then provided to the constructor of SKShapeNode for the path argument. Some additional properties are set to style the shape node and add a glow effect, then bam, we've got our mob sprite. So, what's the problem here?
It turns out, SpriteKit will create a unique texture for every SKShapeNode instance we declare this way. If you're at all familiar with lower level GPU frameworks like OpenGL or Metal, you'll know that swapping out texture objects, along with other render state changes, tends to be costly, performance-wise. On 64-bit phones and tablets that supported Metal this didn't make things dip below the point of playability, but it was still noticeably sluggish. On 32-bit iOS devices, which only support OpenGL, the result was disastrous. Generating a map with hundreds of mobs on it, each with its own texture, ate up enough CPU time with switching textures and state validation to make the game unplayable. Add to that other enemies, props, parallax star fields, and various particle effects, and what should have been a fairly simple game ended up being a slideshow.
The solution
It turns out, SpriteKit knows how to batch sprite draw calls that explicitly use the same texture object. Even better, the SKView class has a method called texture(from:) which takes any old SKNode and renders it to a texture bitmap.
The constructor for the Drone enemy type (as well as other SKShapeNode-based sprites that would need to be repeated in large numbers) was modified to take the render target SKView as an argument, and lazily generate a static texture property, creating SKSpriteNodes from that common texture. So, zooming out a bit from the above:
private static var texture: SKTexture! = nil
private static func getTexture(view: SKView) -> SKTexture {
if texture != nil {
return texture
}
// ... path and SKShapeNode code from above...
texture = view.texture(from: shape)
return texture
}
Newly minted Drone instances will call this method on setup, and create an SKSpriteNode instance from the returned texture. The sentries (stationary turrets) and dormant ships use this same hack. With sentries, the pivot point is actually toward the bottom center of the resulting sprite texture, not the center of the generated image itself, so that had to be set manually by debug inspecting the resulting texture's dimensions and converting the normalized coordinates on setup.
One-time objects, such as the HUD buttons, the virtual joystick, and the Starbase, didn't require this technique because they only appear in the game once, rather than potentially hundreds of times.
Parallax star fields required a technique fairly similar to this one, while differing in some key ways. I'll talk about that in a future blog entry.