Last month I released a simple, free educational game, Math-o-Mat, to the App Store. It's basically like Pong, except you have to do arithmetic to "hit" the ball.
This week, two kids got in touch with me through their parents (friends of mine) to tell me that sometimes button taps weren't working for them.
I play Math-o-Mat a lot, so I was a little surprised. I asked a couple of questions (what device, iOS version, etc), and I asked their parents if they generally have problems tapping buttons in other apps (they didn't).
One thing that was in common is that the devices were pre-iPhone X, so they had Home buttons. I play Math-o-Mat on devices like that, but mostly in the simulator.
I tried it on a test device, and rarely (but sometimes), I did seem to have a button tapping issue. It had something to do with this device, but also something to do with the exact way I was tapping.
I logged all button taps, like this:
public override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
print("touchesBegan")
self.onTap?(self)
}
And noticed something weird ..... For buttons close to the bottom of the screen, the log would happen after a little delay if I didn't release the tap quickly.
For other buttons (or on an iPhone X), touchesBegan
got called immediately when the touch started.
Armed with some knowledge, I could now google and found the question "UISystemGateGestureRecognizer and delayed taps near bottom of screen" on StackOverflow.
So, iOS installs some gesture recognizers at the top level that try to help stop accidental gestures at the bottom edge on these phones. The suggested answer is to muck with the iOS Window.gestureRecognizers
.
Generally, I try never to do things like this because they are unsupported and will probably break in a future version of iOS. In fact, if you do it, you'll get a console message that says:
[Warning] Trying to set delaysTouchesBegan to NO on a system gate gesture recognizer - this is unsupported and will have undesired side effects
Luckily, there's another way to do it. Using touchesBegan
directly has this issue, but gesture recognizers do not. By default, SpriteKit tends not to use them, but it is completely possible.
If you try to use UIGestureRecognizer
with SpriteKit, here are a couple of things to think about:
SKNode
is not a view, so you won't be able to attach them to your game nodes. My game is simple, so I attached them the main top level view and then used theSKNode
tree to find the right node (usingSKNode.nodes(at: CGPoint)
on the root node).The y-axis is flipped for the recognizer's touch point and the coordinate system used by SceneKit. You need to use
view.height - tapPoint.y
for your y-coordinate in the above call.
This also explains why I have never seen this behavior before (the delayed taps at the bottom of the screen). I generally just use UIView
or UIGestureRecognizer
descendants to get taps. It's been a very long time since I needed to implement something on top of touchesBegan
directly. SpriteKit uses neither of these classes, so the default way to collect taps is the low-level calls.
Top comments (2)
Very good info!
Lou, I have to ask since I read this last week on everyone's favorite tutorial site, raywenderlich.com: raywenderlich.com/6305-2d-apple-ga...
Do you think SpriteKit is going to hang around for a few more years? Should we be moving to Unity?
I am no SpriteKit or gaming expert, but my hot take is that if you are planning to make games professionally and need to make money, you should choose Unity or something similar.
SpriteKit is great for people like me -- I already know Swift -- I have no plans to make my living with it -- strictly a hobby.
I am not worried that SpriteKit will go anywhere -- there are many games that depend on it -- but I do think it's possible a SwiftUI like thing will replace it at some point.
Now SceneKit is a different story -- Apple is clearly doing something big in AR -- and right now, that's very aligned with SceneKit. Maybe the real AR solution won't be SceneKit based, but that would be surprising since the whole point of doing this now (before hardware) was to build up a library of AR content.