Apple introduced the GCVirtualController in iOS 15 which is a
software emulation of a real controller. The virtual controllers
can be configurable with different inputs. See more info at:
https://developer.apple.com/documentation/gamecontroller/gcvirtualcontroller
A simple gamepad configuration with a dPad and A and B buttons
is added. The user can enable/disable the virtual game controller
swiping two fingers right to left, or through the port-specific
option dialog.
The main change in to use the interface orientation and not the device
orientation. This may be different for example when locking the orientation.
This also changes the way orientation changes are detected using the
documented method. However this means dropping support for iOS 7 as this
method is only available since iOS 8, and alternative methods available in
iOS 7 have been deprecated in iOS 13.
Another change is to properly detect the interface orientation instead of
infering it from the view bounds, which was incorrect on some devices.
Add isInGame property to track if the launcher is shown of if a game is
running. Handle press on menu key different depending on if launcher is
shown or not. If launcher is shown suspend the application to return to
Apple TV Home Screen since that is the parent view of the launcher. If
in game pause the game and show menu. This is according to Apple
guidelines which can ge read here:
https://developer.apple.com/design/human-interface-guidelines/inputs/remotes
The keyboard can be presented and dismissed without being triggered by
the showKeyboard/hideKeyboard functions e.g. by pressing the menu button
on the Apple TV remote while the keyboard is shown.
If the keyboard visibility is not set entirely by the showKeyboard/
hideKeyboard functions that means that the _keyboardVisible state
variable can be out of sync.
Check if the keyboard is shown based on if the inputView is the first
responder or not. The check has to be made on the main thread.
iOS and tvOS shares a lot of code. However some there are parts that are
specific to iOS, for instance handling of UI device orientation and
certain types of gestures.
Currently there are also some limitations on the Apple TV that needs to
be flagged to the engine. There is no support for virtual keyboard, no
clipboard support and no possibility to open URLs.
Put code specific for iOS within the ObjC platfrom macro TARGET_OS_IOS.
The code specific for tvOS are put within the macro TARGET_OS_TV.
This enables some game engines only supporting 32 bit color formats,
e.g. 11th hour and Broken Sword 2.5.
Add support for the pixel formats RGBA8888 and ABGR8888. The pixel
formats are defined in big endian while iOS utilizes little endian,
thus requiring converting some formats to get correct color
representation.
Use the Apple Accelerate framework converting the format requiring
to minimize the CPU load since this is done every frame.
Move touch inputs to a TouchController class to move some logic from the
iPhoneView class. Only do this for touches on screen since connected
trackpads can generate touches as well. The latter ones are of type
UITouchTypeIndirectPointer while touches on screen are of type
UITouchTypeDirect. They are separated thanks to the preference key
UIApplicationSupportsIndirectInputEvents set to YES in Info.plist.
Without the preference above, there is no way to distinguish touches
from screen from a trackpad.
This is better than using an hardcoded delay for two main reasons.
The first one is that the application can terminate as soon as it
has finished saving the state, and the second one is that it will
still work if saving the state takes longer than the delay that
was hardcoded.