New interfaces
Siri Shortcuts
It's been more than a year since Apple bought App Workflow for
an undisclosed sum. The app is a kind of construction kit with which various
actions can be bundled into a single "workflow". For example, an "I'm running late"
message can be generated and sent to all participants of the current calendar entry. At
the same time, navigation to the destination is started in the map app.
However, since Apple acquired the app in March 2018, Workflow had been quiet. This will
now change with iOS 12 - Workflow will become Siri Shortcuts. While the basic
functionality remains the same, the new possibilities with Siri are particularly
interesting. For example, Siri commands can now be assigned to the created workflows. In
addition, developers can use so-called "Custom Intents", which can be found as INObjects
in SiriKit, developers can define shortcuts for their own app.
An app in which tickets for buses or trains can be purchased could, for example, define a
shortcut "Buy a ticket from my location to home", which could handle the complete
purchase of a ticket without having to open the app. The prerequisite for this, however,
would be that the app knows the user's home. Developers can also integrate the new
INUIAddVoiceShortcutViewController
into their app. This provides a user interface that allows the assignment of custom
commands for a specific shortcut. For example, the user could rename the shortcut to
"Prepare my way home". In this scenario, this command to Siri alone would be sufficient
to complete the ticket purchase.
However, this example also illustrates where Siri shortcuts are still limited. Completely
dynamic commands are not
possible at the moment. For "Buy a ticket from my location to Schiller Street"
another shortcut would have to be created and for each additional street as well.
Nevertheless, Siri shortcuts offer a lot of possibilities and it will be interesting to
see how developers use this new feature.
Augmented Reality
A big focus of iOS 12 is augmented reality. ARKit 2.0 brings new multiplayer features
that will make it possible to view and interact with the same AR session from multiple
devices. This makes entirely new types of games conceivable.
The new USDZ file format also allows 3D models to be exported and imported and displayed in AR on iOS 12
devices. In addition,
ARKit has been enhanced with interfaces to scan
and identify
real-world objects.
Particularly interesting is the option of eye tracking, through which the movements of
the eyes can be tracked in real time.
Machine Learning
CoreML, Apple's machine learning framework, is also coming in its second version with iOS
12. In addition to several new features, Apple promises a significantly improved
performance.
The new Natural Language Framework allows developers to analyze text within the app using
machine learning. An NLLanguageRecognizers
can be used to determine the dominant language within a string. Language recognition
could be used to automatically display the translation of the text if it is not the
user's device language.
However, with the Natural Language Framework, analysis over text is also possible to
generate simple statistics, such as the number of words or sentences. The NLTagger class
also offers the possibility to identify "tags" of different types in a string. For
example, names, locations or organization names can be determined in a string.
While CoreML still cannot be used on iOS devices to create machine learning models, the
new Create ML framework for macOS Mojave offers exactly these capabilities.
In a separate WWDC
session, , the possibilities of Create ML were shown in detail. With the
framework, machine learning models can be trained to classify images, for example.
If you create an MLImageClassifierBuilder,
via an Xcode Playground, images can be inserted into the Playground for classification
either via drag and drop or programmatically. Apple's session shows how to train the
machine learning model with images of fruit. After feeding it with enough images, it is
inserted into an iOS app to recognize and name fruits in images.
Figure 2: Excerpt from the WWDC session "Introducing Create ML"
In addition to image recognition, Create ML can also be used to train the meaning of text
and the relationship between numerical values.
Other innovations
With the new Network
Framework, network connections can be established to get direct access to
protocols like TLS, TCP and UDP. However, URLSession should still be used for HTTP
calls.
There are only a few innovations for the UIKit this year. With UITextInputPasswordRules
classes, the company-specific password rules can now be defined, which are taken into
account by iOS when a password is suggested within an app. By generating the passwords
and at the same time taking into account rules specified by the backend, the protection
against misuse is significantly strengthened. For this, only a UITextInputPasswordRules
instance must be set on a UITextField. In order to receive password suggestions on a
UITextField instance, .newPassword must also be set as textContentType.
UITextField also offers a completely new textContentType oneTimeCode.
If you set this parameter, incoming SMS are scanned for two-factor authentication codes
and inserted directly into the text field.
With the new revised Notification System in iOS 12, there are also new possibilities for
developers to group notifications.
Deprecations
Since the introduction of WebKit with iOS 8, Apple advises developers to use the
WKWebView to display web pages within an app. However, the old UIWebView is still used
by many developers today. With iOS 12, however, the UIWebView is now deprecated. Already
with iOS 13, Apple could remove it completely from UIKit, which is why developers should
switch to the WKWebView in their apps now at the latest.
While the effort here is probably limited, this is not true for OpenGL. Apple would like
games and other graphical renderings to be developed with the in-house framework Metal
from now on. OpenGL will continue to work with iOS 12, but it has also been marked as
deprecated, which is why developers should switch from OpenGL to Metal as soon as
possible.
Other changes
The new iOS 12 also brings revised App Store guidelines. In the future, it will be
possible for apps that are not financed via a subscription model to test the paid app
for free for a limited period of time. This will be done via a free in-app purchase.
Another new feature concerns the description of app updates. Starting with iOS 12, the
update text must clearly explain which new features have been integrated into the app.
Generic update descriptions will only be accepted if they concern bug fixes, security
updates or performance improvements. With this step, Apple goes against the development
of the last months, in which every update was provided with the same description
text.
Apple simplifies app beta testing with iOS 12. Test users no longer have to be invited
manually with the new "TestFlight Public Link" feature. Instead, it will be possible to
invite users to Test Flight via a URL. This means that even large groups can be
organized quickly and easily. With TestFlight Public Link, it is even possible to invite
up to 10,000 testers to an app. It should also be possible for developers to create
groups of testers and make certain builds available to individual groups.
Developers of apps that allow in-app purchases will be particularly pleased with the
following new feature: With iOS 12, it is possible to log into a sandbox App Store
account, which is only used for requests against the App Store test environment. So
in the future, it will no longer be necessary to manually log in and out to test in-app
purchases.