What Apple WWDC 2017 Has in Store for iOS App Developers

What Apple WWDC 2017 Has in Store for iOS App Developers

The Apple Worldwide Developer’s Conference 2017 wrapped up on the 5th of June. In this year’s keynote, Apple executives took the stage to announce upcoming updates expected for MacBooks, iOS, Apple TV and a lot more.

For this year, updates that are particularly interesting for iOS App developers include the new iOS 11 platform, Apple’s venture into AR (Augmented Reality), machine learning and a redesigned App Store. Let’s have a quick look at what these updates have in store for developers.

Apple Pay

According to Apple SVP Craig Federighi, Apple Pay is considered the number one contactless payment method for mobile devices. Developers have found the digital wallet app as an easy way to provide secure payment options in their iOS apps (both in-app and otherwise). And now, Apple Pay is expanding to include person-to-person payments.

Before the update, you couldn’t really include a feature that allowed people to pay each other in your application. The new person-to-person payment lets you pay anyone directly from Messages as an iMessage app. A TouchID fingerprint sensor secures verifies the transaction while the payment goes into the Apple Pay Cashcard.

iOS App Developers now have the chance to create applications that make the best use of this feature. For example, qualified non-profit organizations can now securely and easily receive donations from donors within their apps with Apple Pay.

Siri

Siri is used on more than 375 million Apple devices. The SiriKit was already a favorite for developers as an extension that enables any app to add a conversational UI.

In addition to a new and more natural female/male voice, Apple’s intelligent assistant now has a new visual interface, a Translation feature and a new predictive setup.

New translation feature: Apple’s intelligent assistant is available in more countries and languages (21 languages in 36 countries) than any other assistant. This basically means that iOS developers could use the assistant to reach a wider and more diverse audience with their apps. Consider third party apps like Lyft and Uber that can be summoned by Siri in different languages.

SiriKit offers new APIs for third party integration: SiriKit is a framework that allows third party apps from the App Store to integrate their features with Siri. It basically allows WatchOS and iOS apps to work with Siri so that users can do more using just their voice.

With the new update, app developers will have access to all new kinds of APIs for third party app integration. In addition to extending support for phone calls and messaging, the framework adds support that allow developers to take advantage of new services like taking notes or managing tasks.

Camera

Apple plans to release a new Depth API for iOS app developers. This basically means that they will be able to use the iPhone 7 Plus’ secondary sensor to add more depth of information.

Apple Music

During the keynote, Apple announced that Apple Music has around 27 million paid subscribers. To help app developers cash in on this, the company introduced a new Apple MusicKit API which will enable them to use user music subscriptions in their apps.

By adding paid songs to their own apps, developers will finally be able to take advantage of Apple Music’s popularity. For example, it was revealed that the platform will allow users to instantly add songs that they identify with to the Shazam app and use the podcasting app Anchor to allow them to DJ with all songs in Apple Music.

The new framework also gives them the chance to compete with streaming services like Spotify which already has its own APIs and SDKs that lets its developers bring Spotify songs to third party apps.

New App Store

A design makeover of the App Store was previewed during the event and in the months to come, Apple will review applications submitted by app developers much faster than before. Additionally, amongst a whole new ‘Today’ for showcasing new applications, features that are particularly notable for developers include:

Phased releases: According to the keynote, phased releases was a highly requested feature from app developers. With this update, you will be able to choose if you want your app to be phased over time, see how customers respond to each phase and iterate before proceeding with the rest of the update.

Dedicated tab for game developers: iOS game developers should rejoice at the new tab that is exclusively for games with in-app purchases available to view right there on the App Store. For developers, this puts a big part of their business right upfront.

Need to build a iOS app?

Tell us your requirements, and we’ll get back to you with a project plan for your custom built iOS app.

Core ML (Machine Learning)

Apple unveiled a new machine learning (ML) API for developers, called Core ML. The new ML framework API will make it easy for app developers to run machine learning models on Apple devices and speed up the execution of AI tasks.

Here is how the new reveal will help developers:

New Vision API: Developers will now be able to build computer vision techniques for barcode tracking, face detection, text detection and object tracking.

NLP (Natural Language Processing): The NLP APIs use machine learning to understand text deeply thereby giving chatbot developers to create more enhanced experiences for users.

On-Device Processing: With Core ML, data that app developers use to improve user experience will not leave the customer’s iPhone or iPads.

This has several benefits for developers, such as:

  • Their apps won’t have to be online to take advantage of machine learning models. These tools can be run locally on the user’s device.
  • User data will be more secure since the data doesn’t have to leave the device to benefit from intelligent ML results.
  • Apps won’t have to wait for information to be processed over a network which means that data processing will be faster.
ARKit (Augmented Reality Framework)

Apple also announced the ARKit, an augmented reality framework which allows developers to create AR experiences for the iOS devices. This is the company’s first big push into augmented reality and provides iOS app developers to finally be able to extend their applications beyond the screen. Here is how the new platform can help developers.

Improving scale of virtual objects: When it comes to AR, scale is essential in bridging the gap between the real and virtual world.

ARKit offers excellent object scaling. It uses ambient lighting to estimate scale, as exhibited by keynote speaker Federighi when he placed a virtual cup on the horizontal surface of the podium and we noted that it was actually the size of a real cup.

A lot of brands are already finding this beneficial. An example is IKEA that uses the realistic scaling in its own AR app to allow customers to see how their furniture would look like in at home.

Offering more immersive experiences: ARKit allows iOS app developers to take advantage of detailed sensor and camera data of the new OS 11 and move beyond simple 2D overlays to offer more immersive experiences to their users.

Need to build a AR app?

Reach out to us with your project brief and we’ll get back to you with suggestions, cost and time estimates.

Wrapping Up

The WWDC didn’t disappoint us this year either. Apple has given us developers a lot to work on in the coming months as iOS 11 beta is released for registered developers.

In case you missed it, check out what the Google I/O event this year had in store for Android app developers.




Looking for app development services,
advices & best practices?
Contact us

Email us: [email protected]