Last Updated: February 25, 2016
·
470
· rismay

Core Location Evolution

http://wrkstrm.postach.io/post/core-location-evolution

TL;DR: Apple has been refining their location APIs opportunistically to improve battery life.

There seems to be a lot of interest in the new iOS 7 location APIs - CoreLocation. Here is a great hacker news discussion about it. I want to provide a little more "context" to this discussion.

As an aside checkout:

My Background:
Here is a quick 1.5 minute demo of wrkstrm I made for the AngelHack 2012 summer finals.

iOS 2:
Apple introduces CoreLocation with a minimal amount of features. Programmers can ask to be updated about GPS coordinates only when their Apps are in the foreground.

iOS 5:
Apple adds two huge additions which introduced significant location monitoring and geo fencing. This information was already being logged by the iPhone there was a huge scare about this in 2011. All Apple did was give developers access to this data without having to hack (check out William Edward's demo and Jer Thorp's OpenPaths).

Another aside: These two APIs are what prompted me to learn how to code. I was seriously disappointed when I found out that Apple was "cheating" the implementation of these APIs. The resolution to these APIs is horrible - something like football fields of disparity in inner cities. This is what started my journey into finding a more accurate way to geo fence and track locations.

iOS 6:
For this release Apple worked closely with the MapKit team to improve GPS accuracy. The best example of this is how the CoreLocation team used the new MapKit vector graphics to clip GPS coordinates to streets. Again, Apple didn't invent a new technology to improve GPS accuracy, it leverage a new user interface innovation. To implement this Apple released a activity type (driving, walking, etc) API which alerted CoreLocation as to when it would be appropriate to clip coordinates.

iOS 7:
These screenshots use a different approach to generate. This information is NEW and gathering it was made possible by two sensor driven user interface innovations which seem totally irrelevant. First, in iOS 6 Apple started running the accelerometer all day with it's "raise to speak to Siri" feature - (simply put the phone to your ear to activate Siri). Now with iOS 7, Apple has introduced a "constant on" gyroscope, with the introduction of the Parallax effect. Now with some clever signal processing, Apple can measure "stay" events when the iPhone is not moving without resorting to expensive / inaccurate geo fencing. Why am I so sure that this is what Apple is doing? Apple was originally planning to go even further by providing step count data (similar to the Galaxy S4's S Health data) to developers. This is only possible by running the aforementioned sensors and using signal processing. This was shown in the new iOS 7 technologies during the keynote and even in the iOS 7 beta 1 documentation, but all mentions were abruptly removed with beta 2 and beyond. Look for this in iOS 7.1 or iOS 8.

Finally, with iOS 7, Apple went one step further. With iOS 6, Apple used to provide GPS applications real time updates. Now there is a new API that allows deferred GPS notifications so that Apple can shut down your app and give you notifications "opportunistically" (i.e. when a user turns on the lock screen). Apple claims to provide a 40% improvement in battery life using this new method. If so, this technology is a game changer for always on location trackers.