Krause is definitely making a point to Apple here. He clearly states that the techniques in Detect.location should not be used in a production app under any circumstances, and only exist to highlight the ease of obtaining such details.
Of course, much of this information is derived from one dataset: the metadata spat out by the iOS Photos app. Here, Krause has managed to extract the exact location of the device, its date/time of recording, and even the physical speed at which the photo was taken. It also means, as Krause says, that you can probably get to figure out where the user works and lives, by the inference of images taken at particular times, and with facial detection of the photos, who they are and who they spend time with.
Shockingly or impressively (depending on how you look at it), Krause built the app in around an hour, and it was accepted by Apple for inclusion in the app store.
The whole thing has also been released to GitHub, here, so devs can play with the code. For the rest of you, get it from the App Store, and prepare to be more than a little concerned as to how much you could be sharing, even if you think that you're sharing nothing.