Your phone knows where it is thanks to a suite of sensors that basically try to measure everything they possibly can about their environment. Where does the GPS think I am? What orientation is the device in? What WiFi networks can I see? What are the nearby Bluetooth devices? Have I been moving around a lot lately, accelerometer? What cell phone networks am I connected to?
Unless you’re standing in a field in Kansas with a clear view of the sky for ten minutes (so your GPS has lots of time to settle), your location will be questionable.
The original iPhone used WiFi network data to figure out where it was, because a GPS wasn’t included. Skyhook (I think it was…) drove cars around major cities sniffing for networks while recording their geolocation. Then an iPhone could look up its location by comparing what networks it could see to the database of network locations. Then, it could start adding networks not in the database it could also see at the same place.
As phones added all kinds of sensors, these databases grew and became free-floating associations of place information. We can now correlate almost anything with where you are so that if the GPS doesn’t work (because you’re inside a building, say), devices fail-over to what other clues they have to figure out where you are.
Integrating all this information is still a challenge, especially if you’re driving around a major city. The reliability of all the location signals are questionable as Pete Tenereillo outlined in a recent LinkedIn post. Driving around San Francisco, you’re still subjected to the map jumping all over the place even with high end phones and the latest software.
How users experience this can happen at the other end too, when you see your uber or delivery driver jumping around the map on their way to you:
As well as finding your location, many apps want to store it too. There’s 1,001 ways to do that. Different amounts of data, different formats, different places to send it. What ends up happening, quite reasonably, is that various location-based app developers both capture and store location data in many different ways, and there are paid-for APIs and SDKs to help with pieces of the puzzle.
What’s changed over time is the value of this data. Aggregating vast amounts of anonymized location data can help with use-cases such as building base maps for example. If you take all the GPS traces of everyone every day, you can figure out where all the roads are and their speed limits and so on. This data is equally valuable for other uses; advertising and predicting stock prices as two examples. If you know how many people went to WalMart this week you have an indication of their stock value. Things like this appear to have driven the new $164M round for Mapbox – “Mapbox collects more than 200 million miles of anonymized sensor data per day”.
It’s supported by a long list of backers and it should remove a bunch of work when developing anything location-based, much as Auth0 removes having to set up custom authentication. For more, see the announcement blog post here!