At Worldwide Developers Conference (WWDC) Apple introduced their first augmented reality framework - ARKit. Unparalleled augmented reality experiences for iPhone and iPad can be easily created with minimum coding.
By blending digital objects and information with the environment around you, ARKit takes apps beyond the screen, freeing them to interact with the real world in entirely new ways.
Main advantage of this framework is that Apple engineers work with a higher level software and put the optimization making this option unavailable to third-party developers. But what does it mean for the third-party developers? Our team of iOS developers decided to use more resources and execute more operations saving desired 60 frames per second at the same time.
Key Reasons to choose CoreLocation
Our team of iOS developers has chosen navigation feature as an example. The reasons were the following:
- you see where to go, where to turn in order not to miss the turning;
- when going you can look round and see information about the passed places that helps to know the area better;
- navigation can be used not within set area, as well as all over the earth.
Major Challenges
When working with ARKit framework our team tried to understand how to coordinate CoreLocation data to ARKit framework. There were following challenges:
- True north and magnetic north lie in different points. Navigation orients on magnetic north. This difference should be noted as AR points will lie in different places.
- Our planet is ellipsoid while the world inside AR is infinite plane. It is not a problem within short distances, you can even neglect a route curvature. But the problem appears when transferring coordinates to the plane.
Our Solutions
Fortunately, both problems can be solved with Math: the necessary formulas have been already derived. We only needed to adapt them taking into account obtained from navigation data and to give them ARKit.
Having received new coordinates, we tried to place them in AR world to demonstrate the first result. We noticed interesting moment: AR uses real system of units measurements - meters that is common for people.
Thus our initial objectives have been achieved: we have placed points in space by receiving them from geodata.
Improvements
Having reviewed the result in details we discovered the results were inaccurate:
1. All points were placed periodically not along the route.
2. Sometimes the route was drawn with several degrees deviation.
We decided to solve mentioned challenges:
- On start, all points have been sent to be replaced while ARKit session was not adjusted and started completely. As a result, when the session was started, the correct coordinates have been already lost and all points were placed incorrectly that corresponds to the coordinate origin.
- The last stage was to draw the line between points to display a route to go. We should replace points with the final line intervals, make them necessary length and turn each intervals towards the beginning of the following intervals. Here is the solution:
What comes next?
Of course, this result can be improved. For example, elements can be limited to get a certain distance. It helps:
1. to remove graphic objects artifacts are in a long distance.
2. if intervals can be displayed on a long and difficult route, the frame frequency will be decreased and a battery will be run down.
DDI Development team continue working on graphic improvements: not to display passed objects. We are also planning to refine the algorithm that assess the accuracy of the geolocation.