In early March, I had the opportunity to attend the Esri Developer Summit in Palm Springs, CA. The conference was jam-packed with 4 days of technical sessions and social events, including a dodgeball tournament! At the conference, I gave a presentation titled “Smart Palettes – Editing in the ArcGIS Runtime for .NET”. My talk focused on the architecture our team used for building reusable edit palettes for ground and aerial data collection. I shared the logging and telemetry we set up to measure the successes and shortcomings of our user interface. I wrapped up with a ‘look ahead’ of how we are trying to use collected data to make our palettes smarter, more intuitive, and provide a more streamlined approach for data capture out in the field. Outside of my presentation, I focused my schedule around attending Esri staff and user presentations about .NET, Mobile App Development, Desktop App Development, and User Experience (UX). Here are some of the highlights from my time at the dev summit:
At the 2017 Esri Developer Summit (last year’s conference), the runtime team shared their approach for a completely redesigned runtime SDK that focused on code sharing and a more consistent API across each supported language platform. This year, a lot of sessions built on last year’s announcement by focusing on the functional gaps that were closed between runtime version 100 and both the previous version (10.2.x) and ArcGIS Engine, in hopes of encouraging the migration of existing projects and new development to version 100. New features that came out this year and planned for the next summer release were highlighted, including increased ability to interact with 3D data, WFS read support (write coming soon), and potentially new AR/VR capabilities (see the section below).
Earlier this year, a big announcement in the .NET Core/Tooling world was a new project type that lets you target multiple platforms, so-called ‘Multitargeting’ (introduced in Visual Studio 2017). This project type was a big highlight of tech sessions at the conference, with Esri Developers showing sample projects that shared ArcGIS Runtime mapping code between WPF, UWP, and Xamarin iOS and Android projects. Multitargeted projects are an evolution on Shared Projects to promote code sharing among all platforms, and I’m excited to try it out on an existing UWP project to extend support to Xamarin and WPF platforms. Checkout the Visual Studio Multitargeting Project templates on Github to get started.
This year, a new Augmented and Virtual Reality beta SDK was announced for the .NET, Android, and iOS ArcGIS Runtime SDKs. The SDK extends existing 3D capabilities to use device sensors to interact with a SceneView for either a virtual or augmented reality experience. On the iOS side, it leverages ArKit,while the Android side has plans to use Google’s ArCore. These demos were crowd pleasers and made their way not only to tech sessions, but also the plenary. Additionally, the team mentioned that the AR/VR SDK could make its way into an official runtime release in the future if the beta goes well. I’m interested in experimenting with the SDK to do 3D data capture in augmented reality. My idea would be to take the concept of a laser range finder, but instead of using specific hardware, use the phone’s camera and augmented reality. By adding a cross-hair to the phone’s camera view, it would make it easier to capture data and view data in AR. Of course, good 3D-ready data is going to be an obstacle here, but it seems like a promising beta SDK, and I’ll keep my fingers crossed for a commercial release. To get access to the early adopter community/beta send an email to
Over the course of the past year or so, I have worked on integrating a Trimble R2 GNSS Receiver with a Universal Windows Platform (UWP) application that uses the ArcGIS Runtime for mapping capabilities. So, at the conference, I sought out every opportunity to swap war stories on High Accuracy GPS to see what developers and Esri staff were doing to approach this challenge. I attended a session on using GNSS Receivers with Collector and talked to a few developers on the apps team about strategies for using GNSS Receivers with the runtime. I learned about a new application called ‘Aurora’ (simply stated, it’s a new version of Collector targeting the 100.x version of the ArcGIS Runtime) that is going to have a lot of high accuracy GPS features baked in. I thought it was interesting that the team decided to write a new application for this upgrade and was very interested in their approaches for designing a different User Experience for it. At the “ArcGIS Runtime Road Ahead” session, I learned that the team is working on building a High Accuracy GPS API into the runtime. You can find details on the issue I logged during the “Ask Us Anything” session with the runtime team on what this means exactly: https://github.com/Esri/runtime-questions/issues/146.
One of the standout sessions was given by Nick Black and Dawit Elias (UX Engineers from Esri) on their redesign of the Aurora application based on user feedback and usage data from Collector. They shared their team’s approach for creating new interfaces based on Google’s “Design Sprint” and gave a lot of tips and best practices for discussing UI without losing your mind in a team setting. One simple example they gave for the Aurora redesign was, ‘Don’t force a user to switch to GPS if they normally use it.’ The team received a lot of feedback regarding the annoyance of having to toggle on GPS mode to start a data capture session. The data capture session was such a standard state for the app that it made far more sense to just keep the GPS on all the time. Making this change to keep the GPS on my default saved users a lot of time. Additionally, they added a long press gesture to the map during an edit session to bring up cross-hairs. These cross-hairs could be moved to accurately place a point without worrying about fat-fingering its location. Even though I’m not spending a lot of my day-to-day doing UX, this presentation was very motivating, and I’m going to try and implement concepts from the “Design Sprint” into my own team’s approach for new user interfaces.
A standout user presentation was given by the authors of the mobile app “Fish Washington” from the Washington Department of Fish & Wildlife (WDFW). The free mobile app uses the ArcGIS Runtime to convey up-to-the-minute fishing regulations on every body of water in the state. Jake Shapley and Melody Alefteras (Engineers at WDFW) gave an awesome presentation on how they combined one ArcGIS Server instance with static content hosted on AWS S3 to deliver data updates to users of the application in a non-conventional way. It was neat to see a consumer facing app that uses the ArcGIS Runtime with a unique approach to save on ArcGIS Server licensing costs and keep data up to date in the field.
Some of the best moments of the Dev Summit came from wandering the expo and striking up conversations with other developers and Esri staff. Outside of the conference, I visited the Palm Springs Aerial Tram for some much-needed hiking after a flight from the East coast. The views of the desert valley were incredible and it was a nice kick-off to the week. Not only was this my first time attending the conference, but it was also my first trip out to California – I’ll definitely be back next year!