This article is part of a series going through the design and implementation of a sample app I made available on GitHub.

Last time we looked at how SignalR is used for real-time communication with the front-end. So this time let's look at the Vue front-end:

Architecture diagram with front-end, SignalR Service, Function App and Azure Maps

Azure Maps

As you can see from the above architecture diagram, the sample app uses Azure Maps for map tiles. There are a few reasons for this:

  1. Map tiles are good quality from my experience
  2. Can authenticate with Entra ID / Azure AD
  3. Billing is all in the Azure subscription

You can also do things like reverse geocoding through the Azure Maps API, though the sample does not use it.

The front-end also uses the Azure Maps SDK for the map view. Another library could also be used here that is compatible (or can be made compatible) with Azure Maps map tiles. It would also be possible to use an entirely different service for map tiles, e.g. Google Maps.

The sample app initializes the map view in Map.vue:

// Slightly simplified code
const map = new atlas.Map("myMap", {
  center: [24.940806, 60.170218],
  zoom: 14,
  view: "Auto",
  authOptions: {
    authType: "anonymous",
    // Client ID comes from .env file
    // note this is not an Entra ID client ID..
    // you get this from the Maps resource
    clientId: import.meta.env.VITE_MAPS_CLIENT_ID,
    getToken: getMapsAccessToken,
  },
});

Note that we use Entra ID access tokens to authenticate to Azure Maps, generated with a user-assigned Managed Identity that only has the minimal Azure Maps Search and Render Data Reader role. You could also access Azure Maps on behalf of the signed in user, requiring the user to be assigned a role like this to the Maps resource. In the sample app we don't require authentication from users, so this is not an option.

We add layers to the map to show e.g. geofences and vehicles. Here is a slightly simplified version of the vehicle tracking layer configuration:

const locationTrackerDataSource = new atlas.source.DataSource();

map.events.add("ready", function () {
  // Once the map is ready, we can add data sources and layers
  map.sources.add(locationTrackerDataSource);
  // Create vehicle sprite
  map.imageSprite
    .createFromTemplate("car", "car", "teal", "#fff", 1)
    .then(function () {
      // Once the sprite is ready, we can create the layer that uses it
      // (Azure Maps gets upset if you try to do it earlier)
      const locationTrackerLayer = new atlas.layer.SymbolLayer(
        locationTrackerDataSource,
        undefined,
        {
          iconOptions: {
            image: "car",
            anchor: "center",
            ignorePlacement: true,
            allowOverlap: true,
            // Get heading property from the shape for the rotation
            // = shape.getProperties().heading
            rotation: ["get", "heading"],
            rotationAlignment: "map",
          },
        }
      );
      map.layers.add(locationTrackerLayer);
    });
});

This utilizes a built-in car sprite in the Azure Maps SDK. A bit of an oddity in API design is how we define the rotation angle for the sprite. The line rotation: ["get", "heading"] defines that Azure Maps SDK should get the value of the heading property on the shape. This bit of the API is capable of a lot more than just getting properties, but we are not going to go deeper down that rabbit hole.

Note when we add a layer, we specify the data source that layer should use. Later when we add things to the data source, they will be reflected on this layer.

Updating vehicle locations on the map

The sample application uses the mitt library for client-side events. An event is sent when the front-end receives a vehicle location update through SignalR:

onMounted(() => {
  connection.on("locationUpdated", onLocationUpdated);
});

function onLocationUpdated(trackerId: string, latitude: number, longitude: number, timestamp: number) {
  emitter.emit('locationUpdated', { trackerId, latitude, longitude, timestamp });
}

Note here as well we have to be very careful to put the latitude and longitude the right way around.

This event is received by a data store component (using the Pinia library):

export const useLocationTrackingStore = defineStore("locationTracking", () => {
  const trackers = ref<Record<string, LocationTrackerState>>({});

  function onLocationUpdated({
    trackerId,
    latitude,
    longitude,
    timestamp,
  }: Events["locationUpdated"]) {
    let tracker = trackers.value[trackerId];
    if (tracker === undefined) {
      // First event we have received for this vehicle,
      // can't calculate speed and heading yet
      tracker = {
        trackerId,
        previousLocation: null,
        previousEventSentTimestamp: null,
        previousEventReceivedTimestamp: null,
        latestLocation: new atlas.data.Point([longitude, latitude]),
        latestEventSentTimestamp: timestamp,
        latestEventReceivedTimestamp: Date.now(),
        heading: null,
        speed: null,
      };
      trackers.value[trackerId] = tracker;
    } else {
      tracker.previousLocation = tracker.latestLocation;
      tracker.previousEventSentTimestamp = tracker.latestEventSentTimestamp;
      tracker.previousEventReceivedTimestamp = tracker.latestEventReceivedTimestamp;
      tracker.latestLocation = new atlas.data.Point([longitude, latitude]);
      tracker.latestEventSentTimestamp = timestamp;
      tracker.latestEventReceivedTimestamp = Date.now();

      const heading = atlas.math.getPixelHeading(
        tracker.previousLocation,
        tracker.latestLocation
      );
      // Time delta calculated based on the event timestamps
      // so error is not introduced by delays in processing
      const deltaSeconds = atlas.math.getTimespan(
        tracker.previousEventSentTimestamp,
        timestamp,
        atlas.math.TimeUnits.seconds
      );
      const speed = atlas.math.getSpeed(
        tracker.previousLocation,
        tracker.latestLocation,
        deltaSeconds,
        "seconds",
        "kilometersPerHour",
        0
      );

      tracker.heading = heading;
      tracker.speed = speed;
    }

    emitter.emit("trackerUpdated", tracker);
  }

  function subscribeLocationEvents() {
    emitter.on("locationUpdated", onLocationUpdated);
  }

  function unsubscribeLocationEvents() {
    emitter.off("locationUpdated", onLocationUpdated);
  }

  return {
    trackers,
    subscribeLocationEvents,
    unsubscribeLocationEvents,
  };
});

This data store uses some of the Azure Maps SDK's math utilities to figure out the heading and speed of the vehicle based on the previous two events. A "trackerUpdated" event is then sent, which is received in the map component:

onMounted(() => {
  emitter.on('trackerUpdated', onLocationTrackerUpdated);
});

function onLocationTrackerUpdated(tracker: Events['trackerUpdated']) {
  let shape = locationTrackerDataSource.getShapeById(tracker.trackerId);
  if (shape === null || shape === undefined) {
    // Vehicle is not yet on map, add shape to data source
    shape = new atlas.Shape(tracker.latestLocation, tracker.trackerId, {
      name: 'Tracker: ' + tracker.trackerId,
      // This is what the ["get", "heading"] from earlier reads
      heading: tracker.heading ?? 0,
      speed: tracker.speed,
      eventReceivedTimestamp: tracker.latestEventReceivedTimestamp,
    });
    locationTrackerDataSource.add(shape);
  } else {
    // Update properties on shape, updating the heading is reflected in the vehicle's direction
    shape.addProperty('heading', tracker.heading);
    shape.addProperty('speed', tracker.speed);
    shape.addProperty('eventReceivedTimestamp', tracker.latestEventReceivedTimestamp);
    // Update position on map
    shape.setCoordinates(tracker.latestLocation.coordinates);
  }
};

Adding a vehicle on the map is relatively simple, we add a new "shape" to the data source, which gets reflected on the map. Existing vehicles' properties get updated, and those are visible on the map as well:

Screenshot of map view showing three vehicles

Coordinate projection

Note that when we tell Azure Maps SDK to display a shape, we define the coordinates in the EPSG:4326 coordinate system. However, the map view cannot use this coordinate system as the map is projected to 2D (i.e. stretched at parts). So the map component converts the coordinates to EPSG:3857 for display.

This is all handled by the component, no action is needed on the developer's behalf. It is good to be aware of this though, as you may run into the other system's coordinates at times. Map views also have an "interesting" quirk with these coordinates. If you rotate the map west or east until you reach the same position, the coordinates are not the same. For example, if we are at longitude 0 and rotate the map to the east until we reach that same point, the longitude will be 360! So it is subtracting or adding a full circle depending which direction you went.

For this reason the Function App has methods to normalize coordinate values:

private static int NormalizeLongitude(int longitude)
{
  while (longitude < -180)
  {
    longitude += 360;
  }

  while (longitude > 180)
  {
    longitude -= 360;
  }

  return longitude;
}

The Function App also has one to normalize the latitude, but since you can't rotate the map view that way, it should not really be needed.

Front-end hosting in Azure Functions

We could host the front-end in e.g. Azure Static Web Apps, but for simplicity's sake in the sample it is hosted in the same Function App as the Event Hub handlers. In local development, the front-end can also be run independently.

In order to host the index.html at the root of the Function App, we need a small adjustment to host.json:

{
  "extensions": {
    "http": {
      "routePrefix": ""
    }
  }
}

This removes the standard "/api" prefix from HTTP routes. Then we can define the index HttpTrigger:

[Function(nameof(GetIndex))]
public async Task<HttpResponseData> GetIndex(
  [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "/")] HttpRequestData req)
{
  var lastIndexWrite = File.GetLastWriteTime(Path.Combine(_staticFilesDirectory, "index.html"));
  if (CachedIndexHtml == null || lastIndexWrite > CachedIndexHtmlLastWrite)
  {
    var indexHtmlContent = await File.ReadAllTextAsync(Path.Combine(_staticFilesDirectory, "index.html"));
    CachedIndexHtml = indexHtmlContent;
    CachedIndexHtmlLastWrite = lastIndexWrite;
  }

  var res = req.CreateResponse(HttpStatusCode.OK);
  res.Headers.Add("Content-Type", "text/html; charset=utf-8");
  await res.WriteStringAsync(CachedIndexHtml, Encoding.UTF8);
  return res;
}

Other static files can be handled by another HttpTrigger:

[Function(nameof(GetStaticFile))]
public async Task<HttpResponseData> GetStaticFile(
  [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "static/{**filePath}")] HttpRequestData req,
  string filePath,
  FunctionContext functionContext)
{
    var logger = functionContext.GetLogger(nameof(GetStaticFile));
    var mappedFilePath = GetStaticFilePath(filePath);

    try
    {
      var res = req.CreateResponse(HttpStatusCode.OK);
      res.Headers.Add("Content-Type", GetContentType(mappedFilePath));
      await res.WriteBytesAsync(await File.ReadAllBytesAsync(mappedFilePath));
      return res;
    }
    catch (FileNotFoundException)
    {
      logger.LogWarning("Unable to find static file at path {filePath}", filePath);
      return req.CreateResponse(HttpStatusCode.NotFound);
    }
}

You can check the full code for the HTTP triggers on GitHub. The front-end build is configured to produce its content into the Function App's static folder:

export default defineConfig({
  // Other configurations removed for brevity
  build: {
    outDir: "../AzureLocationTracking.Functions/static",
    emptyOutDir: true,
  },
});

So when we run npm run build, we can then publish the Function App to Azure with the front-end files.

Summary

This was a quick look into the sample app's front-end and how it uses Azure Maps. I don't do any professional work currently with Vue (we use React at Zure), but it's pretty fun to work with. Next time, we will be looking at some performance optimizations that I've done in the sample. I'm currently working on an alternate version that uses Cosmos DB instead of Azure SQL. So far the results are quite promising, I'll write about those in that next part.