Monitoring the Nuget feed using Azure Functions
In my previous post, I showed you a serverless way to accomplish this using Microsoft Flow. Today, let's explore another one. Namely, Azure Functions.
To start, you'll need an account on Azure. A simple pay-as-you-go subscription works great here.
Using Visual Studio
For the first part of this walkthrough, we'll develop our Azure Function using Visual Studio and the Azure Function Tools. As of this writing, this requires the latest Preview of Visual Studio 2017 (15.3) in order for the Tools to install.
Once you've got those two installed, it's time to get coding. As we ended our last post using Flow, one could easily reach the conclusion that doing this via Flow was quite a bit of work, trying to "shove a square peg in to a round hole" if you will. So, let's try and do it all with code, shall we?
In your Visual Studio Preview installation (which now has the Azure Functions Tools installed) go to File | New Project | Azure Functions
Once your new project is created, we have to create the actual code for the function. To do this, right click the project and choose Add | New Item... | Azure Function
in the next dialog, choose TimerTrigger, name it appropriately, and set the schedule to 0 0 */6 * * *
(every 6 hours).
You should now have a new Function class with this content:
public static class Function1
{
[FunctionName("NugetPoller")]
public static void Run([TimerTrigger("0 0 */6 * * *")]TimerInfo myTimer, TraceWriter log)
{
log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
}
}
And, of course, we'll now fill out the code accordingly. But first, let's think about the endpoints we're going to need.
- Blob storage
We'll still need a place to store state for our function - the last version we saw from the feed - because Azure Functions, too, are stateless. - Alerting mechanism
How do we plan on sending an alert from our function? Sure, we can do all the logic to determine if one is needed, but what does sending one look like? To accomplish this, we're going to tie a webhook-enabled Flow to our Function - easy peasy!
Set up an Alert endpoint in Microsoft Flow
Create a new Flow like we did last time, but for the first step, search for Request
and choose Request/Response - Request
:
Notice the cool part of this Trigger when it gets created in your Flow: "URL will be created after save" - that's right, Microsoft Flow will give you a unique URL to hit this Flow with and kick it off. For ours, we'll want to send a message to the flow to use in the alert, to let's define our schema from this simple JSON object:
{
"message": "our message"
}
by clicking the "Use sample payload to generate schema" link at the bottom of the Request action.
Next, set up the alert step. Click 'Add an action' and search for 'notification':
You can choose mobile, e-mail, or heck, both. I'll choose mobile for now. Set it up so the message for the mobile notification is whatever we send through to the request for step 1:
Name & create the Flow and you're done!
After you click 'Create flow', expand the 'Request' trigger (Step 1) and notice you now have a URL filled in.
Click the 'copy' button and take that over to your local.settings.json
in our new Azure Function.
Continue implementing the Function
You can quickly & easily test your Flow trigger with Postman:
Important: Make sure you send the
content-type
header with valueapplication/json
or your flow will error out.
So, let's put variables in our local.settings.json
for your Flow endpoint:
"FlowAlertEndpoint": "https://prod-21.westcentralus.logic.azure.com:443/workflows/...",
Webjob storage
Azure Functions also require storage of their own just to execute. You could either choose to reuse this storage to store your variables, or set up something separate. For me, I did it separate. Note that in our previous configuration for Microsoft Flow, you may have created a Blob Storage account (vs General Purpose). It's worth noting Azure Functions requires General Purpose Geo-redundant storage as its Webjob storage so make sure you choose the right options when setting this up.
The value for the Webjob storage you set up must be stored in the settings JSON file as well in the variable
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=..."
It's also not advisable to use Storage Emulation with Azure Functions.
Next, add a local setting pointing to the same Azure Storage you set up for the Flow we created in the last blog post (or set up a new Azure Storage account if you're doing this for the first time). Put its connection string in to local settings like so:
"StorageConnectionString": "DefaultEndpointsProtocol=https;AccountName=..."
Now that our local.settings.json
is fully configured it should look something like this:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=...",
"AzureWebJobsDashboard": "",
"FlowAlertEndpoint": "https://prod-21.westcentralus.logic.azure.com:443/workflows/...",
"PackageId": "Microsoft.Bot.Builder",
"StorageConnectionString": "DefaultEndpointsProtocol=https;AccountName=..."
}
}
I also put the package id in my settings. This will enable me to reuse this Azure Function as many times as I want to monitor different packages (you could also delimit a list of them if you wanted).
Now it's time to move on to the code.
First, in order to use the Azure Storage endpoint to pull our blob containing the last version number, we'll need to add a reference to the WindowsAzure.Storage
package on Nuget.
In the code, I start by pulling out my environment variables in to constants (or, more precisely, static readonly string
objects):
private static readonly string NUGET_PACKAGE_ID = Environment.GetEnvironmentVariable(@"PackageId");
private static readonly Uri NUGET_QUERY_URI = new Uri($@"https://api-v2v3search-0.nuget.org/query?q={NUGET_PACKAGE_ID}&prerelease=true");
private static readonly string STORAGE_CONNECTION_STRING = Environment.GetEnvironmentVariable(@"StorageConnectionString");
private static readonly Uri FLOW_ALERT_ENDPOINT = new Uri(Environment.GetEnvironmentVariable(@"FlowAlertEndpoint"));
And now, the execution:
- Check the Nuget endpoint:
dynamic nugetResults;
using (var client = new HttpClient { BaseAddress = NUGET_QUERY_URI })
{
var resultsJson = await client.GetStringAsync(string.Empty);
nugetResults = JObject.Parse(resultsJson);
}
- Next, widdle this down to only ones whose ids exactly match what we're looking for:
// we only care about packages that *exactly* match the id, not others that might be returned from a search
dynamic targetPackage = ((IEnumerable<dynamic>)nugetResults.data)
.SingleOrDefault(i => ((string)i.id).Equals(NUGET_PACKAGE_ID, StringComparison.OrdinalIgnoreCase));
if (targetPackage != null)
{
string version = targetPackage.version;
log.Info($@"Package found. Latest version: {version}");
// step 3 goes here
}
else
{
log.Info($@"No package found with id {NUGET_PACKAGE_ID}");
}
- Read our blob from storage to get the last version we saw
var storageClient = CloudStorageAccount.Parse(STORAGE_CONNECTION_STRING).CreateCloudBlobClient();
var container = storageClient.GetContainerReference(@"functionvars");
container.CreateIfNotExists();
// get the blob that contains the last version we saw for this package. We're naming it the same as our package id
var blob = container.GetBlockBlobReference(NUGET_PACKAGE_ID);
// step 4 goes here
- What happens if it's the first run? (No blob file exists yet) In my case, I chose to create & store the first version, but not send an alert. You can choose to send an alert (using the upcoming alert code) if you want.
if (!blob.Exists())
{ // if we haven't processed this package before, just set our baseline version
log.Info(@"First time we've seen this package. Storing version.");
blob.UploadText(version);
}
// step 5 goes here
- Otherwise check the Nuget feed version against the blob version:
else
{
var lastSeenVersion = new StreamReader(blob.OpenRead()).ReadToEnd();
log.Info($@"Last version we saw was {lastSeenVersion}");
if (!lastSeenVersion.Equals(version, StringComparison.OrdinalIgnoreCase))
// step 6 goes here
}
- Send an alert via our Flow endpoint
{ // if the latest version of the pkg on nuget doesn't match the last one we've seen, it's new!
log.Info($@"Notifying!");
using (var notificationClient = new HttpClient { BaseAddress = FLOW_ALERT_ENDPOINT })
{
await notificationClient.PostAsJsonAsync(string.Empty, new { message = $@"New version of {NUGET_PACKAGE_ID} has been published to NuGet. Version {version}" });
}
// step 7 goes here
}
- Update the blob in storage
// update the last seen version in the associated blob entry of our Storage account
blob.UploadText(version);
And that's it! The finished product should look something like this:
[FunctionName("Nuget6HourAlert")]
public static async Task Run([TimerTrigger("0 0 */6 * * *")]TimerInfo myTimer, Microsoft.Azure.WebJobs.Host.TraceWriter log)
{
dynamic nugetResults;
using (var client = new HttpClient { BaseAddress = NUGET_QUERY_URI })
{
var resultsJson = await client.GetStringAsync(string.Empty);
nugetResults = JObject.Parse(resultsJson);
}
// we only care about packages that *exactly* match the id, not others that might be returned from a search
dynamic targetPackage = ((IEnumerable<dynamic>)nugetResults.data)
.SingleOrDefault(i => ((string)i.id).Equals(NUGET_PACKAGE_ID, StringComparison.OrdinalIgnoreCase));
if (targetPackage != null)
{
string version = targetPackage.version;
log.Info($@"Package found. Latest version: {version}");
var storageClient = CloudStorageAccount.Parse(STORAGE_CONNECTION_STRING).CreateCloudBlobClient();
var container = storageClient.GetContainerReference(@"functionvars");
container.CreateIfNotExists();
// get the blob that contains the last version we saw for this package. We're naming it the same as our package id
var blob = container.GetBlockBlobReference(NUGET_PACKAGE_ID);
if (!blob.Exists())
{ // if we haven't processed this package before, just set our baseline version
log.Info(@"First time we've seen this package. Storing version.");
blob.UploadText(version);
}
else
{
var lastSeenVersion = new StreamReader(blob.OpenRead()).ReadToEnd();
log.Info($@"Last version we saw was {lastSeenVersion}");
if (!lastSeenVersion.Equals(version, StringComparison.OrdinalIgnoreCase))
{ // if the latest version of the pkg on nuget doesn't match the last one we've seen, it's new!
log.Info($@"Notifying!");
using (var notificationClient = new HttpClient { BaseAddress = FLOW_ALERT_ENDPOINT })
{
await notificationClient.PostAsJsonAsync(string.Empty, new { message = $@"New version of {NUGET_PACKAGE_ID} has been published to NuGet. Version {version}" });
}
// update the last seen version in the associated blob entry of our Storage account
blob.UploadText(version);
}
}
}
else
{
log.Info($@"No package found with id {NUGET_PACKAGE_ID}");
}
}
Publish your project to a new Azure Functions endpoint, and you're off and running!
The IDE-less approach
That was well & good, but what about an even simpler way to create a serverless execution? Let's see what we can do from right within the Azure Portal.
Open the portal, click +
and search 'functions'. Create a new Azure Functions instance.
Next, open your new Functions instance and add a new Timer Function. For us we'll choose CSharp:
Since we're using the Windows Storage SDK in our function, we have to tell Azure Functions about that nuget package. To do this, we simply upload a project.json
that looks like this:
{
"frameworks": {
"net46":{
"dependencies": {
"WindowsAzure.Storage": "8.1.4"
}
}
}
}
Upon doing this you'll see the Functions console start a Nuget package restore.
When this is done, simply copy & paste the code from your Visual Studio function in to the web IDE.
It won't work yet, though, because we haven't set the settings we're using in our GetEnvironmentVariable
calls. The bonus here is that we can keep these separate from our Function code by setting them in the function's Application Settings area a la every other Azure App Service offering.
To do this, click on the top-level Azure Function node in the left tree, then click 'Application Settings'
Next, put in the settings for PackageId
, StorageConnectionString
, and FlowAlertEndpoint
and click 'Save'
Next, tune the timer for our function! Click the 'Integrate' option below your function in the left-hand tree and change the timer to match what was in the attribute of our function in Visual Studio:
Pro Tip
If you want to test out your function, set this timer to something low, like 5 or 10 seconds (*/5 * * * * *
), and watch the log output in the editor area (bottom portion of the screen).
The changes you need to make to the out of the box template are minimal:
- Replace the template's usings block with this:
using Microsoft.WindowsAzure.Storage;
using Newtonsoft.Json.Linq;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Threading.Tasks;
- Add the
readonly
variables above the method declaration. - Take the body of the function from Visual Studio and paste it in to the body of this function
The end result should look something like this:
using Microsoft.WindowsAzure.Storage;
using Newtonsoft.Json.Linq;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Threading.Tasks;
private static readonly string NUGET_PACKAGE_ID = Environment.GetEnvironmentVariable(@"PackageId");
private static readonly Uri NUGET_QUERY_URI = new Uri($@"https://api-v2v3search-0.nuget.org/query?q={NUGET_PACKAGE_ID}&prerelease=true");
private static readonly string STORAGE_CONNECTION_STRING = Environment.GetEnvironmentVariable(@"StorageConnectionString");
private static readonly Uri FLOW_ALERT_ENDPOINT = new Uri(Environment.GetEnvironmentVariable(@"FlowAlertEndpoint"));
public static async Task Run(TimerInfo myTimer, TraceWriter log)
{
dynamic nugetResults;
using (var client = new HttpClient { BaseAddress = NUGET_QUERY_URI })
{
var resultsJson = await client.GetStringAsync(string.Empty);
nugetResults = JObject.Parse(resultsJson);
}
// we only care about packages that *exactly* match the id, not others that might be returned from a search
dynamic targetPackage = ((IEnumerable<dynamic>)nugetResults.data)
.SingleOrDefault(i => ((string)i.id).Equals(NUGET_PACKAGE_ID, StringComparison.OrdinalIgnoreCase));
if (targetPackage != null)
{
string version = targetPackage.version;
log.Info($@"Package found. Latest version: {version}");
var storageClient = CloudStorageAccount.Parse(STORAGE_CONNECTION_STRING).CreateCloudBlobClient();
var container = storageClient.GetContainerReference(@"functionvars");
container.CreateIfNotExists();
// get the blob that contains the last version we saw for this package. We're naming it the same as our package id
var blob = container.GetBlockBlobReference(NUGET_PACKAGE_ID);
if (!blob.Exists())
{ // if we haven't processed this package before, just set our baseline version
log.Info(@"First time we've seen this package. Storing version.");
blob.UploadText(version);
}
else
{
var lastSeenVersion = new StreamReader(blob.OpenRead()).ReadToEnd();
log.Info($@"Last version we saw was {lastSeenVersion}");
if (!lastSeenVersion.Equals(version, StringComparison.OrdinalIgnoreCase))
{ // if the latest version of the pkg on nuget doesn't match the last one we've seen, it's new!
log.Info($@"Notifying!");
using (var notificationClient = new HttpClient { BaseAddress = FLOW_ALERT_ENDPOINT })
{
await notificationClient.PostAsJsonAsync(string.Empty, new { message = $@"New version of {NUGET_PACKAGE_ID} has been published to NuGet. Version {version}" });
}
// update the last seen version in the associated blob entry of our Storage account
blob.UploadText(version);
}
}
}
else
{
log.Info($@"No package found with id {NUGET_PACKAGE_ID}");
}
}
Click 'Save' or 'Save and run' and your function is executing immediately!
I hope you've found these approaches to serverless computing useful and can see how marrying two such offerings provides a quick & easy way to go fr'om problem to solution and avoid all the infrastructure & setup headache.
If you, like me, find the idea of getting alerts for specific packages when they're updated on Nuget, you can find the source for this Azure Function on Github!