Storage Queue Trigger Samples
Triggering via Azure Storage Queue
Storage Queues can be used as triggers for your functions.
Using the QueueTrigger
attribute, you can supply the queue along with the connection information for the storage account that should be used.
[FunctionName("FunctionsQueueTrigger")]
public static void Run([QueueTrigger("101functionsqueue", Connection = "AzureWebJobsStorage")]string myQueueItem,
TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
}
Takeaways
- Use the
QueueTrigger
attribute to trigger your function via an Azure Storage Queue. QueueTrigger
requires the name of the queue and the name of the setting that holds the connection information for your storage account.
Read more
- Create a function trigger by Azure Storage queue
- Azure Functions triggers and bindings concepts
- Introduction to Queues
Azure Storage Queue Trigger using a POCO
If your queue message is made up of a JSON payload, the runtime serializes it into a plain old CLR object (POCO) for you.
[FunctionName("SimpleQueue")]
public static void Run([QueueTrigger("101functionsqueue", Connection = "AzureWebJobsStorage")]Customer queuedCustomer, TraceWriter log)
{
log.Info("101 Azure Functions Demo - Queue Trigger w/ POCO");
log.Info($"Customer Name: {queuedCustomer.FirstName}");
}
Takeaways
- JSON payloads can be deserialized into POCO objects
- JSON .NET types can also be used, for example,
JObject
,JArray
Read more
- Create a function trigger by Azure Storage queue
- Azure Functions triggers and bindings concepts
- Introduction to Queues
- JSON .NET Documentation
Retrieving queue metadata from an Azure Storage Queue Trigger
With Storage Queue triggers, the CloudQueueMessage
class can be used to retrieve metadata about the queue message.
Some common properties include:
- DequeueCount - The number of times this message has been dequeued.
- ExpirationTime - The time that the message expires.
- ID - Queue message ID.
- InsertionTime - The time that the message was added to the queue.
[FunctionName("QueueTriggerMetadata")]
public static void Run([QueueTrigger("101functionsqueue", Connection = "AzureWebJobsStorage")]CloudQueueMessage myQueueItem, TraceWriter log)
{
log.Info("101 Azure Function Demo - Retrieving Queue metadata");
log.Info($"Queue ID: {myQueueItem.Id}");
log.Info($"Queue Insertion Time: {myQueueItem.InsertionTime}");
log.Info($"Queue Expiration Time: {myQueueItem.ExpirationTime}");
log.Info($"Queue Payload: {myQueueItem.AsString}");
}
Takeaways
- Using
CloudQueueMessage
as a queue trigger parameter enables metadata retrieval. - The
AsBytes
andAsString
methods can be used to return the message payload in the respective format.
Read more
- Azure Functions triggers and bindings concepts
- Azure Functions Queue Storage bindings
- Introduction to Queues
Poison queue messages with Azure Storage Queue Triggers
If a function triggered by a QueueTrigger
fails, the Azure Functions runtime automatically retries that function five times for that specific queue message.
If the message continues to fail, then the bad message is moved to a "poison" queue. The name of the queue will be based on the original + "-poison".
For example, a storage queue named myqueue
results in a poison queue named myqueue-poison
.
[FunctionName("PoisionQueues")]
public static void Run([QueueTrigger("101functionsqueue", Connection = "AzureWebJobsStorage")]CloudQueueMessage myQueueItem, TraceWriter log)
{
log.Info("101 Azure Function Demo - Poision Queue Messages");
log.Info($"Queue ID: {myQueueItem.Id}");
log.Info($"Queue Dequeue Count: {myQueueItem.DequeueCount}");
log.Info($"Queue Payload: {myQueueItem.AsString}");
throw new Exception("Intentional failure");
}
After letting the code above run, there should be a queue names 101functionsqueue-poison
with the message that failed to process.
Takeaways
- Using
CloudQueueMessage
, metadata such as the message dequeue count can be retrieved. - Five attempts are made to process a queue message before moving it to a poison queue.
Read more
- Azure Functions triggers and bindings concepts
- Azure Functions Queue Storage bindings
- Introduction to Queues
Azure Storage Queue output binding
Azure Storage Queues can be used as output bindings for your Azure Functions app. Using the Queue
attribute allows you to make a parameter as a source to push queue messages. Some supported parameter types include:
- out
- a .NET POCO is serialized into JSON before being added to the message payload - out string
- out byte[]
[FunctionName("BasicQueueOutput")]
public static void Run([TimerTrigger("*/30 * * * * *")]TimerInfo myTimer,
TraceWriter log,
[Queue("101functionsqueue",Connection = "AzureWebJobsStorage")] out string queueMessage)
{
log.Info("101 Azure Function Demo - Storage Queue output");
queueMessage = DateTime.UtcNow.ToString();
}
Takeaways
- Use the
Queue
attribute. - The
Queue
attribute requires the name of the queue and the name of the setting that holds the connection information for your storage account. - If POCOs, strings, and byte arrays are parameters of your function, they must be marked with the
out
keyword.
Read more
- Create a function trigger by Azure Storage queue
- Azure Functions triggers and bindings concepts
- Introduction to Queues
Using ICollector with Azure Storage Queue bindings
ICollector
and IAsyncCollector
can be used as parameter types for Storage Queue output bindings. Using these interfaces allows you to add multiple messages to the respective Storage Queue.
[FunctionName("CollectorQueueOutput")]
public static void Run([TimerTrigger("*/30 * * * * *")]TimerInfo myTimer,
TraceWriter log,
[Queue("101functionsqueue", Connection = "AzureWebJobsStorage")] ICollector<Customer> queueCollector)
{
log.Info("101 Azure Function Demo - Storage Queue output");
queueCollector.Add(new Customer { FirstName = "John" });
}
Takeaways
- Use the
QueueTrigger
attribute to trigger your function via an Azure Storage Queue. QueueTrigger
requires the name of the queue and the name of the setting that holds the connection information for your storage account.- The queue name and connection values are required.