top of page
Vusi Dhlamini

Getting Started with Azure Functions Part 1

Azure functions is an event-driven, serverless compute platform that helps solve complex orchestration problems. It can be used to develop efficient event driven applications. Its serverless architecture makes it a cost effective way for deploying small pieces of code without having to worry about hardware management or maintenance.


Figure 1: Azure function integration view


Figure 1 is an overview of the Azure functions integration architecture, the key components are: Trigger, Inputs, the Function and Outputs. Azure Functions can be triggered in multiple ways, depending on the type of event that triggers them. The list of potential triggers include:

  • HTTP Triggered Functions

  • Timer Triggered Functions

  • Service Bus Triggered Functions

  • Durable Functions


This is an introductory post to Azure Functions for technical professionals. The next section of the post will cover detailed steps on creating an Azure function app, then link it to a storage account.


Azure Function Creation Steps:


Step 1:

Navigate to the Azure Portal search box, type in functions, click on function app, then you will land the a screen that looks like the one above. Click on "Create function App" to begin the process.


Step 2:

After clicking on "Create function App" you will land on the screen that looks the same as the one above. Here you must populate the following fields: Choose Subscription, Resource group, Function app name, deployment method (code or image), and the runtime stack of your choice.


Step 3:


You further have to choose the operating system of your choice and your preferred hosting option, in my case I chose serverless.


Step 4:

The function app I am creating will use a storage container as a source of trigger. It is a standard practice to link your function to a storage account. I will also use the same storage account to host the container that will be monitored to uploads. The aim is to create a function app that gets triggered when a document is uploaded to a storage account container of our choice.


Step 5:


In this tab you can choose if you want allow public access to your function app, or keep it strictly private. You also have an option of enabling network injection. Injecting function apps into a virtual network, unlocks advanced App Service networking and security features. This provides you with greater control over your network security configuration


Step 6:

After deploying the app, navigate to the function tab on the left plane.



Step 7:




Use the "create" button to begin the function creation process. You will need to choose your development environment: Portal, VS Code, or any another code editor. You then move on to choosing the trigger, for the purpose of this exercise I chose the "Azure Blob Storage trigger" as the plan is to integrate storage with this function app. Continue to choose the storage account you will use to host your data.


Step 8:

Go ahead and create a container inside your storage account, then upload any document into this container. This step will help us test our newly built application.


Step 9:

Now, test your function app by mapping your Input to the container you created earlier, and also specify your document of interest.



Step 10:

You can now use the function trigger to test if the function app receives requests. After executing the test code, I received the output: 202 Accepted. A response code of 202 Accepted means that the function app has received the request and has started processing it, but the processing has not yet completed. This response code indicates that the request has been successfully accepted and queued for processing, but the server has not yet completed the processing of the request.


We will end part 1 here, in part 2 we will continue with the blob storage use case for Azure function and show a complete end to end process.




24 views0 comments

Recent Posts

See All

Comentários


bottom of page