How to set the RetryPolicy on the API App client

Introduction

When you want to consume your API App from your application our Visual Studio tooling does all the magic (aka generation) for you. Because of Swagger, we can generate the code of the API App client with two clicks. All you have to do is point the API App Client SDK to your API App (or the Swagger file) and we’ll do the generation of the code for you. You can find more details about this feature and how to use it at Consume an API app in Azure App Service from a .NET client

Once you have your code though, you might want to customize a couple of things, like e.g. the HttpClient timeout or set a RetryPolicy.

Setting the HttpClient properties

The HttpClient is publicly accessible from the code and you can customize it. Increasing the timeout  or setting custom headers to be send to your API App, could be as simple as:

Setting a retry policy to handle transient errors

When we’re trying to consume an API App or in general, a web application, there are cases where transient errors will happen and a simple retry will make the call go through. There are various reasons why transient errors manifest but the key here is handling them and retrying before failing a call.

The API App client has built in support for retry logic and all we have to do is provide the “RetryPolicy” we want to use. In our code this will look like:

In this case we’re using the default retry policy which detects “50x” errors and retries in the case, but there might be other cases were we want to retry.

Creating a custom retry policy

We can create a custom retry policy and pass this to the API App client. First we have to implement the “ITransientErrorDetectionStrategy” interface.

PS. Keep in mind that this is just an example the above Exceptions might not be representative or thorough.

All we do in this case is check if the exception is of a type we think is a transient error and return true. This will let the RetryPolicy know that it has to retry based on the “RetryStrategy” we have set for this API App client. Using the custom retry policy will look like this:

In the “IsTransient” method of the “ITransientErrorDetectionStrategy” interface you can do whatever kind of check you want but just keep in mind that this has to be fairly fast as you’re blocking the continuation as long as you don’t return something.

Conclusion

A retry strategy is generally a wise thing to do but just keep in mind that it has to be within reasonable limits and intervals. Overdoing it will hurt the performance of the application and not necessarily improve the reliability.

Taking advantage of API Management for API Apps

Introduction

Once you publish your API App using any of the tutorials we provide (.NET, Node.js, Java) you might want to take advantage of the advanced API Management capabilities of by Azure API Management, capabilities like throttling, interactive console and documentation, usage analytics and many more.

Creating an API Management account

If you don’t have an API Management account created, you will have to create one as a first step. To do so, navigate to the old portal at https://manage.windowsazure.com/ and from the left navigation menu find “API Management”:

There, click “New” on the left lower corner and then follow the instructions to create a new account. It will take about 15 to 30 mins for the account provisioning to complete. Once this is done, you should have something like this:

api-overview-account-created

Managing and adding the API definition

Select the account and click “Manage

click-manage

This should open the administration portal of your API Management account.

There, the first thing you see is the Dashboard which by default will only have the Echo sample API. We’re interested into the “Import API” which we will use to import our API definition of the API App.

import-api

To achieve that we’re going to take advantage of the Swagger file we generate for our API Apps. The easiest and fastest way for a Public API (Public Anonymous access level) is to point to the Swagger endpoint URL and let API Management do the import from there. In my case I’m using the Node.js Food trucks sample.

If your API is Public Authenticated, you can use the “Download Swagger” button from the “API Definition” blade on the Preview portal and provide a file instead of a URL to the API Management.

swagger-endpoint

If you need more information on how to find that or what Swagger is, make sure to check the URLs to the articles I provided at the beginning of the this post.

Once this is complete, the API Definition should be imported and available in API Management. To confirm, click on the “Operations” tab on the page you were redirected after you imported the API and check if the operations are there.

operations-overview

Calling the API through API Management

Now that the API Definition import is complete, we can use Postman or Fiddler and try accessing the API and check the results. I’m going to use Postman in my case.

To call the API you need to get the URL of your API Management Account plus the suffix you choose during the import of the API. In my case this URL is https://ninja-test.azure-api.net/foodtruckapi.

Before you’re able to call the API you need to do two steps. The first one is to add your API to one of the available “Products” that are created by default by API Management. Products are effectively SKUs provided by you to developers trying to access your API. To add your API to the “Unlimited” product, click on the “Products” tab and then “Add API to Products

add-product-api

Select the “Unlimited” and click “Save“.

The second step is to obtain your API Key which is required by the API Management. This is how API Management recognizes, among other ways, the different levels of service and the users trying to access API endpoints expose by API Management. To obtain your key, go to the “Users” from the left navigation.

users-left-nav

Find and click your user and then click “Show” on the primary key which is located on the “Unlimited” subscription section. Copy the value revealed as you’ll need it to call the API.

show-api-key

Now that we completed all this steps, let’s call the API. As the API needs a subscription key to be accessed, you can either pass it as an HTTP Header called “Ocp-Apim-Subscription-Key” or as a query string parameter on the URL “?subscription-key={key}”, by replacing “{key}” with your subscription key.

As you can see, you successfully called the API.

To find the URLs that the methods are listening to, you can visit the Developer portal of API Management and check them out there. You can also get code samples and interactive console to try calls to the endpoints of your API right from within the browser. The Developer portal is the URL of your API Management account, in my case that was https://ninja-test.portal.azure-api.net/

If case your API App is set to “Public Authenticated“, the request will fail as it doesn’t have the necessary token, called ZUMO, to be successfully called. Keep in mind that it’s not necessary to have authentication on the API App, as you can enable authentication on API Management and let it handle all the details.

In case you still need the API App calls to be authenticated, all you have to do is pass the token at the Zumo header (x-zumo-token) as part of the request to the API Management and it will forward the token to the API App, successfully authenticating the call. To learn more on how to obtain the token and enable authentication, you can read “Protect an API app: Add Azure Active Directory or social provider authentication

I hope you found the post useful and as always, I’m open to suggestions and questions.

Why am I getting “A route named ‘swagger_docs’ is already in the route collection” after I publish my API App?

Question:

After publishing my API App I’m getting the yellow error screen of ASP.NET. The error message says “A route named ‘swagger_docs’ is already in the route collection”.

How can I fix this?

Answer:

This is not related to API Apps per se but more around Web API. What triggers the error is pretty simple:

  1. You publish the API App which is based on Web API.
  2. You discard your project and start working on a new API App based on Web API
  3. You want to publish the new API App instead of the old API App you created at step 1.
  4. You select the API App during “Publish..” and you get the publishing profile of the existing API App we deployed at step 1.
  5. You deploy using Web Deploy and the publishing profile, the new API App on top of the old one.

That will trigger the issue I’ve explained before. That happens because there are two routes being registered by Swashbuckle when you try to start the app. One of the old one and one of the new one. That’s because the old files are still present at the destination.

To solve this, during Web Deploy, click on the Settings tab and then expand the “File Publish Options“. There is a checkbox there, called “Remove additional files from destination“. This will fix the issue as it will only leave the files you deploy at the destination and not the old ones as well.

web-deploy

Hope it helps,

Panos

What to know what API Apps are? Check here.

Part 2: Running non-.NET API Apps on Azure App Service

UPDATE: You can find a similar Node.js + Express sample and article on the Azure App Service documentation center here. There is also a Java sample and article available here

On my last post I’ve talked in a bit more detail what API Apps are, what you can build with them, how they work and provided some answers to frequently asked questions. Another very common question is: “How do I run an API App that it’s not .NET?“. If you’ve read my last post, I explained how Azure App Service discovers and understands the API definition of your API and light up functionality in the portal by reading your Swagger 2.0 endpoint. The structure we’re looking for to be present on your app is simple:

  • An “apiapp.json” file with some metadata about your API
  • A “Metadata” folder containing a “deploymentTemplates” folder
  • In the deploymentTemplates folder, a file called appappconfig.azureresource.json. The “Metadata” folder is temporarily not used by the platform but it will in the future. When the time comes, I will explain in more detail what it does.

If those files and folders exist in the root, then we can properly parse your API definition and understand the capabilities of your API. The minimum we need from the apiapp.json file is:

apiappjson

 

Make note of the endpoints object and the apiDefinition property. This property points to a Swagger 2.0 endpoint from where we extract the definition of your API. The property is relative to your API app’s path. If you don’t have a valid Swagger 2.0 endpoint, we can’t parse your API definition and your API App will not function properly in the portal and within Logic Apps.

How do I make other non-.NET API Apps run?

As I explained before, the underlying infrastructure is basically a Web App. That means you can run whatever technology you want and it’s supported as a Web App like Node.js, PHP, Python, Java and more. In my case I’m going to use Node.js. To make the App Service understand you’re an API App, you need the file(s) that I explained above (at this time, only apiapp.json).

Building a Node.js API App

To run the Node.js app I will follow the same path as I was about to deploy it as a Web App and I will add the files required by App Service. Your structure should look something like the one below:

nodeapiapproot

 

I haven’t created a “Metadata” folder because, as I mentioned, is currently not used by the App Service. This is a simple Node.js app using the Express framework. I chose the Express framework for this example for a couple of reasons:

  • It’s easy to use and build something that demonstrates the technology
  • Every other module I found, didn’t support Swagger 2.0 but 1.2. App Service requires 2.0 otherwise it’s not going to work
  • The module I used is swaggerize-express by PayPal

I have an apiapp.json file which is my App Service API App metadata file, the api.json which is my Swagger 2.0 spec file, a server.js which is what App Service will run to start my Node App and lastly my package.json which contains information about the Node app I’m deploying.

server.js

What the server.js code is doing is serving the api.json file which contains the Swagger 2.0 spec. In there the REST API URI is of the /v1/petstore format. We also dictate that the /api-docs is where the Swagger 2.0 metadata can be found. Also, we provide the handlers directory for Express routes in case someone tries to access the API. Lastly, there is a simple Hello World! message in case you try to access the root URI instead of the REST API.

apiapp.json

In the apiapp.json file, along with the metadata, we provide the important apiDefinition property which points to where our Swagger 2.0 metadata are.

Deploying the API App

At the tutorial link I provided before it’s explained how you can deploy using git. In our case you have to create an empty API App, not a Web App. Choose an appropriate name and create the API App. To find where to enable Continuous deployment, make sure you go the API App host. You can find that on the Settings of your API App. From there, follow the steps as they are explained on the tutorial to deploy the Node API App.

Api-app-host

Accessing the API App

Once the git deployment finishes, if you already had another API App running in the same deployment, you have to restart your gateway so it will pickup the correct metadata files from your API App. If you didn’t, you don’t have to restart the gateway. Find the gateway from the API App Settings and click on it. Click on “Restart” from the top toolbar. You can verify that everything works OK by clicking on “API Definition”:

api-definition

You should see something like this on the blade that will open:

api-definition-swagger

To access your API App, use can use the URL from your API App settings which is in the form of microsoft-apiappGUID.azurewebsites.net. If your tier allows, you can enable custom domains (among other capabilities) and make this look prettier.

We’re done. Where is the code?

That’s it! Your Node API App is now running on App Service.

You can find the Node API App on a Github repo and you can grab it from there.

What’s next?

This is a series of blog posts explaining API Apps and Logic Apps. My next blog post will be explaining how you can integrate a ASP.NET MVC WebAPI API App (that’s a mouthful, isn’t?) and a Node.js API App using a Logic App.

In the meantime download the bits, start playing with the platform for free and stay tuned!

You can find Part 1 here.

Feel free to reach out to me if you have any other questions.

Panos

Part 1: Azure App Service API Apps in more detail and some FAQ

What is Azure App Service?

In case you missed it, on the 24th we announced Azure App Service. In simple words, it’s a common platform were you can develop Web, Mobile, Logic and API Apps. It’s based on proven technologies and more specifically WebSites, which is now called Web Apps. The fact that there is a common infrastructure below gives us a lot of benefits like being able to scale apps together, or independently, sharing resources, being able to seamlessly integrate the apps together and many more.

 

Building API Apps

One of the new types of Apps you can build are API Apps. You might wonder, couldn’t I just have a Web App and put an API there? The answer is yes, but API Apps has more to it than just hosting. When you choose to build an API App some of the advantages are:

  1. We can auto-generate the code needed for that API to be consumed by your apps
  2. You can use your API Apps in Logic Apps
  3. The API App can be listed in the galleries (public or private, including the marketplace) (not currently available in Preview)
  4. API Apps can be auto-updated (not currently available in Preview)

The API App can be of any technology that a Web App can be, meaning .NET, Node.JS, PHP, Python, Java etc. When you build and deploy an API App what happens behind the scenes is that we create a Web App for you and we put some special metadata around it to know that is an API App. The API App also needs a special file in the root, called apiapp.json which contains the API App related metadata like author, summary, id and many more that are not currently used in the preview but they will light up in the future. You can find more information at this page. Towards the middle of the page there is a detailed description of the all the properties you can add to the apiapp.json file.

apiappjson

To be able to start building the apps, make sure you get the latest Azure SDK. Currently the Azure App Services SDK only works with Visual Studio 2013. Once you download the SDK and install it you have two options: migrate an existing Web API or create a new one.

How does it work?

The power of the API Apps comes from the easiness to consume and discover them. Our long term vision is to enable anything from regular REST APIs to even OData APIs to run on the platform and benefit from it. To be able to do that, we need a way to describe what those APIs are capable of and have a extensible model that expresses their capabilities. That is happening by using Swagger 2.0. For the API App to light up properly you need to have a Swagger 2.0 Metadata endpoint where our platform will reach out and discover the capabilities of your API, effectively find the API Definition. The Swagger 2.0 metadata are only required for that reason. Your customers, you or whoever wants to consume the API doesn’t need any special tooling or anything so there is no coupling here. In case you use our Azure API App Client, you can then point our tooling to your API App and we will generate the code for you. The API App client generation works with any valid Swagger 2.0 metadata file, so even if the API App is not running, as long as you have the Swagger 2.0 metadata file, we can generate the code for you. Then the code is yours and you can extend it as much as you like as it’s partial classes.

Azure App Service API App architecture overview

Once you create an API App you might want to deploy it to your subscription. When you are about to deploy the app, you’ll be asked a couple of things:

  1. App Service Plan: This describes what kind of capabilities the underlying infrastructure will have (e.g. pricing tier, how many VMs are goin to host your app etc.)
  2. Resource Group: This is a separation unit of your resources. It helps when you want to separate the resources available to different API Apps or even isolate the discovery of API Apps all together. Logic Apps within a resource group can only discover other API Apps within that resource group. It’s also very easy to manage the resource like that as you can see all the dependencies and such.
  3. Access Level: Who is able to access your API? Options can be Public Anonymous, Internal and Public Authenticated.
  4. Region: The region you want your API App to be deployed.

APIAppPublish

After the deployment finishes, you will notice that within your resource group there is another App called “Gateway”. Conceptually, all the API Apps sit behind the gateway. The Gateway handles things like API Management, Authentication and more in the future. When you try to access an API App there is communication with the gateway which in case of an authenticated call, it also flows the token to the API App being accessed. The Gateway also contains metadata of the API Apps like their definition, name, version of the package and more.

Implementing a microservice architecture

As you can imagine you can create multiple, independent API Apps that communicate with each other, have their own persistence layer, share authentication and much more. You can actually implement a microservice architecture like that. There is no precise definition right now, but the idea is having lightweight services that are structured around business capabilities, have automated deployments, intelligence in the endpoints and be decentralized from languages and data (by Martin Fowler). The protocol of choice can be HTTP but that’s not absolutely necessary. In our case though, that’s how API Apps work.

FAQ

  • Am I tightly coupled to the service if I generate code for an API App? Is this another “Add Service Reference…” thing?
    • No, you’re not and no it’s not. The Swagger 2.0 metadata are used to describe the capabilities of the API. When you generate the code using the Azure API App Client, we’re not adding any reference or anything. We’re just generating the code you would write anyway to access the API. If you don’t have to generate the code if you don’t want to. You can consume the API as any other API out there, by writing manual HttpClient code and JSON serialization
  • How can I update my API App?
    • Before the galleries and the packaging is released, all you have to do is do a Publish (Deploy) as you would do for a Web App. That’s it. The experience will stay as easy once the galleries are out as well.
  • If my API is accessed rarely but I still need quick responses, how do I do that?
    • If you remember, the underlying container for the API App is a Web App. The benefits of using a common infrastructure is that you get a common set of capabilities. That means, you can go to the Settings of your API Apps host and enable the Always On feature. Open the blade of the API App on the preview portal and look for the API App Host setting. Click it and that should get you to the API App Host. There, click on Settings->Application Settings and find the Always On option and switch it to On on the blade that will open. Click Save and you’re done.

What’s next?

This is a series of blog posts explaining API Apps and Logic Apps. My next blog post will be explaining how you can run a Node.js App as an API App.

The second part is available here.

In the meantime download the bits, start playing with the platform for free and stay tuned!

Feel free to reach out to me if you have any other questions.

Panos

My blog is not dead, I promise.

It’s not dead.

I promise.

I haven’t written a single thing for a loo<chuchu_train>oooooooooo</chuchu_train>ng time. It was for a good reason. So what happened? Well:

  1. I joined Microsoft. Which means I had to drop my MVP status.
  2. I’m a Technical Evangelist, writing code, spreading knowledge.
  3. I moved to Vancouver, BC to be closer to my team.
  4. My beautiful wife followed me.

In between those steps, there were a lot of talks at TechEd’s, TechDays etc. I’ve been all around Europe and a bit of US.

I told you, my blog is not dead.

More is coming soon.

Cheers,
Panos

VMDepot is now integrated into the Windows Azure Portal

A nice change I noticed today is that VMDepot images are now visible in the Windows Azure Portal.

If you go to Virtual Machines

Vm-option

You’ll see an option that says “Browse VMDepot”

Browse-VM-Depot

If you click it, you get the list of the image already in the VM Depot:

VMDepot-List

 

You can select one and create a virtual machine based on that image, just like that! :)

The coolest of all is that you can create your own images, publish them to VM Depot and if they get accepted, they get visible in the portal as well.

Small addition but a lot of value comes out of it!

Cheers,

PK

Windows Azure and VM Depot from MS Open Tech

VM Depot allows users to build, deploy and share their favorite Linux configuration, create custom open source stacks, work with others and build new architectures for the cloud that leverage the openness and flexibility of the Azure platform.

How it works

All images shared via this catalog are open for interaction where other users of VM Depot can leave comments, ratings, and even remix images to developers’ liking and share the results with other members of the community.  Currently, on catalogs similar to VM Depot (such as Amazon Web Services),users must pay to publish all versions of their images. With VM Depot, publishing an image is free to encourage users to take full advantage of shared insights and experience, as well as encourage collaboration towards the best possible version of an image. The shared community environment also means users can access this catalog to use basic image frameworks created by others so they can quickly continue building without starting from scratch.

VM Depot was made possible with the support of a number of partners who have contributed images and packages for this preview launch including Alt Linux, Basho, Bitnami and Hupstream.

FAQ

Q: Why has Microsoft Open Technologies Developed VM Depot?

A: VM Depot has been developed to help developers and IT pros take advantage of the open capabilities of the Windows Azure platform. VM Depot is a community-driven catalog of open source virtual machine images for Windows Azure. Using VM Depot, the community can build, deploy and share their favorite Linux configuration, create custom open source stacks, and work with others to build new architectures for the cloud that leverage the openness and flexibility of the Windows Azure platform.

Some links:

Port 25 blog

Interoperability@Microsoft blog

More info:

VM Depot delivers more options for users bringing their custom Linux images to Windows Azure Virtual Machines

http://blogs.msdn.com/b/windowsazure/archive/2013/01/09/vmdepot-delivers-more-options-for-users-bringing-their-custom-linux-images-to-windows-azure-virtual-machines.aspx

VM Depot from MS Open Tech

http://blogs.technet.com/b/port25/archive/2013/01/09/for-your-oss-image-building-and-sharing-pleasure-meet-vm-depot-from-ms-open-tech.aspx

MS1621_Banner_220x165_r05

Speaking at Windows AzureConf

What is Windows AzureConf

On November 14, 2012, Microsoft will be hosting Windows AzureConf, a free event for the Windows Azure community. This event will feature a keynote presentation by Scott Guthrie, along with numerous sessions executed by Windows Azure community members. Streamed live for an online audience on Channel 9, the event will allow you to see how developers just like you are using Windows Azure to develop applications on the best cloud platform in the industry. Community members from all over the world will join Scott in the Channel 9 studios to present their own inventions and experiences. Whether you’re just learning Windows Azure or you’ve already achieved success on the platform, you won’t want to miss this special event. (Source: http://www.windowsazureconf.net)

Why register?

First of all, it’s free! But that’s not the real value of it. The real value is the opportunity to watch and learn from real-world examples of how Windows Azure was used to real-world applications. The presentations/sessions are going to be delivered by community members, the same people who worked on those applications.

My session

I will be at the Channel9 studios in Redmond as my session (Windows Azure + Twilio == A Happy Tale to Tell) was selected  to be one of them and I would love to see you online and answer your questions.

Register now at http://www.windowsazureconf.net