Why am I getting “A route named ‘swagger_docs’ is already in the route collection” after I publish my API App?

Question:

After publishing my API App I’m getting the yellow error screen of ASP.NET. The error message says “A route named ‘swagger_docs’ is already in the route collection”.

How can I fix this?

Answer:

This is not related to API Apps per se but more around Web API. What triggers the error is pretty simple:

  1. You publish the API App which is based on Web API.
  2. You discard your project and start working on a new API App based on Web API
  3. You want to publish the new API App instead of the old API App you created at step 1.
  4. You select the API App during “Publish..” and you get the publishing profile of the existing API App we deployed at step 1.
  5. You deploy using Web Deploy and the publishing profile, the new API App on top of the old one.

That will trigger the issue I’ve explained before. That happens because there are two routes being registered by Swashbuckle when you try to start the app. One of the old one and one of the new one. That’s because the old files are still present at the destination.

To solve this, during Web Deploy, click on the Settings tab and then expand the “File Publish Options“. There is a checkbox there, called “Remove additional files from destination“. This will fix the issue as it will only leave the files you deploy at the destination and not the old ones as well.

web-deploy

Hope it helps,

Panos

What to know what API Apps are? Check here.

Part 2: Running non-.NET API Apps on Azure App Service

On my last post I’ve talked in a bit more detail what API Apps are, what you can build with them, how they work and provided some answers to frequently asked questions. Another very common question is: “How do I run an API App that it’s not .NET?“. If you’ve read my last post, I explained how Azure App Service discovers and understands the API definition of your API and light up functionality in the portal by reading your Swagger 2.0 endpoint. The structure we’re looking for to be present on your app is simple:

  • An “apiapp.json” file with some metadata about your API
  • A “Metadata” folder containing a “deploymentTemplates” folder
  • In the deploymentTemplates folder, a file called appappconfig.azureresource.json. The “Metadata” folder is temporarily not used by the platform but it will in the future. When the time comes, I will explain in more detail what it does.

If those files and folders exist in the root, then we can properly parse your API definition and understand the capabilities of your API. The minimum we need from the apiapp.json file is:

apiappjson

 

Make note of the endpoints object and the apiDefinition property. This property points to a Swagger 2.0 endpoint from where we extract the definition of your API. The property is relative to your API app’s path. If you don’t have a valid Swagger 2.0 endpoint, we can’t parse your API definition and your API App will not function properly in the portal and within Logic Apps.

How do I make other non-.NET API Apps run?

As I explained before, the underlying infrastructure is basically a Web App. That means you can run whatever technology you want and it’s supported as a Web App like Node.js, PHP, Python, Java and more. In my case I’m going to use Node.js. To make the App Service understand you’re an API App, you need the file(s) that I explained above (at this time, only apiapp.json).

Building a Node.js API App

To run the Node.js app I will follow the same path as I was about to deploy it as a Web App and I will add the files required by App Service. Your structure should look something like the one below:

nodeapiapproot

 

I haven’t created a “Metadata” folder because, as I mentioned, is currently not used by the App Service. This is a simple Node.js app using the Express framework. I chose the Express framework for this example for a couple of reasons:

  • It’s easy to use and build something that demonstrates the technology
  • Every other module I found, didn’t support Swagger 2.0 but 1.2. App Service requires 2.0 otherwise it’s not going to work
  • The module I used is swaggerize-express by PayPal

I have an apiapp.json file which is my App Service API App metadata file, the api.json which is my Swagger 2.0 spec file, a server.js which is what App Service will run to start my Node App and lastly my package.json which contains information about the Node app I’m deploying.

server.js

What the server.js code is doing is serving the api.json file which contains the Swagger 2.0 spec. In there the REST API URI is of the /v1/petstore format. We also dictate that the /api-docs is where the Swagger 2.0 metadata can be found. Also, we provide the handlers directory for Express routes in case someone tries to access the API. Lastly, there is a simple Hello World! message in case you try to access the root URI instead of the REST API.

apiapp.json

In the apiapp.json file, along with the metadata, we provide the important apiDefinition property which points to where our Swagger 2.0 metadata are.

Deploying the API App

At the tutorial link I provided before it’s explained how you can deploy using git. In our case you have to create an empty API App, not a Web App. Choose an appropriate name and create the API App. To find where to enable Continuous deployment, make sure you go the API App host. You can find that on the Settings of your API App. From there, follow the steps as they are explained on the tutorial to deploy the Node API App.

Api-app-host

Accessing the API App

Once the git deployment finishes, if you already had another API App running in the same deployment, you have to restart your gateway so it will pickup the correct metadata files from your API App. If you didn’t, you don’t have to restart the gateway. Find the gateway from the API App Settings and click on it. Click on “Restart” from the top toolbar. You can verify that everything works OK by clicking on “API Definition”:

api-definition

You should see something like this on the blade that will open:

api-definition-swagger

To access your API App, use can use the URL from your API App settings which is in the form of microsoft-apiappGUID.azurewebsites.net. If your tier allows, you can enable custom domains (among other capabilities) and make this look prettier.

We’re done. Where is the code?

That’s it! Your Node API App is now running on App Service.

You can find the Node API App on a Github repo and you can grab it from there.

What’s next?

This is a series of blog posts explaining API Apps and Logic Apps. My next blog post will be explaining how you can integrate a ASP.NET MVC WebAPI API App (that’s a mouthful, isn’t?) and a Node.js API App using a Logic App.

In the meantime download the bits, start playing with the platform for free and stay tuned!

You can find Part 1 here.

Feel free to reach out to me if you have any other questions.

Panos

Part 1: Azure App Service API Apps in more detail and some FAQ

What is Azure App Service?

In case you missed it, on the 24th we announced Azure App Service. In simple words, it’s a common platform were you can develop Web, Mobile, Logic and API Apps. It’s based on proven technologies and more specifically WebSites, which is now called Web Apps. The fact that there is a common infrastructure below gives us a lot of benefits like being able to scale apps together, or independently, sharing resources, being able to seamlessly integrate the apps together and many more.

 

Building API Apps

One of the new types of Apps you can build are API Apps. You might wonder, couldn’t I just have a Web App and put an API there? The answer is yes, but API Apps has more to it than just hosting. When you choose to build an API App some of the advantages are:

  1. We can auto-generate the code needed for that API to be consumed by your apps
  2. You can use your API Apps in Logic Apps
  3. The API App can be listed in the galleries (public or private, including the marketplace) (not currently available in Preview)
  4. API Apps can be auto-updated (not currently available in Preview)

The API App can be of any technology that a Web App can be, meaning .NET, Node.JS, PHP, Python, Java etc. When you build and deploy an API App what happens behind the scenes is that we create a Web App for you and we put some special metadata around it to know that is an API App. The API App also needs a special file in the root, called apiapp.json which contains the API App related metadata like author, summary, id and many more that are not currently used in the preview but they will light up in the future. You can find more information at this page. Towards the middle of the page there is a detailed description of the all the properties you can add to the apiapp.json file.

apiappjson

To be able to start building the apps, make sure you get the latest Azure SDK. Currently the Azure App Services SDK only works with Visual Studio 2013. Once you download the SDK and install it you have two options: migrate an existing Web API or create a new one.

How does it work?

The power of the API Apps comes from the easiness to consume and discover them. Our long term vision is to enable anything from regular REST APIs to even OData APIs to run on the platform and benefit from it. To be able to do that, we need a way to describe what those APIs are capable of and have a extensible model that expresses their capabilities. That is happening by using Swagger 2.0. For the API App to light up properly you need to have a Swagger 2.0 Metadata endpoint where our platform will reach out and discover the capabilities of your API, effectively find the API Definition. The Swagger 2.0 metadata are only required for that reason. Your customers, you or whoever wants to consume the API doesn’t need any special tooling or anything so there is no coupling here. In case you use our Azure API App Client, you can then point our tooling to your API App and we will generate the code for you. The API App client generation works with any valid Swagger 2.0 metadata file, so even if the API App is not running, as long as you have the Swagger 2.0 metadata file, we can generate the code for you. Then the code is yours and you can extend it as much as you like as it’s partial classes.

Azure App Service API App architecture overview

Once you create an API App you might want to deploy it to your subscription. When you are about to deploy the app, you’ll be asked a couple of things:

  1. App Service Plan: This describes what kind of capabilities the underlying infrastructure will have (e.g. pricing tier, how many VMs are goin to host your app etc.)
  2. Resource Group: This is a separation unit of your resources. It helps when you want to separate the resources available to different API Apps or even isolate the discovery of API Apps all together. Logic Apps within a resource group can only discover other API Apps within that resource group. It’s also very easy to manage the resource like that as you can see all the dependencies and such.
  3. Access Level: Who is able to access your API? Options can be Public Anonymous, Internal and Public Authenticated.
  4. Region: The region you want your API App to be deployed.

APIAppPublish

After the deployment finishes, you will notice that within your resource group there is another App called “Gateway”. Conceptually, all the API Apps sit behind the gateway. The Gateway handles things like API Management, Authentication and more in the future. When you try to access an API App there is communication with the gateway which in case of an authenticated call, it also flows the token to the API App being accessed. The Gateway also contains metadata of the API Apps like their definition, name, version of the package and more.

Implementing a microservice architecture

As you can imagine you can create multiple, independent API Apps that communicate with each other, have their own persistence layer, share authentication and much more. You can actually implement a microservice architecture like that. There is no precise definition right now, but the idea is having lightweight services that are structured around business capabilities, have automated deployments, intelligence in the endpoints and be decentralized from languages and data (by Martin Fowler). The protocol of choice can be HTTP but that’s not absolutely necessary. In our case though, that’s how API Apps work.

FAQ

  • Am I tightly coupled to the service if I generate code for an API App? Is this another “Add Service Reference…” thing?
    • No, you’re not and no it’s not. The Swagger 2.0 metadata are used to describe the capabilities of the API. When you generate the code using the Azure API App Client, we’re not adding any reference or anything. We’re just generating the code you would write anyway to access the API. If you don’t have to generate the code if you don’t want to. You can consume the API as any other API out there, by writing manual HttpClient code and JSON serialization
  • How can I update my API App?
    • Before the galleries and the packaging is released, all you have to do is do a Publish (Deploy) as you would do for a Web App. That’s it. The experience will stay as easy once the galleries are out as well.
  • If my API is accessed rarely but I still need quick responses, how do I do that?
    • If you remember, the underlying container for the API App is a Web App. The benefits of using a common infrastructure is that you get a common set of capabilities. That means, you can go to the Settings of your API Apps host and enable the Always On feature. Open the blade of the API App on the preview portal and look for the API App Host setting. Click it and that should get you to the API App Host. There, click on Settings->Application Settings and find the Always On option and switch it to On on the blade that will open. Click Save and you’re done.

What’s next?

This is a series of blog posts explaining API Apps and Logic Apps. My next blog post will be explaining how you can run a Node.js App as an API App.

The second part is available here.

In the meantime download the bits, start playing with the platform for free and stay tuned!

Feel free to reach out to me if you have any other questions.

Panos

My blog is not dead, I promise.

It’s not dead.

I promise.

I haven’t written a single thing for a loo<chuchu_train>oooooooooo</chuchu_train>ng time. It was for a good reason. So what happened? Well:

  1. I joined Microsoft. Which means I had to drop my MVP status.
  2. I’m a Technical Evangelist, writing code, spreading knowledge.
  3. I moved to Vancouver, BC to be closer to my team.
  4. My beautiful wife followed me.

In between those steps, there were a lot of talks at TechEd’s, TechDays etc. I’ve been all around Europe and a bit of US.

I told you, my blog is not dead.

More is coming soon.

Cheers,
Panos

VMDepot is now integrated into the Windows Azure Portal

A nice change I noticed today is that VMDepot images are now visible in the Windows Azure Portal.

If you go to Virtual Machines

Vm-option

You’ll see an option that says “Browse VMDepot”

Browse-VM-Depot

If you click it, you get the list of the image already in the VM Depot:

VMDepot-List

 

You can select one and create a virtual machine based on that image, just like that! :)

The coolest of all is that you can create your own images, publish them to VM Depot and if they get accepted, they get visible in the portal as well.

Small addition but a lot of value comes out of it!

Cheers,

PK

Windows Azure and VM Depot from MS Open Tech

VM Depot allows users to build, deploy and share their favorite Linux configuration, create custom open source stacks, work with others and build new architectures for the cloud that leverage the openness and flexibility of the Azure platform.

How it works

All images shared via this catalog are open for interaction where other users of VM Depot can leave comments, ratings, and even remix images to developers’ liking and share the results with other members of the community.  Currently, on catalogs similar to VM Depot (such as Amazon Web Services),users must pay to publish all versions of their images. With VM Depot, publishing an image is free to encourage users to take full advantage of shared insights and experience, as well as encourage collaboration towards the best possible version of an image. The shared community environment also means users can access this catalog to use basic image frameworks created by others so they can quickly continue building without starting from scratch.

VM Depot was made possible with the support of a number of partners who have contributed images and packages for this preview launch including Alt Linux, Basho, Bitnami and Hupstream.

FAQ

Q: Why has Microsoft Open Technologies Developed VM Depot?

A: VM Depot has been developed to help developers and IT pros take advantage of the open capabilities of the Windows Azure platform. VM Depot is a community-driven catalog of open source virtual machine images for Windows Azure. Using VM Depot, the community can build, deploy and share their favorite Linux configuration, create custom open source stacks, and work with others to build new architectures for the cloud that leverage the openness and flexibility of the Windows Azure platform.

Some links:

Port 25 blog

Interoperability@Microsoft blog

More info:

VM Depot delivers more options for users bringing their custom Linux images to Windows Azure Virtual Machines

http://blogs.msdn.com/b/windowsazure/archive/2013/01/09/vmdepot-delivers-more-options-for-users-bringing-their-custom-linux-images-to-windows-azure-virtual-machines.aspx

VM Depot from MS Open Tech

http://blogs.technet.com/b/port25/archive/2013/01/09/for-your-oss-image-building-and-sharing-pleasure-meet-vm-depot-from-ms-open-tech.aspx

MS1621_Banner_220x165_r05

Speaking at Windows AzureConf

What is Windows AzureConf

On November 14, 2012, Microsoft will be hosting Windows AzureConf, a free event for the Windows Azure community. This event will feature a keynote presentation by Scott Guthrie, along with numerous sessions executed by Windows Azure community members. Streamed live for an online audience on Channel 9, the event will allow you to see how developers just like you are using Windows Azure to develop applications on the best cloud platform in the industry. Community members from all over the world will join Scott in the Channel 9 studios to present their own inventions and experiences. Whether you’re just learning Windows Azure or you’ve already achieved success on the platform, you won’t want to miss this special event. (Source: http://www.windowsazureconf.net)

Why register?

First of all, it’s free! But that’s not the real value of it. The real value is the opportunity to watch and learn from real-world examples of how Windows Azure was used to real-world applications. The presentations/sessions are going to be delivered by community members, the same people who worked on those applications.

My session

I will be at the Channel9 studios in Redmond as my session (Windows Azure + Twilio == A Happy Tale to Tell) was selected  to be one of them and I would love to see you online and answer your questions.

Register now at http://www.windowsazureconf.net

Windows Azure New Portal Improvements

I accessed the portal today and a nice surprise was waiting for me. Among other improvements that I either didn’t spot yet or they are not visible to us (backend changes) they were two new very welcome changes:

1) Service bus can now be managed from the new portal!

Under the “App Services” category, you can now find the new options to create a new “Queue”, “Topic” or “Relay” service point.

In case you select “Custom create” you can directly create the namespace from the wizard instead of having first to create the namespace and then add entities (services) to it.

You can also track information regarding your namespace’s Queues, Topics or Relays directly from the portal. That information could be e.g. size, transactions etc.

Last but not least, you can retrieve the connection string from the portal. No more mistakes or looking for it when needed.

2) You can now manage the users (co-admin) of your subscriptions directly from the “Settings” menu and the new portal. It used to be that you had to switch to the old Silverlight portal to execute this task.

 

Very few services now missing from the new portal but I love how the team is getting this better every single day. And they do listen to feedback so make sure you send yours!

PK

SugarCRM-New-Install-Select

How to run SugarCRM on Windows Azure Web Sites using WebMatrix 2

Windows Azure Web Sites (WAWS) is a powerful hosting platform provided by Microsoft on Windows Azure. One of the coolest features is that you can run your Web Sites/Web Apps pretty easily and it’s not limited to ASP.NET but it also supports PHP. Also on database level it’s not only SQL databases (MSSQL) that are supported but MySQL as well.

Creating the Web Site

On my previous post on how to migrate your WordPress blog to WAWS I’ve demonstrated some options you have when you create a new Web Site

In our case we just want one that has a database which we’re going to use to migrate our SugarCRM installation. So select “Create with Database”.

In the second step we choose the name and we agree to the terms. Now, in the preview there is a limitation of 1 MySQL database per account, so if you already have a DB like I did, you’ll get a notification and you won’t be able to go forward. Just select and existing MySQL database on the previous step, instead of “Create a new MySQL Database”.

Once this is finished, WAWS will provision your web site and in the next 5-10 seconds you’ll have your web site ready to publish your files to it.

Using WebMatrix 2 to create the SugarCRM installation

If you’re familiar with WebMatrix 2 then you’ll know that creating/installing and running a number of Web Apps it’s a matter of following a Wizard and choosing the correct options. WebMatrix takes care of the rest like creating the hosting locally, downloading dependencies, installing and configuring them and even publishing/migrating your data and files when you upload them to your hoster (with WebDeploy). In my case, I’ve created a new SugarCRM installation from the Application Gallery:

Finish the wizard and you’ll end up with a brand new and working installation of SugarCRM. If you don’t have MySQL installed, it will install it for your and configure it. Also, make sure you download MySQL Workbench because we’ll need it later to connect and inspect the database but also execute the migration of the local MySQL database to the one created on the cloud.

Uploading the website using WebDeploy

Once you finish the local installation of your SugarCRM and you get the login screen, then you’re ready to publish your web site to WAWS. There are two ways to do that, either you can use an FTP client like FileZilla and use the credentials from the portal:

We use WebMatrix, so there is an even easier way to publish them by using WebDeploy. You can download your Publish settings and Import them on your WebMatrix installation. To download your settings, go the Portal, click on your website and then choose “Download Publish Profile”:

Once you do this, you can import your profile to WebMatrix and use it to Publish your website:

Now, you can go ahead and publish your website to WAWS. If everything goes ok, WebMatrix should update your config.php with the correct connection settings imported from your publish profile. In case this doesn’t happen, you have to update your config.php file from SugarCRM with the correct connection settings to MySQL.

Uploading/migrating your database

WebMatrix will prompt you to upload your database along with the publishing of the files but this is not going to work as the local database name doesn’t match the database name of the one created by WAWS and you don’t have permission to create a new one anyway. I was being lazy and as I didn’t want to install anything more than the MySQL Workbench on my machine, I exported the data from my local MySQL database, then use Notepad++ to replace the database name to all of the files exported and then used MySQL Workbench again to import them to my WAWS MySQL database. Below step by step:

Export:

Then I renamed the schema by using Notepad++. What I did is “Find/Replace” a string on the dump file and replace the database name. Then the last step was import that to the WAWS MySQL Database:

It still says “sugarcrm143″ which is my local database name but the file uses the WAWS MySQL database name, so you can safely ignore that.

Again, as I said I was being lazy. What you could do is basically export the files using the mysql CLI and then import it directly to the database you want.

You’re done!

Well, that was pretty much it. You uploaded your SugarCRM installation, you uploaded your database data (I used demo data in mine) and your website is now hosted in WAWS with all the benefits that it has, like scaling up or down on the shared environment or even running it on dedicated instances. All, with just use a slider on the portal! You can access my installation at http://sugarcrm.azurewebsites.net It might be that the first time you try to access it, it will take a bit longer (8-10 seconds) if the web site is sleeping but then the responses are sub-second.

PS. I’ve decided to take the Web Site off as it has been running for a month now and it was only for the purpose of the blog. Here you can find a screenshot of the website running.

Comments, suggestions or corrections are more than welcome!

Thank you,

PK