A web-developer’s guide to help eliminate non-coding tasks and get code done faster

Microsoft have just published this great eBook which helps web developers cut right to the code scripting of environments to make all the firewall ports and get code done quickly by providing an overview of key services in the cloud, what services to use based on your needs, step by step guidance, sample code, sample applications, and a free account to get started.

ebook-a-web-developers-guide

Here is a list of areas it covers

  • When to Use It
  • Moving an Existing ASP.NET Website to App Service
  • Identity Management
  • Scaling your Web App
  • Caching for Performance
  • Better Customer Experience with a CDN
  • Detect Failures Faster
  • Building a New Website on Azure App Service

 

Deployments Best Practices

Table of Contents

Introduction

This guide is aimed to help you better understand how to deal with deployments in your development workflow and provide some best practices. Sometimes a bad production deployment can ruin all the effort you have invested in a development process. Having a solid deployment workflow can become one of the greatest advantages of your team.

Before you start, I recommend reading our Developing and Deploying with Branches guide first to get a general idea of how branches should be setup in your repository to be able to fully utilize tips from this guide. It’s a great read!

Note on Development Branch

In this guide you will see a lot of references to a branch called development. In your repository you can use master (Git), trunk (Subversion) or default (Mercurial) for the same purpose, there’s no need to create a branch specifically called “development”. I chose this name because it’s universal for all version control systems.

The Workflow

Deployments should be treated as part of a development workflow, not as an afterthought. If you are developing a web site or an application, your workflow will usually include at least three environments: Development, Staging and Production. In that case the workflow might look like this:

  • Developers work on bugs and features in separate branches. Really minor updates can be committed directly to the stable development branch.
  • Once features are implemented, they are merged into the staging branch and deployed to the Staging environment for quality assurance and testing.
  • Once testing is complete, feature branches are merged into the development branch.
  • On the release date, the development branch is merged into production and then deployed to the Production environment.

Let’s take a closer look at each environment to see what are the most efficient way to deploy each one of them.

Development Environment

If you make web applications, you don’t need a remote development environment, every developer should have their own local setup.

Many customers have Development environments set up with automatic deployments on every commit or push. While this gives developers a small advantage of not installing the site or the application on their computers to perform testing locally, it also wastes a lot of time. Every tiny change must be committed, pushed, deployed, and only then it can be verified. If the change was made by mistake, a developer will have to revert it, push it, then redeploy.

Testing on a local computer removes the need to commit, push and deploy completely. Every change can be verified locally first, then, once it’s more or less stable, it can be pushed to a Staging environment for proper quality assurance testing.

However, what this does provide, is an environment that can ensure the auto-deployment to environment is successful and runs in an independent installation process far removed from the developers’ environment.

We do not recommend using deployments for rapidly changing development environments. Running your software locally is the best choice for that sort of testing.

We recommend to deploy to the development environment automatically on every commit or push.  This will ensure that the build process is full working.

Staging Environment

Once the features are implemented and considered fairly stable, they get merged into the staging branch and then automatically deployed to the Staging environment. This is when quality assurance kicks in: testers go to staging servers and verify that the code works as intended.

It is very handy to have a separate branch called staging to represent your staging environment. It will allow developers to deploy multiple branches to the same server simultaneously, simply by merging everything that needs to be deployed to the staging branch. It will also help testers understand what exactly is on staging servers at the moment, just by looking inside the staging branch.

We recommend always deploying major releases to staging at a scheduled time, of which the whole team is aware of.

Production Environment

Once the feature is implemented and tested, it can be deployed to production. If the feature was implemented in a separate branch, it should be merged into a stable development branch first. The branches should be deleted after they are merged to avoid confusion between team members.

The next step is to make a to show the difference between the production and development branches to take a quick look at the code that will be deployed to production. This gives you one last chance to spot something that’s not ready or not intended for production. Things like debugger breakpoints, verbose logging or incomplete features.

Once the diff review is finished, you can merge the development branch into production and then initialize a deployment of the production branch to your production environment by hand. Specify a meaningful message for your deployment so that your team knows exactly what you deployed.

Make sure to only merge development branch into production when you actually plan to deploy. Don’t merge anything into production in advance. Merging on time will make files in your production branch match files on your actual production servers and will help everyone better understand the state of your production environment.

We recommend always deploying major releases to production at a scheduled time, this should be a MANUALLY process not automated (This can be as simple as clicking a link to start the process going or moving some files, it just needs to be a human who activates the process), of which the whole team is aware of.  Find the time when your application is least active and use that time to roll out updates. This may sound obvious, but make sure that it’s not too late in the day, because someone needs to be around after the deployment for at least a few hours to monitor the application and make sure the deployment went fine. Urgent production fixes can be deployed at any time.

After deployment finishes make sure to verify it. It is best to check all the features or fixes that you deployed to make sure they work properly in production. It is a big win if your deployment tool can send an email to all team members with a summary of changes after every deployment. This helps team members to understand what exactly went live and how to communicate it to customers. Beanstalk does this for you automatically.

Your deployment to production is now complete, time to pop champagne and celebrate with your team!

Rolling Back

Sometimes deployments don’t go as planned and things break. In that case you have the possibility to rollback. However, you should be as careful with rollbacks as with production deployments themselves. Sometimes a rollback brings more havoc than the issue it was trying to fix. So first of all stay calm and don’t make any sudden moves. Before performing a rollback, answer the following questions:

Did it break because of the code that I deployed, or did something else break?

You can only rollback files that you deployed, so if the source of the issues is something else a rollback won’t be much help.

Is it possible to rollback this release?

Not all releases can be rolled back. Sometimes a release introduces a new database structure that is incompatible with the previous release. In that case if your rollback, your application will break.

If the answer to both questions is “yes”, you can rollback safely. After rollback is done, make sure to fix the bug that you discovered and commit it to either the development branch (if it was minor) or a separate bug-fix branch. Then proceed with the regular bug-fix branch → staging; bug-fix → development → production integration workflow.

Deploying Urgent Fixes

Sometimes you need to deploy a bug-fix to production quickly, when your development branch is not ready for release yet. The workflow in that case stays the same as described above, but instead of merging the development branch into production you actually merge your bug-fix branch first into the development branch, then separately into production, without merging development into production. Then deploy the production branch as usual. This will ensure that only your bug-fix will be deployed to the production environment without all the other stuff from the development branch that’s not ready yet.

It is important to merge the bug-fix branch to both the development and production branches in this case, because your production branch should never include anything that doesn’t exist in your stable development branch. The development branch is where developers work all day, so if your fix is only in the production branch they will never see it and it can cause confusion.

Automatic Deployments to Production?

I can’t stress enough how important it is for all production deployments to be performed and verified by a responsible human being. Using automatic deployments for Production environment is dangerous and can lead to unexpected results. If every commit is deployed to your production site automatically, imagine what happens when someone commits something by mistake or commits an incomplete feature in the middle of the night when the rest of the team is sleeping? Using automatic deployments makes your Production environment very vulnerable. Please don’t do that, always deploy to production manually.

Permissions

Every developer should be able to deploy to the Staging environment. They just need to make sure they don’t overwrite each other’s changes when they do. That’s exactly why the staging branch is a great help: all changes from all developers are getting merged into it so it contains all of them.

Your Production environment, ideally, should only be accessible to a limited number of experienced developers. These guys should always be prepared to fix the servers immediately after a deployment went rogue.

Conclusion

We’ve been using this workflow with many customers for many years to deploy their application. Some of these things were learned the hard way, through broken production servers. Following these guidelines and all production deployments will become incredibly smooth and won’t cause any stress at all.

Orginal Article

Visual Studio Team Services Security

Using Azure can open up a can of worm around security and many customers have many concerns

Microsoft do a lot of things to keep your Team Service project safe and secure, refer to this link for details: Visual Studio Team Services Data Protection Overview.

You can deploy your own build agent which you can have full control and easy to configure your machines to only accept the deployment from that build agent.

Another URL link I found from Microsoft Virtual Learning which might be useful:

Getting Started with Azure Security for the IT Professional

Do IT security concerns keep you up at night? You’re not alone! Many IT Pros want to extend their organization’s infrastructure but need reassurance about security. Whether you are researching a hybrid or a public cloud model with Microsoft Azure, the question remains the same: Does the solution meet your own personal and your organization’s bar for security, including industry standards, attestations, and ISO certifications? In this demo-filled course, explore these and other hot topics, as a team of security experts and Azure engineers takes you beyond the basic certifications and explores what’s possible inside Azure. See how to design and use various technologies to ensure that you have the security and architecture you need to successfully launch your projects in the cloud. Dive into datacenter operations, virtual machine (VM) configuration, network architecture, and storage infrastructure. Get the information and the confidence you need, from the pros who know, as they demystify security in the cloud.

This article is very useful is you need to deploy from a remote server

http://myalmblog.com/2014/04/configuring-on-premises-build-server-for-visual-studio-online/

Azure Security Center

It is important that when using Azure that you take security very seriously, I’ve been looking around to see if I can get together as much information as I can to help businesses protect the environments that they relay on so much to run their businesses.

I found these short introductions to Azure Security Center which are worth looking over:

Introduction to Azure Security Center

A brief overview of how Azure Security Center helps you protect, detect and respond to cybersecurity threats.

Azure Security Center Overview

With Azure Security Center, you get a central view of the security state of all of your Azure resources. At a glance, verify that the appropriate security controls are in place and configured correctly. Scott talks to Sara Fender who explains it all!

Azure Security Center – Threat Detection

With Azure Security Center, you get a central view of the security state of all of your Azure resources. At a glance, verify that the appropriate security controls are in place and configured correctly. Scott talks to Sarah Fender who explains how Security Center integrates Threat Detection.

Azure Security Center – Focus on Prevention

Staying ahead of current and emerging threats requires an integrated, analytics-driven approach. By combining Microsoft global threat intelligence and expertise with insights into security-related events across your Azure deployments, Security Center helps you detect actual threats early, and it reduces false positives. Scott talks to Sara Fender who breaks down the details.

Backup SQL Azure

There are a number of options for backing up SQL Azure, which can be found here:

Different ways to Backup your Windows Azure SQL Database

I like the Azure way, which is just exporting, importing and setting a scheduled

Before You Begin

The SQL Database Import/Export Service requires you to have a Windows Azure storage account because BACPAC files are stored here. For more information about creating a storage account, see How to Create a Storage Account. You must also create a container inside Blob storage for your BACPAC files by using a tool such as the Windows Azure Management Tool (MMC) or Azure Storage Explorer.

If you want to import an on-premise SQL Server database to Windows Azure SQL Database, first export your on-premise database to a BACPAC file, and then upload the BACPAC file to your Blob storage container.

If you want to export a database from Windows Azure SQL Database to an on-premise SQL Server, first export the database to a BACPAC file, transfer the BACPAC file to your local server (computer), and then import the BACPAC file to your on-premise SQL Server.

Export a Database

  1. Using one of the tools listed in the Before You Begin section, ensure that your Blob has a container.

  2. Log on to the Windows Azure Platform Management Portal.

  3. In the navigation pane, click Hosted Services, Storage Accounts & CDN, and then click Storage Accounts. Your storage accounts display in the center pane.

  4. Select the required storage account, and make a note of the following values from the right pane: Primary access key and BLOB URL. You will have to specify these values later in this procedure.

  5. In the navigation pane, click Database. Next, select the subscription, your SQL Database server, and then your database that you want to export.

  6. On the ribbon, click Export. This opens the Export Database to Storage Account window.

  7. Verify that the Server Name and Database match the database that you want to export.

  8. In the Login and Password boxes, type the database credentials to be used for the export. Note that the account must be a server-level principal login – created by the provisioning process – or a member of the dbmanager database role.

  9. In New Blob URL box, specify the location where the exported BACPAC file is saved. Specify the location in the following format: “https://” + Blob URL (as noted in step 4) + “/<container_name>/<file_name>”. For example: https://myblobstorage.blob.core.windows.net/dac/exportedfile.bacpac. The Blob URL must be in lowercase without any special characters. If you do not supply the .bacpac suffix, it is applied by the export operation.

  10. In the Access Key box, type the storage access key or shared access key that you made a note of in step 4.

  11. From the Key Type list, select the type that matches the key entered in the Access Key box: either a Storage Access Key or a Shared Access Key.

  12. Click Finish to start the export. You should see a message saying Your request was successfully submitted.

  13. After the export is complete, you should attempt to import your BACPAC file into a Windows Azure SQL Database server to verify that your exported package can be imported successfully.

Database export is an asynchronous operation. After starting the export, you can use the Import Export Request Status window to track the progress. For information, see How to: View Import and Export Status of Database (Windows Azure SQL Database).

noteNote
An export operation performs an individual bulk copy of the data from each table in the database so does not guarantee the transactional consistency of the data. You can use the Windows Azure SQL Database copy database feature to make a consistent copy of a database, and perform the export from the copy. For more information, see Copying Databases in Windows Azure SQL Database.

Configure Automated Exports

Use the Windows Azure SQL Database Automated Export feature to schedule export operations for a SQL database, and to specify the storage account, frequency of export operations, and to set the retention period to store export files.

To configure automated export operations for a SQL database, use the following steps:

  1. Log on to the Windows Azure Platform Management Portal.

  2. Click the SQL database name you want to configure, and then click the Configuration tab.

  3. On the Automated Export work space, click Automatic, and then specify settings for the following parameters:

    • Storage Account
    • Frequency
      • Specify the export interval in days.
      • Specify the start date and time. The time value on the configuration work space is UTC time, so note the offset between UTC time and the time zone where your database is located.
    • Credentials for the server that hosts your SQL database. Note that the account must be a server-level principal login – created by the provisioning process – or a member of the dbmanager database role.
  4. When you have finished setting the export settings, click Save.

  5. You can see the time stamp for the last export on under Automated Export in the Quick Glance section of the SQL Database Dashboard.

To change the settings for an automated export, select the SQL database, click the Configuration tab, make your changes, and then click Save.

Create a New SQL Database from an Existing Export File

Use the Windows Azure SQL Database Create from Export feature to create a new SQL database from an existing export file.

To create a new SQL database from an existing export file, use the following steps:

  1. Log on to the Windows Azure Platform Management Portal.

  2. Click a SQL database name and then click the Configuration tab.

  3. On the Create from Export work space, click New Database, and then specify settings for the following parameters:

    • Bacpac file name – This is the source file for your new SQL database.
    • A name for the new SQL database.
    • Server – This is the host server for your new SQL database.
    • To start the operation, click the checkmark at the bottom of the page.

Import and Export a Database Using API

You can also programmatically import and export databases by using an API. For more information, see the Import Export example on Codeplex.

Import a Database

  1. Using one of the tools listed in the Before You Begin section, ensure that your Blob has a container, and the BACPAC file to be imported is available in the container.

  2. Log on to the Windows Azure Platform Management Portal.

  3. In the navigation pane, click Hosted Services, Storage Accounts & CDN, and then click Storage Accounts. Your storage accounts display in the center pane.

  4. Select the storage account that contains the BACPAC file to be imported, and make a note of the following values from the right pane: Primary access key and BLOB URL. You will have to specify these values later in this procedure.

  5. In the navigation pane, click Database. Next, select the subscription, and then your SQL Database server where you want to import the database.

  6. On the ribbon, click Import. This opens the Import Database from Storage Account window.

  7. Verify that the Target Server field lists the SQL Database server where the database is to be created.

  8. In the Login and Password boxes, type the database credentials to be used for the import.

  9. In the New Database Name box, type the name for the new database created by the import. This name must be unique on the SQL Database server and must comply with the SQL Server rules for identifiers. For more information, see Identifiers.

  10. From the Edition list, select whether the database is a Web or Business edition database.

  11. From the Maximum Size list, select the required size of the database. The list only specifies the values supported by the Edition you have selected.

  12. In the BACPAC URL box, type the full path of the BACPAC file that you want to import. Specify the path in the following format: “https://” + Blob URL (as noted in step 4) + “/<container_name>/<file_name>”. For example: https://myblobstorage.blob.core.windows.net/dac/file.bacpac. The Blob URL must be in lowercase without any special characters. If you do not supply the .bacpac suffix, it is applied by the import operation.

  13. In the Access Key box, type the storage access key or shared access key that you made a note of in step 4.

  14. From the Key Type list, select the type that matches the key entered in the Access Key box: either a Storage Access Key or a Shared Access Key.

  15. Click Finish to start the import.

Database import is an asynchronous operation. After starting the import, you can use the Import Export Request Status window to track the progress. For information, see How to: View Import and Export Status of Database (Windows Azure SQL Database).

Original Article