Azure Security Center

It is important that when using Azure that you take security very seriously, I’ve been looking around to see if I can get together as much information as I can to help businesses protect the environments that they relay on so much to run their businesses.

I found these short introductions to Azure Security Center which are worth looking over:

Introduction to Azure Security Center

A brief overview of how Azure Security Center helps you protect, detect and respond to cybersecurity threats.

Azure Security Center Overview

With Azure Security Center, you get a central view of the security state of all of your Azure resources. At a glance, verify that the appropriate security controls are in place and configured correctly. Scott talks to Sara Fender who explains it all!

Azure Security Center – Threat Detection

With Azure Security Center, you get a central view of the security state of all of your Azure resources. At a glance, verify that the appropriate security controls are in place and configured correctly. Scott talks to Sarah Fender who explains how Security Center integrates Threat Detection.

Azure Security Center – Focus on Prevention

Staying ahead of current and emerging threats requires an integrated, analytics-driven approach. By combining Microsoft global threat intelligence and expertise with insights into security-related events across your Azure deployments, Security Center helps you detect actual threats early, and it reduces false positives. Scott talks to Sara Fender who breaks down the details.

Azure Resource Manager Compute Templates

Things have moved forward in the Microsoft’s Azure cloud platform since I very first started using Azure.  One thing is for sure is that the number of services available has grown and with this growth comes the responsibility of ensuring that you can successfully deploy what you have built with the minimum about of trouble in the quickest possible time.  Along came the Azure Resource Manager (ARM) which is the service used to provision resources in your Azure subscription. It was first announced at Build 2014 when the new Azure portal (portal.azure.com) was announced and provides a new set of JSON API’s that are used to provision resources. Before ARM, developers and IT professionals used the Azure Service Management API and the old portal (manage.windowsazure.com) to provision resources. Both portals and sets support the of API’s. However, going forward you should be using ARM, the new API’s, and the new Azure portal to provision and manage your Azure resources.

Azure Friday has a nice easy overview of the Azure Resource Manager.  Scott talks to Mahesh Thiagarajan about the new Azure JSON-based APIs for Azure that now include Compute, Network, and Storage. Orchestrating large system deployments is easier and more declarative than ever.

 

The Benefits of using the Azure Resource Manager

There are many benefits of ARM that the first Service Management API’s (and old portal) could not deliver. The advantages that ARM provides can be broken down into a few areas:

Declaratively Provision Resources

Azure Resouce Management provides us with a way to describe resources in a resource group using JSON documents.  There are many resource groups, for example, Web App, Storage Account, SQL Database. In the JSON document, you can describe the properties for each of the resources such as the type of storage account, the size of the SQL Database, and settings for the Web App; there are many more options available. The advantage here is that we can describe the environment we want and send that to Azure Resource Management to make it happen.

Smarter and Faster Provisioning of Resources

Before ARM, you had to provision resources independently to support your application, and it was up to you to figure dependencies between resources and accommodate for these in your deployment scripts.

ARM is also able to determine when it can provision resources simultaneously allow for faster provisioning of all the resources described in the resource group is achieved.

Resource Groups as a Unit of Management

Before ARM, the relationship between resources (i.e. a web app and a database) was something you had to manage yourself. Include trying to compute costs across multiple resources to determine the all-up costs for an application. In the ARM era, since the resource group containing the resources for your application are a single unit of deployment, the resource group also becomes a unit of management. By resource grouping it enables you to do things like determining costs for the entire resource group (and all he resources within), making accounting and chargebacks easier to manage.

Before ARM, if you wanted to enable a user or group of users to manage the resources for a particular application, then you had to make them a co-administrator on your Azure Subscription. The users have full capability to add, delete, and modify any resource in the subscription, not just the resources for that application. With ARM, you can configure Role-Based Access Control for resources and resource groups, enabling you to assign management permissions to only the resource group for only the users that need access to manage it. When those users sign-in to Azure they will be able to see the resource group you gave them access to but not the rest of the resources in your subscription. You can even assign Role-Based Access Control permissions to individual resources if you needed to.

Idempotent Provisioning of Resources

Post ARM, automating the provisioning of resources meant that you had to account for situations where some, but not all, of your resources, would be successfully provisioned.  With ARM, when you send it the JSON document describing your environment, ARM knows which resources already exist and which ones do not and will provision only the resources missing to complete the resource group.

Tools

The Azure portal is an excellent way to get started using Azure Resource Manager with nothing more than your browser. When you create resources using the Azure portal (portal.azure.com) you are using ARM. Eventually, you will need to write a deployment template that describes all the resources you want ARM to provision. And to be scalable, you will need to automate the deployment. You could start with something from the Azure Quick start Templates, but at some point, you will likely need (or want) to build your deployment template from scratch. For this, Visual Studio and PowerShell are the two tools are strongly recommended.

Visual Studio

When it is time to write your ARM deployment templates, you will find that Visual Studio and the Azure Tools delivers an incredible experience that includes JSON IntelliSense in the editor and a JSON outline tool to visualise the resources. Writing ARM deployment templates is not a trivial task. By using Visual Studio and the Azure Tools, you will be able to create deployment templates from scratch in a matter of minutes.

A version of the Azure Tools SDK is available for  VS 2015. The Azure Tools provides an Azure Resource Group project template to get you started as shown below. In subsequent posts, I’ll be demonstrating how to use this project template.

Powershell

The project template mentioned in the previous section generates a PowerShell deployment script that can be used to send your deployment to ARM in an automated manner. The script uses the latest Azure “RM” Cmdlets that you can download and install from here. Most of the code in the deployment script handles uploading artefacts that ARM may need to deploy your environment. For example, if your deployment template describes a virtual machine that requires a DSC extension to add additional configuration to the VM after Azure has created it, then the DSC package (zip file) would need to be generated and uploaded to a storage account for ARM to use while it is provisioning the VM. But, if you scroll down past all that to the bottom of the script file you will see two commands in the script (shown below).

The first command, New-AzureRmResourceGroup, only creates the resource group using a resource group name that you provide.

The second command, New-AzureRmResourceGroupDeployment, pushes your deployment template and parameters for the deployment template to ARM. After ARM receives the files, it starts provisioning the resources described in your deployment template.

A workflow diagram of how these tools are used to deploy an environment using Azure Resource Manager is shown here.

Summary
In this post, We’ve introduced Azure Resource Manager and highlighted some significant advantages it brings to the platform. Concluded by adding a couple of tools you should consider using to author your deployment templates.

Original post by Rick Rainey

Backup SQL Azure

There are a number of options for backing up SQL Azure, which can be found here:

Different ways to Backup your Windows Azure SQL Database

I like the Azure way, which is just exporting, importing and setting a scheduled

Before You Begin

The SQL Database Import/Export Service requires you to have a Windows Azure storage account because BACPAC files are stored here. For more information about creating a storage account, see How to Create a Storage Account. You must also create a container inside Blob storage for your BACPAC files by using a tool such as the Windows Azure Management Tool (MMC) or Azure Storage Explorer.

If you want to import an on-premise SQL Server database to Windows Azure SQL Database, first export your on-premise database to a BACPAC file, and then upload the BACPAC file to your Blob storage container.

If you want to export a database from Windows Azure SQL Database to an on-premise SQL Server, first export the database to a BACPAC file, transfer the BACPAC file to your local server (computer), and then import the BACPAC file to your on-premise SQL Server.

Export a Database

  1. Using one of the tools listed in the Before You Begin section, ensure that your Blob has a container.

  2. Log on to the Windows Azure Platform Management Portal.

  3. In the navigation pane, click Hosted Services, Storage Accounts & CDN, and then click Storage Accounts. Your storage accounts display in the center pane.

  4. Select the required storage account, and make a note of the following values from the right pane: Primary access key and BLOB URL. You will have to specify these values later in this procedure.

  5. In the navigation pane, click Database. Next, select the subscription, your SQL Database server, and then your database that you want to export.

  6. On the ribbon, click Export. This opens the Export Database to Storage Account window.

  7. Verify that the Server Name and Database match the database that you want to export.

  8. In the Login and Password boxes, type the database credentials to be used for the export. Note that the account must be a server-level principal login – created by the provisioning process – or a member of the dbmanager database role.

  9. In New Blob URL box, specify the location where the exported BACPAC file is saved. Specify the location in the following format: “https://” + Blob URL (as noted in step 4) + “/<container_name>/<file_name>”. For example: https://myblobstorage.blob.core.windows.net/dac/exportedfile.bacpac. The Blob URL must be in lowercase without any special characters. If you do not supply the .bacpac suffix, it is applied by the export operation.

  10. In the Access Key box, type the storage access key or shared access key that you made a note of in step 4.

  11. From the Key Type list, select the type that matches the key entered in the Access Key box: either a Storage Access Key or a Shared Access Key.

  12. Click Finish to start the export. You should see a message saying Your request was successfully submitted.

  13. After the export is complete, you should attempt to import your BACPAC file into a Windows Azure SQL Database server to verify that your exported package can be imported successfully.

Database export is an asynchronous operation. After starting the export, you can use the Import Export Request Status window to track the progress. For information, see How to: View Import and Export Status of Database (Windows Azure SQL Database).

noteNote
An export operation performs an individual bulk copy of the data from each table in the database so does not guarantee the transactional consistency of the data. You can use the Windows Azure SQL Database copy database feature to make a consistent copy of a database, and perform the export from the copy. For more information, see Copying Databases in Windows Azure SQL Database.

Configure Automated Exports

Use the Windows Azure SQL Database Automated Export feature to schedule export operations for a SQL database, and to specify the storage account, frequency of export operations, and to set the retention period to store export files.

To configure automated export operations for a SQL database, use the following steps:

  1. Log on to the Windows Azure Platform Management Portal.

  2. Click the SQL database name you want to configure, and then click the Configuration tab.

  3. On the Automated Export work space, click Automatic, and then specify settings for the following parameters:

    • Storage Account
    • Frequency
      • Specify the export interval in days.
      • Specify the start date and time. The time value on the configuration work space is UTC time, so note the offset between UTC time and the time zone where your database is located.
    • Credentials for the server that hosts your SQL database. Note that the account must be a server-level principal login – created by the provisioning process – or a member of the dbmanager database role.
  4. When you have finished setting the export settings, click Save.

  5. You can see the time stamp for the last export on under Automated Export in the Quick Glance section of the SQL Database Dashboard.

To change the settings for an automated export, select the SQL database, click the Configuration tab, make your changes, and then click Save.

Create a New SQL Database from an Existing Export File

Use the Windows Azure SQL Database Create from Export feature to create a new SQL database from an existing export file.

To create a new SQL database from an existing export file, use the following steps:

  1. Log on to the Windows Azure Platform Management Portal.

  2. Click a SQL database name and then click the Configuration tab.

  3. On the Create from Export work space, click New Database, and then specify settings for the following parameters:

    • Bacpac file name – This is the source file for your new SQL database.
    • A name for the new SQL database.
    • Server – This is the host server for your new SQL database.
    • To start the operation, click the checkmark at the bottom of the page.

Import and Export a Database Using API

You can also programmatically import and export databases by using an API. For more information, see the Import Export example on Codeplex.

Import a Database

  1. Using one of the tools listed in the Before You Begin section, ensure that your Blob has a container, and the BACPAC file to be imported is available in the container.

  2. Log on to the Windows Azure Platform Management Portal.

  3. In the navigation pane, click Hosted Services, Storage Accounts & CDN, and then click Storage Accounts. Your storage accounts display in the center pane.

  4. Select the storage account that contains the BACPAC file to be imported, and make a note of the following values from the right pane: Primary access key and BLOB URL. You will have to specify these values later in this procedure.

  5. In the navigation pane, click Database. Next, select the subscription, and then your SQL Database server where you want to import the database.

  6. On the ribbon, click Import. This opens the Import Database from Storage Account window.

  7. Verify that the Target Server field lists the SQL Database server where the database is to be created.

  8. In the Login and Password boxes, type the database credentials to be used for the import.

  9. In the New Database Name box, type the name for the new database created by the import. This name must be unique on the SQL Database server and must comply with the SQL Server rules for identifiers. For more information, see Identifiers.

  10. From the Edition list, select whether the database is a Web or Business edition database.

  11. From the Maximum Size list, select the required size of the database. The list only specifies the values supported by the Edition you have selected.

  12. In the BACPAC URL box, type the full path of the BACPAC file that you want to import. Specify the path in the following format: “https://” + Blob URL (as noted in step 4) + “/<container_name>/<file_name>”. For example: https://myblobstorage.blob.core.windows.net/dac/file.bacpac. The Blob URL must be in lowercase without any special characters. If you do not supply the .bacpac suffix, it is applied by the import operation.

  13. In the Access Key box, type the storage access key or shared access key that you made a note of in step 4.

  14. From the Key Type list, select the type that matches the key entered in the Access Key box: either a Storage Access Key or a Shared Access Key.

  15. Click Finish to start the import.

Database import is an asynchronous operation. After starting the import, you can use the Import Export Request Status window to track the progress. For information, see How to: View Import and Export Status of Database (Windows Azure SQL Database).

Original Article