The way to handle unauthorised requests to Ajax actions in ASP.NET MVC

Problem

I have created a view that posts to an action via Ajax with the expectation that the action will return the requested data or an empty string.  Even better, I would like it to be configurable to return whatever value I see fit.

The problem arises when I decorate the called action with the [Authorize] attribute.  If the request is not authorized and I have a loginUrl configured in my web.config, my ajax request will return the html output of my loginUrl view.  That is undesirable.

Solution

I can extend the existing Authorize attribute by inheriting from the AuthorizeAttribute class.  Here is the code that extends the Authorize attribute:

public class AjaxAuthorizeOverrideAttribute : AuthorizeAttribute
    {
        public string View { get; set; }

        protected override void HandleUnauthorizedRequest(AuthorizationContext filterContext)
        {
            if (!filterContext.HttpContext.Request.IsAjaxRequest())
            {
                base.HandleUnauthorizedRequest(filterContext);
                return;
            }

            filterContext.Result = new ViewResult { ViewName = View };
            filterContext.Result.ExecuteResult(filterContext.Controller.ControllerContext);
        }
    }

Here is the decorator for the ajax action in the controller class:

[AjaxAuthorizeOverride(View="AjaxAuthorizeError")]

public ActionResult AjaxRequest()
{
     return View();
}

Note: there is no default view page being rendered.

 

Original article can be found here

Creating and Publishing to NuGet

Now that we have a Nuget Server installed and up and running we will want to deploy our packages.

First you'll need to install the nuget.exe command line bootstrapper

Installing NuGet.exe

  • Download NuGet.exe
  • Place NuGet in a well known location such as c:\utils on your machine
  • Make sure that NuGet.exe is in your path

Then you can take two simple steps to generate your package

nuget pack MyProject.csproj -IncludeReferencedProjects -Prop Configuration=Release

then push your pack to the nuget server, with your APIKey you setup in your web.config of the nuget.server

nuget push {package file} -s http://yourNugetServer/ {apikey}

That is it, you now have your package available in your own private nuget server.

For more details on Creating and Publishing a Package

Setting up a NuGet Server

Whether it's your company restricts which third-party libraries their developers may use or just you need your own private repository to store all your protected work.  A NuGet Server is what you are after.

Before I start I need to mention a few things about NuGet Gallery, which I spent a good two days trying to get working correctly on Azure and failed and in the end ended up generating a very basic Nuget Server to host all the libraries.

This all can be done with just four simple steps

  1. Create a new ASP.net Web Forms Application
  2. Install nuget.server (http://www.nuget.org/packages/nuget.server)
  3. edit your web.config to set your APIKey  <add key="apiKey" value=""/>
  4. Deploy your application to your website 

That is it, if you fire up your server you should see a screen like this:

That is it you now have a hosted Nuget Server, next we'll look at deploying your solution to the Nuget Server

Full details can be found here: Host Your Own NuGet Server or How To Create Local NuGet Server

User Stories

Here are some helpful pointers for writing User Stories

 
Story template
  1. "As a <User or role>
  2. I want <Business Functionality>
  3. So that <Business Justification>”
 
Example:
  1. As a Account Holder,
  2. I want to be able to withdraw funds from my checking account,
  3. So that I can buy some bling.
 
 
 
Stories are not
  • "mini" Use Cases
  • a complete specification
  • a contract
  • intended to be interpreted without a Product Owner
 
User Stories guidelines
  • Testable - Tangible acceptance tests can be written against any delivered software
  • The scope of the User Story is manage-able enough for the team to provide an Estimate
  • Independent and do not rely on other Stories
  • Sized appropriately.. Have a level of effort which the team can comfortably achieve in the duration of a single iteration

GitHub for web Designers

Unless you’re a one person web shop with no team to collaborate with, you've experienced the frustration that goes along with file sharing. No matter how hard you try, when multiple people are working on a single project without a version control system in place things get chaotic.

If you work with developers on the build-out and implementation of websites, the merge between front-end templates and back-end functionality can be a scary black hole.

Issues like overwrites, lost files, and the all-too-common “working off a previous version” phenomenon crop up constantly. And once back-end functionality has been put into your templates, you become terrified to touch them for fear of breaking something a developer spent a great deal of time getting to work.

In addition, even if you have a common repository that everyone is pulling from odds are at least one member of your team forgot to grab the latest files and is about to blow things up with their latest additions.

Fear not GitHub is here to save the day, I’ll give you a quick review of GitHub, an excellent version control system.

Version Control – A Quick and Dirty Explanation

Version Control (also known as Revision Control or Source Control Management) is a great way to solve the file sharing problem.

The basic concept is this: there is one main repository for all of the project files. Team members check files out, make changes, and then check them back in (or commit them). The Version Control System (VCS) automatically notes who changed the files, when they were changed, and what about them was new or different.

It also asks you to write a little note about the change so everyone on the project knows at a glance what you did and why. Each file will then have a revision history so you can easily go back to a previous version of any file if something goes horribly wrong.

A good  Version Control System also allows you to merge changes to the same file. If you and another person work locally on the same file at the same time, when you push these files back into the main repository the system will merge both sets of changes to create a new and fully up-to-date file. If any conflicts arise during the merge it will highlight them for you.

You’re probably using a very crude  Version Control System right now to keep your files straight. If you’re a designer, it looks something like this:

This works well enough for PSDs and other large binary files, which don’t really lend themselves to  Version Control System. But there’s a much better way to do it when you are managing the source code for a website.

Benefits to using a Version Control System include:

  • Files cannot be overwritten
  • There is a common repository that holds all the latest files
  • People can work on the same files simultaneously without conflict
  • Allows you to revert back to an older version of the file/project if needed
  • Making your developers very happy

Even if you don’t work with a team, version control can be a lifesaver. Backing up files is one of the easiest things you can do to save yourself from losing work or having to start over.

The idea of a  Version Control System seems daunting at first, especially since most of the documentation is written by and for developers. But once you make the move to incorporate it into your workflow, you’ll find it’s not nearly as hard as it looks.

Meet GitHub

OK, so now you can see why a Version Control System is a must-have for your web team. If you do a little Googling you’ll see that there are quite a few options out there including SVN, Mercurial, CVS, Bazaar and GitHub. Any one of them could be a good solution for your needs, and I encourage you to do some research before selecting a  Version Control System. In this article I’m going to focus on GitHub, the one I use daily. It’s a “rising star” that has gained popularity thanks to a strong Linux fanbase, GitHub and the Rails community.

GitHub is a free open-source Version Control System originally created by Linus Torvalds for Linux kernal development. Linus is a very smart guy; when he sets out to solve a problem, he doesn't mess around. One of Git’s big differentiators is that unlike SVN and CVS it is a distributed version control system. This means that every user has a complete copy of the repository data stored locally on their machine. What’s so great about that? A few things:

  • Everything is local, so you can work offline
  • There is no single point of failure. It doesn't rely on one central server that could crash and burn, taking the only repository for your project with it.
  • Because it doesn't have to communicate with a central server constantly, processes run much faster

Git has a slightly tougher learning curve than SVN, but the trade-off is worth it. Just think how impressed your developer friends will be when you tell them you’re using the new hotness that is Git! In all seriousness, I don’t think the learning curve is all that steep. 

Installing Git isn't fun and games. But there are plenty of resources online to get you through it. It will run on a PC, Mac or Linux box, although installation for Linux and OSX is considerable easier than for Windows.

You can download the latest version of Git here. Once you have the files, try this quick guide to get you started with the installation process. For Windows users, this step-by-step visual guide should be helpful. Mac users, try this guide found on GitHub

Original Article

Google authentication get email from ASP.NET Identity

So how do you obtain the external user email (the one authentication by google), first name and name in an ASP.NET website using ASP.NET Identity?

var email = externalIdentity.FindFirstValue(ClaimTypes.Email);

and here is getting the type

public static string FindFirstValue(this ClaimsIdentity identity, string claimType) 
    {
        Claim claim = identity.FindFirst(claimType);
        if (claim != null) 
        {
            return claim.Value;
        }
        return null;
    }

For more information take a look at Decoupling OWIN external authentication from ASP.NET Identity

WCF Attributes and what they all mean

The online MSDN documentation is your first source for things like that:

All the MSDN documentation pages contain detailed explanations of all the settings on those contract attributes.

Active links on Bootstrap Navbar with ASP.NET MVC

If you have used Bootstrap with your ASP.NET MVC application, you might have faced some issues with implementing active links of it’s Navbar component. We’ll have to dynamically add a css class called active to the particular menu item in order to make it selected in the Navbar. Here is a HtmlHelper extension which is capable of rendering menu items as well as drop-down menu items. I’ve used ASP.NET MVC 5 with Razor and Bootstrap 3.

First step is to create a HtmlHelper extension class. 

using System.Web.Mvc;

public static class MenuLinkExtension
{
    public static MvcHtmlString MenuLink(this HtmlHelper htmlHelper, string itemText, string actionName, string controllerName, MvcHtmlString[] childElements = null)
    {
        var currentAction = htmlHelper.ViewContext.RouteData.GetRequiredString("action");
        var currentController = htmlHelper.ViewContext.RouteData.GetRequiredString("controller");
        string finalHtml;
        var linkBuilder = new TagBuilder("a");
        var liBuilder = new TagBuilder("li");

        if (childElements != null && childElements.Length > 0)
        {
            linkBuilder.MergeAttribute("href", "#");
            linkBuilder.AddCssClass("dropdown-toggle");
            linkBuilder.InnerHtml = itemText + " <b class=\"caret\"></b>";
            linkBuilder.MergeAttribute("data-toggle", "dropdown");
            var ulBuilder = new TagBuilder("ul");
            ulBuilder.AddCssClass("dropdown-menu");
            ulBuilder.MergeAttribute("role", "menu");
            foreach (var item in childElements)
            {
                ulBuilder.InnerHtml += item + "\n";
            }

            liBuilder.InnerHtml = linkBuilder + "\n" + ulBuilder;
            liBuilder.AddCssClass("dropdown");
            if (controllerName == currentController)
            {
                liBuilder.AddCssClass("active");
            }

            finalHtml = liBuilder.ToString() + ulBuilder;
        }
        else
        {
            var urlHelper = new UrlHelper(htmlHelper.ViewContext.RequestContext, htmlHelper.RouteCollection);
            linkBuilder.MergeAttribute("href", urlHelper.Action(actionName, controllerName));
            linkBuilder.SetInnerText(itemText);
            liBuilder.InnerHtml = linkBuilder.ToString();
            if (controllerName == currentController && actionName == currentAction)
            {
                liBuilder.AddCssClass("active");
            }

            finalHtml = liBuilder.ToString();
        }

        return new MvcHtmlString(finalHtml);
    }
}

Once you have saved it, you can it by just calling like this!

<header class="navbar navbar-inverse navbar-fixed-top bs-docs-nav" role="banner">
    <div class="container">
        <div class="navbar-header">
            <a href="#" class="navbar-brand">Mvc Shop</a>
        </div>
        <nav role="navigation">
 
            <ul class="nav navbar-nav">
                @Html.MenuLink("Home", "Index", "Home")
                @Html.MenuLink("Dropdown", "Index", "Home2", new MvcHtmlString[]{
                                      @Html.MenuLink("Link1", "Action1", "Controller1"),
                                      @Html.MenuLink("Link2", "Action2", "Controller1"),
                                      @Html.MenuLink("Link3", "Action3", "Controller1"),
                                    })
                @Html.MenuLink("JavaScript", "Index", "Home1", new MvcHtmlString[]{
                                      @Html.MenuLink("Link1", "Index1", "Home1"),
                                      @Html.MenuLink("Link2", "Index2", "Home1"),
                                      @Html.MenuLink("Link3", "Index3", "Home1")
                                    })
 
            </ul>
        </nav>
    </div>
</header>

How to Rank your search results with multiple search terms using LINQ and EntityFramework

I have always used a ranked search criteria, it's the only true way to get good results back from a data set.  But I've been doing my ranking within C# code.  But this got all the data from the data source and then ranked the results.  This is very poor on performance, as we should only be returning back the results for the page size we require.

So task at hand it to produce a LINQ statement to reterive only the data you require.

Here is the solution:

var entity = new myEntities();

var searchTerm = "a b Ba";

var searchArray = searchTerm.Split(new[] { ' ' }, StringSplitOptions.RemoveEmptyEntries);

var usersAll = entity.User.AsExpandable().Where(TC_User.ContainsInLastName(searchArray));

Console.WriteLine("Total Records: {0}", usersAll.Count());

var users = usersAll
    .Select(x => new { 
        x.LastName, 
        Rank = searchArray.Sum(s => ((x.LastName.Length - x.LastName.Replace(s, "").Length) / s.Length)) });

var results = users.OrderByDescending(o => o.Rank)
    .Skip(0)
    .Take(20);

foreach (var user in results)
{
    Console.WriteLine("{0}, {1}", user.LastName, user.Rank);
}

Console.ReadLine();

You'll also need to add a new method to your User class to check for that the search term is contain in the LastName

public static Expression<Func<TC_User, bool>> ContainsInLastName(
                                                params string[] keywords)
{
    var predicate = PredicateBuilder.False<TC_User>();
    foreach (string keyword in keywords)
    {
        string temp = keyword;
        predicate = predicate.Or(p => p.LastName.Contains(temp));
    }
    return predicate;
}

One thing that is required is LinqKit, which is available via NuGet to handle the PredicateBuilder and AsExpandable.

Watch out as the results coming back from Sum is a BigInt so if you create a model to return back make sure it is a long type.

Backup SQL Azure

There are a number of options for backing up SQL Azure, which can be found here:

Different ways to Backup your Windows Azure SQL Database

I like the Azure way, which is just exporting, importing and setting a scheduled

Before You Begin

The SQL Database Import/Export Service requires you to have a Windows Azure storage account because BACPAC files are stored here. For more information about creating a storage account, see How to Create a Storage Account. You must also create a container inside Blob storage for your BACPAC files by using a tool such as the Windows Azure Management Tool (MMC) or Azure Storage Explorer.

If you want to import an on-premise SQL Server database to Windows Azure SQL Database, first export your on-premise database to a BACPAC file, and then upload the BACPAC file to your Blob storage container.

If you want to export a database from Windows Azure SQL Database to an on-premise SQL Server, first export the database to a BACPAC file, transfer the BACPAC file to your local server (computer), and then import the BACPAC file to your on-premise SQL Server.

Export a Database

  1. Using one of the tools listed in the Before You Begin section, ensure that your Blob has a container.

  2. Log on to the Windows Azure Platform Management Portal.

  3. In the navigation pane, click Hosted Services, Storage Accounts & CDN, and then click Storage Accounts. Your storage accounts display in the center pane.

  4. Select the required storage account, and make a note of the following values from the right pane: Primary access key and BLOB URL. You will have to specify these values later in this procedure.

  5. In the navigation pane, click Database. Next, select the subscription, your SQL Database server, and then your database that you want to export.

  6. On the ribbon, click Export. This opens the Export Database to Storage Account window.

  7. Verify that the Server Name and Database match the database that you want to export.

  8. In the Login and Password boxes, type the database credentials to be used for the export. Note that the account must be a server-level principal login - created by the provisioning process - or a member of the dbmanager database role.

  9. In New Blob URL box, specify the location where the exported BACPAC file is saved. Specify the location in the following format: “https://” + Blob URL (as noted in step 4) + “/<container_name>/<file_name>”. For example: https://myblobstorage.blob.core.windows.net/dac/exportedfile.bacpac. The Blob URL must be in lowercase without any special characters. If you do not supply the .bacpac suffix, it is applied by the export operation.

  10. In the Access Key box, type the storage access key or shared access key that you made a note of in step 4.

  11. From the Key Type list, select the type that matches the key entered in the Access Key box: either a Storage Access Key or a Shared Access Key.

  12. Click Finish to start the export. You should see a message saying Your request was successfully submitted.

  13. After the export is complete, you should attempt to import your BACPAC file into a Windows Azure SQL Database server to verify that your exported package can be imported successfully.

Database export is an asynchronous operation. After starting the export, you can use the Import Export Request Status window to track the progress. For information, see How to: View Import and Export Status of Database (Windows Azure SQL Database).

noteNote
An export operation performs an individual bulk copy of the data from each table in the database so does not guarantee the transactional consistency of the data. You can use the Windows Azure SQL Database copy database feature to make a consistent copy of a database, and perform the export from the copy. For more information, see Copying Databases in Windows Azure SQL Database.

Configure Automated Exports

Use the Windows Azure SQL Database Automated Export feature to schedule export operations for a SQL database, and to specify the storage account, frequency of export operations, and to set the retention period to store export files.

To configure automated export operations for a SQL database, use the following steps:

  1. Log on to the Windows Azure Platform Management Portal.

  2. Click the SQL database name you want to configure, and then click the Configuration tab.

  3. On the Automated Export work space, click Automatic, and then specify settings for the following parameters:

    • Storage Account

    • Frequency

      • Specify the export interval in days.

      • Specify the start date and time. The time value on the configuration work space is UTC time, so note the offset between UTC time and the time zone where your database is located.

    • Credentials for the server that hosts your SQL database. Note that the account must be a server-level principal login - created by the provisioning process - or a member of the dbmanager database role.

  4. When you have finished setting the export settings, click Save.

  5. You can see the time stamp for the last export on under Automated Export in the Quick Glance section of the SQL Database Dashboard.

To change the settings for an automated export, select the SQL database, click the Configuration tab, make your changes, and then click Save.

Create a New SQL Database from an Existing Export File

Use the Windows Azure SQL Database Create from Export feature to create a new SQL database from an existing export file.

To create a new SQL database from an existing export file, use the following steps:

  1. Log on to the Windows Azure Platform Management Portal.

  2. Click a SQL database name and then click the Configuration tab.

  3. On the Create from Export work space, click New Database, and then specify settings for the following parameters:

    • Bacpac file name - This is the source file for your new SQL database.

    • A name for the new SQL database.

    • Server – This is the host server for your new SQL database.

    • To start the operation, click the checkmark at the bottom of the page.

Import and Export a Database Using API

You can also programmatically import and export databases by using an API. For more information, see the Import Export example on Codeplex.

Import a Database

  1. Using one of the tools listed in the Before You Begin section, ensure that your Blob has a container, and the BACPAC file to be imported is available in the container.

  2. Log on to the Windows Azure Platform Management Portal.

  3. In the navigation pane, click Hosted Services, Storage Accounts & CDN, and then click Storage Accounts. Your storage accounts display in the center pane.

  4. Select the storage account that contains the BACPAC file to be imported, and make a note of the following values from the right pane: Primary access key and BLOB URL. You will have to specify these values later in this procedure.

  5. In the navigation pane, click Database. Next, select the subscription, and then your SQL Database server where you want to import the database.

  6. On the ribbon, click Import. This opens the Import Database from Storage Account window.

  7. Verify that the Target Server field lists the SQL Database server where the database is to be created.

  8. In the Login and Password boxes, type the database credentials to be used for the import.

  9. In the New Database Name box, type the name for the new database created by the import. This name must be unique on the SQL Database server and must comply with the SQL Server rules for identifiers. For more information, see Identifiers.

  10. From the Edition list, select whether the database is a Web or Business edition database.

  11. From the Maximum Size list, select the required size of the database. The list only specifies the values supported by the Edition you have selected.

  12. In the BACPAC URL box, type the full path of the BACPAC file that you want to import. Specify the path in the following format: “https://” + Blob URL (as noted in step 4) + “/<container_name>/<file_name>”. For example: https://myblobstorage.blob.core.windows.net/dac/file.bacpac. The Blob URL must be in lowercase without any special characters. If you do not supply the .bacpac suffix, it is applied by the import operation.

  13. In the Access Key box, type the storage access key or shared access key that you made a note of in step 4.

  14. From the Key Type list, select the type that matches the key entered in the Access Key box: either a Storage Access Key or a Shared Access Key.

  15. Click Finish to start the import.

Database import is an asynchronous operation. After starting the import, you can use the Import Export Request Status window to track the progress. For information, see How to: View Import and Export Status of Database (Windows Azure SQL Database).

Original Article

About the author

You have probably figured out by now that my name is Bryan Avery (if not, please refer to your browser's address field).  Technology is more than a career to me - it is both a hobby and a passion.  I'm an ASP.NET/C# Developer at heart...

Month List