Category Archives: SQL

SQL Record Count

The fastest way to perform a SQL record count is:

DECLARE @TableName sysname
SET @TableName = 'Log'
 
SELECT TBL.object_id, TBL.name, SUM(PART.rows) AS rows
FROM sys.tables TBL
INNER JOIN sys.partitions PART ON TBL.object_id = PART.object_id
INNER JOIN sys.indexes IDX ON PART.object_id = IDX.object_id
AND PART.index_id = IDX.index_id
WHERE TBL.name = @TableName
AND IDX.index_id < 2
GROUP BY TBL.object_id, TBL.name;

How to export all tables to csv by one export job

If you just want to export all the tables in a SQL database in to separate CSV files, then this is a quick and easy way of doing it.

  1. Execute below query which generates BCP commands

    SELECT ‘bcp ‘ + st.NAME + ‘ out c:\Target\’ + st.NAME + ‘.csv -c -r -d ‘ + DB_NAME() + ‘ -U user@??????.database.windows.net -S tcp:?????.database.windows.net -P ?????? FROM sys.tables st
  2. Paste the result set into text file. Make batch file and schedule it. (It can also be run in CMD manually)

 

Search every column in a SQL Server Database

Following on from the Search every table and field in a SQL Server Database article I posted last week, I thought it would be useful also to be able to search every column for some text and return where it was found.

Here is the script that I have produced which does the job and is quick in its response.

CREATE PROC [SearchAllColumns]
(
 @SearchStr nvarchar(100)
)
AS
 
BEGIN
 
 SELECT COLUMN_NAME AS 'ColumnName',
 TABLE_NAME AS 'TableName'
 FROM INFORMATION_SCHEMA.COLUMNS
 WHERE COLUMN_NAME LIKE '%' + @SearchStr + '%'
 ORDER BY TableName,
 ColumnName;

END

 

Search every table and field in a SQL Server Database

One thing I do like about MySQL is when using phpMyAdmin is the ability to search the whole database for any string, this should only be available for developers who are searching for data in an extensive database.  It does take some time to run, but remember it has a lot to do.

Here is the script that Narayana Vyas Kondreddi has produced which does the trick and works very well.

CREATE PROC SearchAllTables
(
 @SearchStr nvarchar(100)
)
AS
 
BEGIN
 
 -- Copyright © 2002 Narayana Vyas Kondreddi. All rights reserved.
 -- Purpose: To search all columns of all tables for a given search string
 -- Written by: Narayana Vyas Kondreddi
 -- Site: http://vyaskn.tripod.com
 -- Updated and tested by Tim Gaunt
 -- http://www.thesitedoctor.co.uk
 -- http://blogs.thesitedoctor.co.uk/tim/2010/02/19/Search+Every+Table+And+Field+In+A+SQL+Server+Database+Updated.aspx
 -- Tested on: SQL Server 7.0, SQL Server 2000, SQL Server 2005 and SQL Server 2010
 -- Date modified: 03rd March 2011 19:00 GMT
 CREATE TABLE #Results (ColumnName nvarchar(370), ColumnValue nvarchar(3630))
 
 SET NOCOUNT ON
 
 DECLARE @TableName nvarchar(256), @ColumnName nvarchar(128), @SearchStr2 nvarchar(110)
 SET @TableName = ''
 SET @SearchStr2 = QUOTENAME('%' + @SearchStr + '%','''')
 
 WHILE @TableName IS NOT NULL
 
 BEGIN
 SET @ColumnName = ''
 SET @TableName = 
 (
 SELECT MIN(QUOTENAME(TABLE_SCHEMA) + '.' + QUOTENAME(TABLE_NAME))
 FROM INFORMATION_SCHEMA.TABLES
 WHERE TABLE_TYPE = 'BASE TABLE'
 AND QUOTENAME(TABLE_SCHEMA) + '.' + QUOTENAME(TABLE_NAME) > @TableName
 AND OBJECTPROPERTY(
 OBJECT_ID(
 QUOTENAME(TABLE_SCHEMA) + '.' + QUOTENAME(TABLE_NAME)
 ), 'IsMSShipped'
 ) = 0
 )
 
 WHILE (@TableName IS NOT NULL) AND (@ColumnName IS NOT NULL)
 
 BEGIN
 SET @ColumnName =
 (
 SELECT MIN(QUOTENAME(COLUMN_NAME))
 FROM INFORMATION_SCHEMA.COLUMNS
 WHERE TABLE_SCHEMA = PARSENAME(@TableName, 2)
 AND TABLE_NAME = PARSENAME(@TableName, 1)
 AND DATA_TYPE IN ('char', 'varchar', 'nchar', 'nvarchar', 'int', 'decimal')
 AND QUOTENAME(COLUMN_NAME) > @ColumnName
 )
 
 IF @ColumnName IS NOT NULL
 
 BEGIN
 INSERT INTO #Results
 EXEC
 (
 'SELECT ''' + @TableName + '.' + @ColumnName + ''', LEFT(' + @ColumnName + ', 3630) FROM ' + @TableName + ' (NOLOCK) ' +
 ' WHERE ' + @ColumnName + ' LIKE ' + @SearchStr2
 )
 END
 END 
 END
 
 SELECT ColumnName, ColumnValue FROM #Results
 DROP TABLE #Results
END

Original post

Database Layer

The DatabaseLayer is an abstraction for database access which means the calling application or library does not need to be tight-coupled to the database itself. A set of factory methods is available to call in order to create commands and parameters, all based on the .NET abstract base classes.

Currently supported engines are SQL server and MySQL, represented by “sql” and “mysql” in your web.config or app.config.

The DatabaseLayer does not use a provider pattern but the argument is passed in the factory to tell the library what database engine to use. An example is shown below. It is preferable to create one constant for the database type and share it across multiple places it is used.

The example below is lazy loaded and uses AppSettings and ConnectionStrings from the web.config or app.config file.

private static Database database;

internal static Database GetDb()
{
if ( database == null )
{
database = Database.GetDatabase(
ConfigurationManager.AppSettings["databasetype"],
ConfigurationManager.ConnectionStrings["myconnection"].ConnectionString);
}
return database;
}

DatabaseLayer solution

ttt

I can’t believe I couldn’t find a working solution to this after an hour of searching. I’m following this articleon Entity Framework 6.0 which gives a simple walk-through on Code First. I created the project and installed the latest EF Nuget package for the project to compile. I also verified that I have Microsoft SQL Server 2012 Express LocalDB installed which came with Visual Studio 2013. I don’t have any other instances of SQL installed on my local computer. The program runs and entries are added to the database and outputted in the console. But when the article says “check your localdb” it doesn’t say how! I don’t see any ‘.mdf’ or ‘.ldf’ files created under the project folder. I tried every way to connect Visual Studio’s Server Explorer to LocalDB. The wizard cannot locate (localdb) or cannot find any provider in Server Explorer to accept connection string like (localdb)\v11.0;Integrated Security=true;" I’ve seen this asked several places in StackOverflow but no answer works or marked as answer. Please help, this doesn’t have to be this frustrating!

What are the steps to connect Visual Studio Server Explorer to LocalDB?

Steps to connect LocalDB to Visual Studio Server Explorer

  1. Open command prompt
  2. Run SqlLocalDB.exe start v11.0
  3. Run SqlLocalDB.exe info v11.0
  4. Copy the Instance pipe name that starts with np:\…
  5. In Visual Studio select TOOLS > Connect to Database…
  6. For Server Name enter (localdb)\v11.0. If it didn’t work, use the Instance pipe name that you copied earlier. You can also use this to connect with SQL Management Studio.
  7. Select the database on next dropdown list
  8. Click OK

How to Rank your search results with multiple search terms using LINQ and EntityFramework

I have always used a ranked search criteria, it’s the only true way to get good results back from a data set.  But I’ve been doing my ranking within C# code.  But this got all the data from the data source and then ranked the results.  This is very poor on performance, as we should only be returning back the results for the page size we require.

So task at hand it to produce a LINQ statement to reterive only the data you require.

Here is the solution:

var entity = new myEntities();

var searchTerm = "a b Ba";

var searchArray = searchTerm.Split(new[] { ' ' }, StringSplitOptions.RemoveEmptyEntries);

var usersAll = entity.User.AsExpandable().Where(TC_User.ContainsInLastName(searchArray));

Console.WriteLine("Total Records: {0}", usersAll.Count());

var users = usersAll
    .Select(x => new { 
        x.LastName, 
        Rank = searchArray.Sum(s => ((x.LastName.Length - x.LastName.Replace(s, "").Length) / s.Length)) });

var results = users.OrderByDescending(o => o.Rank)
    .Skip(0)
    .Take(20);

foreach (var user in results)
{
    Console.WriteLine("{0}, {1}", user.LastName, user.Rank);
}

Console.ReadLine();

You’ll also need to add a new method to your User class to check for that the search term is contain in the LastName

public static Expression<Func<TC_User, bool>> ContainsInLastName(
                                                params string[] keywords)
{
    var predicate = PredicateBuilder.False<TC_User>();
    foreach (string keyword in keywords)
    {
        string temp = keyword;
        predicate = predicate.Or(p => p.LastName.Contains(temp));
    }
    return predicate;
}

One thing that is required is LinqKit, which is available via NuGet to handle the PredicateBuilder and AsExpandable.

Watch out as the results coming back from Sum is a BigInt so if you create a model to return back make sure it is a long type.

Backup SQL Azure

There are a number of options for backing up SQL Azure, which can be found here:

Different ways to Backup your Windows Azure SQL Database

I like the Azure way, which is just exporting, importing and setting a scheduled

Before You Begin

The SQL Database Import/Export Service requires you to have a Windows Azure storage account because BACPAC files are stored here. For more information about creating a storage account, see How to Create a Storage Account. You must also create a container inside Blob storage for your BACPAC files by using a tool such as the Windows Azure Management Tool (MMC) or Azure Storage Explorer.

If you want to import an on-premise SQL Server database to Windows Azure SQL Database, first export your on-premise database to a BACPAC file, and then upload the BACPAC file to your Blob storage container.

If you want to export a database from Windows Azure SQL Database to an on-premise SQL Server, first export the database to a BACPAC file, transfer the BACPAC file to your local server (computer), and then import the BACPAC file to your on-premise SQL Server.

Export a Database

  1. Using one of the tools listed in the Before You Begin section, ensure that your Blob has a container.

  2. Log on to the Windows Azure Platform Management Portal.

  3. In the navigation pane, click Hosted Services, Storage Accounts & CDN, and then click Storage Accounts. Your storage accounts display in the center pane.

  4. Select the required storage account, and make a note of the following values from the right pane: Primary access key and BLOB URL. You will have to specify these values later in this procedure.

  5. In the navigation pane, click Database. Next, select the subscription, your SQL Database server, and then your database that you want to export.

  6. On the ribbon, click Export. This opens the Export Database to Storage Account window.

  7. Verify that the Server Name and Database match the database that you want to export.

  8. In the Login and Password boxes, type the database credentials to be used for the export. Note that the account must be a server-level principal login – created by the provisioning process – or a member of the dbmanager database role.

  9. In New Blob URL box, specify the location where the exported BACPAC file is saved. Specify the location in the following format: “https://” + Blob URL (as noted in step 4) + “/<container_name>/<file_name>”. For example: https://myblobstorage.blob.core.windows.net/dac/exportedfile.bacpac. The Blob URL must be in lowercase without any special characters. If you do not supply the .bacpac suffix, it is applied by the export operation.

  10. In the Access Key box, type the storage access key or shared access key that you made a note of in step 4.

  11. From the Key Type list, select the type that matches the key entered in the Access Key box: either a Storage Access Key or a Shared Access Key.

  12. Click Finish to start the export. You should see a message saying Your request was successfully submitted.

  13. After the export is complete, you should attempt to import your BACPAC file into a Windows Azure SQL Database server to verify that your exported package can be imported successfully.

Database export is an asynchronous operation. After starting the export, you can use the Import Export Request Status window to track the progress. For information, see How to: View Import and Export Status of Database (Windows Azure SQL Database).

noteNote
An export operation performs an individual bulk copy of the data from each table in the database so does not guarantee the transactional consistency of the data. You can use the Windows Azure SQL Database copy database feature to make a consistent copy of a database, and perform the export from the copy. For more information, see Copying Databases in Windows Azure SQL Database.

Configure Automated Exports

Use the Windows Azure SQL Database Automated Export feature to schedule export operations for a SQL database, and to specify the storage account, frequency of export operations, and to set the retention period to store export files.

To configure automated export operations for a SQL database, use the following steps:

  1. Log on to the Windows Azure Platform Management Portal.

  2. Click the SQL database name you want to configure, and then click the Configuration tab.

  3. On the Automated Export work space, click Automatic, and then specify settings for the following parameters:

    • Storage Account
    • Frequency
      • Specify the export interval in days.
      • Specify the start date and time. The time value on the configuration work space is UTC time, so note the offset between UTC time and the time zone where your database is located.
    • Credentials for the server that hosts your SQL database. Note that the account must be a server-level principal login – created by the provisioning process – or a member of the dbmanager database role.
  4. When you have finished setting the export settings, click Save.

  5. You can see the time stamp for the last export on under Automated Export in the Quick Glance section of the SQL Database Dashboard.

To change the settings for an automated export, select the SQL database, click the Configuration tab, make your changes, and then click Save.

Create a New SQL Database from an Existing Export File

Use the Windows Azure SQL Database Create from Export feature to create a new SQL database from an existing export file.

To create a new SQL database from an existing export file, use the following steps:

  1. Log on to the Windows Azure Platform Management Portal.

  2. Click a SQL database name and then click the Configuration tab.

  3. On the Create from Export work space, click New Database, and then specify settings for the following parameters:

    • Bacpac file name – This is the source file for your new SQL database.
    • A name for the new SQL database.
    • Server – This is the host server for your new SQL database.
    • To start the operation, click the checkmark at the bottom of the page.

Import and Export a Database Using API

You can also programmatically import and export databases by using an API. For more information, see the Import Export example on Codeplex.

Import a Database

  1. Using one of the tools listed in the Before You Begin section, ensure that your Blob has a container, and the BACPAC file to be imported is available in the container.

  2. Log on to the Windows Azure Platform Management Portal.

  3. In the navigation pane, click Hosted Services, Storage Accounts & CDN, and then click Storage Accounts. Your storage accounts display in the center pane.

  4. Select the storage account that contains the BACPAC file to be imported, and make a note of the following values from the right pane: Primary access key and BLOB URL. You will have to specify these values later in this procedure.

  5. In the navigation pane, click Database. Next, select the subscription, and then your SQL Database server where you want to import the database.

  6. On the ribbon, click Import. This opens the Import Database from Storage Account window.

  7. Verify that the Target Server field lists the SQL Database server where the database is to be created.

  8. In the Login and Password boxes, type the database credentials to be used for the import.

  9. In the New Database Name box, type the name for the new database created by the import. This name must be unique on the SQL Database server and must comply with the SQL Server rules for identifiers. For more information, see Identifiers.

  10. From the Edition list, select whether the database is a Web or Business edition database.

  11. From the Maximum Size list, select the required size of the database. The list only specifies the values supported by the Edition you have selected.

  12. In the BACPAC URL box, type the full path of the BACPAC file that you want to import. Specify the path in the following format: “https://” + Blob URL (as noted in step 4) + “/<container_name>/<file_name>”. For example: https://myblobstorage.blob.core.windows.net/dac/file.bacpac. The Blob URL must be in lowercase without any special characters. If you do not supply the .bacpac suffix, it is applied by the import operation.

  13. In the Access Key box, type the storage access key or shared access key that you made a note of in step 4.

  14. From the Key Type list, select the type that matches the key entered in the Access Key box: either a Storage Access Key or a Shared Access Key.

  15. Click Finish to start the import.

Database import is an asynchronous operation. After starting the import, you can use the Import Export Request Status window to track the progress. For information, see How to: View Import and Export Status of Database (Windows Azure SQL Database).

Original Article