Archive

Archive for October, 2015

Passing multiple/dynamic values to Stored Procedures & Functions | Part 5 – by passing JSON string

October 31, 2015 3 comments

This post is part of the [Passing multiple/dynamic values to Stored Procedures & Functions] series, and as well as the new feature Native JSON support in SQL Server 2016.

Adding the fifth part to this series we will use JSON string that will contain the set of values and pass as an JSON param variable to the SP. Then inside the SP we will parse this JSON and use those values in our SQL Queries, just like we did in previous posts with CSV/XML strings:

USE [AdventureWorks2014]
GO

-- Create an SP with NVARCHAR type parameter for JSON string:
CREATE PROCEDURE uspGetPersonDetailsJSON (
	@persons NVARCHAR(MAX)
)
AS
BEGIN
	--DECLARE @persons NVARCHAR(MAX)
	--SET @persons = '{"root":[{"Name":"Charles"},{"Name":"Jade"},{"Name":"Jim"},{"Name":"Luke"},{"Name":"Ken"}]}'
	
	SELECT Name
	INTO #tblPersons
	FROM OPENJSON (@persons, '$.root')
	WITH ( 
		Name NVARCHAR(100)
	)

	SELECT 
		BusinessEntityID, 
		Title, 
		FirstName, 
		MiddleName, 
		LastName, 
		ModifiedDate
	FROM [Person].[Person] PER
	WHERE EXISTS (
		SELECT * 
		FROM #tblPersons tmp 
		WHERE tmp.Name  = PER.FirstName
	)
	ORDER BY FirstName, LastName

	DROP TABLE #tblPersons
END
GO

-- Create JSON string:
DECLARE @json NVARCHAR(1000)
SET @json = N'{
  "root": [
    { "Name": "Charles" },
    { "Name": "Jade" },
    { "Name": "Jim" },
    { "Name": "Luke" },
    { "Name": "Ken" }
  ]
}'

-- Use the JSON string as parameter which calling the SP:
EXEC uspGetPersonDetailsJSON @json
GO

-- Check the output, objective achieved

-- Final Cleanup
DROP PROCEDURE uspGetPersonDetailsXML
GO

 

Thus you can also use JSON string similar to the way you used XML string, to pass multiple and dynamic number of parameters to your Stored Procedures.

As JSON feature is new to SQL Server 2016, so this method will only work with SQL Server 2016 and above versions.
 


Advertisement

Import/Read a JSON string and convert it in tabular (row/column) form | SQL Server 2016 – Part 2

October 30, 2015 6 comments

In my [previous post] I discussed on how to format/convert any Query/Table data to a string in JSON format. JSON in SQL Server was limited to this feature only till the CTP 2 version.

But now with the CTP 3 release you can do reverse of it also, means now you can read back JSON data and convert it to tabular or row & column format.
 

Let’s check this by taking same sample data from our previous JSON-Export post.

DECLARE @json NVARCHAR(1000)
SELECT @json = N'{"StudList":
[  
   {  
      "ID":1,
      "FirstName":"Manoj",
      "LastName":"Pandey",
      "Class":10,
      "Marks":80.5
   },
   {  
      "ID":2,
      "FirstName":"Saurabh",
      "LastName":"Sharma",
      "Class":11,
      "Marks":82.7
   },
   {  
      "ID":3,
      "FirstName":"Kanchan",
      "LastName":"Pandey",
      "Class":10,
      "Marks":90.5
   },
   {  
      "ID":4,
      "FirstName":"Rajesh",
      "LastName":"Shah",
      "Class":11,
      "Marks":70.3
   },
   {  
      "ID":5,
      "FirstName":"Kunal",
      "LastName":"Joshi",
      "Class":12,
      "Marks":64.7
   }
]'

SELECT ID, FirstName, LastName, Class, Marks
FROM OPENJSON (@json, '$.StudList')
WITH (
	ID INT, 
	FirstName VARCHAR(255), 
	LastName VARCHAR(255), 
	Class INT,
	Marks DECIMAL(3,1)
) AS StudList

– Output:
SQL Server 2016 - JSON export 01

Well, that was simple, isn’t it!!!
 


 

In my [next post] I’ll talk about Nested Properties in a JSON string with single and multiple elements.
 


SQL Server 2016 CTP 3 update is here | download now

October 29, 2015 Leave a comment

Microsoft yesterday (29-Oct-2015) announced the CTP 3 update of the recently and initially released Community Technology Preview (CTP) 2.x version of SQL Server 2016.
 

[Register and Download the CTP 3 Evaluation version (180 days) here]
 

–> Direct download link (~2.3 GB):

– Download the single ISO: SQLServer2016CTP3.0-x64-ENU.iso

– Or download both EXE & BOX files:
– – Box file SQLServer2016-x64-ENU.box
– – EXE file SQLServer2016-x64-ENU.exe

 

–> Check version and SQL build:

select @@version

Microsoft SQL Server 2016 (CTP3) – 13.0.700.139 (X64) Oct 7 2015 04:48:43 Copyright (c) Microsoft Corporation Enterprise Evaluation Edition (64-bit) on Windows 10 Enterprise 6.3 (Build 10240: )
 

–> The Major feature in this preview is the “RRE Integration”:

That is the Advanced Revolution Analytics R Integration, as SQL Server R Service within SQL Server 2016. “R” is the most popular programming language for Advanced Analytics. You can use it to analyze data, uncover patterns and trends and build predictive models. It offers an incredibly rich set of packages and a vibrant and fast-growing developer community.

SQL Server R Services with in-database analytics will help with:

– Data Exploration and Predictive Modeling with R over SQL Server data

– Operationalizing your R code using T-SQL

While installing SQL Server 2016, you will get an option to integrate R, it mentions “Includes Advanced Analytics Extensions that enables integration with R language using standard T-SQL statements.”, as shown below:

SQL Server 2016 - R - Advance Analytics

Post installation you need to go thru these steps to configure “R” with SQL Server, [link].
 

–> Another major feature is “Transactional Replication from SQL Server to Azure SQL DB”

Now you can setup Azure SQL DB as a subscriber of transaction replication, allowing you to migrate data from SQL Server instance on-premises or in IaaS to Azure SQL database without downtime.
 

–> With this release SQL Server team has added and improved lot of new features. Some of the important Enhancements and Issues fixed in this release are:
 

1. In-Memory improvements: [read more]

– Support for FOREIGN KEY, CHECK and UNIQUE constraints.

– Inline table-values functions.

– Existing NonClustered ColumnStore indexes are updateable without requiring index rebuild.

– Parallel Index build for NonClustered ColumnStore indexes.

– Performance improvements (INSERT, String pushdown, bypassing delete buffer when processing deleted rows).
 

2. Stretch Database improvements: [read more]

– Create/Drop index support

– Improved credential management for remote Stretch database stores.

– Improved performance for joins between stretched tables.

– SSMS GUI updates:
– – Database and Table level fly out menu options.
– – Stretch monitor functionality added to allow users to monitor current migration status, including the ability to pause the migration at the table level.
– – Added ability to Stretch individual tables, and added option to Stretch to new SQL Azure or existing SQL Azure server.
 

3. JSON improvements: [read more]

– OPENJSON() Table value function, parses JSON text and returns rowset.

– JSON_VALUE() Scalar function, returns a value from JSON on the specified path.

– ISJSON() function, validates that JSON is properly formatted.

– JSON_QUERY function, returns a fragment from the JSON text.
 

4. Temporal improvements: [read more]

– Support for using Temporal with In-Memory OLTP. Combining disk-based table for cost-effective storing of history data with memory-optimized tables for storing latest (actual) data.

– Direct ALTER for system-versioned temporal tables.

– New clause FOR SYSTEM_TIME ALL for querying entire data history easily without specifying period boundaries.

 

5. Row-Level Security (RLS) with Block Predicates: [read more]

– Preventing users from Inserting, Updating, and/or Deleting rows that violate the predicate.

– Security policies can now be created with SCHEMABINDING = OFF.
 

5. Query Store improvements [read more]

– Performance monitoring supported for In-Memory OLTP workloads.
 

–> Integration Services (SSIS) improvements:

– SSIS control flow template enables customers to save a commonly used control flow task or container to a standalone template file and reuse it multiple times in a package or multiple packages in a project.

– Added Azure blob source support for the Import/Export wizard.

– Relaxed Max Buffer Size of Data Flow Task.
 

–> Analysis Services (SSAS) improvements:

– DBCC support

– SSMS and SSDT updates for Tabular

– Check more at SSAS team blog.
 

–> Reporting Services (SSRS) improvements:

– Pin Reporting Services report items.

– Check more at SSRS team blog.
 

I’ve covered very few and important points here, you can check all the updates here in [SQL Server blog for CTP 3] with details.
 

For all these new features released in SQL Server 2016, check my blog posts here.
 

So, download the Preview today and start playing with the new features and plan your DB migration/upgrade.
 


 

Check & Like my FB Page.
 


Code Review checklist for SQL Server “Stored Procedures” & T-SQL Scripts

October 25, 2015 Leave a comment

 
While working on an official Database or DW/BI project we want to make sure that the Code written by team members should be correct, as per standards, with comments and beautifully indented. But everyone has a different coding style, some are good, but some are not so good, I won’t say bad :). So, to make sure everyone’s code is correct we/you introduce a process of Code Reviews.
 

Here in this post I’ll share some best coding practices and good to have stuff that will help others to better understand your code. The below checklist features some of them:
 

  1. Before sending any item for Code Review please make sure to do following actions:
          – Compile the Stored Procedures, Views, T-SQL Scripts.
          – For a SSRS report check the Stored Procedure by running from the Report, and check if you are getting desired output and it meets the requirement.
          – If it’s an Integration SP run it for Full Load, Delta Load, and see if your are getting expected functionality.

 

  1. Before doing the check-ins ensure you do a get latest and resolve all the conflicts.

 

  1. Use TRY-CATCH to handle exceptions, and THROW clause to raise the error in CATCH clause.

 

  1. Use “;THROW” clause instead of RAISERROR() syntax. This is aligned with SQL Server future versions (> 2012) as RAISERROR is deprecated & this will reduce the code from multiple lines to just 1 line.

 

  1. Do not use TRANSACTIONs while creating Reporting or read-only Stored Procedures. Use them only in Integration SPs if the DMLs exceeds by one Query.

 

  1. Do not include #Table processing within TRANSACTIONs, and try to keep them as small as possible.

 

  1. Do not JOIN/APPLY UDFs with other tables, first extract and store records in #Tables then JOIN the #Table with other tables, otherwise it may kill performance.

 

  1. Use “WHERE EXISTS (SELECT * FROM <table>)” while evaluating Sub-Queries, and use “IN()” only while dealing with constant values.

 

  1. Use COALESCE() instead of ISNULL(), its faster & fail proof.

 

  1. Never use ISNULL() on the BIT data type, as it only accepts 1/0/NULL as the possible values.

 

  1. While comparing data against a NULLABLE column, ensure COALESCE is always used to handle records which in fact have NULLs.

 

  1. Don’t use ISNUMERIC(), use TRY_PARSE() instead.

 

  1. Do not mix DDLs & DMLs in a Stored Procedure, try to use DDLs first at the top section of SP, than use DMLs in below section. Otherwise this leads to recompilation of SP on every execution and not use the optimal cached plan.

 

  1. For temporary storage use @Table Variables for small set of records, and use #Tables for larger sets of data. Like to store Parameter values use @Table variables.

 

  1. Always use SET NOCOUNT ON statement in the beginning of SP.

 

  1. Use CTEs instead of Sub-Queries for better code manageability and readability.

 

  1. Handle Divide-by-zero errors the columns/variables occurring in denominator.

 

  1. With WHERE clause put OR conditions within brackets, otherwise they will conflict with other AND conditions and may behave differently.

 

  1. In WHERE clause use columns from a LEFT JOIN table very carefully as it may lead to convert the LEFT JOIN to an INNER JOIN or behave differently.

 

  1. Do not include DROP & CREATE statements in the CREATE table script; they should be added only In the deployment scripts.

 

  1. While designing new tables, ensure most of the columns are created as NOT NULLs.

 

  1. Please do not use “SELECT *” while creating Views, in SPs, instead get only required columns. Even if all Columns are required, list out all the columns instead of “SELECT *”.

 

  1. Do not use USE <db_name> and GO keywords in DB Object (SPs, Views, Table, etc) scripts, should be only used with Custom Pre/Post SQL Scripts.

 

  1. While CREATING or ALTERING any DB object try checking its existence by using IF-ELSE condition like:

  IF NOT EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = ‘TABLE_NAME’ AND TABLE_SCHEMA = ‘SCHEMA_NAME’)
  BEGIN
    CREATE TABLE dbo.TABLE_NAME (…)
  END
 

  1. While sending Code Reviews for SSRS RDLs, also mention the URL where the reports are deployed and the changes done, with steps to populate the required params with expected O/P.

 

I would also like to hear from you what all things you make sure for Code Review process.
 


Categories: SQL Tips

Store file data to SQL without streaming on server – MSDN TSQL forum

October 14, 2015 1 comment

–> Question:

I need to store file(s) to SQL without streaming / reading at Server. I have created a Web API with AngularJS and SQL.

e.g.

var fileType = httpRequest.Files[file].ContentType;
var fileStrm = httpRequest.Files[file].InputStream;
var fileSize = httpRequest.Files[file].ContentLength;
byte[] fileRcrd = new byte[fileSize];
var file_Name = Path.GetFileName(filePath);
fileStrm.Read(fileRcrd, 0, fileSize);

Is it possible to send file data to SQL (in bytes) without streaming / reading at server?

I don’t want to put a load on server for large files. just read the data and send them to SQL where SQL will do the streaming and store data as varbinary.
 

–> Answer:

Store the file in File System via FileTable feature that uses filestream out of the box.

Check this blog on how to setup and use FileTables starting SQL Server 2012.

You can also setup Full Text Search over these files here.
 

Ref link.