Watch & Subscribe my SQL videos on YouTube | Join me on Facebook

Store JSON data in a table, OPENJSON and JSON_Value functions | SQL Server 2016 – Part 4

November 2, 2015 9 comments

In my previous posts I talked about how to [export] a Table or Query data into JSON string format, and [read it back] from JSON string to Relational-table format, and with [nested elements].
 

Here, in this post I’ll show how we can store JSON data in a normal table column, just like you store XML data.

XML data is stored in a column of XML datatype which also check the validity of the XML data to be stored. But to store JSON data there is no new datatype introduced, JSON can be stored in an NVARCHAR datatype column just like a plain text, and to validate it you can add a CHECK constraint on it.

IsJSON() function: can be used as a CHECK constraint on the columns that contain JSON string which will validate if the JSON string is in proper format or not.
 

As we will need AdvantureWorks2014 Sample Database in our example below, we need to upgrade its Compatibility from SQL 2014 to SQL 2016, i.e. from level 120 to 130, like:

USE [master]
GO

ALTER DATABASE [AdventureWorks2014] SET COMPATIBILITY_LEVEL = 130
GO

You can download AdvantureWorks2014 sample Database from Microsoft [CodePlex site].
 

–> Ok, now let’s create a new Table with OrderDetailsJSON column for storing JSON string with a CHECK constraint on it:

USE [AdventureWorks2014]
GO

CREATE TABLE CustomerOrder (
	BusinessEntityID INT, 
	FirstName NVARCHAR(50), 
	MiddleName NVARCHAR(50), 
	LastName NVARCHAR(50), 
	EmailPromotion INT,

	OrderDetailsJSON NVARCHAR(MAX) -- normal column with NVARCHAR datatype
		CHECK ( IsJSON ( OrderDetailsJSON ) = 1 ) -- CHECK Constraint to validate JSON string
)

 

–> Let’s create a sample record-set with JSON data in OrderDetailsJSON column. We will use FOR JSON AUTO option to convert relational data to JSON string for our example, as shown below:

;WITH CTE_PersonContact AS (
	SELECT 
		BusinessEntityID, FirstName, MiddleName, LastName, EmailPromotion,
		OrderDetailsJSON = 
		(	SELECT SalesOrderID, OrderDate, SubTotal, TaxAmt, TotalDue
			FROM [AdventureWorks2014].[Sales].[SalesOrderHeader] S
			WHERE S.CustomerID = P.BusinessEntityID
			FOR JSON AUTO -- here
		) -- our JSON column
	FROM [Person].[Person] P
)
INSERT INTO CustomerOrder
SELECT 
	BusinessEntityID, FirstName, MiddleName, LastName, EmailPromotion, 
	OrderDetailsJSON
FROM CTE_PersonContact
WHERE OrderDetailsJSON IS NOT NULL

-- (9778 row(s) affected)

–> Check the above inserted records with the OrderDetailsJSON column containing data in JSON format:

SELECT * FROM CustomerOrder

-- (9778 row(s) affected)

SQL Server 2016 - JSON table
 

–> Let’s Query back the JSON data from the OrderDetailsJSON column with other columns in relational form, by using OPENJSON() function. As for each Customer it can contain multiple orders we will get multiple rows for each Customer and multiple columns as per defined in the JSON string:

SELECT 
	C.BusinessEntityID, C.FirstName, C.MiddleName, C.LastName, C.EmailPromotion, 
	J.SalesOrderID, J.OrderDate, J.SubTotal, J.TaxAmt, J.TotalDue
FROM CustomerOrder C
CROSS APPLY OPENJSON (OrderDetailsJSON)
WITH (
	SalesOrderID INT, 
	OrderDate DATETIME, 
	SubTotal MONEY, 
	TaxAmt MONEY, 
	TotalDue MONEY
) AS J

-- (17463 row(s) affected)

SQL Server 2016 - JSON table 2
 

–> And if you want to get just one Order per Customer then you can use following Query, by using JSON_Value() function and by specifying the array key pointer/position to get the first value fro the array:

SELECT 
	C.BusinessEntityID, C.FirstName, C.MiddleName, C.LastName, C.EmailPromotion, 
	SalesOrderID		= JSON_Value (OrderDetailsJSON, '$[0].SalesOrderID'),
	OrderDate			= JSON_Value (OrderDetailsJSON, '$[0].OrderDate'),
	SubTotal			= JSON_Value (OrderDetailsJSON, '$[0].SubTotal'),
	TaxAmt				= JSON_Value (OrderDetailsJSON, '$[0].TaxAmt'),
	TotalDue			= JSON_Value (OrderDetailsJSON, '$[0].TotalDue')
FROM CustomerOrder C

-- (9778 row(s) affected)

 

We can also use JSON string for passing multiple values from a single parameter in an Stored Procedure to implement dynamic parameters functionality as a workaround, check the [blog post] for the same.


Reading JSON string with Nested array of elements | SQL Server 2016 – Part 3

November 1, 2015 36 comments

 

In my [previous post] I discussed about how to Import or Read a JSON string and convert it in relational/tabular format in row/column from.

 

Today in this post I’ll talk about how to read/parse JSON string with nested array of elements, just like XML.
Native JSON support in SQL Server 2016 provides you few functions to read and parse your JSON string into relational format and these are:

– OPENJSON() Table valued function: parses JSON text and returns rowset view of JSON.

– JSON_Value() Scalar function: returns a value from JSON on the specified path.

 

We will see usage of both the functions in our example below:

Here, we have just one nested element, and the OPENJSON() function will get you the child elements values:

–> Method #1.a. Using OPENJSON() function:

DECLARE @json NVARCHAR(1000)
SELECT @json = 
N'{
  "OrderHeader": [
    {
      "OrderID": 100,
      "CustomerID": 2000,
 
      "OrderDetail": 
        {
          "ProductID": 2000, 
          "UnitPrice": 350      
        }
    }
  ]
}'
 
SELECT 
    OrderID, 
    CustomerID, 
     
    [OrderDetail.ProductID] AS ProductID, 
    [OrderDetail.UnitPrice] AS UnitPrice 
 
FROM OPENJSON (@json, '$.OrderHeader') 
WITH (
    OrderID INT, 
    CustomerID INT, 
 
    [OrderDetail.ProductID] INT, 
    [OrderDetail.UnitPrice] INT
) AS Orders
OrderID	CustomerID	ProductID	UnitPrice
100	2000		2000		350

 

But, if you have more than one nested elements the same query will give just 1 row with NULL values under the child columns, like this.

–> Method #1.b. In case of multiple child elements:

DECLARE @json NVARCHAR(1000)
SELECT @json = 
N'{
  "OrderHeader": [
    {
      "OrderID": 100,
      "CustomerID": 2000,
 
      "OrderDetail": [
        {
          "ProductID": 2000, 
          "UnitPrice": 350      
        },      
        {             
          "ProductID": 3000,
          "UnitPrice": 450
        }
      ]
    }
  ]
}'
 
SELECT 
    OrderID, 
    CustomerID, 
     
    [OrderDetail.ProductID] AS ProductID, 
    [OrderDetail.UnitPrice] AS UnitPrice 
 
FROM OPENJSON (@json, '$.OrderHeader') 
WITH (
    OrderID INT, 
    CustomerID INT, 
 
    [OrderDetail.ProductID] INT, 
    [OrderDetail.UnitPrice] INT
) AS Orders 
OrderID	CustomerID	ProductID	UnitPrice
100	2000		NULL		NULL

You might be expecting 2 rows with same OrderID & CustomerID, with different ProductID & UnitPrice, right?

Instead you get ProductID & UnitPrice column values as NULL. Because, here you are having array of child elements with OrderDetail node (notice the square-bracket after “OrderDetail”: node), thus the Query is not able to find the key on the path.

 

In this case what you have to do is, use the array positions with square brackets (“[” and “]”) in your query and call out separate columns for each child element, like below:

DECLARE @json NVARCHAR(1000)
SELECT @json = 
N'{
  "OrderHeader": [
    {
      "OrderID": 100,
      "CustomerID": 2000,
 
      "OrderDetail": [
        {
          "ProductID": 2000, 
          "UnitPrice": 350      
        },      
        {             
          "ProductID": 3000,
          "UnitPrice": 450
        }
      ]
    }
  ]
}'
 
SELECT 
    OrderID, 
    CustomerID, 
 
    [OrderDetail[0]].ProductID] AS ProductID1, 
    [OrderDetail[0]].UnitPrice] AS UnitPrice1, 
 
    [OrderDetail[1]].ProductID] AS ProductID2, 
    [OrderDetail[1]].UnitPrice] AS UnitPrice2 
 
FROM OPENJSON (@json, '$.OrderHeader') 
WITH (
    OrderID INT, 
    CustomerID INT, 
 
    [OrderDetail[0]].ProductID] INT, 
    [OrderDetail[0]].UnitPrice] INT, 
 
    [OrderDetail[1]].ProductID] INT, 
    [OrderDetail[1]].UnitPrice] INT
) AS Orders
OrderID	CustomerID	ProductID1	UnitPrice1	ProductID2	UnitPrice2
100	2000		2000		350		3000		450

 

You can also specify the child elements with full path by using the dollar sign “$” inside the WITH() clause (instead at column level above), like below:

–> Method #2. Using OPENJSON() function:

DECLARE @json NVARCHAR(1000)
SELECT @json = 
N'{
  "OrderHeader": [
    {
      "OrderID": 100,
      "CustomerID": 2000,
 
      "OrderDetail": [
        {
          "ProductID": 2000, 
          "UnitPrice": 350      
        },      
        {             
          "ProductID": 3000,
          "UnitPrice": 450
        }
      ]
    }
  ]
}'
 
SELECT
    OrderID, 
    CustomerID, 
 
    ProductID1, 
    UnitPrice1, 
 
    ProductID2, 
    UnitPrice2
 
FROM OPENJSON (@json, '$.OrderHeader')
WITH (
    OrderID    INT '$.OrderID',
    CustomerID INT '$.CustomerID',
 
    ProductID1 INT '$.OrderDetail[0].ProductID',
    UnitPrice1 INT '$.OrderDetail[0].UnitPrice', 
 
    ProductID2 INT '$.OrderDetail[1].ProductID',
    UnitPrice2 INT '$.OrderDetail[1].UnitPrice'
) AS Orders
OrderID	CustomerID	ProductID1	UnitPrice1	ProductID2	UnitPrice2
100	2000		2000		350		3000		450

Ok, so by using the key path and the array position we can get the child elements value in our Query result-set by using above 2 methods.
But instead of having them in separate columns how about pulling them in separate rows, this will also make your query dynamic as you would not know the number of child-elements before hand, right?

 

This can be done by CROSS APPLYing the JSON child node with the parent node and using the JSON_Value() function, like shown below:

–> Method #3. Using JSON_Value() with OPENJSON() function:

DECLARE @json NVARCHAR(1000)
SELECT @json = 
N'{
  "OrderHeader": [
    {
      "OrderID": 100,
      "CustomerID": 2000,
 
      "OrderDetail": [
        {
          "ProductID": 2000, 
          "UnitPrice": 350      
        },      
        {             
          "ProductID": 3000,
          "UnitPrice": 450
        },
        {             
          "ProductID": 4000,
          "UnitPrice": 550
        }
      ]
    }
  ]
}'
 
SELECT
    JSON_Value (c.value, '$.OrderID') as OrderID, 
    JSON_Value (c.value, '$.CustomerID') as CustomerID, 
    JSON_Value (p.value, '$.ProductID') as ProductID, 
    JSON_Value (p.value, '$.UnitPrice') as UnitPrice
 
FROM OPENJSON (@json, '$.OrderHeader') as c
CROSS APPLY OPENJSON (c.value, '$.OrderDetail') as p
OrderID	CustomerID	ProductID	UnitPrice
100	2000		2000		350
100	2000		3000		450
100	2000		4000		550

 

Ok, that’s it for today.

In my [next post] I’ll talk about storing JSON string in a table and doing some hands-on with it.



Passing multiple/dynamic values to Stored Procedures & Functions | Part 5 – by passing JSON string

October 31, 2015 3 comments

This post is part of the [Passing multiple/dynamic values to Stored Procedures & Functions] series, and as well as the new feature Native JSON support in SQL Server 2016.

Adding the fifth part to this series we will use JSON string that will contain the set of values and pass as an JSON param variable to the SP. Then inside the SP we will parse this JSON and use those values in our SQL Queries, just like we did in previous posts with CSV/XML strings:

USE [AdventureWorks2014]
GO

-- Create an SP with NVARCHAR type parameter for JSON string:
CREATE PROCEDURE uspGetPersonDetailsJSON (
	@persons NVARCHAR(MAX)
)
AS
BEGIN
	--DECLARE @persons NVARCHAR(MAX)
	--SET @persons = '{"root":[{"Name":"Charles"},{"Name":"Jade"},{"Name":"Jim"},{"Name":"Luke"},{"Name":"Ken"}]}'
	
	SELECT Name
	INTO #tblPersons
	FROM OPENJSON (@persons, '$.root')
	WITH ( 
		Name NVARCHAR(100)
	)

	SELECT 
		BusinessEntityID, 
		Title, 
		FirstName, 
		MiddleName, 
		LastName, 
		ModifiedDate
	FROM [Person].[Person] PER
	WHERE EXISTS (
		SELECT * 
		FROM #tblPersons tmp 
		WHERE tmp.Name  = PER.FirstName
	)
	ORDER BY FirstName, LastName

	DROP TABLE #tblPersons
END
GO

-- Create JSON string:
DECLARE @json NVARCHAR(1000)
SET @json = N'{
  "root": [
    { "Name": "Charles" },
    { "Name": "Jade" },
    { "Name": "Jim" },
    { "Name": "Luke" },
    { "Name": "Ken" }
  ]
}'

-- Use the JSON string as parameter which calling the SP:
EXEC uspGetPersonDetailsJSON @json
GO

-- Check the output, objective achieved

-- Final Cleanup
DROP PROCEDURE uspGetPersonDetailsXML
GO

 

Thus you can also use JSON string similar to the way you used XML string, to pass multiple and dynamic number of parameters to your Stored Procedures.

As JSON feature is new to SQL Server 2016, so this method will only work with SQL Server 2016 and above versions.
 


Import/Read a JSON string and convert it in tabular (row/column) form | SQL Server 2016 – Part 2

October 30, 2015 6 comments

In my [previous post] I discussed on how to format/convert any Query/Table data to a string in JSON format. JSON in SQL Server was limited to this feature only till the CTP 2 version.

But now with the CTP 3 release you can do reverse of it also, means now you can read back JSON data and convert it to tabular or row & column format.
 

Let’s check this by taking same sample data from our previous JSON-Export post.

DECLARE @json NVARCHAR(1000)
SELECT @json = N'{"StudList":
[  
   {  
      "ID":1,
      "FirstName":"Manoj",
      "LastName":"Pandey",
      "Class":10,
      "Marks":80.5
   },
   {  
      "ID":2,
      "FirstName":"Saurabh",
      "LastName":"Sharma",
      "Class":11,
      "Marks":82.7
   },
   {  
      "ID":3,
      "FirstName":"Kanchan",
      "LastName":"Pandey",
      "Class":10,
      "Marks":90.5
   },
   {  
      "ID":4,
      "FirstName":"Rajesh",
      "LastName":"Shah",
      "Class":11,
      "Marks":70.3
   },
   {  
      "ID":5,
      "FirstName":"Kunal",
      "LastName":"Joshi",
      "Class":12,
      "Marks":64.7
   }
]'

SELECT ID, FirstName, LastName, Class, Marks
FROM OPENJSON (@json, '$.StudList')
WITH (
	ID INT, 
	FirstName VARCHAR(255), 
	LastName VARCHAR(255), 
	Class INT,
	Marks DECIMAL(3,1)
) AS StudList

– Output:
SQL Server 2016 - JSON export 01

Well, that was simple, isn’t it!!!
 


 

In my [next post] I’ll talk about Nested Properties in a JSON string with single and multiple elements.
 


SQL Server 2016 CTP 3 update is here | download now

October 29, 2015 Leave a comment

Microsoft yesterday (29-Oct-2015) announced the CTP 3 update of the recently and initially released Community Technology Preview (CTP) 2.x version of SQL Server 2016.
 

[Register and Download the CTP 3 Evaluation version (180 days) here]
 

–> Direct download link (~2.3 GB):

– Download the single ISO: SQLServer2016CTP3.0-x64-ENU.iso

– Or download both EXE & BOX files:
– – Box file SQLServer2016-x64-ENU.box
– – EXE file SQLServer2016-x64-ENU.exe

 

–> Check version and SQL build:

select @@version

Microsoft SQL Server 2016 (CTP3) – 13.0.700.139 (X64) Oct 7 2015 04:48:43 Copyright (c) Microsoft Corporation Enterprise Evaluation Edition (64-bit) on Windows 10 Enterprise 6.3 (Build 10240: )
 

–> The Major feature in this preview is the “RRE Integration”:

That is the Advanced Revolution Analytics R Integration, as SQL Server R Service within SQL Server 2016. “R” is the most popular programming language for Advanced Analytics. You can use it to analyze data, uncover patterns and trends and build predictive models. It offers an incredibly rich set of packages and a vibrant and fast-growing developer community.

SQL Server R Services with in-database analytics will help with:

– Data Exploration and Predictive Modeling with R over SQL Server data

– Operationalizing your R code using T-SQL

While installing SQL Server 2016, you will get an option to integrate R, it mentions “Includes Advanced Analytics Extensions that enables integration with R language using standard T-SQL statements.”, as shown below:

SQL Server 2016 - R - Advance Analytics

Post installation you need to go thru these steps to configure “R” with SQL Server, [link].
 

–> Another major feature is “Transactional Replication from SQL Server to Azure SQL DB”

Now you can setup Azure SQL DB as a subscriber of transaction replication, allowing you to migrate data from SQL Server instance on-premises or in IaaS to Azure SQL database without downtime.
 

–> With this release SQL Server team has added and improved lot of new features. Some of the important Enhancements and Issues fixed in this release are:
 

1. In-Memory improvements: [read more]

– Support for FOREIGN KEY, CHECK and UNIQUE constraints.

– Inline table-values functions.

– Existing NonClustered ColumnStore indexes are updateable without requiring index rebuild.

– Parallel Index build for NonClustered ColumnStore indexes.

– Performance improvements (INSERT, String pushdown, bypassing delete buffer when processing deleted rows).
 

2. Stretch Database improvements: [read more]

– Create/Drop index support

– Improved credential management for remote Stretch database stores.

– Improved performance for joins between stretched tables.

– SSMS GUI updates:
– – Database and Table level fly out menu options.
– – Stretch monitor functionality added to allow users to monitor current migration status, including the ability to pause the migration at the table level.
– – Added ability to Stretch individual tables, and added option to Stretch to new SQL Azure or existing SQL Azure server.
 

3. JSON improvements: [read more]

– OPENJSON() Table value function, parses JSON text and returns rowset.

– JSON_VALUE() Scalar function, returns a value from JSON on the specified path.

– ISJSON() function, validates that JSON is properly formatted.

– JSON_QUERY function, returns a fragment from the JSON text.
 

4. Temporal improvements: [read more]

– Support for using Temporal with In-Memory OLTP. Combining disk-based table for cost-effective storing of history data with memory-optimized tables for storing latest (actual) data.

– Direct ALTER for system-versioned temporal tables.

– New clause FOR SYSTEM_TIME ALL for querying entire data history easily without specifying period boundaries.

 

5. Row-Level Security (RLS) with Block Predicates: [read more]

– Preventing users from Inserting, Updating, and/or Deleting rows that violate the predicate.

– Security policies can now be created with SCHEMABINDING = OFF.
 

5. Query Store improvements [read more]

– Performance monitoring supported for In-Memory OLTP workloads.
 

–> Integration Services (SSIS) improvements:

– SSIS control flow template enables customers to save a commonly used control flow task or container to a standalone template file and reuse it multiple times in a package or multiple packages in a project.

– Added Azure blob source support for the Import/Export wizard.

– Relaxed Max Buffer Size of Data Flow Task.
 

–> Analysis Services (SSAS) improvements:

– DBCC support

– SSMS and SSDT updates for Tabular

– Check more at SSAS team blog.
 

–> Reporting Services (SSRS) improvements:

– Pin Reporting Services report items.

– Check more at SSRS team blog.
 

I’ve covered very few and important points here, you can check all the updates here in [SQL Server blog for CTP 3] with details.
 

For all these new features released in SQL Server 2016, check my blog posts here.
 

So, download the Preview today and start playing with the new features and plan your DB migration/upgrade.
 


 

Check & Like my FB Page.