Archive

Archive for November, 2015

SQL DBA – Database restored successfully, but stuck in (Restoring…) state

November 27, 2015 Leave a comment

 
Today after restoring a database on my DEV machine from a PROD backup, I observed that the database was still showing as Restoring… in Object Explorer, and cannot be used. I again checked in SSMS results pane and it was showing as a successful restore with below message:

100 percent processed.
Processed 21713736 pages for database ‘dbName’, file ‘dbName_Data’ on file 1.
Processed 398 pages for database ‘dbName’, file ‘dbName_log’ on file 1.
RESTORE DATABASE successfully processed 21714134 pages in 506.682 seconds (334.808 MB/sec).

 
I checked online and found that I was using WITH NO RECOVERY option with my RESTORE DATABASE statement, like:

RESTORE DATABASE db_name WITH NO RECOVERY

… and by using above option, you make the Database to be unused, and allow more Roll Forward actions to continue with the next RESTORE DATABASE statements in sequence.

 
As this was not my intention, so I can simply mark by Database to stop accepting more Transaction Logs, by forcing the database out of Restoring state, by issuing following statement:

RESTORE DATABASE dbName WITH RECOVERY

Converting database ‘dbName’ from version 706 to the current version 852.
Database ‘dbName’ running the upgrade step from version 706 to version 770.

Database ‘dbName’ running the upgrade step from version 851 to version 852.
RESTORE DATABASE successfully processed 0 pages in 1.716 seconds (0.000 MB/sec).

… and my database was out from Restoring… state, and I could use it now !!!


Advertisement
Categories: DBA Stuff, SQL Errors Tags:

New syntax option “IF EXISTS” with DROP and ALTER statements – in SQL Server 2016

November 3, 2015 4 comments

I was going through some sample Scripts provided by Microsoft SQL Server team on their site, and was checking the JSON Sample Queries procedures views and indexes.sql script file.
 

And the following ALTER statement grabbed my attention:

ALTER TABLE Sales.SalesOrder_json
DROP
	COLUMN IF EXISTS vCustomerName,
	CONSTRAINT IF EXISTS [SalesOrder reasons must be formatted as JSON array],
	COLUMN IF EXISTS SalesReasons,
	CONSTRAINT IF EXISTS [SalesOrder items must be formatted as JSON array],
	COLUMN IF EXISTS OrderItems,
	CONSTRAINT IF EXISTS [SalesOrder additional information must be formatted as JSON],
	COLUMN IF EXISTS Info
GO

The above DDL Query is removing Columns and Constraints form the table, and if you notice there is a new option after the COLUMN/CONSTRAINT name i.e. IF EXISTS.
 

And same with the DROP statements:

DROP INDEX IF EXISTS idx_SalesOrder_json_CustomerName ON Sales.SalesOrder_json
go
DROP PROCEDURE IF EXISTS Person.PersonList_json
go
DROP VIEW IF EXISTS Sales.vwSalesOrderInfoRel_json
go
DROP FUNCTION IF EXISTS dbo.ufnToRawJsonArray
go

Here also it is dropping Database objects conditionally by using IF EXISTS in between the object type and name.
 

To make sure I checked the MSDN BOL and found that this is a new feature added to the SQL Server 2016 version. And as per this msdn article this enhancement has been add with the CTP 3 release.

For IF EXISTS option/syntax the MSDN BoL mentions: Conditionally drops the [object] only if it already exists.
 

This is a very helpful enhancement added to these DDL statements and would reduce a lot of effort and coding lines.

–> Previously with ALTER statement, to DROP any item you had to check the existence of it with a separate IF EXISTS() statement, and then DROP it within the IF condition, like:

IF EXISTS (select * from INFORMATION_SCHEMA.COLUMNS WHERE COLUMN_NAME = 'vCustomerName')
BEGIN
	ALTER TABLE Sales.SalesOrder_json
	DROP COLUMN vCustomerName;
END
GO

This is only for one column, to DROP other 6 columns/constraints you will have to repeat this 6 more times.

–> Similarly for the DROP statement, you would need to do:

IF EXISTS (select * from INFORMATION_SCHEMA.ROUTINES WHERE ROUTINE_NAME = 'PersonList_json')
BEGIN
	DROP PROCEDURE Person.PersonList_json
END
GO

But if you check the ALTER statement with the IF EXISTS option at the top it is doing 7 ALTER DROP operations with one statement and within a single transaction. And similarly the DROP statement with this IF EXISTS option is doing it in one go.
 

Thus, the new IF EXISTS syntax can be used optionally to check & drop items form a single statement with these DDLs:

1. ALTER: statement with DROP COLUMN & DROP CONSTRAINT option.

2. DROP: statement with all Database objects, like: AGGREGATE, ASSEMBLY, COLUMN, CONSTRAINT, DATABASE, DEFAULT, FUNCTION, INDEX, PROCEDURE, ROLE, RULE, SCHEMA, SECURITY POLICY, SEQUENCE, SYNONYM, TABLE, TRIGGER, TYPE, USER, and VIEW.
 

You can download the SQL Server 2016 sample Database and files from this link.
 

Check more new features of SQL Server 2016 here: https://sqlwithmanoj.com/sql-server-2016-articles/


Store JSON data in a table, OPENJSON and JSON_Value functions | SQL Server 2016 – Part 4

November 2, 2015 9 comments

In my previous posts I talked about how to [export] a Table or Query data into JSON string format, and [read it back] from JSON string to Relational-table format, and with [nested elements].
 

Here, in this post I’ll show how we can store JSON data in a normal table column, just like you store XML data.

XML data is stored in a column of XML datatype which also check the validity of the XML data to be stored. But to store JSON data there is no new datatype introduced, JSON can be stored in an NVARCHAR datatype column just like a plain text, and to validate it you can add a CHECK constraint on it.

IsJSON() function: can be used as a CHECK constraint on the columns that contain JSON string which will validate if the JSON string is in proper format or not.
 

As we will need AdvantureWorks2014 Sample Database in our example below, we need to upgrade its Compatibility from SQL 2014 to SQL 2016, i.e. from level 120 to 130, like:

USE [master]
GO

ALTER DATABASE [AdventureWorks2014] SET COMPATIBILITY_LEVEL = 130
GO

You can download AdvantureWorks2014 sample Database from Microsoft [CodePlex site].
 

–> Ok, now let’s create a new Table with OrderDetailsJSON column for storing JSON string with a CHECK constraint on it:

USE [AdventureWorks2014]
GO

CREATE TABLE CustomerOrder (
	BusinessEntityID INT, 
	FirstName NVARCHAR(50), 
	MiddleName NVARCHAR(50), 
	LastName NVARCHAR(50), 
	EmailPromotion INT,

	OrderDetailsJSON NVARCHAR(MAX) -- normal column with NVARCHAR datatype
		CHECK ( IsJSON ( OrderDetailsJSON ) = 1 ) -- CHECK Constraint to validate JSON string
)

 

–> Let’s create a sample record-set with JSON data in OrderDetailsJSON column. We will use FOR JSON AUTO option to convert relational data to JSON string for our example, as shown below:

;WITH CTE_PersonContact AS (
	SELECT 
		BusinessEntityID, FirstName, MiddleName, LastName, EmailPromotion,
		OrderDetailsJSON = 
		(	SELECT SalesOrderID, OrderDate, SubTotal, TaxAmt, TotalDue
			FROM [AdventureWorks2014].[Sales].[SalesOrderHeader] S
			WHERE S.CustomerID = P.BusinessEntityID
			FOR JSON AUTO -- here
		) -- our JSON column
	FROM [Person].[Person] P
)
INSERT INTO CustomerOrder
SELECT 
	BusinessEntityID, FirstName, MiddleName, LastName, EmailPromotion, 
	OrderDetailsJSON
FROM CTE_PersonContact
WHERE OrderDetailsJSON IS NOT NULL

-- (9778 row(s) affected)

–> Check the above inserted records with the OrderDetailsJSON column containing data in JSON format:

SELECT * FROM CustomerOrder

-- (9778 row(s) affected)

SQL Server 2016 - JSON table
 

–> Let’s Query back the JSON data from the OrderDetailsJSON column with other columns in relational form, by using OPENJSON() function. As for each Customer it can contain multiple orders we will get multiple rows for each Customer and multiple columns as per defined in the JSON string:

SELECT 
	C.BusinessEntityID, C.FirstName, C.MiddleName, C.LastName, C.EmailPromotion, 
	J.SalesOrderID, J.OrderDate, J.SubTotal, J.TaxAmt, J.TotalDue
FROM CustomerOrder C
CROSS APPLY OPENJSON (OrderDetailsJSON)
WITH (
	SalesOrderID INT, 
	OrderDate DATETIME, 
	SubTotal MONEY, 
	TaxAmt MONEY, 
	TotalDue MONEY
) AS J

-- (17463 row(s) affected)

SQL Server 2016 - JSON table 2
 

–> And if you want to get just one Order per Customer then you can use following Query, by using JSON_Value() function and by specifying the array key pointer/position to get the first value fro the array:

SELECT 
	C.BusinessEntityID, C.FirstName, C.MiddleName, C.LastName, C.EmailPromotion, 
	SalesOrderID		= JSON_Value (OrderDetailsJSON, '$[0].SalesOrderID'),
	OrderDate			= JSON_Value (OrderDetailsJSON, '$[0].OrderDate'),
	SubTotal			= JSON_Value (OrderDetailsJSON, '$[0].SubTotal'),
	TaxAmt				= JSON_Value (OrderDetailsJSON, '$[0].TaxAmt'),
	TotalDue			= JSON_Value (OrderDetailsJSON, '$[0].TotalDue')
FROM CustomerOrder C

-- (9778 row(s) affected)

 

We can also use JSON string for passing multiple values from a single parameter in an Stored Procedure to implement dynamic parameters functionality as a workaround, check the [blog post] for the same.


Reading JSON string with Nested array of elements | SQL Server 2016 – Part 3

November 1, 2015 36 comments

 

In my [previous post] I discussed about how to Import or Read a JSON string and convert it in relational/tabular format in row/column from.

 

Today in this post I’ll talk about how to read/parse JSON string with nested array of elements, just like XML.
Native JSON support in SQL Server 2016 provides you few functions to read and parse your JSON string into relational format and these are:

– OPENJSON() Table valued function: parses JSON text and returns rowset view of JSON.

– JSON_Value() Scalar function: returns a value from JSON on the specified path.

 

We will see usage of both the functions in our example below:

Here, we have just one nested element, and the OPENJSON() function will get you the child elements values:

–> Method #1.a. Using OPENJSON() function:

DECLARE @json NVARCHAR(1000)
SELECT @json = 
N'{
  "OrderHeader": [
    {
      "OrderID": 100,
      "CustomerID": 2000,
 
      "OrderDetail": 
        {
          "ProductID": 2000, 
          "UnitPrice": 350      
        }
    }
  ]
}'
 
SELECT 
    OrderID, 
    CustomerID, 
     
    [OrderDetail.ProductID] AS ProductID, 
    [OrderDetail.UnitPrice] AS UnitPrice 
 
FROM OPENJSON (@json, '$.OrderHeader') 
WITH (
    OrderID INT, 
    CustomerID INT, 
 
    [OrderDetail.ProductID] INT, 
    [OrderDetail.UnitPrice] INT
) AS Orders
OrderID	CustomerID	ProductID	UnitPrice
100	2000		2000		350

 

But, if you have more than one nested elements the same query will give just 1 row with NULL values under the child columns, like this.

–> Method #1.b. In case of multiple child elements:

DECLARE @json NVARCHAR(1000)
SELECT @json = 
N'{
  "OrderHeader": [
    {
      "OrderID": 100,
      "CustomerID": 2000,
 
      "OrderDetail": [
        {
          "ProductID": 2000, 
          "UnitPrice": 350      
        },      
        {             
          "ProductID": 3000,
          "UnitPrice": 450
        }
      ]
    }
  ]
}'
 
SELECT 
    OrderID, 
    CustomerID, 
     
    [OrderDetail.ProductID] AS ProductID, 
    [OrderDetail.UnitPrice] AS UnitPrice 
 
FROM OPENJSON (@json, '$.OrderHeader') 
WITH (
    OrderID INT, 
    CustomerID INT, 
 
    [OrderDetail.ProductID] INT, 
    [OrderDetail.UnitPrice] INT
) AS Orders 
OrderID	CustomerID	ProductID	UnitPrice
100	2000		NULL		NULL

You might be expecting 2 rows with same OrderID & CustomerID, with different ProductID & UnitPrice, right?

Instead you get ProductID & UnitPrice column values as NULL. Because, here you are having array of child elements with OrderDetail node (notice the square-bracket after “OrderDetail”: node), thus the Query is not able to find the key on the path.

 

In this case what you have to do is, use the array positions with square brackets (“[” and “]”) in your query and call out separate columns for each child element, like below:

DECLARE @json NVARCHAR(1000)
SELECT @json = 
N'{
  "OrderHeader": [
    {
      "OrderID": 100,
      "CustomerID": 2000,
 
      "OrderDetail": [
        {
          "ProductID": 2000, 
          "UnitPrice": 350      
        },      
        {             
          "ProductID": 3000,
          "UnitPrice": 450
        }
      ]
    }
  ]
}'
 
SELECT 
    OrderID, 
    CustomerID, 
 
    [OrderDetail[0]].ProductID] AS ProductID1, 
    [OrderDetail[0]].UnitPrice] AS UnitPrice1, 
 
    [OrderDetail[1]].ProductID] AS ProductID2, 
    [OrderDetail[1]].UnitPrice] AS UnitPrice2 
 
FROM OPENJSON (@json, '$.OrderHeader') 
WITH (
    OrderID INT, 
    CustomerID INT, 
 
    [OrderDetail[0]].ProductID] INT, 
    [OrderDetail[0]].UnitPrice] INT, 
 
    [OrderDetail[1]].ProductID] INT, 
    [OrderDetail[1]].UnitPrice] INT
) AS Orders
OrderID	CustomerID	ProductID1	UnitPrice1	ProductID2	UnitPrice2
100	2000		2000		350		3000		450

 

You can also specify the child elements with full path by using the dollar sign “$” inside the WITH() clause (instead at column level above), like below:

–> Method #2. Using OPENJSON() function:

DECLARE @json NVARCHAR(1000)
SELECT @json = 
N'{
  "OrderHeader": [
    {
      "OrderID": 100,
      "CustomerID": 2000,
 
      "OrderDetail": [
        {
          "ProductID": 2000, 
          "UnitPrice": 350      
        },      
        {             
          "ProductID": 3000,
          "UnitPrice": 450
        }
      ]
    }
  ]
}'
 
SELECT
    OrderID, 
    CustomerID, 
 
    ProductID1, 
    UnitPrice1, 
 
    ProductID2, 
    UnitPrice2
 
FROM OPENJSON (@json, '$.OrderHeader')
WITH (
    OrderID    INT '$.OrderID',
    CustomerID INT '$.CustomerID',
 
    ProductID1 INT '$.OrderDetail[0].ProductID',
    UnitPrice1 INT '$.OrderDetail[0].UnitPrice', 
 
    ProductID2 INT '$.OrderDetail[1].ProductID',
    UnitPrice2 INT '$.OrderDetail[1].UnitPrice'
) AS Orders
OrderID	CustomerID	ProductID1	UnitPrice1	ProductID2	UnitPrice2
100	2000		2000		350		3000		450

Ok, so by using the key path and the array position we can get the child elements value in our Query result-set by using above 2 methods.
But instead of having them in separate columns how about pulling them in separate rows, this will also make your query dynamic as you would not know the number of child-elements before hand, right?

 

This can be done by CROSS APPLYing the JSON child node with the parent node and using the JSON_Value() function, like shown below:

–> Method #3. Using JSON_Value() with OPENJSON() function:

DECLARE @json NVARCHAR(1000)
SELECT @json = 
N'{
  "OrderHeader": [
    {
      "OrderID": 100,
      "CustomerID": 2000,
 
      "OrderDetail": [
        {
          "ProductID": 2000, 
          "UnitPrice": 350      
        },      
        {             
          "ProductID": 3000,
          "UnitPrice": 450
        },
        {             
          "ProductID": 4000,
          "UnitPrice": 550
        }
      ]
    }
  ]
}'
 
SELECT
    JSON_Value (c.value, '$.OrderID') as OrderID, 
    JSON_Value (c.value, '$.CustomerID') as CustomerID, 
    JSON_Value (p.value, '$.ProductID') as ProductID, 
    JSON_Value (p.value, '$.UnitPrice') as UnitPrice
 
FROM OPENJSON (@json, '$.OrderHeader') as c
CROSS APPLY OPENJSON (c.value, '$.OrderDetail') as p
OrderID	CustomerID	ProductID	UnitPrice
100	2000		2000		350
100	2000		3000		450
100	2000		4000		550

 

Ok, that’s it for today.

In my [next post] I’ll talk about storing JSON string in a table and doing some hands-on with it.