Advertisements

Archive

Archive for the ‘SQL Server Internals’ Category

SQL Job creation failing (having VBScript step) after SQL Server 2016/2017 upgrade

May 21, 2018 Leave a comment

 
Yesterday my friend pinged me and told that he is facing some issues while executing a SQL Job DDL script. They had upgraded their SQL Server version from 2008 to 2016, and while creating SQL Jobs they were facing below error:

Msg 14234, Level 16, State 1, Procedure sp_verify_subsystem, Line 28 [Batch Start Line 2]
The specified ‘@subsystem’ is invalid (valid values are returned by sp_enum_sqlagent_subsystems).

 
I checked the code and on inquiring for a moment I saw that one of the SQL Job step was configured for a VB Script, as shown below:

...
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep 
@job_id=@jobId, 
@step_name=N'xyz VBScript', 
@step_id=3, 
@cmdexec_success_code=0, 
@on_success_action=4, 
@on_success_step_id=0, 
@on_fail_action=5, 
@on_fail_step_id=0, 
@retry_attempts=0, 
@retry_interval=0, 
@os_run_priority=0, 
@subsystem=N'ActiveScripting', 
...

On quickly checking on net I came to know that this feature has been discontinued and should not be used.

** Important *\* This feature will be removed in a future version of Microsoft SQL Server. Avoid using this feature in new development work, and plan to modify applications that currently use this feature.
MS BoL link

 

This MS BoL link also mentions about the discontinued feature:

ActiveX subsytem is discontinued. Use command line or PowerShell scripts instead.


Advertisements

SQL Server 2017 – ColumnStore Index enhancements and improvements over previous versions

December 21, 2017 1 comment

 
ColumnStore Indexes were first introduced in SQL Server 2012, and this created a new way to store and retrieve the Index or Table data in an efficient manner.
 

What is a ColumnStore Index?
 

What all new features & enhancements done in ColumnStore Index from SQL Server 2012 to 2014 and 2016?
 

–> What’s new in SQL Server 2017?

1. Online Non-Clustered ColumnStore index build and rebuild support added

2. Clustered Columnstore Indexes now support LOB columns (nvarchar(max), varchar(max), varbinary(max))

3. Columnstore index can have a non-persisted computed columns

4. The -fc option in Database Tuning Advisor (DTA) for allowing recommendations of ColumnStore indexes
 

–> Video on ColumnStore Index:


DB Basics – How to control Data Redundancy in a database system (RDBMS) by Normalization

April 12, 2016 Leave a comment

Redundancy in Database systems occurs with various insert, update, and delete anomalies.

To avoid these anomalies in first step you need to make sure your database tables or relations are in good normal forms or normalized upto a certain level.

Normalization is a systematic way of ensuring that a database structure is suitable for general-purpose querying and free the anomalies discussed above. that could lead to a loss of data integrity.

Normal forms in a database or the concept of Normalization makes a Relation or Table free from insert/update/delete anomalies and saves space by removing duplicate data.
 

–> According to E. F. Codd the objectives of normalization were stated as follows:

1. To free the collection of relations from undesirable insertion, update and deletion dependencies.

2. To reduce the need for restructuring the collection of relations as new types of data are introduced, and thus increase the life span of application programs.

3. To make the relational model more informative to users.

4. To make the collection of relations neutral to the query statistics, where these statistics are liable to change as time goes by.
 

As of now there are total 8 normal forms, but to keep our data consistent & non-redundant the first 3 Normal Forms are sufficient.
 

–> Anomalies like: Let’s say you have a single table that stores Employee and Department details, thus:

1. Insert Anomaly: If you are inserting a detail of an Employee then his department detail will also be entered for every employee record, thus departments details will be repeated with multiple records, thus storing duplicate data for Departments.

2. Update Anomaly: While updating a department detail you have to update the same department for various employees, which may lead to inconsistent state if any record is left while updating or on any error.

3. Delete Anomaly: If a department is closed, then deleting department record will also delete the Employee records, thus missing records.
 

The process of normalization makes this EmployeeDepartment table to decompose or split into 2 or more tables and linked them by Foreign Keys, thus eliminating duplicate records, data redundancy and making data/records consistent across all relations/tables.

– 1st NF talks about atomic values and non-repeating groups.

– 2nd NF enforces that a non-Key attribute should belong to entire Key attribute.

– 3rd NF makes sure that there should be no transitive dependency between a non-Key and a Key attribute.

For details on these 3 NFs check my blog post on [Database Normalization].
 


Parse or Query XML column with XMLNAMESPACES (xmlns namespace) – MSDN TSQL forum

January 15, 2016 Leave a comment

–> Question:

We have SQL audit information.
We would like to select XML column in some user friendly way.

CREATE TABLE [dbo].[audit](
	[server_instance_name] [NVARCHAR](128) NULL,
	[statement] [NVARCHAR](4000) NULL,
	[additional_information] XML NULL
)


INSERT INTO [dbo].[audit]([server_instance_name],[statement],[additional_information]) 
VALUES('srv1','sp_addlinkedsrvlogin','')

INSERT INTO [dbo].[audit]([server_instance_name],[statement],[additional_information]) 
VALUES('srv2','','<action_info xmlns="http://schemas.microsoft.com/sqlserver/2008/sqlaudit_data"><session><![CDATA[Audit$A]]></session><action>event enabled</action><startup_type>manual</startup_type><object><![CDATA[audit_event]]></object></action_info>')

SELECT * FROM [dbo].[audit]

Output of the XML column:

XML parse

Required Output:

XML parse2

 

–> Answer:

;WITH XMLNAMESPACES('http://schemas.microsoft.com/sqlserver/2008/sqlaudit_data' as ns)
SELECT 
	 [server_instance_name]
	,[statement]
	,[additional_information]
	,t.c.value ('ns:session[1]', 'varchar(50)') AS session
	,t.c.value ('ns:action[1]', 'varchar(50)') AS action
	,t.c.value ('ns:startup_type[1]', 'varchar(50)') AS startup_type
	,t.c.value ('ns:object[1]', 'varchar(50)') AS object
FROM [audit] as a
OUTER APPLY a.additional_information.nodes('//ns:action_info') as t(c)
GO

-- OR -- 

;WITH XMLNAMESPACES(DEFAULT 'http://schemas.microsoft.com/sqlserver/2008/sqlaudit_data')
SELECT 
	 [server_instance_name]
	,[statement]
	,t.c.value ('session[1]', 'varchar(50)') AS session
	,t.c.value ('action[1]', 'varchar(50)') AS action
	,t.c.value ('startup_type[1]', 'varchar(50)') AS startup_type
	,t.c.value ('object[1]', 'varchar(50)') AS object
FROM [audit] as a
OUTER APPLY a.additional_information.nodes('//action_info') as t(c)
GO

 

Drop table finally

DROP TABLE [audit]
GO

 

Ref link.


Passing multiple/dynamic values to Stored Procedures & Functions | Part 5 – by passing JSON string

October 31, 2015 3 comments

This post is part of the [Passing multiple/dynamic values to Stored Procedures & Functions] series, and as well as the new feature Native JSON support in SQL Server 2016.

Adding the fifth part to this series we will use JSON string that will contain the set of values and pass as an JSON param variable to the SP. Then inside the SP we will parse this JSON and use those values in our SQL Queries, just like we did in previous posts with CSV/XML strings:

USE [AdventureWorks2014]
GO

-- Create an SP with NVARCHAR type parameter for JSON string:
CREATE PROCEDURE uspGetPersonDetailsJSON (
	@persons NVARCHAR(MAX)
)
AS
BEGIN
	--DECLARE @persons NVARCHAR(MAX)
	--SET @persons = '{"root":[{"Name":"Charles"},{"Name":"Jade"},{"Name":"Jim"},{"Name":"Luke"},{"Name":"Ken"}]}'
	
	SELECT Name
	INTO #tblPersons
	FROM OPENJSON (@persons, '$.root')
	WITH ( 
		Name NVARCHAR(100)
	)

	SELECT 
		BusinessEntityID, 
		Title, 
		FirstName, 
		MiddleName, 
		LastName, 
		ModifiedDate
	FROM [Person].[Person] PER
	WHERE EXISTS (
		SELECT * 
		FROM #tblPersons tmp 
		WHERE tmp.Name  = PER.FirstName
	)
	ORDER BY FirstName, LastName

	DROP TABLE #tblPersons
END
GO

-- Create JSON string:
DECLARE @json NVARCHAR(1000)
SET @json = N'{
  "root": [
    { "Name": "Charles" },
    { "Name": "Jade" },
    { "Name": "Jim" },
    { "Name": "Luke" },
    { "Name": "Ken" }
  ]
}'

-- Use the JSON string as parameter which calling the SP:
EXEC uspGetPersonDetailsJSON @json
GO

-- Check the output, objective achieved

-- Final Cleanup
DROP PROCEDURE uspGetPersonDetailsXML
GO

 

Thus you can also use JSON string similar to the way you used XML string, to pass multiple and dynamic number of parameters to your Stored Procedures.

As JSON feature is new to SQL Server 2016, so this method will only work with SQL Server 2016 and above versions.