Code Review checklist for SQL Server “Stored Procedures” & T-SQL Scripts
While working on an official Database or DW/BI project we want to make sure that the Code written by team members should be correct, as per standards, with comments and beautifully indented. But everyone has a different coding style, some are good, but some are not so good, I won’t say bad :). So, to make sure everyone’s code is correct we/you introduce a process of Code Reviews.
Here in this post I’ll share some best coding practices and good to have stuff that will help others to better understand your code. The below checklist features some of them:
- Before sending any item for Code Review please make sure to do following actions:
-
– Compile the Stored Procedures, Views, T-SQL Scripts.
– For a SSRS report check the Stored Procedure by running from the Report, and check if you are getting desired output and it meets the requirement.
– If it’s an Integration SP run it for Full Load, Delta Load, and see if your are getting expected functionality.
- Before doing the check-ins ensure you do a get latest and resolve all the conflicts.
- Use TRY-CATCH to handle exceptions, and THROW clause to raise the error in CATCH clause.
- Use “;THROW” clause instead of RAISERROR() syntax. This is aligned with SQL Server future versions (> 2012) as RAISERROR is deprecated & this will reduce the code from multiple lines to just 1 line.
- Do not use TRANSACTIONs while creating Reporting or read-only Stored Procedures. Use them only in Integration SPs if the DMLs exceeds by one Query.
- Do not include #Table processing within TRANSACTIONs, and try to keep them as small as possible.
- Do not JOIN/APPLY UDFs with other tables, first extract and store records in #Tables then JOIN the #Table with other tables, otherwise it may kill performance.
- Use “WHERE EXISTS (SELECT * FROM <table>)” while evaluating Sub-Queries, and use “IN()” only while dealing with constant values.
- Use COALESCE() instead of ISNULL(), its faster & fail proof.
- Never use ISNULL() on the BIT data type, as it only accepts 1/0/NULL as the possible values.
- While comparing data against a NULLABLE column, ensure COALESCE is always used to handle records which in fact have NULLs.
- Don’t use ISNUMERIC(), use TRY_PARSE() instead.
- Do not mix DDLs & DMLs in a Stored Procedure, try to use DDLs first at the top section of SP, than use DMLs in below section. Otherwise this leads to recompilation of SP on every execution and not use the optimal cached plan.
- For temporary storage use @Table Variables for small set of records, and use #Tables for larger sets of data. Like to store Parameter values use @Table variables.
- Always use SET NOCOUNT ON statement in the beginning of SP.
- Use CTEs instead of Sub-Queries for better code manageability and readability.
- Handle Divide-by-zero errors the columns/variables occurring in denominator.
- With WHERE clause put OR conditions within brackets, otherwise they will conflict with other AND conditions and may behave differently.
- In WHERE clause use columns from a LEFT JOIN table very carefully as it may lead to convert the LEFT JOIN to an INNER JOIN or behave differently.
- Do not include DROP & CREATE statements in the CREATE table script; they should be added only In the deployment scripts.
- While designing new tables, ensure most of the columns are created as NOT NULLs.
- Please do not use “SELECT *” while creating Views, in SPs, instead get only required columns. Even if all Columns are required, list out all the columns instead of “SELECT *”.
- Do not use USE <db_name> and GO keywords in DB Object (SPs, Views, Table, etc) scripts, should be only used with Custom Pre/Post SQL Scripts.
- While CREATING or ALTERING any DB object try checking its existence by using IF-ELSE condition like:
IF NOT EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = ‘TABLE_NAME’ AND TABLE_SCHEMA = ‘SCHEMA_NAME’)
BEGIN
CREATE TABLE dbo.TABLE_NAME (…)
END
- While sending Code Reviews for SSRS RDLs, also mention the URL where the reports are deployed and the changes done, with steps to populate the required params with expected O/P.
I would also like to hear from you what all things you make sure for Code Review process.
Store file data to SQL without streaming on server – MSDN TSQL forum
–> Question:
I need to store file(s) to SQL without streaming / reading at Server. I have created a Web API with AngularJS and SQL.
e.g.
var fileType = httpRequest.Files[file].ContentType;
var fileStrm = httpRequest.Files[file].InputStream;
var fileSize = httpRequest.Files[file].ContentLength;
byte[] fileRcrd = new byte[fileSize];
var file_Name = Path.GetFileName(filePath);
fileStrm.Read(fileRcrd, 0, fileSize);
Is it possible to send file data to SQL (in bytes) without streaming / reading at server?
I don’t want to put a load on server for large files. just read the data and send them to SQL where SQL will do the streaming and store data as varbinary.
–> Answer:
Store the file in File System via FileTable feature that uses filestream out of the box.
Check this blog on how to setup and use FileTables starting SQL Server 2012.
You can also setup Full Text Search over these files here.
Ref link.
SQL Tips – Check count of records in Temporary Tables from other sessions
Recently one Stored Procedure was giving issues on our Production system. The SP had some temporary (#) tables that were getting populated in sequence, but the final result was not as expected. As Temp-Tables cannot be accessed across other sessions you cannot see what’s going on. Also as its a PROD system I could not dissect or change the SP and see what’s happening inside.
Debugging a SP on a DEV machine is simple. I can add some extra checks after temp-table population code to get the record count by using “SELECT @@rowcount”, and was able to see the issues.
But this cannot be done on PROD, as you cannot alter the SPs there. So, to see which temp table is being populated you can use below query. This will also show the records count if any temp table is in mid of population.
SELECT T.NAME AS TABLE_NAME, S.ROW_COUNT FROM TEMPDB.sys.dm_db_partition_stats AS S INNER JOIN TEMPDB.sys.tables AS T ON S.OBJECT_ID = T.OBJECT_ID WHERE S.INDEX_ID < 2 and T.NAME like '%#TempTable%'; -- Give #Table name here
SQL Error – Cannot create index on view ‘vw_SomeView’ because the view is not schema bound.
This error occurs when you try to create an Index on top of a View which is not created by using “WITH SCHEMABINDING” option.
When you create Index on top of a View they are called “Indexed View” or “Materialized View”.
–> Let’s check below:
use [AdventureWorks2014] go DROP VIEW IF EXISTS vw_Person GO CREATE VIEW vw_Person AS SELECT [BusinessEntityID], [FirstName], [MiddleName], [LastName], [Suffix], [EmailPromotion], [ModifiedDate] FROM [Person].[Person] GO CREATE UNIQUE CLUSTERED INDEX CIX_vw_Person ON [vw_Person]([BusinessEntityID]) GO
Msg 1939, Level 16, State 1, Line 18
Cannot create index on view ‘vw_Person’ because the view is not schema bound.
–> So to fix this issue you can ALTER the view definition by adding “WITH SCHEMABINDING” option with ALTER/CREATE VIEW statement as shown below.
Thus, creating an Index on a Schema bound view will not throw this error.
ALTER VIEW vw_Person WITH SCHEMABINDING --<< here << AS SELECT [BusinessEntityID], [FirstName], [MiddleName], [LastName], [Suffix], [EmailPromotion], [ModifiedDate] FROM [Person].[Person] GO CREATE UNIQUE CLUSTERED INDEX CIX_vw_Person ON [vw_Person]([BusinessEntityID]) GO
–> By adding this rule or restriction SQL Server just wants to make sure if an Index is created on a View, nobody can directly change the definition of Table(s) underneath the View. And if it needs a change then first the View need to be dropped and re-created after altering the tables (error shown below).
ALTER [Person].[Person] DROP COLUMN [Suffix]
Msg 5074, Level 16, State 1, Line 32
The object ‘vw_Person’ is dependent on column ‘suffix’.
Msg 4922, Level 16, State 9, Line 32
ALTER TABLE DROP COLUMN suffix failed because one or more objects access this column.







