Thursday 29 December 2011

SQL server receives a user query

SQL server receives a user query; it follows well-defined steps to produce a result for user.

Step-1

First, it creates a compiled plan, which is a set of logical instructions such as how to join rows.

Step-2

Next, it creates an execution plan based on the compiled plan. This execution plan contains instructions with all logical references translated to actual objects, and tracking mechanisms for query execution.

Step-3

Finally, the server starts execution from the top of instruction tree.

 Creating a compiled plan is expensive because the server needs to find the optimal plan out of hundreds of potential candidates. Distinguishing between compile and execution helps overall server performance because each compiled plan can be cached and shared among multiple execution plans. The memory grant estimate follows the same overall sequence. It has parameters saved in compiled plan, and a mechanism to calculate actual grant size at execution time.

A successful query execution involves 3 major memory consumers: compile, cache, and memory grant.

  • Compile: Building and searching the most optimal plan out of hundreds candidates typically requires significant amount of memory. The lifetime of this usage is typically short because optimizer releases memory as soon as optimal plan is found. Lack of available memory would cause delays in compile, and potentially inefficient (slow) plans.

  • Cache: Finding optimal plan is costly in terms of CPU and memory usage. SQL server tries to store compiled plans in caches for later reuse. Lifetime of this memory usage is long-term. Lack of cache memory would cause more unnecessary re-compiles.

  • Memory grant: This memory is used to store temporary rows for sort and hash join. The lifetime of memory grant is the same as the lifetime of query. Lack of available memory grant causes a query to use hard disk, which affects query performance.

SQL Server maintains the balance between these 3 consumers with internal facility called "memory broker". Based on the usage and physical memory available, the memory broker sets the usage limit and tells each component to trim its memory usage if it anticipates a shortage. Generally, a well behaving server would have about the same contributions from these consumers.

Posted by: MR. JOYDEEP DAS

 

 

Type of SP

There are many type of stored procedure, but broadly we can classify it into three categories.

 

1.    User define stored procedure

2.    External stored procedure

3.    System stored procedure

 

User define stored procedure:

 

This type is mostly a routine or bundle of individual work grouping together to perform some set of works.

 

It again divided into 2 types

a.    Transact-SQL

b.    CLR

 

In transact-SQL contain Transact-SQL statements that works together to performs a specific tasks. Such as, a stored procedure to perform some DML or DDL activity.

 

A CLR stored procedure is a reference to a Microsoft .NET Framework common language runtime (CLR) method that can take and return user-supplied parameters. They are implemented as public, static methods on a class in a .NET Framework assembly

 

External Stored procedure:

 

It is used to create external routines by any kind of programming language like C.

External stored procedures are a DLL that dynamically load by SQL server and run.

External stored procedures run dynamically in the same memory space where the program that runs the external stored procedures API runs.

It generally takes xp_ prefix.

 

System Stored procedures:

 

Many administrative tasks is performed by it.

For example sys.sp_helptrigger is a system procedure. The system stored procedure is always prefixed by sp_. They logically appear at sys schema of every systems and user define database.

 

For better understanding please refer to MSDN blog

http://msdn.microsoft.com/en-us/library/ms187644(v=sql.90).aspx

 

 

Posted by : MR.JOYDEEP DAS

 

Problem with Forwarding Pointers ?


Forwarding pointers in heap is a bad thing for database in terms of performance.
For an analogy you are trying to insert a row in a table and the data is not fit to the same page, it takes next page to store the data. But the forwarding pointers are set to the previous page, so the non clustered index is not updated properly.
 If have you so many forwarding pointers the performance is going to decreased as non clustered index is not properly used.
So what we do to solve this problem.
SQL server 2008 has the solutions.
Use
ALTER TABLE <Tablename> REBUILD
It just removes the forwarding pointers and rebuilds all the indexes related to table.
Posted by: MR. JOYDEEP DAS

Wednesday 28 December 2011

ISTEAD OF INSERT trigger

First we create a table described bellow

CREATE TABLE t1

            (ids   varchar(1),

             sname varchar(50))

 

Now create a trigger on it (Normal Insert Trigger)

 CREATE TRIGGER t1_trig ON t1

 FOR INSERT

 AS

 INSERT INTO t1

 SELECT ids, sname FROM inserted

 

What happened when we insert a statement on table t1 like this

INSERT INTO t1

VALUES('1', 'ABC')

It affects 2 records,

Ids       sanme

1          ABC

1          ABC

 

Now replace the trigger with this

DROP TRIGGER t1_trig

 

CREATE TRIGGER t1_trig  on t1

INSTEAD OF INSERT

AS

BEGIN

  INSERT INTO t1

       SELECT ids, sname FROM inserted

END

 

Now insert the value

INSERT INTO t1

VALUES('1', 'DEF')

It affects 1 records,

Ids       sanme

1          DEF

 

Now got the differences? It works on SQL 2008 Only.

References : http://msdn.microsoft.com/en-us/library/ms175089.aspx

Posted by : MR. JOYDEEP DAS

 

 

Trigger – DDL ?


When we think about trigger, we always think about DML trigger but DDL triggers are playing a very important role on database impact.
SQL Server 2005 introduced a new trigger called the DDL Trigger. DDL means "Data Definition Language" that is create table, alter table, create proc, etc. What's great about these new triggers is if they are setup on the database/server, you are able to capture the DDL statement and automatically log it to a change log. You have no idea how much easier that is than doing it manually. Also, the laziness in all of us gets a little boost.
The DDL trigger has 2 type of scope
1.    Server-scoped statements
2.    Database-scoped statements
In "server-scoped statement", the triggered worked on entire SQL server instance, no matter how many database exists within this instance.
But in the "database-scoped statements", the triggers work on a particular database.
Server-scoped statements
ALTER_AUTHORIZATION_SERVER
CREATE_DATABASE
ALTER_DATABASE
DROP_DATABASE
CREATE_ENDPOINT
DROP_ENDPOINT
CREATE_LOGIN
ALTER_LOGIN
DROP_LOGIN
GRANT_SERVER
DENY_SERVER
REVOKE_SERVER
Database-scoped statements
CREATE_APPLICATION_ROLE
ALTER_APPLICATION_ROLE
DROP_APPLICATION_ROLE
CREATE_ASSEMBLY
ALTER_ASSEMBLY
DROP_ASSEMBLY
ALTER_AUTHORIZATION_DATABASE
CREATE_CERTIFICATE
ALTER_CERTIFICATE
 DROP_CERTIFICATE
CREATE_CONTRACT
DROP_CONTRACT
GRANT_DATABASE
DENY_DATABASE
REVOKE_DATABASE
CREATE_EVENT_NOTIFICATION
DROP_EVENT_NOTIFICATION
CREATE_FUNCTION
ALTER_FUNCTION
DROP_FUNCTION
CREATE_INDEX
ALTER_INDEX
DROP_INDEX
CREATE_MESSAGE_TYPE
ALTER_MESSAGE_TYPE
DROP_MESSAGE_TYPE
CREATE_PARTITION_FUNCTION
ALTER_PARTITION_FUNCTION
DROP_PARTITION_FUNCTION
CREATE_PARTITION_SCHEME
ALTER_PARTITION_SCHEME
DROP_PARTITION_SCHEME
CREATE_PROCEDURE
ALTER_PROCEDURE
DROP_PROCEDURE
CREATE_QUEUE
ALTER_QUEUE
DROP_QUEUE
CREATE_REMOTE_SERVICE_BINDING
ALTER_REMOTE_SERVICE_BINDING
DROP_REMOTE_SERVICE_BINDING
CREATE_ROLE
ALTER_ROLE
DROP_ROLE
CREATE_ROUTE
ALTER_ROUTE
DROP_ROUTE
CREATE_SCHEMA
ALTER_SCHEMA
DROP_SCHEMA
CREATE_SERVICE
ALTER_SERVICE
DROP_SERVICE
CREATE_STATISTICS
DROP_STATISTICS
UPDATE_STATISTICS
CREATE_SYNONYM
DROP_SYNONYM
CREATE_TABLE
ALTER_TABLE
DROP_TABLE
CREATE_TRIGGER
ALTER_TRIGGER
DROP_TRIGGER
CREATE_TYPE
DROP_TYPE
CREATE_USER
ALTER_USER
DROP_USER
CREATE_VIEW
ALTER_VIEW
DROP_VIEW
 CREATE_XML_SCHEMA_COLLECTION
 ALTER_XML_SCHEMA_COLLECTION
DROP_XML_SCHEMA_COLLECTION

Syntax to Create DDL triggers
CREATE TRIGGER [name of trigger]
ON [scope (database|server)]
FOR [event]
    As
    --Trigger definition here…
An example to DLL Triggers
CREATE TRIGGER backup_procs
ON DATABASE
FOR CREATE_PROCEDURE, ALTER_PROCEDURE, DROP_PROCEDURE
AS

SET NOCOUNT ON

DECLARE @data XML
SET @data = EVENTDATA()

INSERT INTO dbo.eventslog
           (eventtype, objectname, objecttype, sqlcommand, username)
VALUES     (@data.value('(/EVENT_INSTANCE/EventType)[1]', 'varchar(50)'),
            @data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'varchar(256)'),
            @data.value('(/EVENT_INSTANCE/ObjectType)[1]', 'varchar(25)'),
            @data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'varchar(max)'),
            @data.value('(/EVENT_INSTANCE/LoginName)[1]', 'varchar(256)')
)
GO
The script above will create a DDL trigger called backup_procs. It will insert a row into the events log table when a proc is created, altered or dropped.
For better understanding of DDL trigger please refer to MSDN

Posted by: MR. JOYDEEP DAS

Tuesday 27 December 2011

SSIS Control Flow and Data Flow

As I think that In SSIS we must have some solid concept regarding Control Flow and Data flow before going to any complicated scenario. I try to collects some facts related to it, it may be help you but I always prefer to MSDN to get better guideline.

Control flow deals with orderly processing of tasks, which are individual, isolated units of work that perform a specific action ending with a finite outcome (such that can be evaluated as either Success, Failure, or Completion). While their sequence can be customized by linking them into arbitrary arrangements with precedence constraints and grouping them together or repeating their execution in a loop with the help of containers, a subsequent task does not initiate unless its predecessor has completed.

Data flow, on the other hand, handles its processing responsibilities by employing the pipeline paradigm, carrying data record by record (or rather, to be more accurate, memory buffer by memory buffer) from its source to a destination and modifying it in transit by applying transformations. (There are exceptions to this rule, since some of them, such as Sort or Aggregate require the ability to view the entire data set before handing it over to their downstream counterparts). Note that this does not imply that tasks cannot be executed in parallel, but rather that if they do, their actions are not coordinated (unlike processing of data flow components that are part of the same data stream). Another distinction between them is the absence of a mechanism that would allow direct transfer of data between individual control flow tasks. On the other hand, data flow lacks nesting capabilities provided by containers.

These two SQL Server Integration Services features are implemented in the form of two tabs (bearing their respective names) of the Designer interface in the Business Intelligence Development Studio. The control flow portion of a package is constructed by populating the area exposed by the first of these tabs, typically by dragging tasks and containers delivering desired functionality from the Toolbox. The same methodology is applied when adding data sources, destinations, and transformation to the area exposed by the Data Flow tab (with Toolbox adjusting its content depending on the context).

Summary at a glance

Control Flow:

 

  • Process is the key:  precedence constraints control the project flow based on task completion, success or failure
  • Task 1 needs to complete before task 2 begins
  • Smallest unit of the control flow is a task
  • Control flow does not move data from task to task
  • Tasks are run in series if connected with precedence or in parallel
  • Package control flow is made up of containers and tasks connected with precedence constraints to control package flow

Data Flow:

  • Streaming
  • Unlink control flow, multiple components can process data at the same time
  • Smallest unit of the data flow is a component
  • Data flows move data, but are also tasks in the control flow, as such, their success or failure effects how your control flow operates
  • Data is moved and manipulated through transformations
  • Data is passed between each component in the data flow
  • Data flow is made up of source(s), transformations, and destinations.

To be continued…

Posted By: MR. JOYDEEP DAS

 

Friday 23 December 2011

CONCAT()

It is a new for SQL 2012, it concatenate different length string and make it a single length string.

It only used in SQL version 2012.

Syntax

CONCAT ( string_value1, string_value2 [, string_valueN ] )

 

For detail information please look at.

http://msdn.microsoft.com/en-us/library/hh231515%28SQL.110%29.aspx

 

Posted By: MR. JOYDEEP DAS

 

Thursday 22 December 2011

Trick- Date Range

To display continues date between 2 date ranges uses this trick

SET DATEFORMAT DMY

GO

IF NOT EXISTS(SELECT * FROM sysobjects WHERE xtype='U' and name like 'Numbers')

      BEGIN

            SELECT TOP 10000 IDENTITY(int,1,1) AS Number

                     INTO Numbers

          FROM sys.columns s1

                   CROSS JOIN sys.columns s2

                   

          ALTER TABLE Numbers ADD CONSTRAINT PK_Numbers PRIMARY KEY CLUSTERED (Number)

      END

GO   

 

DECLARE @Start DATETIME

DECLARE @End DATETIME

 

SET @Start='01-01-2011'

SET @End='28-02-2011'

SELECT

    @Start+Number-1

    FROM Numbers

    WHERE Number<=DATEDIFF(day,@Start,@End)+1

GO

 

Posted by: MR. JOYDEEP DAS

 

SSIS Connection manager

After my first posting about Microsoft BI tolls SSIS, now I am going to post my second scenario and it is connection manager. We have some clear understanding about connection manager before going to deepest point.

SSIS uses connection managers to integrate different data sources into packages. SSIS includes a wide variety of different connection managers that allow you to move data around from place to place.

Connection Manager

Handles

ADO Connection Manager

Connecting to ADO objects such as a Record set.

ADO.NET Connection Manager

Connecting to data sources through an ADO.NET provider.

Analysis Services Connection Manager

Connecting to an Analysis Services database or cube.

Excel Connection Manager

Connecting to an Excel worksheet.

File Connection Manager

Connecting to a file or folder.

Flat File Connection Manager

Connecting to delimited or fixed width flat files.

FTP Connection Manager

Connecting to an FTP data source.

HTTP Connection Manager

Connecting to an HTTP data source.

MSMQ Connection Manager

Connecting to a Microsoft Message Queue.

Multiple Files Connection Manager

Connecting to a set of files, such as all text files on a particular hard drive.

Multiple Flat Files Connection Manager

Connecting to a set of flat files.

ODBC Connection Manager

Connecting to an ODBC data source.

OLE DB Connection Manager

Connecting to an OLE DB data source.

SMO Connection Manager

Connecting to a server via SMO.

SMTP Connection Manager

Connecting to a Simple Mail Transfer Protocol server.

SQL Server Mobile Connection Manager

Connecting to a SQL Server Mobile database.

WMI Connection Manager

Connecting to Windows Management Instrumentation data.

 

To be continued …

Posted By: MR. JOYDEEP DAS