Vyoms
Bookmark and Share Rss Feeds

SQL Optimization Tips | Articles | Recent Articles | News Article | Interesting Articles | Technology Articles | Articles On Education | Articles On Corporate | Company Articles | College Articles | Articles on Recession
Hot Jobs
leftMenu Bullet Freshers Jobs
leftMenu Bullet Experienced Jobs
leftMenu Bullet Government Jobs
leftMenu Bullet Walkin Jobs
Placement Section
leftMenu Bullet Company Profiles
leftMenu Bullet Interview Questions
leftMenu Bullet Placement Papers
Interview Ebook
Get 9,000+ Interview Questions & Answers in an eBook.
Interview Questions & Answers Kit
  • 9,000+ Interview Questions
  • All Questions Answered
  • 5 FREE Bonuses
  • Free Upgrades
Resources @ VYOMS
leftMenu Bullet Companies In India
leftMenu Bullet Consultants In India
leftMenu Bullet Colleges In India
leftMenu Bullet Exams In India
leftMenu Bullet Latest Results
leftMenu Bullet Notifications In India
leftMenu Bullet Call Centers In India
leftMenu Bullet Training Institutes In India
leftMenu Bullet Job Communities In India
leftMenu Bullet Courses In India
leftMenu Bullet Jobs by Keyskills
leftMenu Bullet Jobs by Functional Areas
Learn @ VYOMS
leftMenu Bullet GATE Preparation
leftMenu Bullet GRE Preparation
leftMenu Bullet GMAT Preparation
IAS Preparation
leftMenu Bullet SAP Preparation
leftMenu Bullet Testing Preparation
leftMenu Bullet MBA Preparation
News @ VYOMS
leftMenu Bullet Freshers News
leftMenu Bullet Job Articles
leftMenu Bullet Latest News

Join The Community
VYOMS TOP EMPLOYERS

Wipro Technologies
Tata Consultancy Services
Accenture
IBM
Satyam
Genpact
Cognizant Technologies

Home » Articles » SQL Optimization Tips

SQL Optimization Tips

Karnataka CET 2013 Result to be declared today May 28| UPSC Notified Final Marks for Central Armed Police Forces (ACs) Exam 2011| UPSC Declared CISF (AC) Examination 2012| PSCADB Recruitment 2013 Apply Online for 47 Asst Manager, IT Officer Posts | Meghalaya Cooperative Apex Bank Recruitment 2013 Probationary Officer Posts | Meghalaya Cooperative Apex Bank Recruitment 2013 70 Supervisor, Cashier & PO Posts | LIC Recruitment 2013 107 Financial Service Executive Posts | LIC Recruitment 2013 207 Financial Service Executive Posts | SIDBI Recruitment 2013 Engineering Consultant Posts | Chhittorgarh Central Co-Operative Bank Assistant Posts 2013 | Bundi Central Co-Operative Bank Assistant Posts 2013 | Nagaur Central Co-Operative Bank Assistant Posts 2013 | Udaipur Central Co-operative Bank Assistant Posts 2013


Search Jobs:
(For ex: Software Testing Jobs, C/C++/Java Jobs, .Net Jobs)
 


Article Posted On Date : Thursday, April 01, 2010


SQL Optimization Tips
Advertisements

   1. Use views and stored procedures instead of heavy-duty queries. This can reduce network traffic, because your client will send to server only stored procedure or view name (perhaps with some
      parameters) instead of large heavy-duty queries text. This can be used to facilitate permission management also, because you can restrict user access to table columns they should not see.
   2. Try to use constraints instead of triggers, whenever possible. Constraints are much more efficient than triggers and can boost performance. So, you should use constraints instead of triggers, whenever possible.
   3. Use table variables instead of temporary tables. Table variables require less locking and logging resources than temporary tables, so table variables should be used whenever possible.
      The table variables are available in SQL Server 2000 only.
   4.  Try to use UNION ALL statement instead of UNION, whenever possible. The UNION ALL statement is much faster than UNION, because UNION ALL statement does not look for duplicate rows, and UNION statement does look for duplicate rows, whether or not they exist.
   5. Try to avoid using the DISTINCT clause, whenever possible. Because using the DISTINCT clause will result in some performance degradation, you should use this clause only when it is necessary.
   6. Try to avoid using SQL Server cursors, whenever possible. SQL Server cursors can result in some performance degradation in comparison with select statements. Try to use correlated sub-query or derived tables, if you need to perform row-by-row operations.
   7. Try to avoid the HAVING clause, whenever possible. The HAVING clause is used to restrict the result set returned by the GROUP BY clause. When you use GROUP BY with the HAVING clause, the GROUP BY clause divides the rows into sets of grouped rows and aggregates their values, and then the HAVING clause eliminates undesired aggregated groups. In many cases, you can write your select statement so, that it will contain only WHERE and GROUP BY clauses without HAVING clause. This can improve the performance of your query.
   8. If you need to return the total table's row count, you can use alternative way instead of SELECT COUNT(*) statement. Because SELECT COUNT(*) statement make a full table scan to return the
      total table's row count, it can take very many time for the large table. There is another way to determine the total row count in a table. You can use sysindexes system table, in this case. There is ROWS column in the sysindexes table. This column contains the total row count for each table in your database. So, you can use the following select statement instead of SELECT COUNT(*):

      SELECT rows
      FROM sysindexes WHERE id = OBJECT_ID('table_name') AND indid < 2

      So, you can improve the speed of such queries in several times.
   9. Include SET NOCOUNT ON statement into your stored procedures to stop the message indicating the number of rows affected by a T-SQL statement. This can reduce network traffic, because your client will not receive the message indicating the number of rows affected by a T-SQL statement.
  10. Try to restrict the queries result set by using the WHERE clause. This can results in good performance benefits, because SQL Server will return to client only particular rows, not all rows from the table(s). This can reduce network traffic and boost the overall performance of the query.
  11. Use the select statements with TOP keyword or the SET ROWCOUNT statement, if you need to return only the first n rows. This can improve performance of your queries, because the smaller result set will be returned. This can also reduce the traffic between the server and the clients.
  12. Try to restrict the queries result set by returning only the particular columns from the table, not all table's columns. This can results in good performance benefits, because SQL Server will
      return to client only particular columns, not all table's columns. This can reduce network traffic and boost the overall performance of the query.
      1. Indexes
      2. Avoid more number of triggers on the table
      3. Unnecessary complicated joins
      4. Correct use of Group by clause with the select list
      5. In worst cases Denormalization

Index Optimization tips

   1. Every index increases the time in takes to perform INSERTS, UPDATES and DELETES, so the number of indexes should not be very much. Try to use maximum 4-5 indexes on one table, not more. If you have read-only table, then the number of indexes may be increased.
   2. Keep your indexes as narrow as possible. This reduces the size of the index and reduces the number of reads required to read the index.
   3. Try to create indexes on columns that have integer values rather than character values.
   4. If you create a composite (multi-column) index, the order of the columns in the key are very important. Try to order the columns in the key as to enhance selectivity, with the most selective columns to the leftmost of the key.
   5. If you want to join several tables, try to create surrogate integer keys for this purpose and create indexes on their columns.
   6. Create surrogate integer primary key (identity for example) if your table will not have many insert operations.
   7. Clustered indexes are more preferable than nonclustered, if you need to select by a range of values or you need to sort results set with GROUP BY or ORDER BY.
   8. If your application will be performing the same query over and over on the same table, consider creating a covering index on the table.
   9. You can use the SQL Server Profiler Create Trace Wizard with "Identify Scans of Large Tables" trace to determine which tables in your database may need indexes. This trace will show which tables are being scanned by queries instead of using an index.
  10. You can use sp_MSforeachtable undocumented stored procedure to rebuild all indexes in your database. Try to schedule it to execute during CPU idle time and slow production periods. sp_MSforeachtable @command1="print '?' DBCC DBREINDEX ('?')"





Sponsored Ads





FREE JOBS NEWSLETTER
3,11,757 [96,218 + 2,15,539] MEMBERS!



Contact Us | Feedback | Link to Us
Copyright © 2001-2009 VYOMS.com. All Rights Reserved. Home | About Us | Jobs | Contact Us | Privacy Policy | Terms & Conditions.
Disclaimer: VYOMS.com has taken all reasonable steps to ensure that information on this site is authentic. Applicants are advised to research bonafides of advertisers independently. VYOMS.com shall not have any responsibility in this regard.
Placement Papers | Get Your Free Website | IAS Preparation | C++ Interview Questions | C Interview Questions | Report a Bug | Romantic Shayari | CAT 2014