database - Will denormalization improve performance in SQL? Advice appreciated -


I want to speed up our SQL queries. I have started reading a book on DataWire, where you have a separate database with data in different tables. The problem is that I do not want to create a separate reporting database for each of our customers for a few reasons:

  1. We have more than 200, maintenance on these databases is sufficient < li> Reporting data should be available immediately

    I was thinking, if I only report on the tables I currently have, there are so many in it and believe it is expensive (table About 20,000,000 lines in) So I copied several tables of data, it will display a little? I know that there are issues of copying data everywhere, but it can be good for any point in history ....

    Would anyone have experience and tell me in the right direction? / P>

    Denormalization performance is not guaranteed to improve.

    Have you considered the questions of your application as tuning? Take a look at what the report is running, identify places where you can add indexes and split. Probably most of the reports see only last month's data - you can divide data by month, so a small amount of table need to be read, when asked if it is not necessary that the option is a big denormalized There is a table, which requires a huge full scan of the table instead of some index scans ...

    Your question is very common - about doing something about your Talk to DBA in report to see the marks on the query (and plans) to see if you can do.

Comments

Popular posts from this blog

Pass DB Connection parameters to a Kettle a.k.a PDI table Input step dynamically from Excel -

multithreading - PhantomJS-Node in a for Loop -

c++ - MATLAB .m file to .mex file using Matlab Compiler -