Re: [OT] SQL Server DB designed for reporting

2019-01-31 Thread David Burstin
>> It's only if you were building a green-fields application that you'd
look into CQRS - it's not something practical for your existing line of
business apps.

Hi Nathan,

My response is that "it depends". I would certainly not advocate a full
separate database approach except in exceptional circumstances. However,
CQRS is primarily an approach to having separate *models *for CUD vs R. I
would argue that having SQL views is a form of CQRS, as it supplies a
specific data model that is used for reads only. So, in the sense of having
different views of the data for reading/reporting, and leveraging the
experience and knowledge of CQRS practitioners, there is a good opportunity
to find a solution for Tom's problem and answer any questions that come up
(eg around synchronization if required).

I believe that this approach is totally applicable to current LoB
applications - the implementation is just a question of degree, and the
minimum to meet the requirements is best. I again suggest looking at the
work of Vladimir Khorikov, including the Pluralsight course, as it walks
through the multiple levels of CQRS from simple views right through to
separate databases. Somewhere in that range is the sweet spot for
responsive reporting vs implementation work/risk.

YMMV

Cheers
David

On Fri, 1 Feb 2019 at 15:45, Nathan Schultz  wrote:

> +1 to Tony's answer. Using SSIS (integration services) to export to a
> reporting database or data warehouse.
> From there you can use Power BI (for Dashboards) or something like SSRS
> (SQL Server Reporting Services) for traditional print invoices and things.
>
> It's only if you were building a green-fields application that you'd look
> into CQRS - it's not something practical for your existing line of business
> apps.
>
>
> On Fri, 1 Feb 2019 at 07:55, Tony McGee  wrote:
>
>> One way to do this is to have a regular job that uses SQL Server
>> Integration Services package(s) to extract data from the source system,
>> transform it, cleanse, etc then load it into a destination database (ETL)
>>
>> Often, but not always, the destination database has a star schema with
>> fact and dimension tables, that a data warehouse can be built upon. Tools
>> like SQL Server Analysis Services or Excel/Power BI can understand and
>> slice the data for the user interface
>>
>> For large volumes of data there's usually an initial load, then
>> subsequent runs capture any changed data from the source system and load it
>> incrementally
>>
>>
>> On Fri, 1 Feb 2019, 08:27 Tom P >
>>> Hi Folks
>>>
>>> I have a normalised database for an application that is working well but
>>> is not performant for reporting. I’ve tried and tried to optimise my
>>> reporting queries but they are just too slow as the data volume is large in
>>> the database. What techniques are there to tackle this?
>>>
>>> I was thinking of creating denormalised tables for the reports which
>>> would work but how to automate this so that whenever the main normalised
>>> tables are updated then the changes “flow into” the denormalised reporting
>>> tables and stay in synch?
>>>
>>> I’m sure this is not a new problem so surely some of the people here
>>> have been in this situation. Any advice would be appreciated.
>>>
>>> Cheer
>>> Tom
>>> --
>>> Thanks
>>> Tom
>>>
>>


Re: [OT] SQL Server DB designed for reporting

2019-01-31 Thread Nathan Schultz
+1 to Tony's answer. Using SSIS (integration services) to export to a
reporting database or data warehouse.
>From there you can use Power BI (for Dashboards) or something like SSRS
(SQL Server Reporting Services) for traditional print invoices and things.

It's only if you were building a green-fields application that you'd look
into CQRS - it's not something practical for your existing line of business
apps.


On Fri, 1 Feb 2019 at 07:55, Tony McGee  wrote:

> One way to do this is to have a regular job that uses SQL Server
> Integration Services package(s) to extract data from the source system,
> transform it, cleanse, etc then load it into a destination database (ETL)
>
> Often, but not always, the destination database has a star schema with
> fact and dimension tables, that a data warehouse can be built upon. Tools
> like SQL Server Analysis Services or Excel/Power BI can understand and
> slice the data for the user interface
>
> For large volumes of data there's usually an initial load, then subsequent
> runs capture any changed data from the source system and load it
> incrementally
>
>
> On Fri, 1 Feb 2019, 08:27 Tom P 
>> Hi Folks
>>
>> I have a normalised database for an application that is working well but
>> is not performant for reporting. I’ve tried and tried to optimise my
>> reporting queries but they are just too slow as the data volume is large in
>> the database. What techniques are there to tackle this?
>>
>> I was thinking of creating denormalised tables for the reports which
>> would work but how to automate this so that whenever the main normalised
>> tables are updated then the changes “flow into” the denormalised reporting
>> tables and stay in synch?
>>
>> I’m sure this is not a new problem so surely some of the people here have
>> been in this situation. Any advice would be appreciated.
>>
>> Cheer
>> Tom
>> --
>> Thanks
>> Tom
>>
>


Re: [OT] SQL Server DB designed for reporting

2019-01-31 Thread Tony McGee
One way to do this is to have a regular job that uses SQL Server
Integration Services package(s) to extract data from the source system,
transform it, cleanse, etc then load it into a destination database (ETL)

Often, but not always, the destination database has a star schema with fact
and dimension tables, that a data warehouse can be built upon. Tools like
SQL Server Analysis Services or Excel/Power BI can understand and slice the
data for the user interface

For large volumes of data there's usually an initial load, then subsequent
runs capture any changed data from the source system and load it
incrementally


On Fri, 1 Feb 2019, 08:27 Tom P  Hi Folks
>
> I have a normalised database for an application that is working well but
> is not performant for reporting. I’ve tried and tried to optimise my
> reporting queries but they are just too slow as the data volume is large in
> the database. What techniques are there to tackle this?
>
> I was thinking of creating denormalised tables for the reports which would
> work but how to automate this so that whenever the main normalised tables
> are updated then the changes “flow into” the denormalised reporting tables
> and stay in synch?
>
> I’m sure this is not a new problem so surely some of the people here have
> been in this situation. Any advice would be appreciated.
>
> Cheer
> Tom
> --
> Thanks
> Tom
>


Re: [OT] SQL Server DB designed for reporting

2019-01-31 Thread David Burstin
Might be worth looking into CQRS. Lots of info around about separating
databases for reporting and synchronization strategies. Pluralsight has
quite a few good courses on CQRS - eg
https://www.pluralsight.com/courses/cqrs-in-practice - especially the
chapter called "Synchronizing the command and query databases".

Most of the stuff by Vladimir Khorikov is pretty good too.

HTH.


On Fri, 1 Feb 2019 at 09:27, Tom P  wrote:

> Hi Folks
>
> I have a normalised database for an application that is working well but
> is not performant for reporting. I’ve tried and tried to optimise my
> reporting queries but they are just too slow as the data volume is large in
> the database. What techniques are there to tackle this?
>
> I was thinking of creating denormalised tables for the reports which would
> work but how to automate this so that whenever the main normalised tables
> are updated then the changes “flow into” the denormalised reporting tables
> and stay in synch?
>
> I’m sure this is not a new problem so surely some of the people here have
> been in this situation. Any advice would be appreciated.
>
> Cheer
> Tom
> --
> Thanks
> Tom
>


[OT] SQL Server DB designed for reporting

2019-01-31 Thread Tom P
Hi Folks

I have a normalised database for an application that is working well but is
not performant for reporting. I’ve tried and tried to optimise my reporting
queries but they are just too slow as the data volume is large in the
database. What techniques are there to tackle this?

I was thinking of creating denormalised tables for the reports which would
work but how to automate this so that whenever the main normalised tables
are updated then the changes “flow into” the denormalised reporting tables
and stay in synch?

I’m sure this is not a new problem so surely some of the people here have
been in this situation. Any advice would be appreciated.

Cheer
Tom
-- 
Thanks
Tom