When this topic came up a few months ago, my final statement was that after
years of indecision I finally arrived at a pattern I like: I prefer to use
procs at the barrier between the service layer managed code and the
database. My procs only do "database stuff" (I hope you know what I mean)
... select, sort, filter, join, aggregate, etc, the things that databases
are specifically designed to do brilliantly. I use EF6 as the ORM, but it's
mostly relegated to the job of a POCO mapper.

I have some other serious protest points against putting business login the
DB * Your program is scattered over different environments with different
IDEs and tooling * It's harder to track the source-of-truth code (people
suggested that VS DB projects solve this) * Version control difficulties *
Release coordination difficulties * Inability to search all code or use
code refactoring tools * T-SQL is a scripting language and is often abused
by people who make sloppy changes and bust things horribly (unless you
strictly treat it with the same respect as compiled code).

*GK*

On 26 April 2017 at 17:29, Tony Wright <tonyw...@gmail.com> wrote:

> I think you're the first person that has brought up the desire to change
> databases in the last 10 years! Virtually no one ever changes relational
> database management systems these days once the decision has been made.
>
> Having the business logic in the database also tends to get around issues
> with archaic environmental processes that require a massive effort to
> redeploy an application but virtually a days notice to update a stored proc
> via a script. Not saying that's right, but each environment has its own
> challenges.
>
> Those sort of deadlocking issues don't go away just because of moving to
> the middle tier, and in fact, it can be worse. Either they don't bother
> with transactions when they are needed, or when a transaction is used, even
> less people care about update order.
>
> We are using angular via webapi. Everything method is a method on a
> controller, which calls a service, which calls a Repo (fake or real) which
> is where the data source is supposed to be switched. Mapping happens in the
> repo when retrieving data from the database. But whether you are mapping
> via AutoMapper or doing your own mapping, there is still a significant
> overhead in performing the mapping. I've also used Dapper, which is pretty
> good, and on par with writing your own ADO layer, and I've seen PetaPoco in
> there.
>
> I've had exceptionally fast calls that call the controller, which executes
> the service, repo then stored proc, and returns the stored proc results all
> the way back to the controller to be returned to angular without mapping.
> The controller interface operates as a "natural" boundary (just like the
> interface between the C# engine and the database is another "natural"
> boundary). Doing the same thing using a few LINQ statements, performing
> updates, then returning mapped results takes seconds longer, hence my
> concern.
>
>
>
> On Wed, Apr 26, 2017 at 3:59 PM, Nathan Schultz <milish...@gmail.com>
> wrote:
>
>> If performance is essential, then I agree that stored procedures are
>> going to win, but in my experience it is not without major trade-offs.
>>
>> I once worked on a large business critical project where all business
>> logic was in stored procedures, due to the reasons you have addressed above.
>>
>> Some of the problems we faced were:
>> - It was a while ago, and the database was Ingres. Ingres was expensive,
>> but the difficulty (and so cost) in migrating to SQL Server meant it was
>> not economical for the business to migrate. This trapped the business into
>> paying a fortune on licensing.
>> - While the application indeed was performant, it quickly reached the
>> limits of scalability under heavy use (as it became more and more costly to
>> scale the data tier). The company ended up paying a huge sum at the time
>> for a beast of a server to host the database to get it to cope with demand
>> during the heaviest periods. Meanwhile, scaling the middle-tier would not
>> have been a problem, and it would have taken a fair bit of burden off the
>> data-tier.
>> - Business logic in stored procedures meant large atomic transactions,
>> and under heavy load this meant that deadlocks were common (although I'd
>> admit that this is more of a problem with how the stored procedures were
>> implemented, rather than their actual use).
>> - Back in those days, web-services were just becoming popular, and when
>> your business logic is in the data-tier, we found it to be a real headache
>> to ensure data integrity when integrating web-services and other data
>> sources.
>>
>> So I still see real wisdom in the industries practice of putting business
>> logic in the middle-tier.
>>
>> Especially in today's world where web-services and alternate sources of
>> data are the norm, and the quantity of data now being processed has forced
>> the consideration away from relational databases, and to consider more
>> scalable / cloud alternatives, such as No SQL databases, and Event Sourcing.
>>
>> Still, relational databases are the best fit in most cases, but I tend to
>> use light-weight performant ORMs such as PetaPoco - so database calls are
>> still written in SQL (and can be properly optimised by a competent DBA),
>> but I don't have to worry about mapping while being quite performant. And
>> you can always fall back to hand-written ADO.NET and a hand-written
>> mapper for absolute best performance (although I've not found I've needed
>> to).
>>
>> For Unit Testing, I either 'inject' the data-layer (so it can be faked,
>> or I use an in-memory database), or using a file-based database (so it can
>> be easily re-set).
>>
>> On 26 April 2017 at 13:42, Greg Keogh <gfke...@gmail.com> wrote:
>>
>>> I'm one of those ummm ... purists who reckons you shouldn't put anything
>>> (like business logic) in procs that needs unit testing -- *GK*
>>>
>>> On 26 April 2017 at 15:36, Greg Low (罗格雷格博士) <g...@greglow.com> wrote:
>>>
>>>> Yep, that’s how we do it. Some people use transactions to do a similar
>>>> thing but you can’t test transactional code by doing that.
>>>>
>>>>
>>>>
>>>> Regards,
>>>>
>>>>
>>>>
>>>> Greg
>>>>
>>>>
>>>>
>>>> Dr Greg Low
>>>>
>>>>
>>>>
>>>> 1300SQLSQL (1300 775 775) office | +61 419201410
>>>> <+61%20419%20201%20410> mobile│ +61 3 8676 4913 <+61%203%208676%204913>
>>>> fax
>>>>
>>>> SQL Down Under | Web: www.sqldownunder.com |http://greglow.me
>>>>
>>>>
>>>>
>>>> *From:* ozdotnet-boun...@ozdotnet.com [mailto:ozdotnet-bounces@ozdot
>>>> net.com] *On Behalf Of *Tony Wright
>>>> *Sent:* Wednesday, 26 April 2017 3:08 PM
>>>> *To:* ozDotNet <ozdotnet@ozdotnet.com>
>>>> *Subject:* Re: Unit testing question and stored procedures
>>>>
>>>>
>>>>
>>>> So let me understand this. I believe what you are doing is having a
>>>> database snapshot (or testing database) that you can continuously revert to
>>>> its initial state, then you run the stored proc via nunit, then in the init
>>>> for the next test, revert back to the initial state and run that test,
>>>> etc.  I would have thought that it would take a lot of extra processing
>>>> time to run tests that way, especially if a restore is needed?
>>>>
>>>>
>>>>
>>>> I've used in memory databases (via the database first philosophy of EF
>>>> entity creation) but they don't handle stored procs.
>>>>
>>>>
>>>>
>>>> TSQLUnit looks...interesting. Must investigate.
>>>>
>>>>
>>>>
>>>> On Wed, Apr 26, 2017 at 12:48 PM, Greg Low (罗格雷格博士) <g...@greglow.com>
>>>> wrote:
>>>>
>>>> I should have added that the dac framework stuff had testing but has
>>>> now removed it.
>>>>
>>>>
>>>>
>>>> Some use TSQLUnit but I’ve not found it any more useful and NUnit fits
>>>> well with other testing.
>>>>
>>>>
>>>>
>>>> Regards,
>>>>
>>>>
>>>>
>>>> Greg
>>>>
>>>>
>>>>
>>>> Dr Greg Low
>>>>
>>>>
>>>>
>>>> 1300SQLSQL (1300 775 775) office | +61 419201410
>>>> <+61%20419%20201%20410> mobile│ +61 3 8676 4913 <+61%203%208676%204913>
>>>> fax
>>>>
>>>> SQL Down Under | Web: www.sqldownunder.com |http://greglow.me
>>>>
>>>>
>>>>
>>>> *From:* ozdotnet-boun...@ozdotnet.com [mailto:ozdotnet-bounces@ozdot
>>>> net.com] *On Behalf Of *Tony Wright
>>>> *Sent:* Wednesday, 26 April 2017 11:53 AM
>>>> *To:* ozDotNet <ozdotnet@ozdotnet.com>
>>>> *Subject:* Unit testing question and stored procedures
>>>>
>>>>
>>>>
>>>> Hi all,
>>>>
>>>>
>>>>
>>>> A while ago, we were discussing avoiding using LINQ to query sql
>>>> server. The preferred method of querying discussed was either to use direct
>>>> SQL calls or stored procs to perform data manipulation.
>>>>
>>>>
>>>>
>>>> This was because the overhead of starting up Entity Framework is
>>>> significant and the underlying queries produced by LINQ can be quite
>>>> convoluted and inefficient. Lazy loading is also something to be avoided
>>>> (at the very least you should be using Eager loading – which forced you to
>>>> be explicit about what related data is being included/loaded. As an aside,
>>>> I’ve also seen a massive performance drop when using mappers to covert
>>>> database objects in EF to POCO objects using tools such as AutoMapper.)
>>>>
>>>>
>>>>
>>>> Add to this, that putting some business logic in stored procs is about
>>>> the most efficient way to perform data manipulation in a SQL Server
>>>> database. It is unbelievably fast and efficient compared to passing all the
>>>> data over the wire to your middle tier to perform any updates and then
>>>> passing it back to commit the data to the database.
>>>>
>>>>
>>>>
>>>> In fact, I would argue that the very fact that current “best practice”
>>>> is to inefficiently pass all your data to the middle-tier to be modified,
>>>> only to be returned to the database for the update, is a failure in modern
>>>> development, but of course, there is not really an alternative if your
>>>> intent is to performing proper unit testing. It is a very sad thing that
>>>> modern enterprise development has not worked out how to utilise the full
>>>> power of SQL Server other than to say "only use stored procs in special
>>>> cases."
>>>>
>>>>
>>>>
>>>> So the question I have is, if it was decided to put business logic in
>>>> stored procedures (and some of you have, I know, even though a few of you
>>>> with the purist hat would deny it!), how do people currently unit test
>>>> their stored procs?
>>>>
>>>>
>>>>
>>>> Kind regards,
>>>>
>>>> Tony
>>>>
>>>>
>>>>
>>>
>>>
>>
>

Reply via email to