Now I like the reference but there is one bit in it...


2009/1/16 David Chappell <[email protected]>:
>
>
> ________________________________
> From: Rob Eamon [mailto:[email protected]]
> Sent: Wednesday, January 14, 2009 3:25 PM
> To: [email protected]
> Subject: [service-orientated-architecture] Re: IBM's Carter on Selling SOA
> to the CEO
>
>  ...
>
>> If we were to look at the average complexity of the before and
>> after, we see that it goes down substantially. There are many
>> metrics that can be used to prove that (cyclomatic complexity, for
>> one), but the KPI we see improving is the time it takes to make a
>> change or add functionality to the ecosystem.
>
> Does anyone have anything other than anecdotal evidence of complexity
> decreasing "substantially?"
>
>
>
> [DC] - In my blog posting from Earth Day of last year,
> http://blogs.oracle.com/davidchappell/2008/04/roi_by_the_ton_going_green_wit.html
>
> I highlighte a case study about Verizon Wireless going green by rewriting
> their fraud detection application using SOA, EDA, and Web 2.0, and as a
> result eliminating 6 tons of hardware from their datacenter.
>
> The old application, which was based on J2EE,  replicated the entire data
> warehouse of call detail records for use by the fraud detection application.
> It also had a lot of procedural custom code that was hand written by 5 FTE
> over 2 years, some stuff that was ported from Forte to J2EE, and 100s of
> JSPs feeding (circa 1995) html 3.0 pages with data.
>
> The new implementation is 0.5% of its original size. They replaced 100's of
> JSPs and EJBs with 1 JSP and 1 SWF file for UI and BPEL processes which
> access call detail records from the backend systems directly (via service
> level interfaces).  They also went from 5 FTE over the course of 2 years
> down to 1 consultant who wrote several BPEL processes over the course of 1
> year.
>
> The old architecture had to replicate the call detail records and operate
> agains the replicated data in the event of a suspicious activity.  The new
> architecture uses BPEL processes (which are themselves exposed as services)
> to call directly to the backend data sources via service level interfaces to
> get access to the call detail records.
>
> The best line of code is one that you never have to write. The complexity
> decreased substantially in terms of the amount of handwritten code to be
> maintained, in lieu of BPEL processes which can be declaritively modified
> using visual modeling tools.  Aside from the approach to language and
> tooling, the sheer volume of code, data, and hardware was also dramatically
> reduced, which no matter how you look at it will dramatically reduce the
> complexity.

This bit.  BPEL is code, its Visual COBOL, now BPEL is good for
certain things,  Async handling, branching and generally co-ordination
buts its also rubbish at other things, rules, algorithms, GUIs.
Having a visual tool is great, anyone who wants to edit BPEL directly
should (IMO) be locked straight up.  The key in this reference (and I
have a few similar ones myself) is that this hits the BPEL sweet spot
and thus has the impact.

BPEL processes are still handwritten, its a hand that uses the tool,
and from experience there are some nutters out there who can abuse
BPEL in a way that makes 1980s COBOL apps look like a joy to maintain
(hint of the day: if you need to use a plotter to draw your process
then its probably too complex).


Right tool, right job, right outcome.

Steve




>
> Dave
>
> .
>
> 

Reply via email to