I was not clear enough. ServiceA, ServiceB, and ServiceC etc. provide, as part of their service interface, 'give me all data I want' and 'keep me updated on all changes' capabilities via pox/http. Reporting ServiceR gets the data and stores them (no problem with rdbms here because it is internal ServiceR logic). Within ServiceR you use whatever tools/technologies you want for data analysis. Results are provided via ServiceR's interface to users. If you have only one ServiceA then this approach is an overkill and you do data analysis right over the ServiceA's database. Does this make sense? -Radovan

On 4/20/06, stilkov <[EMAIL PROTECTED]> wrote:
--- In [email protected], "Radovan Janecek"

<[EMAIL PROTECTED]> wrote:
>
> I'm coping with the same question actually. At this point, we have reporting
> server (some analogy to your BI) getting data via service interfaces (http
> get) to its own data store. Although I have some concerns about scalability
> of this solution in case of really large data sets I believe these are
> rather theoretical issues.
>
> Radovan
>

Hey Radovan,

But then your use case is a different one: You have a registry/repository product that
offers reporting capabilities, and I can see why for the amount of data one can reasonably
expect there, the REST-based interface that will return XML representations is good
enough (in fact, a REST/Atom combination is probably the coolest thing one can imagine
there).

But my use case is refactoring an existing application landscape into a SOA, where
currently HUGE amounts of data are being ETLd and cleansed and transformed and
analyzed between applications and the enterprise DWH; there's a very knowledgeable team
handling this. Telling them to throw away their tooling and replace it by an XQuery- or
XPath-supported, HTTP-based mechanism is not an option.


--
Stefan Tilkov, http://www.innoq.com/blog/st/








YAHOO! GROUPS LINKS






--
Radovan Janecek
http://radovanjanecek.net/blog

YAHOO! GROUPS LINKS




Reply via email to