Re: [Carbon-dev] Data Services performance test results using performance testing framework

2010-06-02 Thread Damitha Kumarage
Srinath,
Srinath Perera wrote:
> Hi Eranda;
>
> If you need more than average, right way to go is drawing confidence
> intervals (http://en.wikipedia.org/wiki/Confidence_interval)/ error
> bars, which will let you draw both servers on same graph.
we already planned to implement this in the next step. I will add a wiki 
with the TODO list.
>  But in
> general, only average is good.
>
> Also what are X/Y axis?
>   
If you look at the detailed graphs, X-axis represent the demand rate, 
which is the number requests sent per second. For example starting from 
100, increasing by steps of 50 up to 500. Y axis represent the replies 
per second. Graphs are for minimum reply rate, maximum reply rate, 
average reply rate  and standard deviation.

There are so many graphs. Eranda has sent graphs related to reply rate 
only. Can someone suggest a good way to represent all the graphs so that 
one can grasp the results easily?

Thanks,
Damitha

> Thanks
> Srinath
>
>
>
> On Tue, Jun 1, 2010 at 5:16 PM, Eranda Sooriyabandara  wrote:
>   
>> This test was done for the Data Services team to compare the performance of
>> the Data Services servers 2.2.0 and 2.5.0 using the Performance testing
>> framework.
>> What I did here is add a RDBMS data source, create a data query service, and
>> querying different number of  rows form the Data Service server.
>> In performance test framework we can increase the number of concurrent
>> requests per second(demand rate) step by step so that we can exactly get the
>> point where server begin to saturate.  I tested for 100, 200,  1000
>> query rows for 100, 120, 140,  ,500 demand rates and get the results.
>> Here I am attaching the results with this. In results.png, include a quick
>> review of all results and in the zip file there are separate graphs in
>> detail.
>>
>> thoughts:
>> Data Services 2.5.0 shows slightly high performance(almost the same) than
>> Data Service 2.2.0.
>> We can identify the demand rate where  the server start to saturate,  which
>> shows the highest response rate. When query rows increases, it saturate at
>> lower demand rates.
>>
>> Ideally this test is to be done in separate client and server machines. But
>> I ran them in same machine. Shall I  use the private cloud to do the testing
>> for separate client/server?
>> --
>> Thanks
>> Eranda
>>
>> ___
>> Carbon-dev mailing list
>> Carbon-dev@wso2.org
>> https://mail.wso2.org/cgi-bin/mailman/listinfo/carbon-dev
>>
>>
>> 
>
>
>
>   


-- 
__

Damitha Kumarage
Technical Lead; WSO2 Inc.
"Oxygenating the Web Service Platform; " http://www.wso2.com/

blog: " http://damithakumarage.wordpress.com/
__

___
Carbon-dev mailing list
Carbon-dev@wso2.org
https://mail.wso2.org/cgi-bin/mailman/listinfo/carbon-dev


Re: [Carbon-dev] Data Services performance test results using performance testing framework

2010-06-02 Thread Damitha Kumarage
Sumedha,
Is it OK to do the comparision in private cloud as Eranda mentioned?. 
There are no real hardware available at the moment. It is good if we 
could have dedicated hardware for all future performance related testing 
of our products.

Thanks,
Damitha
Sumedha Rubasinghe wrote:
> Eranda,
> You need to compare 2.5.0 with 2.0.0.  AFAIR these are the two 
> versions we discussed to compare.
>
> /sumedha
>
> On Tue, Jun 1, 2010 at 5:16 PM, Eranda Sooriyabandara  > wrote:
>
> This test was done for the Data Services team to compare the
> performance of the Data Services servers 2.2.0 and 2.5.0 using the
> Performance testing framework.
> What I did here is add a RDBMS data source, create a data query
> service, and querying different number of  rows form the Data
> Service server.
> In performance test framework we can increase the number of
> concurrent requests per second(demand rate) step by step so that
> we can exactly get the point where server begin to saturate.  I
> tested for 100, 200,  1000 query rows for 100, 120, 140, 
> ,500 demand rates and get the results.
> Here I am attaching the results with this. In results.png, include
> a quick review of all results and in the zip file there are
> separate graphs in detail.
>
> thoughts:
> Data Services 2.5.0 shows slightly high performance(almost the
> same) than Data Service 2.2.0.
> We can identify the demand rate where  the server start to
> saturate,  which shows the highest response rate. When query rows
> increases, it saturate at lower demand rates.
>
> Ideally this test is to be done in separate client and server
> machines. But I ran them in same machine. Shall I  use the private
> cloud to do the testing for separate client/server?
> -- 
> Thanks
> Eranda
>
> ___
> Carbon-dev mailing list
> Carbon-dev@wso2.org 
> https://mail.wso2.org/cgi-bin/mailman/listinfo/carbon-dev
>
>
> 
>
> ___
> Carbon-dev mailing list
> Carbon-dev@wso2.org
> https://mail.wso2.org/cgi-bin/mailman/listinfo/carbon-dev
>   


-- 
__

Damitha Kumarage
Technical Lead; WSO2 Inc.
"Oxygenating the Web Service Platform; " http://www.wso2.com/

blog: " http://damithakumarage.wordpress.com/
__

___
Carbon-dev mailing list
Carbon-dev@wso2.org
https://mail.wso2.org/cgi-bin/mailman/listinfo/carbon-dev


Re: [Carbon-dev] Data Services performance test results using performance testing framework

2010-06-01 Thread Sumedha Rubasinghe
Eranda,
You need to compare 2.5.0 with 2.0.0.  AFAIR these are the two versions we
discussed to compare.

/sumedha

On Tue, Jun 1, 2010 at 5:16 PM, Eranda Sooriyabandara wrote:

> This test was done for the Data Services team to compare the performance of
> the Data Services servers 2.2.0 and 2.5.0 using the Performance testing
> framework.
> What I did here is add a RDBMS data source, create a data query service,
> and querying different number of  rows form the Data Service server.
> In performance test framework we can increase the number of concurrent
> requests per second(demand rate) step by step so that we can exactly get the
> point where server begin to saturate.  I tested for 100, 200,  1000
> query rows for 100, 120, 140,  ,500 demand rates and get the results.
> Here I am attaching the results with this. In results.png, include a quick
> review of all results and in the zip file there are separate graphs in
> detail.
>
> thoughts:
> Data Services 2.5.0 shows slightly high performance(almost the same) than
> Data Service 2.2.0.
> We can identify the demand rate where  the server start to saturate,  which
> shows the highest response rate. When query rows increases, it saturate at
> lower demand rates.
>
> Ideally this test is to be done in separate client and server machines. But
> I ran them in same machine. Shall I  use the private cloud to do the testing
> for separate client/server?
> --
> Thanks
> Eranda
>
> ___
> Carbon-dev mailing list
> Carbon-dev@wso2.org
> https://mail.wso2.org/cgi-bin/mailman/listinfo/carbon-dev
>
>
___
Carbon-dev mailing list
Carbon-dev@wso2.org
https://mail.wso2.org/cgi-bin/mailman/listinfo/carbon-dev


Re: [Carbon-dev] Data Services performance test results using performance testing framework

2010-06-01 Thread Srinath Perera
Hi Eranda;

If you need more than average, right way to go is drawing confidence
intervals (http://en.wikipedia.org/wiki/Confidence_interval)/ error
bars, which will let you draw both servers on same graph. But in
general, only average is good.

Also what are X/Y axis?

Thanks
Srinath



On Tue, Jun 1, 2010 at 5:16 PM, Eranda Sooriyabandara  wrote:
> This test was done for the Data Services team to compare the performance of
> the Data Services servers 2.2.0 and 2.5.0 using the Performance testing
> framework.
> What I did here is add a RDBMS data source, create a data query service, and
> querying different number of  rows form the Data Service server.
> In performance test framework we can increase the number of concurrent
> requests per second(demand rate) step by step so that we can exactly get the
> point where server begin to saturate.  I tested for 100, 200,  1000
> query rows for 100, 120, 140,  ,500 demand rates and get the results.
> Here I am attaching the results with this. In results.png, include a quick
> review of all results and in the zip file there are separate graphs in
> detail.
>
> thoughts:
> Data Services 2.5.0 shows slightly high performance(almost the same) than
> Data Service 2.2.0.
> We can identify the demand rate where  the server start to saturate,  which
> shows the highest response rate. When query rows increases, it saturate at
> lower demand rates.
>
> Ideally this test is to be done in separate client and server machines. But
> I ran them in same machine. Shall I  use the private cloud to do the testing
> for separate client/server?
> --
> Thanks
> Eranda
>
> ___
> Carbon-dev mailing list
> Carbon-dev@wso2.org
> https://mail.wso2.org/cgi-bin/mailman/listinfo/carbon-dev
>
>



-- 

Srinath Perera, Ph.D.
   WSO2 Inc. http://wso2.com
   Blog: http://srinathsview.blogspot.com/

___
Carbon-dev mailing list
Carbon-dev@wso2.org
https://mail.wso2.org/cgi-bin/mailman/listinfo/carbon-dev


Re: [Carbon-dev] Data Services performance test results using performance testing framework

2010-06-01 Thread Damitha Kumarage
Anjana Fernando wrote:
> Hi,
>
> Yeah, you will not see any performance difference between 2.2.0 and
> 2.5.0 versions because, the core is basically the same, only
> additional features are added to 2.5.0. The major difference came with
> 2.2.0. So the two version you should be testing for performance should
> be 2.0.0 and the 2.2.0/2.5.0 releases. And also, because of the
> streaming functionality, the main advantage is memory consumption. I'm
> not sure if that is possible to measure with the performance testing
> framework, and to effectively test this, service calls must be made,
> which returns significantly large responses.
>   
We have planned to provide server side memory, cpu usage etc in the next 
step.

Thanks,
Damitha
> Cheers,
> Anjana.
>
> On Tue, Jun 1, 2010 at 5:16 PM, Eranda Sooriyabandara  wrote:
>   
>> This test was done for the Data Services team to compare the performance of
>> the Data Services servers 2.2.0 and 2.5.0 using the Performance testing
>> framework.
>> What I did here is add a RDBMS data source, create a data query service, and
>> querying different number of  rows form the Data Service server.
>> In performance test framework we can increase the number of concurrent
>> requests per second(demand rate) step by step so that we can exactly get the
>> point where server begin to saturate.  I tested for 100, 200,  1000
>> query rows for 100, 120, 140,  ,500 demand rates and get the results.
>> Here I am attaching the results with this. In results.png, include a quick
>> review of all results and in the zip file there are separate graphs in
>> detail.
>>
>> thoughts:
>> Data Services 2.5.0 shows slightly high performance(almost the same) than
>> Data Service 2.2.0.
>> We can identify the demand rate where  the server start to saturate,  which
>> shows the highest response rate. When query rows increases, it saturate at
>> lower demand rates.
>>
>> Ideally this test is to be done in separate client and server machines. But
>> I ran them in same machine. Shall I  use the private cloud to do the testing
>> for separate client/server?
>> --
>> Thanks
>> Eranda
>>
>> ___
>> Carbon-dev mailing list
>> Carbon-dev@wso2.org
>> https://mail.wso2.org/cgi-bin/mailman/listinfo/carbon-dev
>>
>>
>> 
>
>
>
>   


-- 
__

Damitha Kumarage
Technical Lead; WSO2 Inc.
"Oxygenating the Web Service Platform; " http://www.wso2.com/

blog: " http://damithakumarage.wordpress.com/
__

___
Carbon-dev mailing list
Carbon-dev@wso2.org
https://mail.wso2.org/cgi-bin/mailman/listinfo/carbon-dev


Re: [Carbon-dev] Data Services performance test results using performance testing framework

2010-06-01 Thread Anjana Fernando
Hi,

Yeah, you will not see any performance difference between 2.2.0 and
2.5.0 versions because, the core is basically the same, only
additional features are added to 2.5.0. The major difference came with
2.2.0. So the two version you should be testing for performance should
be 2.0.0 and the 2.2.0/2.5.0 releases. And also, because of the
streaming functionality, the main advantage is memory consumption. I'm
not sure if that is possible to measure with the performance testing
framework, and to effectively test this, service calls must be made,
which returns significantly large responses.

Cheers,
Anjana.

On Tue, Jun 1, 2010 at 5:16 PM, Eranda Sooriyabandara  wrote:
> This test was done for the Data Services team to compare the performance of
> the Data Services servers 2.2.0 and 2.5.0 using the Performance testing
> framework.
> What I did here is add a RDBMS data source, create a data query service, and
> querying different number of  rows form the Data Service server.
> In performance test framework we can increase the number of concurrent
> requests per second(demand rate) step by step so that we can exactly get the
> point where server begin to saturate.  I tested for 100, 200,  1000
> query rows for 100, 120, 140,  ,500 demand rates and get the results.
> Here I am attaching the results with this. In results.png, include a quick
> review of all results and in the zip file there are separate graphs in
> detail.
>
> thoughts:
> Data Services 2.5.0 shows slightly high performance(almost the same) than
> Data Service 2.2.0.
> We can identify the demand rate where  the server start to saturate,  which
> shows the highest response rate. When query rows increases, it saturate at
> lower demand rates.
>
> Ideally this test is to be done in separate client and server machines. But
> I ran them in same machine. Shall I  use the private cloud to do the testing
> for separate client/server?
> --
> Thanks
> Eranda
>
> ___
> Carbon-dev mailing list
> Carbon-dev@wso2.org
> https://mail.wso2.org/cgi-bin/mailman/listinfo/carbon-dev
>
>



-- 
Anjana Fernando
Software Engineer
WSO2, Inc.; http://wso2.com
lean.enterprise.middleware

___
Carbon-dev mailing list
Carbon-dev@wso2.org
https://mail.wso2.org/cgi-bin/mailman/listinfo/carbon-dev