Hi Zacharias,

Have you tried to increase the size of your crawled index and segments? For
example, the clearing cloud paper says they used 2GB index and 23GB
segments.

Best

Hailong


On Fri, May 31, 2013 at 10:24 PM, zhadji01 <[email protected]> wrote:

> Hi,
>
> I have a web-search benchmark setup with 4 machines 1 client, 1 front-end,
> 1 search server and 1 segment server for fetching the summaries.
>
> All machines are two-socket Xeon E5620 @2.4Ghz, 32GB RAM and they  are
> connected with 1Gb Ethernet. My crawled data is 400 MB index and 4GB
> segments.
>
> My problem is that the servers' cpu utilization is very low. The max
> throughput I managed to get using the faban client or apache benchmark was
> ~400-450 queries/sec with user cpu utlizations: frontend ~5%, search server
> ~ 10%, segment server ~35-39%.
>
> I'm sure that the network is not the bottleneck cause I'm not even close
> to fill the bandwidth.
>
> Can you give any suggestions on how to utilize well the servers or any
> thoughts on what can be the problem?
>
> Thanks Zacharias Hadjilambrou
>

Reply via email to