Is it better to analyze logs with lucene ? Or other solutions are better for
performance....

On Thu, May 20, 2010 at 9:51 AM, ilkay polat <polattechnol...@gmail.com>wrote:

> Hello;
>
>   I have a desing question while developing my project. If you have time,
>   lease read my problem and if you have a solution please make me informed.
>
>   Project : Our system produce a txt file for every one hour(13 pm 14 pm
> e.g.
> ). (These files contain logs from network e.g. TCP logs). I use FreeBSD and
>   cron. For every hour, after five minutes later(13:05 pm , 14:05 pm e.g.),
>   there is a process which indexes this txt file with lucene indexing.
>  Then I
>   have an web app which search some textual search with this produced
> indexed
>   file(lucene produce obviously).
>
>   Problem: Our customers wants to know which client is using internet most,
>   which site is used most and like this things which are done with sql like
>   that as you know
>
>   select site, count(site) from log_table
>   group by site
>
>   Mhy solution is: A second process which insert logs to table (temporary
>   table ) and aftr inserting do some quesries on this temporarty table and
> get
>  results to main statistics tablse which has tables as which site is most
>   visited table . This temporary table updates related statistics table .
>
>
>   I have need a recommendation about the problem? Is there any solution on
>   lucene(get most ranked kind query searching is exist or not and if yes is
> it
>  good for performance)
>   If there is no a solution in lucene what will you use for this situation?
> : Thanks for your help.
>
>
> : ilkay POLAT
>
> : Research &Development Software Engineer TURKEY
>



-- 
---
ilkay POLAT

Research &Development Software Engineer TURKEY

Reply via email to