I haven't tried situation this this but as per your requirements, you can
make the schema for defining all those fields required by you like, date,
location, etc you can also configure the faceting form solrconfig.xml if
you want the same for every request.

You should give it a try by allocating the 2-4GB of heap space then you can
increase the size by testing it on heavy load.
All the hardware kind of parameters are pluggable, you have to try it by
yourself. If problems arises then should look at the solr logs if there is
a issue related the memory then you can allocate more memory by visualizing
the GC graphs.

I am not an expert, i am just a newbie in solr, may be some points are not
well explained by me, but you should try by experimenting it, I guess you
have  a sufficient time before july ;) .

With Regards
Aman Tandon


On Fri, May 9, 2014 at 11:09 PM, Cool Techi <cooltec...@outlook.com> wrote:

> Hi,
> We have a requirement from one of our customers to provide search and
> analytics on the upcoming Soccer World cup, given the sheer volume of
> tweet's that would be generated at such an event I cannot imagine what
> would be required to store this in solr.
> It would be great if there can be some pointer's on the scale or hardware
> required, number of shards that should be created etc. Some requirement,
> All the tweets should be searchable (approximately 100million tweets/date
>  * 60 Days of event). All fields on tweets should be searchable/facet on
> numeric and date fields. Facets would be run on TwitterId's (unique users),
> tweet created on date, Location, Sentiment (some fields which we generate)
>
> If anyone has attempted anything like this it would be helpful.
> Regards,Rohit
>

Reply via email to