I enabled the lucene index cache for the mostly used properties and it 
made a huge difference!

Regarding the VM heap size, I started with 128MB but already tried to 
increase it to 512MB but didn't help. Now I'm trying to estimate the 
actual memory requirement. I guess that using lucene index cache further 
increases the memory requirement.

Miklós

2010.05.17. 11:15 keltezéssel, Mattias Persson írta:
> 2010/5/17 Kiss Miklós<kissmik...@freemail.hu>
>
>    
>> Hello,
>>
>> Thanks for the answer.
>> No I haven't looked at lucene index cache yet but will soon, thanks for
>> the tip.
>> The strings I store are mostly about 10 characters long but I also have
>> some about 30 chars. I wouldn't consider these as 'big strings'.
>>
>>      
> That's a very common (and rather optimal for Neo4j) string size.
>
> Is it a good choice to put all write operations into one transaction?
>    
>> This means in my case a few thousand nodes (2-5000) and about 4-5 times
>> more relation. Or would it give better performance if I sliced operation
>> into smaller transactions? What would be the optimal transaction size?
>>
>>      
> That should yield a good write performance, yes. How much heap have you
> given the JVM? Write operations in a transaction is kept in memory until
> committed so if you don't have a big heap size it can be a problem and cause
> out-of-memory problems like you encountered...
>
>
>    
>> Thanks,
>> Miklos
>>
>> 2010.05.17. 9:35 keltezéssel, Mattias Persson írta:
>>      
>>> Hi, sorry for a late response.
>>>
>>> Yep, this seems like an excellent fit for Neo4j. Regarding lucene index
>>> lookup performance: have you looked at enabling caching
>>> http://wiki.neo4j.org/content/Indexing_with_IndexService#Caching ? It
>>>        
>> can
>>      
>>> speed up lookups considerably.
>>>
>>> Do you store very big strings or just few words? Neo4j currently isn't
>>> optimal at storing big strings. For this an integration with another
>>> database could be a solution.
>>>
>>> 2010/5/10 Kiss Miklós<kissmik...@freemail.hu>
>>>
>>>
>>>        
>>>> Hello,
>>>>
>>>> I'd like to ask if using Neo4j would be a good solution for the
>>>> following scenario.
>>>> I have an application which performs some natural language text
>>>> analysis. I want to put the results of this analysis into a database. I
>>>> have words, stems, collocations, themes and many relations between them.
>>>> This is why neo4j seems to be a good solution.
>>>> However, I ran into performance problems: I need to use lucene index
>>>> service heavily (I have to look up if a node I'm up to store already
>>>> exists) which I think is a bit slow. The other problem is java heap
>>>> space: some documents cause my app to halt with out of memory exception
>>>> (for which I couldn't yet find the reason).
>>>>
>>>> My questions are:
>>>> 1, Is my data storage scenario a good one (nodes = words, relations =
>>>> relations) or there could be a better one?
>>>> 2, How should I perform the lookup of nodes in the database?
>>>> 3, Or should I use some other database?
>>>>
>>>> Thanks in advance,
>>>> Miklós
>>>> _______________________________________________
>>>> Neo mailing list
>>>> User@lists.neo4j.org
>>>> https://lists.neo4j.org/mailman/listinfo/user
>>>>
>>>>
>>>>          
>>>
>>>
>>>        
>> _______________________________________________
>> Neo mailing list
>> User@lists.neo4j.org
>> https://lists.neo4j.org/mailman/listinfo/user
>>
>>      
>
>
>    

_______________________________________________
Neo mailing list
User@lists.neo4j.org
https://lists.neo4j.org/mailman/listinfo/user

Reply via email to