> indexing time would be fine (it was previously reduced from 7 to 2 days),
> but I think that traversing more than 100000000 nodes is a pretty tough

Yes that would take time (i.e. indexing 100M+ nodes). This is an area
where work is in progress (OAK-6246) to get much shorter indexing
time.

> related to indexing optimization or any advice on how to design the repo
> (e.g. use different paths to isolate different groups of assets, use
> different nodetypes to differentiate content type, create different
> repositories [is that possible?] for different groups of uses...) is

Hard to give a generic advice here. It all depends on type of query,
index definition and content structure. So would need such details to
provide any suggestion.

Chetan Mehrotra

Reply via email to