[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-30 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16146931#comment-16146931
 ] 

Vivek Narang commented on SOLR-10317:
-

Hello [~ichattopadhyaya],
JSON based configurable querying is working now. Yes, I can provide a written 
summary of these updates. Regards.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-29 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16146458#comment-16146458
 ] 

Vivek Narang commented on SOLR-10317:
-

Hello [~ichattopadhyaya],
Now the configurable approach to indexing is working. Querying will also start 
working in the same way in a couple of hours from now. I would like to give you 
a quick demo soon. Regards. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-24 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16141250#comment-16141250
 ] 

Vivek Narang commented on SOLR-10317:
-

Refactoring is being done section by section, for now, two parts - QueryClient 
and Metric have been modified. Will add the required documentation soon.  

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-22 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16137488#comment-16137488
 ] 

Vivek Narang commented on SOLR-10317:
-

Hello [~ichattopadhyaya], please switch to "SolrNightlyBenchmarks-R2" branch 
for the latest updates. Regards.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-21 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16135892#comment-16135892
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya],
I was working on a big section of backend refactoring last week, I will send an 
update after committing the code tonight. Regards. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-14 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16125810#comment-16125810
 ] 

Vivek Narang commented on SOLR-10317:
-

10-4 Captain!

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-14 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16125739#comment-16125739
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya],
Let's have a Google hangout meeting today (8/14) at 11 pm EST. Thanks

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-12 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16124776#comment-16124776
 ] 

Vivek Narang commented on SOLR-10317:
-

Hey [~ichattopadhyaya], I think an awesome dynamic UI mechanism idea just 
struck me. i'll summarize it soon. Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-12 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16124510#comment-16124510
 ] 

Vivek Narang commented on SOLR-10317:
-

The front-end plan is not clear yet. I will update as soon as I have more 
clarity on the approach. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-11 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16124490#comment-16124490
 ] 

Vivek Narang commented on SOLR-10317:
-

These CSV files are needed at the front-end side. 

1. Primary Consumer of CSV files - The dygraph API.
2. Format changes with the required graph layout. [Col1 - X-axis, Col2,Col3 ... 
etc are the Y-axis]. The first line in the CSV files is the visible label as 
seen on the graphs.
3. Dygraph uses these CSV data files.
4. These CSV files are referenced to the calling JS functions from the dygraph 
API, and then they are handled by the dygraph API internally.



> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-11 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16124417#comment-16124417
 ] 

Vivek Narang commented on SOLR-10317:
-

Hello [~ichattopadhyaya],
Initial steps for backend refactoring (up to the point of creating CSV files): 
Top layer - JSON meta data file, followed by creating a list of 
BenchmarkConfiguration objects based on parsed information. A 
BenchmarkConfiguration object holds the information on a specific benchmark 
which is then passed across several layers inside providing information at each 
level. Against each benchmark, there is a map object that holds the results 
that can then be passed to the point that creates the CSV files.  This approach 
will remove a major portion of partially redundant code in the Tests.java as 
well as remove the enums that were limiting configuration at various levels.

Please check sample code on a new branch that tries to showcase a part of this 
approach.
[https://github.com/viveknarang/lucene-solr/commit/0869a7c2aa131de6afdca38e04fa4627f87d6a46]

Regards.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-10 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16121865#comment-16121865
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya],
I am feeling better now, Thanks! I took this time to think about your 
suggestions and here is my opinion.

Yes, it is a good idea to make this configurable but there are some challenges 
to it. Even if the backend (leading to the point where CSVs are generated) is 
made dynamic (which is not as straight forward as it seems), the front end will 
have to be made dynamic which violates the initial agreement of keeping the 
front end static as discussed initially. Limitations on the front-end are one 
of the reasons why the implementation is the way it is. We need to make the 
front end dynamic for this idea to be realistic. The suite even in the current 
state is extensible but for that one will have to modify the classes and the 
front-end to add/modify benchmarks. However, one can always find better ways to 
do things and JSON based configurable suite is a good idea as an improvement! 
I want this benchmark project to be useful for the community and I would not 
recommend waiting for another year to make the suggested changes. However, 
since I anticipate these changes to be time taking and since the deadline is 
just over a week away, I recommend creating a duplicate branch of the current 
project and implement your suggestions over the coming weeks. 

Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-08 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16118865#comment-16118865
 ] 

Vivek Narang commented on SOLR-10317:
-

Hello [~ichattopadhyaya], 
I have been unwell for last three days, will try to resume work as soon as 
possible.


> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-04 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16115261#comment-16115261
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya],
Status:
-Testing complete and bugs fixed in the new QPS calculation logic, the code 
will be pushed in one hour. 
Regards.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-03 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16113867#comment-16113867
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya],
Status for Today:
- Testing the new QPS calculation logic. 
Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-02 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16112115#comment-16112115
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya],
Status for Today:
Testing and refactoring QPS calculation logic - controlling the number of 
queries instead of the time period. 

Regards


> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-08-01 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16110222#comment-16110222
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya]
Status For Today:
Conducted tests and added more logging & Null checks.

After doing the RCA on the issue observed recently, I have come to a conclusion 
that a recent commit done in Solr, is the cause for this issue. 
Please see this: 
[https://github.com/apache/lucene-solr/commit/67b3d4e108b82b56ca45b9e78bcac52bd2280c8b].
 
This is the commit after which the issue started appearing (the ones you noted 
a few comments before). If you observe lines 128 and 149 
(_default/conf/managed-schema), you see that *_pi is removed and now *_i must 
be used in instead of it. 

Work in progress:
- Adding an option for dev-tests where a small data set and small query set 
will allow for quicker execution of the benchmark framework, speeding up the 
development/debugging process.
- Adding more checks in the code. 

Regards  

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-07-31 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16108330#comment-16108330
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya]
Status for today:
Testing, adding more logging and RCA for the observed errors. I see that the 
test is running fine for an older commit eg: 
5f1c5ce81286ab2e0c0f98c47d332511ea57e8c1 while there are some errors noted when 
running for a recent commit eg: 3a405971b9e06e2004e0d66ae1b82f530de969f2. 

Regards




> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-07-30 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106760#comment-16106760
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya],
Work status as of now is as follows:

- Remove Internal queuing capability: Done
- Rename Root and webapp directory: Done
- Create a script for processing commits for last n days: In progress 
- Missing exception chaining/handling: Done, pending verification.
- Log4j implementation: Done. (solrnightlybenchmarks.log file created in /logs 
folder)
- Null pointer exception handling across all class files: In progress
- Providing shell script for download: Done (download.sh script in /scripts 
folder)
- Embedded Jetty server support, removing external HTTP server dependency: Done

As of now, I am currently testing against the changes made today.

Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-07-30 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106602#comment-16106602
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya]
I have added issues on the list. Thanks for pointing out important gaps. I am 
working on them and will provide updates soon. 

Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> Screenshot from 2017-07-30 20-30-05.png, SOLR-10317.patch, SOLR-10317.patch, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-07-30 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106376#comment-16106376
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya]
bq. Perhaps "Benchmarking data files" or "Data files".

Will Rename.

bq. That link is useless. How will the user download all the files from that 
link?

Link Removed.

bq. Downloading files requires a specialized software, and the benchmarking 
suite should not be concerned with that. Resuming partial downloads, 
multi-threaded downloading, etc. are special features that the downloader 
should take care of. I think the business of downloading files should be left 
for the user to deal with.

Shell script coming up shortly. 

bq.   ...which property in that properties file needs to be adjusted? How to 
enable/disable this ability etc.?

More detailed steps coming up shortly.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-07-30 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106352#comment-16106352
 ] 

Vivek Narang edited comment on SOLR-10317 at 7/30/17 7:45 AM:
--

Hi [~ichattopadhyaya]

bq. Your README file [0] is unreadable due to horizontal scrollers in each 
section.

Fixed now please check.

[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks]

bq. You need to be specific in terms of steps the user needs to take.

https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks#steps-to-launch

The steps have always existed on this file! Do you find any steps missing?

bq. Why are you calling it "test data files"?

Please suggest what you want these files to be referred as. 

bq. Also, as a user, I don't fucking care about what ability the system has.

The referred part has been removed from the README file.

bq. Couldn't you spare 5 minutes from those 8-16 hours (in last 2 days) to get 
to this?

I wanted to finish a group of important in-progress tasks before updating the 
TODO items.

bq.  Going forward, can you kindly post the daily status here in JIRA and 
include sufficient details on work you have done each day?

Sure! 

bq. Secondly, "please use the link provided below" is equivalent to telling the 
user to "go f*** yourself"

The link has been provided as an option and is not a required step. When the 
user has configured the system to automatically download required files, there 
is no need for the user to manually download files.

bq. This is just unnecessary complexity to the system. Add a shell script to 
download the data files (similar to the one I pasted above) [0].

I don't think there is any complexity here. Checking the existence of these 
files is already an essential step. The files are only downloaded when missing 
and essentially a wget is done to download files please see code below. If you 
still insist I will create a shell script for this. 

[https://github.com/viveknarang/lucene-solr/blob/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks/src/main/java/org/apache/solr/tests/nightlybenchmarks/Util.java#L737]


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya]

bq. Your README file [0] is unreadable due to horizontal scrollers in each 
section.

Fixed now please check.

[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks]

bq. You need to be specific in terms of steps the user needs to take.

https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks#steps-to-launch

The steps have always existed on this file! Do you find any steps missing?

bq. Why are you calling it "test data files"?

Please suggest what you want these files to be referred as. 

bq. Also, as a user, I don't fucking care about what ability the system has.

The referred part has been removed from the README file.

bq. Couldn't you spare 5 minutes from those 8-16 hours (in last 2 days) to get 
to this?

I wanted to finish a group of important in-progress tasks before updating the 
TODO items.

bq.  Going forward, can you kindly post the daily status here in JIRA and 
include sufficient details on work you have done each day?

Sure! 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # h

[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-07-30 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106352#comment-16106352
 ] 

Vivek Narang edited comment on SOLR-10317 at 7/30/17 7:35 AM:
--

Hi [~ichattopadhyaya]

bq. Your README file [0] is unreadable due to horizontal scrollers in each 
section.

Fixed now please check.

[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks]

bq. You need to be specific in terms of steps the user needs to take.

https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks#steps-to-launch

The steps have always existed on this file! Do you find any steps missing?

bq. Why are you calling it "test data files"?

Please suggest what you want these files to be referred as. 

bq. Also, as a user, I don't fucking care about what ability the system has.

The referred part has been removed from the README file.

bq. Couldn't you spare 5 minutes from those 8-16 hours (in last 2 days) to get 
to this?

I wanted to finish a group of important in-progress tasks before updating the 
TODO items.

bq.  Going forward, can you kindly post the daily status here in JIRA and 
include sufficient details on work you have done each day?

Sure! 


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya]

bq. Your README file [0] is unreadable due to horizontal scrollers in each 
section.

Fixed now please check.

[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks]

bq. You need to be specific in terms of steps the user needs to take.

https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks#steps-to-launch

The steps have always existed on this file! Do you find any steps missing?

bq. Why are you calling it "test data files"?

Please suggest what you want these files to be referred as. 

bq. Also, as a user, I don't fucking care about what ability the system has.

The referred part has been removed from the README file.

bq. Couldn't you spare 5 minutes from those 8-16 hours (in last 2 days) to get 
to this?

I wanted to finish a group of important in-progress tasks before updating the 
TODO items.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-07-30 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106352#comment-16106352
 ] 

Vivek Narang edited comment on SOLR-10317 at 7/30/17 7:32 AM:
--

Hi [~ichattopadhyaya]

bq. Your README file [0] is unreadable due to horizontal scrollers in each 
section.

Fixed now please check.

[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks]

bq. You need to be specific in terms of steps the user needs to take.

https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks#steps-to-launch

The steps have always existed on this file! Do you find any steps missing?

bq. Why are you calling it "test data files"?

Please suggest what you want these files to be referred as. 

bq. Also, as a user, I don't fucking care about what ability the system has.

The referred part has been removed from the README file.

bq. Couldn't you spare 5 minutes from those 8-16 hours (in last 2 days) to get 
to this?

I wanted to finish a group of important in-progress tasks before updating the 
TODO items.


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya]

bq. Your README file [0] is unreadable due to horizontal scrollers in each 
section.

Fixed now please check.

[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks]

bq. You need to be specific in terms of steps the user needs to take.

https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks#steps-to-launch

The steps have always existed on this file! Do you find any steps missing?

bq. Why are you calling it "test data files"?

Please suggest what you want these files to be referred as. 

bq. Also, as a user, I don't fucking care about what ability the system has.

The referred part has been removed from the README file.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-07-30 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106352#comment-16106352
 ] 

Vivek Narang edited comment on SOLR-10317 at 7/30/17 7:25 AM:
--

Hi [~ichattopadhyaya]

bq. Your README file [0] is unreadable due to horizontal scrollers in each 
section.

Fixed now please check.

[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks]

bq. You need to be specific in terms of steps the user needs to take.

https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks#steps-to-launch

The steps have always existed on this file! Do you find any steps missing?

bq. Why are you calling it "test data files"?

Please suggest what you want these files to be referred as. 

bq. Also, as a user, I don't fucking care about what ability the system has.

The referred part has been removed from the README file.


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya]

bq. Your README file [0] is unreadable due to horizontal scrollers in each 
section.

Fixed now please check.

[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks]

bq. You need to be specific in terms of steps the user needs to take.

https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks#steps-to-launch

The steps have always existed on this file! Do you find any steps missing?

bq. Why are you calling it "test data files"?

Please suggest what you want these files to be referred as. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-07-30 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106352#comment-16106352
 ] 

Vivek Narang edited comment on SOLR-10317 at 7/30/17 7:25 AM:
--

Hi [~ichattopadhyaya]

bq. Your README file [0] is unreadable due to horizontal scrollers in each 
section.

Fixed now please check.

[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks]

bq. You need to be specific in terms of steps the user needs to take.

https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks#steps-to-launch

The steps have always existed on this file! Do you find any steps missing?

bq. Why are you calling it "test data files"?

Please suggest what you want these files to be referred as. 


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya]

bq. Your README file [0] is unreadable due to horizontal scrollers in each 
section.

Fixed now please check.

[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks]

bq. You need to be specific in terms of steps the user needs to take.

https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks#steps-to-launch

The steps have always existed on this file! Do you find any steps missing?



> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-07-30 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106352#comment-16106352
 ] 

Vivek Narang edited comment on SOLR-10317 at 7/30/17 7:23 AM:
--

Hi [~ichattopadhyaya]

bq. Your README file [0] is unreadable due to horizontal scrollers in each 
section.

Fixed now please check.

[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks]

bq. You need to be specific in terms of steps the user needs to take.

https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks#steps-to-launch

The steps have always existed on this file! Do you find any steps missing?




was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya]

bq. Your README file [0] is unreadable due to horizontal scrollers in each 
section.

Fixed now please check.

[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks]

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-07-30 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106352#comment-16106352
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya]

bq. Your README file [0] is unreadable due to horizontal scrollers in each 
section.

Fixed now please check.

[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks]

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-07-29 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106295#comment-16106295
 ] 

Vivek Narang edited comment on SOLR-10317 at 7/30/17 3:10 AM:
--

Hi [~ichattopadhyaya],
I have sent a detailed status email for this week combined with last week a few 
minutes before. Since major current and future planned goals are already 
completed by now (as also noted in the email) the TODO items as of now are: 
Code refactoring, working on some of the points noted by you above and thorough 
testing of the platform. I will provide additional updates shortly. 


was (Author: vivek.nar...@uga.edu):
[~ichattopadhyaya]
I have sent a detailed status email for this week combined with last week a few 
minutes before. Since major current and future planned goals are already 
completed by now (as also noted in the email) the TODO items as of now are: 
Code refactoring, working on some of the points noted by you above and thorough 
testing of the platform. I will provide additional updates shortly. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-07-29 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106295#comment-16106295
 ] 

Vivek Narang commented on SOLR-10317:
-

[~ichattopadhyaya]
I have sent a detailed status email for this week combined with last week a few 
minutes before. Since major current and future planned goals are already 
completed by now (as also noted in the email) the TODO items as of now are: 
Code refactoring, working on some of the points noted by you above and thorough 
testing of the platform. I will provide additional updates shortly. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-07-29 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16106194#comment-16106194
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya]
I have fixed the label bug on graphs. Please see 
[http://212.47.242.214/MergedViewStandalone.html]

Regards
Vivek

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-07-27 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16104154#comment-16104154
 ] 

Vivek Narang edited comment on SOLR-10317 at 7/28/17 12:07 AM:
---

Hi [~ichattopadhyaya]

I have added additional capability to archive performance data (archive 
parameter) and clear performance data (clear-data parameter). Please check the 
latest readme file.

Regards
Vivek


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya]

I have added additional capability to archive performance data (--archive) and 
clear performance data (--clear-data). Please check the latest readme file.

Regards
Vivek

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-07-27 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16104154#comment-16104154
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya]

I have added additional capability to archive performance data (--archive) and 
clear performance data (--clear-data). Please check the latest readme file.

Regards
Vivek

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-07-27 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16103731#comment-16103731
 ] 

Vivek Narang edited comment on SOLR-10317 at 7/27/17 7:15 PM:
--

Hi [~ichattopadhyaya]

Steps to run for N days are added in the README file. Yes, currently, it has to 
be done manually but I will try to add a new feature so this could be done 
using parameters very soon. 
Menu for standalone and Cloud have been re-arranged as you described above. Now 
a "More Details" link on each panel on the Unified view pages will take the 
user to the linked page. The back button is redundant since there are links on 
the top menu to browse around as required. If you still prefer it, please let 
me know and I will add it on the pages. Please check [http://212.47.242.214]

I will compile and list the TODO items shortly.

Regards
Vivek


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya]

Steps to run for N days are added in the README file. Yes, currently, it has to 
be done manually but I will try to add a new feature so this could be done 
using parameters very soon. 
Menu for standalone and Cloud have been re-arranged as you described above. Now 
a "More Details" link on each panel on the Unified view pages will take the 
user to the linked page. The back button is redundant since there is are links 
in the top menu to browse around as required. If you still prefer it, please 
let me know and I will add it on the pages. Please check [http://212.47.242.214]

I will compile and list the TODO items shortly.

Regards
Vivek

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-07-27 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16103731#comment-16103731
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya]

Steps to run for (n) days are added in the README file. Yes, currently, it has 
to be done manually but I will try to add a new feature so this could be done 
using parameters very soon. 
Menu for standalone and Cloud have been re-arranged as you described above. Now 
a "More Details" link on each panel on the Unified view pages will take the 
user to the linked page. The back button is redundant since there is are links 
in the top menu to browse around as required. If you still prefer it, please 
let me know and I will add it on the pages. Please check [http://212.47.242.214]

I will compile and list the TODO items shortly.

Regards
Vivek

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-07-27 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16103731#comment-16103731
 ] 

Vivek Narang edited comment on SOLR-10317 at 7/27/17 7:14 PM:
--

Hi [~ichattopadhyaya]

Steps to run for N days are added in the README file. Yes, currently, it has to 
be done manually but I will try to add a new feature so this could be done 
using parameters very soon. 
Menu for standalone and Cloud have been re-arranged as you described above. Now 
a "More Details" link on each panel on the Unified view pages will take the 
user to the linked page. The back button is redundant since there is are links 
in the top menu to browse around as required. If you still prefer it, please 
let me know and I will add it on the pages. Please check [http://212.47.242.214]

I will compile and list the TODO items shortly.

Regards
Vivek


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya]

Steps to run for (n) days are added in the README file. Yes, currently, it has 
to be done manually but I will try to add a new feature so this could be done 
using parameters very soon. 
Menu for standalone and Cloud have been re-arranged as you described above. Now 
a "More Details" link on each panel on the Unified view pages will take the 
user to the linked page. The back button is redundant since there is are links 
in the top menu to browse around as required. If you still prefer it, please 
let me know and I will add it on the pages. Please check [http://212.47.242.214]

I will compile and list the TODO items shortly.

Regards
Vivek

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-07-22 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16097386#comment-16097386
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya]

Multiple threads have already been implemented. Please see 
[http://212.47.242.214/IndexingThroughputBenchmarkStandaloneConcurrent.html]
I think you got confused between two different tests. One test uses HTTP Solr 
client while the other with threads uses Concurrent Update client. 

The ability to locate and download test data files is awesome and now the 
system will be able to check the data file presence and download the missing 
file(s) by itself as required. The source URL is configurable through the 
properties file. 

Regards
Vivek 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-07-12 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16084044#comment-16084044
 ] 

Vivek Narang commented on SOLR-10317:
-

Hello [~ichattopadhyaya]

Wikipedia data - done.
Unified page view - pending.
Merging menu items - done.
Code formatting - in progress, partially done.
Args parameter renaming - done.
Log4j - done.
Label adjustment - pending. 

Regards
Vivek


> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-07-07 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16078068#comment-16078068
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi, [~michael.sun] Thanks. I will look into it soon. Regards.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-06-27 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16066003#comment-16066003
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya] & [~mkhludnev] 
Below are the action points (In order of importance), from the Round 1 
evaluation meeting.

- Switching from test data, currently used, to the use of Wikipedia data for 
index benchmarking. 
- Merging menu items (eg. Merging standalone numeric query tests under category 
Standalone Metrics etc.)
- Code formatting (Removing spaces, indentation, missing javadocs etc.)
- Adding an option to put all the graps on one page (category wise)
- Args parameter renaming. 
- Use of log4j as applicable. 
- Adjust labels on graphs/charts

I am happy and thankful to have awesome Mentors like you!

Regards
Vivek Narang





> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-06-21 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16057279#comment-16057279
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~michael.sun] It changed after identifying and handling resource 
contention. 

Sadly for Indexing (using ConcurrentUpdateSolrClient) on the SolrCloud, there 
are still fluctuations noted (I am guessing because ConcurrentUpdateSolrClient 
uses HttpSolrClient instead of CloudSolrClient, see: 
https://github.com/apache/lucene-solr/blob/master/solr/solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrClient.java#L105-L107)

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-06-16 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16052624#comment-16052624
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~michael.sun]. Like you, I was also deeply concerned about the fluctuations 
in the resulting metrics and after digging deep into the problem, for last two 
days, and applying critical fixes, I am happy to inform you that the 
fluctuations are now been contained. Please access [http://212.47.227.9/prod/]. 
Some important changes are still remaining though. Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> SOLR-10317.patch, solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-06-09 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16045363#comment-16045363
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya] An Update. I have implemented a logic to calculate the 
Latency (for numeric queries) and corresponding percentiles (75th, 95th, 99th & 
99.9th) for Latency. Additionally, I am in the process of adding mechanisms for 
AND and OR type queries for Numeric fields. These new features are not yet 
visible on the front end, however, their logic is already added. These new 
features will appear soon. Regards.  

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-06-07 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16042085#comment-16042085
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya]
An Update. I have added a mechanism with which a set of commits collected in a 
single day are processed for benchmarks at midnight now. Please view a sample 
run of this batch mode (that was processed today) here: 
[http://212.47.227.9/prod/]. The benchmark (bach-mode) cycle is now triggered 
by Jenkins every midnight [EST]. Regards


> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-06-05 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16038003#comment-16038003
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya] I think there is some confusion and I think I should give 
some explanation.

- I am not building a new framework, but I am extending the benchmarks 
framework that you created [https://github.com/chatman/solr-upgrade-tests] as 
mentioned in the proposal. 
- The reason behind me extending upon your framework is that it already has 
many flexible, ready to use resources and that it is written in one language. I 
am comfortable using one language over two languages together.   
- For the remaining things, I already am in the process of using the required 
resources from Shalin's work and adding to the framework that you created.  
- As far as tagging/addition of significant events go while I think that the 
current logic in Shalin's code base related to listing significant events is 
hard coded 
[https://github.com/shalinmangar/solr-perf-tools/blob/master/src/python/bench.py#L32-L87],
 the closest that I am in making it dynamic and self-dependent is showing 
relevant commit messages with each metric point 
[http://212.47.227.9/prod/NumericQueryBenchmarkStandalone.html] (please hover 
over any point to see the relevant commit message).  
- I will try to add a feature through which you would be able to view all the 
graphs together. 

Please access the latest codebase 
[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks].
As per the agreement in the proposal, the code for benchmarks suite is under 
dev-tools framework and in the SolrNightlyBenchmarks branch. 

Regards


> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-06-03 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16036110#comment-16036110
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya] I have restored Jenkins on the new server please access 
[http://212.47.227.9/prod]. At this moment you may not see any metric data but 
soon the numbers should become visible. Regards. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-06-01 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16033906#comment-16033906
 ] 

Vivek Narang commented on SOLR-10317:
-

Hello [~ichattopadhyaya], I have added a mechanism with which the failed tests 
sessions are identified and handled. The running processes, if any, 
(solr/zookeeper) from the last failed session are located and destroyed, any 
files/folders created during the failed session are located and removed and 
Metric data files are only updated with new statistics when the test session is 
completely successful. This mechanism will make this suite more 
self-sufficient. 

Please access the latest code at 
[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks].
 Regards.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-06-01 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16033224#comment-16033224
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~shalinmangar] Thanks for your suggestions above. Your code for indexing is 
excellent and I will be using that in the project. I was, however, trying to 
understand the IndexThreads class that you have written. Can you please explain 
its working when you have some free time? Thanks in advance.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-30 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16029990#comment-16029990
 ] 

Vivek Narang commented on SOLR-10317:
-

Hello,
I have been down with illness for last three days, I will resume my activity 
shortly. Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-26 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16026549#comment-16026549
 ] 

Vivek Narang commented on SOLR-10317:
-

Hello [~ichattopadhyaya], After getting inspired by source code in SolrMeter I 
have come up with the logic to do a kind of a stress test where I am estimating 
the QPS for a set of numeric queries (as an example for now.). 
Please access [http://212.47.227.9/dev/NumericQueryBenchmarkCloud.html]. I am 
observing a strange co-incidence - The QPS measured for a query looking for a 
specific number is lowest and the QPS measured for a query looking for all 
those numbers greater than a number. Is there an explanation for this?

QPS(Field:Number) < QPS (Field:[Number1 TO Number2]) < QPS(Field:[* TO 
Number]) < QPS(Field:[Number TO *])

This has been observed over a set of commits and not one commit. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-05-24 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16023417#comment-16023417
 ] 

Vivek Narang edited comment on SOLR-10317 at 5/24/17 6:44 PM:
--

Hi [~shalinmangar] Thank you for sharing the link to your prior work. I briefly 
went through your work before, using the link to your work that was mentioned 
in the description section above. I think the purpose of this initiative is to 
have a comprehensive benchmarking suite based on the prior work done in this 
area and addition of many other features/tests that helps the Solr community. I 
would love to know your thoughts as to how we can come up with a better and 
much more comprehensive suite, based on the prior work done. Regards. 


was (Author: vivek.nar...@uga.edu):
Hi [~shalinmangar] Thank you for sharing the link to your prior work. I briefly 
went through your work before, using the link to your work was mentioned in the 
description section above. I think the purpose of this initiative is to have a 
comprehensive benchmarking suite based on the prior work done in this area and 
addition of many other features/tests that helps the Solr community. I would 
love to know your thoughts as to how we can come up with a better and much more 
comprehensive suite, based on the prior work done. Regards. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-24 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16023430#comment-16023430
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi, I am trying to find a way to measure the QPS for query benchmarking. The 
way I understand it is that it is a way to measure the querying capacity for a 
Solr node or a cluster. Can anyone recommend me the correct mechanism of 
estimating QPS? Thanks for the help. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-24 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16023417#comment-16023417
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~shalinmangar] Thank you for sharing the link to your prior work. I briefly 
went through your work before, using the link to your work was mentioned in the 
description section above. I think the purpose of this initiative is to have a 
comprehensive benchmarking suite based on the prior work done in this area and 
addition of many other features/tests that helps the Solr community. I would 
love to know your thoughts as to how we can come up with a better and much more 
comprehensive suite, based on the prior work done. Regards. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-23 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16022024#comment-16022024
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya] An update, now the following graph 
[http://212.47.227.9/dev/IndexingThroughputBenchmarkCloudConcurrent.html] shows 
the throughput performance with different thread configurations (2,4,6,8 and 10 
Threads) on a Solr cluster of 5 Nodes with 2 Shards and 2 Replicas (as an 
example).  Regards. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-23 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16020801#comment-16020801
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya] A quick short additional update. I thought that It will 
make sense if we add a "moving average" line for metric estimation. Please 
access [http://212.47.227.9/dev/IndexingThroughputBenchmarkCloud.html]. The 
"moving average" line acts as a dampening effect on the metric estimation that 
absorbs small fluctuations. In this way, if there is a big shift in the "moving 
average" line that is sure shot a red flag. Please let me know what you think. 
Regards. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-05-22 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16020600#comment-16020600
 ] 

Vivek Narang edited comment on SOLR-10317 at 5/23/17 3:09 AM:
--

Hi [~ichattopadhyaya] I hope that you are well now. 

I think I was right when I assumed that there was a memory issue when I was 
testing on my previous server (with unsuitable hardware configuration for this 
purpose). Thanks to migrating to the new server that you provided for this 
project (8 Core 32 GB RAM), The indexing throughput is almost constant now, 
even across recent consecutive commits. Please see 
[http://212.47.227.9/dev/IndexingBenchmarkStandalone.html]. As you can observe 
now in comparison with the setup on the previous server, the performance on 
this server is consistent (with the difference of one second possibly due to 
rounding off, which may also be statistically insignificant depending on how 
precise we want to measure indexing performance). There is an exception noted 
though, the indexing on a cluster of 5 Solr nodes, two shards and two replicas 
using *CloudSolrClient*, is still registering a level of variation in indexing, 
please see [http://212.47.227.9/dev/IndexingBenchmarkCloud.html]. This might be 
due to the internal mechanism of indexing which could be explained by someone 
in our community who may have worked deeply on building/maintaining this 
client. 

Hi [~michael.sun] Thank you for your appreciation, I am highly motivated and am 
looking forward to making this project a success and useful for the community. 
I will explore the option that you have recommended and will try to include 
that as well for a more meaningful estimation.

Thanks [~ichattopadhyaya] again, for providing me this awesome server to work 
on!!! Please let me know if I can use this space for this project for the rest 
of the alloted time (till ~mid August) so that I can stop using my older server.

Regards.  
Vivek


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya] I hope that you are well now. 

I think I was right when I assumed that there was a memory issue when I was 
testing on my previous server (with unsuitable hardware configuration for this 
purpose). Thanks to migrating to the new server that you provided for this 
project (8 Core 32 GB RAM), The indexing throughput is almost constant now, 
even across recent consecutive commits. Please see 
[http://212.47.227.9/dev/IndexingBenchmarkStandalone.html]. As you can observe 
now in comparison with the setup on the previous server, the performance on 
this server is consistent (with the difference of one second possibly due to 
rounding off, which may also be statistically insignificant depending on how 
precise we want to measure indexing performance). There is an exception noted 
though, the indexing on a cluster of 5 Solr nodes, two shards and two replicas 
using *CloudSolrClient*, is still registering a level of variation in indexing, 
please see [http://212.47.227.9/dev/IndexingBenchmarkCloud.html]. This might be 
due to the internal mechanism of indexing which could be explained by someone 
in our community who may have worked deeply on building/maintaining this 
client. 

Hi [~michael.sun] Thank you for your appreciation, I am highly motivated and am 
looking forward to making this project a success and useful for the community. 
I will explore the option that you have recommended and will try to include 
that as well for a more meaningful estimation.

Thanks [~ichattopadhyaya] again, for providing me this awesome server to work 
on!!!

Regards.  
Vivek

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://gi

[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-22 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16020600#comment-16020600
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya] I hope that you are well now. 

I think I was right when I assumed that there was a memory issue when I was 
testing on my previous server (with unsuitable hardware configuration for this 
purpose). Thanks to migrating to the new server that you provided for this 
project (8 Core 32 GB RAM), The indexing throughput is almost constant now, 
even across recent consecutive commits. Please see 
[http://212.47.227.9/dev/IndexingBenchmarkStandalone.html]. As you can observe 
now in comparison with the setup on the previous server, the performance on 
this server is consistent (with the difference of one second possibly due to 
rounding off, which may also be statistically insignificant depending on how 
precise we want to measure indexing performance). There is an exception noted 
though, the indexing on a cluster of 5 Solr nodes, two shards and two replicas 
using *CloudSolrClient*, is still registering a level of variation in indexing, 
please see [http://212.47.227.9/dev/IndexingBenchmarkCloud.html]. This might be 
due to the internal mechanism of indexing which could be explained by someone 
in our community who may have worked deeply on building/maintaining this 
client. 

Hi [~michael.sun] Thank you for your appreciation, I am highly motivated and am 
looking forward to making this project a success and useful for the community. 
I will explore the option that you have recommended and will try to include 
that as well for a more meaningful estimation.

Thanks [~ichattopadhyaya] again, for providing me this awesome server to work 
on!!!

Regards.  
Vivek

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-22 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16020044#comment-16020044
 ] 

Vivek Narang commented on SOLR-10317:
-

Thanks [~ichattopadhyaya], [~mkhludnev] & [~janhoy]. Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-05-21 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16019044#comment-16019044
 ] 

Vivek Narang edited comment on SOLR-10317 at 5/22/17 5:35 AM:
--

Hi [~ichattopadhyaya], I hope you get well soon. Regarding my doubt (as 
mentioned in the fourth last comment), I may have found out the cause of 
variation in performance. I am providing the details below. 

Concern: I have been noticing that there was a significant difference in the 
indexing (1 documents) throughput (in documents/sec). Even for the same 
commit test executed at separate times, a significant difference in throughput 
was observed. Please see the following screenshot 
[http://162.243.101.83/Capture_Throughput_drop_reason.PNG]. I have marked the 
drops in throughput on the graph with red circles. You can observe the 
difference in throughput for example (322 - 263 doc/sec). This difference in 
throughput was reported by the system for the same commit id which was not 
making sense and should not be happening. After running a number of tests, I 
may have been able to pinpoint the cause of this abnormality in performance 
reporting. As guessed before this might be a *memory issue*. The server that I 
am using has limited resources (2 GB RAM). I added more than 5GB of swap space. 
These specific drops (as marked on the graph with red circles), might be 
happening when the test run required building the Solr server from fresh code 
before running the benchmark test. This additional activity (of building Solr 
from source) may have been causing the RAM to get utilized completely causing 
the OS to handle the situation with the swap space provided and this additional 
swap activity might be causing the degradation in performance. 


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya], I hope you get well soon. Regarding my doubt (as 
mentioned in the fourth last comment), I have found out the cause of variation 
in performance. I am providing the details below. 

Concern: I have been noticing that there was a significant difference in the 
indexing (1 documents) throughput (in documents/sec). Even for the same 
commit test executed at separate times, a significant difference in throughput 
was observed. Please see the following screenshot 
[http://162.243.101.83/Capture_Throughput_drop_reason.PNG]. I have marked the 
drops in throughput on the graph with red circles. You can observe the 
difference in throughput for example (322 - 263 doc/sec). This difference in 
throughput was reported by the system for the same commit id which was not 
making sense and should not be happening. After running a number of tests, I 
was able to pinpoint the cause of this abnormality in performance reporting. As 
guessed before this was a *memory issue*. The server that I am using has 
limited resources (2 GB RAM). I added more than 5GB of swap space. These 
specific drops (as marked on the graph with red circles), was happening when 
the test run required building the Solr server from fresh code before running 
the benchmark test. This additional activity (of building Solr from source) was 
causing the RAM to get utilized completely causing the OS to handle the 
situation with the swap space provided and this additional swap activity was 
causing the degradation in performance. Conclusion: I have figured out the 
cause of the abnormality in metric estimation and assume that with a suitable 
server (having required hardware resources), this abnormality in reporting will 
go away. Regards. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (S

[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-21 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16019044#comment-16019044
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya], I hope you get well soon. Regarding my doubt (as 
mentioned in the fourth last comment), I have found out the cause of variation 
in performance. I am providing the details below. 

Concern: I have been noticing that there was a significant difference in the 
indexing (1 documents) throughput (in documents/sec). Even for the same 
commit test executed at separate times, a significant difference in throughput 
was observed. Please see the following screenshot 
[http://162.243.101.83/Capture_Throughput_drop_reason.PNG]. I have marked the 
drops in throughput on the graph with red circles. You can observe the 
difference in throughput for example (322 - 263 doc/sec). This difference in 
throughput was reported by the system for the same commit id which was not 
making sense and should not be happening. After running a number of tests, I 
was able to pinpoint the cause of this abnormality in performance reporting. As 
guessed before this was a *memory issue*. The server that I am using has 
limited resources (2 GB RAM). I added more than 5GB of swap space. These 
specific drops (as marked on the graph with red circles), was happening when 
the test run required building the Solr server from fresh code before running 
the benchmark test. This additional activity (of building Solr from source) was 
causing the RAM to get utilized completely causing the OS to handle the 
situation with the swap space provided and this additional swap activity was 
causing the degradation in performance. Conclusion: I have figured out the 
cause of the abnormality in metric estimation and assume that with a suitable 
server (having required hardware resources), this abnormality in reporting will 
go away. Regards. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-21 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16018891#comment-16018891
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya] Please access the current code here: 
[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightlyBenchmarks].
 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-05-20 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16018655#comment-16018655
 ] 

Vivek Narang edited comment on SOLR-10317 at 5/21/17 4:15 AM:
--

Hi [~ichattopadhyaya] I have one doubt. Please see the following graph 
[http://162.243.101.83/dev/IndexingBenchmarkStandalone.html]. What could be the 
cause of variation in performance? The environment is same (same configuration) 
 and the commits on this graph are consecutive and recent. Please let me know 
soon! Regards


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya] I have one doubt. Please see the following graph 
[http://162.243.101.83/dev/IndexingBenchmarkStandalone.html]. What could be the 
cause of variation of performance? The environment is same (same configuration) 
 and the commits on this graph are consecutive and recent. Please let me know 
soon! Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-20 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16018655#comment-16018655
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya] I have one doubt. Please see the following graph 
[http://162.243.101.83/dev/IndexingBenchmarkStandalone.html]. What could be the 
cause of variation of performance? The environment is same (same configuration) 
 and the commits on this graph are consecutive and recent. Please let me know 
soon! Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-19 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16018271#comment-16018271
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya], After doing some refactoring the links in the previous 
comments may not work. Please access 
[http://162.243.101.83/dev/CreateCollectionBenchmarkStandalone.html] or 
[http://162.243.101.83/production/CreateCollectionBenchmarkStandalone.html]. 

One of these links are for my dev environment the other is the simulated 
production environment where Jenkins on my server triggers the benchmark suite.

Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-19 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16017963#comment-16017963
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya], A quick update, now I have also added indexing 
performance for SolrCloud using the regular and the concurrent client. You can 
see the current running data by clicking the links provided. The current sample 
configuration for SolrCloud is 5 Solr nodes, 2 Shards, and 2 Replicas.  For the 
concurrent client, the configuration is queue size of 10 and threads 10. 
Please access here: 
[http://162.243.101.83/IndexingBenchmarkCloudConcurrent.html]. 
Also, I thought that it would be a great idea to install Jenkins 
[http://162.243.101.83:8080/] on my server and make it invoke the nightly 
benchmark tests. Now, this benchmark package would be invoked every night on my 
server. Please let me know what you think. Regards. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-05-17 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16014937#comment-16014937
 ] 

Vivek Narang edited comment on SOLR-10317 at 5/17/17 11:38 PM:
---

Hi [~ichattopadhyaya], A quick update. Please access 
[http://162.243.101.83/IndexingBenchmarkStandalone.html]. Now when you click a 
point on the graph, a popup box appears having three tabs (Heap Space Usage, 
Process CPU load, and Test Environment Details). Now, the estimations are made 
using the Solr's Metrics API.The sections provide graphs for Heap Space usage 
and CPU usage. The third tab shows other environment information. Also, there 
is a link at the bottom (on the popup itself) which opens the page with details 
of file changes associated with each commit. Please let me know what you think. 
Regards.


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya], A quick update. Please access 
[http://162.243.101.83/IndexingBenchmarkStandalone.html]. Now when you click a 
point on the graph, a popup box appears having three tabs (Heap Space Usage, 
Process CPU load, and Test Environment Details). The sections provide graphs 
for Heap Space usage and CPU usage. The third tab shows other environment 
information. Also, there is a link at the bottom (on the popup itself) which 
opens the page with details of file changes associated with each commit. Please 
let me know what you think. Regards.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-17 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16014937#comment-16014937
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya], A quick update. Please access 
[http://162.243.101.83/IndexingBenchmarkStandalone.html]. Now when you click a 
point on the graph, a popup box appears having three tabs (Heap Space Usage, 
Process CPU load, and Test Environment Details). The sections provide graphs 
for Heap Space usage and CPU usage. The third tab shows other environment 
information. Also, there is a link at the bottom (on the popup itself) which 
opens the page with details of file changes associated with each commit. Please 
let me know what you think. Regards.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-17 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16014663#comment-16014663
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ab], Thank you for writing this awesome Metrics API. My life just got 
better with this :)

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-17 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16014542#comment-16014542
 ] 

Vivek Narang commented on SOLR-10317:
-

Thanks [~ichattopadhyaya] This API is Awesome 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-17 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16014418#comment-16014418
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya] I am working on making these changes. For memory 
monitoring, a thread runs in parallel when the test is in progress and parses 
data from */proc/meminfo* in specific intervals. Please let me know if you 
think that there is a better approach. Thanks.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-16 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16013277#comment-16013277
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya], [~michael.sun]. I have added an option to see the state 
of memory during a specific test. Please access 
[http://162.243.101.83/IndexingBenchmarkStandalone.html], now when you click on 
a specific point on the graph, another graph will appear on a popup window with 
the record of how the main memory and swap memory availability changed during 
the test execution. Please let me know what you think. Thanks. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-05-16 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16012994#comment-16012994
 ] 

Vivek Narang edited comment on SOLR-10317 at 5/16/17 8:02 PM:
--

Hi [~ichattopadhyaya], [~michael.sun]. I have added a "Environment Details" 
option on the top right-hand corner. Please access 
[http://162.243.101.83/IndexingBenchmarkStandalone.html]. While I am working 
towards adding a memory usage profile during a specific metric estimation, 
please let me know if the information that is currently shown is helpful. 
Thanks!


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya], [~michael.sun]. I have added a "Test Environment 
Details" option on the top right-hand corner. Please access 
[http://162.243.101.83/IndexingBenchmarkStandalone.html]. While I am working 
towards adding a memory usage profile during a specific metric estimation, 
please let me know if the information that is currently shown is helpful. 
Thanks!

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-16 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16012994#comment-16012994
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya], [~michael.sun]. I have added a "Test Environment 
Details" option on the top right-hand corner. Please access 
[http://162.243.101.83/IndexingBenchmarkStandalone.html]. While I am working 
towards adding a memory usage profile during a specific metric estimation, 
please let me know if the information that is currently shown is helpful. 
Thanks!

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-05-16 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16012606#comment-16012606
 ] 

Vivek Narang edited comment on SOLR-10317 at 5/16/17 4:15 PM:
--

Hi [~ichattopadhyaya] A quick update. I was thinking that It would be nice to 
show the author information (Name, email, and date) for the commit (In case the 
viewer needs to reach out to him/her for some performance related questions). 
Please access:[http://162.243.101.83/IndexingBenchmarkStandalone.html] and now 
when you hover on a point on the graph you will also see the author details and 
his commit message. Please let me know what you think. Thanks!


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya] A quick update. I was thinking that I would be nice to 
show the author information (Name, email, and date) for the commit (In case the 
viewer needs to reach out to him/her for some performance related questions). 
Please access:[http://162.243.101.83/IndexingBenchmarkStandalone.html] and now 
when you hover on a point on the graph you will also see the author details and 
his commit message. Please let me know what you think. Thanks!

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-16 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16012606#comment-16012606
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya] A quick update. I was thinking that I would be nice to 
show the author information (Name, email, and date) for the commit (In case the 
viewer needs to reach out to him/her for some performance related questions). 
Please access:[http://162.243.101.83/IndexingBenchmarkStandalone.html] and now 
when you hover on a point on the graph you will also see the author details and 
his commit message. Please let me know what you think. Thanks!

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-15 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16011811#comment-16011811
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya] I was trying to create an improved UI. Please access 
[http://162.243.101.83/IndexingBenchmarkStandalone.html] and let me know what 
you think. This is not yet final and I will be adding 
features/options/controls/data (Like an option to view test environment related 
information etc.) on this soon. Thanks. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-15 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16011691#comment-16011691
 ] 

Vivek Narang commented on SOLR-10317:
-

Yes [~arafalov] :)

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-05-15 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16011667#comment-16011667
 ] 

Vivek Narang edited comment on SOLR-10317 at 5/16/17 3:08 AM:
--

Hello [~ichattopadhyaya] I also found out another dataset which can be used for 
some scenarios/features. Please see 
[https://www.kaggle.com/snap/amazon-fine-food-reviews]. This data set is huge 
with over half a million records and ten fields. There is a good mix of text 
and numeric fields. The current indexing time as observed is 1222 seconds on a 
standalone node. Please access the file (~250MB) here: 
[http://162.243.101.83/Reviews.csv]. I think this is awesome! Please let me 
know what you think. Thanks. 

--- Data Structure Details ---
Id
ProductId - unique identifier for the product
UserId - unqiue identifier for the user
ProfileName
HelpfulnessNumerator - number of users who found the review helpful
HelpfulnessDenominator - number of users who indicated whether they found the 
review helpful
Score - rating between 1 and 5
Time - timestamp for the review
Summary - brief summary of the review
Text - text of the review
---


was (Author: vivek.nar...@uga.edu):
Hello [~ichattopadhyaya] I also found out another dataset which can be used for 
some scenarios/features. Please see 
[https://www.kaggle.com/snap/amazon-fine-food-reviews]. This data set is huge 
with over half a million records and ten fields. There is a good mix of text 
and numeric fields. The current indexing time as observed is 1222 seconds on a 
standalone node. Please access the file (~250MB) here: 
[http://162.243.101.83/Reviews.csv]. I think this is awesome! Please let me 
know what you think. Thanks. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-15 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16011667#comment-16011667
 ] 

Vivek Narang commented on SOLR-10317:
-

Hello [~ichattopadhyaya] I also found out another dataset which can be used for 
some scenarios/features. Please see 
[https://www.kaggle.com/snap/amazon-fine-food-reviews]. This data set is huge 
with over half a million records and ten fields. There is a good mix of text 
and numeric fields. The current indexing time as observed is 1222 seconds on a 
standalone node. Please access the file (~250MB) here: 
[http://162.243.101.83/Reviews.csv]. I think this is awesome! Please let me 
know what you think. Thanks. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-15 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16011599#comment-16011599
 ] 

Vivek Narang commented on SOLR-10317:
-

Thanks [~arafalov] :)


> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-05-15 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16011172#comment-16011172
 ] 

Vivek Narang edited comment on SOLR-10317 at 5/15/17 7:27 PM:
--

Hi [~ichattopadhyaya], A small additional update. Using this set of documents 
(~25K) and the data structure I am getting the following metrics against recent 
commits [http://162.243.101.83/IndexingBenchmark.html]. Y-Axis is in seconds 
and X-Axis is time. On hover, you will be able to see the commit ID.  


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya], A small additional update. Using this set of documents 
(~25K) and the data structure I am getting the following metrics against recent 
commits [http://162.243.101.83/IndexingBenchmark.html]. Y-Axis is in seconds 
this time and X-Axis is time. On hover, you will be able to see the commit ID.  

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-15 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16011172#comment-16011172
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya], A small additional update. Using this set of documents 
(~25K) and the data structure I am getting the following metrics against recent 
commits [http://162.243.101.83/IndexingBenchmark.html]. Y-Axis is in seconds 
this time and X-Axis is time. On hover, you will be able to see the commit ID.  

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-05-15 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16011141#comment-16011141
 ] 

Vivek Narang edited comment on SOLR-10317 at 5/15/17 7:12 PM:
--

Hi [~ichattopadhyaya] I also wanted to share that I was trying to figure a way 
out to get the Solr commit history for indexing purposes. After searching I 
came up with a script 
[https://gist.github.com/viveknarang/141ab289789b0fe55b09409f99d84c75] and with 
this, created a JSON file [http://162.243.101.83/solrcommit.log] having around 
25K documents. Please let me know what you think about the data structure. 
Thanks!

--- Sample Record ---

{
  "commit": "9c6279d439a231d9ec8c9564b0ab76f616d10076",
  "author": "Joel Bernstein",
  "date": "Sun May 14 15:54:32 2017 -0400",
  "message": "SOLR-10663-Add-distance-Stream-Evaluator",
 "author email": "jbern...@apache.org",
 "timestamp": "1494791672",
 "committer name": "Joel Bernstein",
 "committer email": "jbern...@apache.org",
 "commit date": "Mon May 15 11:26:05 2017 -0400"
}


was (Author: vivek.nar...@uga.edu):
Hi [~ichattopadhyaya] I also wanted to share that I was trying to figure a way 
out to get the Solr commit history for indexing purposes. After searching I 
came up with a script 
[https://gist.github.com/viveknarang/141ab289789b0fe55b09409f99d84c75] and with 
this, created a JSON file [http://162.243.101.83/solrcommit.log] having around 
25K documents. Please let me know what you think about the data structure. 
Thanks!

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-15 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16011141#comment-16011141
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~ichattopadhyaya] I also wanted to share that I was trying to figure a way 
out to get the Solr commit history for indexing purposes. After searching I 
came up with a script 
[https://gist.github.com/viveknarang/141ab289789b0fe55b09409f99d84c75] and with 
this, created a JSON file [http://162.243.101.83/solrcommit.log] having around 
25K documents. Please let me know what you think about the data structure. 
Thanks!

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-15 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=1603#comment-1603
 ] 

Vivek Narang commented on SOLR-10317:
-

[~ichattopadhyaya] Yes! I agree that nanosecond scale is too precise. I will 
convert it to milliseconds. Thanks.

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-15 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16011080#comment-16011080
 ] 

Vivek Narang commented on SOLR-10317:
-

Thanks [~ichattopadhyaya]. The X axis represents day and time when the metric 
was recorded. The Y axis in this example is the time in nanoseconds that was 
registered for this activity against each commit. I will provide the link to 
the code soon. As far as community bonding activities are concerned I have been 
reading the Solr documentation and especially coding standards that I must 
adhere to. I have used also #solr-dev and #solr channels to ask few questions. 
I plan to be more active in the community bonding activities this week. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-14 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16009889#comment-16009889
 ] 

Vivek Narang commented on SOLR-10317:
-

Hello, [~mkhludnev], [~ichattopadhyaya]. Please access 
[http://162.243.101.83/CreateCollectionBenchmark.html]. I tried creating a 
sample where I am trying to calculate the time taken for creating a collection 
tested on a set of recent commits. Please hover on the graph to get more 
details. This uses the same graph API as used at 
[https://home.apache.org/~mikemccand/lucenebench/indexing.html] for Lucene 
nightly benchmarks. Please let me know what you think. I look forward to 
hearing from you soon. Thanks. 


> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-05-14 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16009889#comment-16009889
 ] 

Vivek Narang edited comment on SOLR-10317 at 5/15/17 12:00 AM:
---

Hello, [~mkhludnev], [~ichattopadhyaya]. Please access 
[http://162.243.101.83/CreateCollectionBenchmark.html]. I created a sample 
where I am calculating the time taken for creating a collection, tested on a 
set of recent commits. Please hover on the graph to get more details. This uses 
the same graph API as used at 
[https://home.apache.org/~mikemccand/lucenebench/indexing.html] for Lucene 
nightly benchmarks. Please let me know what you think. I look forward to 
hearing from you soon. Thanks. 



was (Author: vivek.nar...@uga.edu):
Hello, [~mkhludnev], [~ichattopadhyaya]. Please access 
[http://162.243.101.83/CreateCollectionBenchmark.html]. I tried creating a 
sample where I am trying to calculate the time taken for creating a collection 
tested on a set of recent commits. Please hover on the graph to get more 
details. This uses the same graph API as used at 
[https://home.apache.org/~mikemccand/lucenebench/indexing.html] for Lucene 
nightly benchmarks. Please let me know what you think. I look forward to 
hearing from you soon. Thanks. 


> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-08 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16001326#comment-16001326
 ] 

Vivek Narang commented on SOLR-10317:
-

Thanks [~michael.sun]! 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-05-02 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15994176#comment-15994176
 ] 

Vivek Narang commented on SOLR-10317:
-

Hi [~michael.sun], Thank you for your comment. Yes! I would like to know more 
about your framework. Please let me know your availability for next week and 
your preferred medium of communication (Skype, phone etc.).  I am hoping that 
my proposal will get an official acceptance soon. I look forward to hearing 
from you soon. Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Comment Edited] (SOLR-10317) Solr Nightly Benchmarks

2017-04-03 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15953589#comment-15953589
 ] 

Vivek Narang edited comment on SOLR-10317 at 4/3/17 3:29 PM:
-

Hello, 
I have uploaded my final proposal for this project. I would like to take this 
opportunity to thank all the members of the community especially 
[~ichattopadhyaya], [~mkhludnev] & [~arafalov] for their prompt feedback and 
review of my proposal. I am attaching a copy of the uploaded final proposal 
here. 

I look forward to working on this project. 

Regards
Vivek Narang


was (Author: vivek.nar...@uga.edu):
Hello, 
I have uploaded my final proposal for this project. I would like to take this 
opportunity to thank all the members of the community especially 
[~ichattopadhyaya], [~mkhludnev] & [~arafalov] for their prompt feedback and 
review of my proposal. I am attaching a copy of the uploaded final proposal 
here. 

Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-10317) Solr Nightly Benchmarks

2017-04-03 Thread Vivek Narang (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vivek Narang updated SOLR-10317:

Attachment: 
Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf

Final Proposal Attached

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-04-03 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15953589#comment-15953589
 ] 

Vivek Narang commented on SOLR-10317:
-

Hello, 
I have uploaded my final proposal for this project. I would like to take this 
opportunity to thank all the members of the community especially 
[~ichattopadhyaya], [~mkhludnev] & [~arafalov] for their prompt feedback and 
review of my proposal. I am attaching a copy of the uploaded final proposal 
here. 

Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-03-31 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15951479#comment-15951479
 ] 

Vivek Narang commented on SOLR-10317:
-

Thanks [~ichattopadhyaya] for your prompt feedback on my draft for the 
proposal!! Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-03-31 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15951416#comment-15951416
 ] 

Vivek Narang commented on SOLR-10317:
-

Thanks [~ichattopadhyaya], Draft uploaded on the GSoC website. Regards

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-10317) Solr Nightly Benchmarks

2017-03-31 Thread Vivek Narang (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vivek Narang updated SOLR-10317:

Attachment: (was: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx)

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-10317) Solr Nightly Benchmarks

2017-03-31 Thread Vivek Narang (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vivek Narang updated SOLR-10317:

Attachment: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx

Hello, 
I am attaching the first draft version of my proposal. I would like to have 
your thoughts and valuable suggestions on my draft. Looking forward to the 
suggestions from the community. 

Regards 
Vivek

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Issue Comment Deleted] (SOLR-10317) Solr Nightly Benchmarks

2017-03-31 Thread Vivek Narang (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vivek Narang updated SOLR-10317:

Comment: was deleted

(was: Proposal Draft)

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Updated] (SOLR-10317) Solr Nightly Benchmarks

2017-03-31 Thread Vivek Narang (JIRA)

 [ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vivek Narang updated SOLR-10317:

Attachment: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx

Proposal Draft

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
> Attachments: Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



[jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks

2017-03-31 Thread Vivek Narang (JIRA)

[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15951147#comment-15951147
 ] 

Vivek Narang commented on SOLR-10317:
-

Thanks [~ichattopadhyaya], [~steve_rowe], [~thetaphi] & [~arafalov] for all 
your inputs. 

> Solr Nightly Benchmarks
> ---
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
>  Issue Type: Task
>Reporter: Ishan Chattopadhyaya
>  Labels: gsoc2017, mentor
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org



  1   2   >