Github user chrismattmann commented on the issue:
https://github.com/apache/incubator-joshua/pull/81
+1 from me
---
Github user chrismattmann commented on the issue:
https://github.com/apache/incubator-joshua/pull/81
I will commit this if I don't see any comments in the next 24 hours
---
Github user thammegowda commented on the issue:
https://github.com/apache/incubator-joshua/pull/81
I am trying to run an experiment with a bunch of big language models, but
the tuner is taking forever!
In the code base, I found a few more (possible) bottlenecks:
1.
GitHub user thammegowda opened a pull request:
https://github.com/apache/incubator-joshua/pull/81
Fix: memory overflow in tuning with larger language models
tuner needs more memory when larger language models are used.
Even though we allocate more memory in the training pipeline
Done.
On 3/5/18, 2:34 PM, "lewis john mcgibbney" wrote:
Hi Folks,
I've been working with a few folks on PODLINGNAMESEARCH-97 [0]. This
essentially to determine whether the Joshua name can be trademarked and
used moving forward if we wished, as a PPMC/PMC,
Hi Folks,
I've been working with a few folks on PODLINGNAMESEARCH-97 [0]. This
essentially to determine whether the Joshua name can be trademarked and
used moving forward if we wished, as a PPMC/PMC, to do this.
Can folks please mention your opinions on that thread? If so, we can move
on.
Thank