"There is something hard coded in there and I will find it eventually
and find why it was put there and by whom."

This attitude might discourage people from helping you with your efforts.

B.


On 18 December 2013 22:33, david martin <david.mar...@lymegreen.co.uk> wrote:
> On 18/12/13 18:05, Robert Newson wrote:
>>
>> I've confirmed that the native view server honors that timeout, can
>> you tell me what;
>>
>> curl localhost:5984/_config/couchdb/os_process_timeout
>
>
> restart CouchDB  on 1.2 (latest in Ubuntu) then
>
> curl david:************@localhost:5984/_config/couchdb/os_process_timeout
> "50000000000000"
> rerun gives
> Error: timeout
>
> {gen_server,call,
>             [<0.200.0>,
>              {prompt,[<<"map_doc">>,
> {[{<<"_id">>,<<"61c3f496b9e4c8dc29b95270d9000370">>},
> {<<"_rev">>,<<"9-e48194151642345e0e3a4a5edfee56e4">>},
>                         {<<"test">>,
>                          {[{<<"hey">>,
>                             {[{<<"_id">>,
> <<"61c3f496b9e4c8dc29b95270d9000370">>},........}
>
> Test JSON here ~16K lines
>>
>> https://friendpaste.com/6LkCbdENAe1gOZlD9DWCod
>
> Code as in couchdb/erlang  list in "Using the Erlang view server to Educate
> in CouchDB"
>
> I have looked for this for some time hoping next release would fix it.
> There is something hard coded in there and I will find it eventually and
> find why it was put there and by whom.
>
>
>
>>
>> returns? You might need to bounce couchdb in any case, as it applies
>> this timeout setting when it creates the process, and we keep a pool
>> of them around, so changes to timeout after that won't be picked up
>> until they're rebuild. restarting couchdb is the quickest way to
>> ensure that.
>>
>> B.
>>
>>
>> On 18 December 2013 16:20, david martin <david.mar...@lymegreen.co.uk>
>> wrote:
>>>
>>> Futon on Apache CouchDB 1.2 (according to Futon)
>>> {"couchdb":"Welcome","version":"1.2.0"} according to ?
>>> CouchDB 1.4.0 Ubuntu according to Package name
>>>
>>> I set os_process_timeout 50000000000000 (effective infinity).
>>>
>>>   I ALWAYS get the VERY unhelpful message which merely prints the
>>> document
>>> contents.
>>>
>>> Error: timeout       % yes I know this but cannot do anything about it
>>>
>>> {gen_server,call,     % it's in a gen_server yes I know this!
>>>              [<0.14190.8>,   % this is its PID yes I know this!
>>>               {prompt,[<<"map_doc">>,   % it is a MAP function yes I know
>>> this!
>>> {[{<<"_id">>,<<"61c3f496b9e4c8dc29b95270d9000370">>}, % it is the
>>> document I
>>> am processing, Yes I know this!
>>> {<<"_rev">>,<<"9-e48194151642345e0e3a4a5edfee56e4">>},
>>>                          .....
>>>
>>> Yes it is a large and complex document (16K lines to make this happen on
>>> fast machine much less on Raspberry Pi).
>>> Yes it uses Erlang view function.
>>> Yes I DO want it to hog resources until it is finished.
>>> Yes I am the administrator.
>>> No  I AM NOT INTERFERING WITH ANYTHING ELSE.
>>> No I cannot dictate how big or small the document is.
>>> Yes this is important to me.
>>> I have not pursued this as I was using rcouch, I could not find the
>>> source
>>> of the timeout message.
>>> I did not want to have to rebuild to fix this.
>>> I did not want to bother the Couchdb team as I was using a fork of
>>> CouchDB.
>>> Simlar issues have been raised and no answers forthcoming.
>>> Mentions of "hidden tweaks", "this is not good for you", "have you got
>>> big
>>> documents"  etc.
>>>
>>> How do I get this NOT to timeout?
>>>
>>> On rcouch I would change a value and rebuild a release to fix this (if I
>>> could identify the source).
>>> If anybody can give a clue I will test their hypothesis and report back
>>> to
>>> the list.
>>>
>>> --
>>> David Martin
>>>
>>
>
>
> --
> David Martin
>

Reply via email to