That's strange... I'm using erlang-dev from the Debian 6 repository. I see that there's a debian repository run by erlang-solutions.com but it's in a different format than the main Debian repository. If I try to install "erlang" from the main repository or "esl-erlang" from the erlang-solutions repository, apt wants to install a whole host of packages I really don't want installed such as "x11-common" and "java-common". "erlang-dev" seems to have just installed the bare necessities to compile and install the couchdb project.
On Sun, Aug 19, 2012 at 8:13 AM, Robert Newson <rnew...@apache.org> wrote: > > > A full view response should always be valid JSON, so that does point towards > a bug in CouchDB assuming the response output you posted is verbatim what > CouchDB returned. Can you reproduce this reliably? > > Oh, and unrelated to the bug itself, but R14A is a beta release, you should > upgrade. > > B. > > On 19 Aug 2012, at 11:05, CGS wrote: > >> I don't suppose the problem is coming from CouchDB, but from your external >> environment which has a limited number of characters per line, truncating >> the message if more. It happened to me in few occasions (Linux terminal >> truncated my message because it was single line - line end was translated >> as "\n" inside the message, so, no real line break was actually registered). >> >> CGS >> >> >> On Sun, Aug 19, 2012 at 1:46 AM, Tim Tisdall <tisd...@gmail.com> wrote: >> >>> I have a script where I query a view for multiple entries of data. >>> I'm doing it in batches of 1000. It works fine multiple times and >>> then suddenly it returns a result that doesn't properly parse as JSON >>> because it's missing some content at the end (not sure how much, but >>> it's at least missing the final bracket to make it complete). >>> >>> My logs don't point out any problem... >>> >>> [Sat, 18 Aug 2012 22:14:08 GMT] [debug] [<0.26768.3>] 'POST' >>> /app_stats/_design/processing/_view/blanks {1,0} from "127.0.0.1" >>> Headers: [{'Content-Length',"14010"}, >>> {'Content-Type',"application/json"}, >>> {'Host',"localhost"}] >>> [Sat, 18 Aug 2012 22:14:08 GMT] [debug] [<0.26768.3>] OAuth Params: [] >>> [Sat, 18 Aug 2012 22:14:08 GMT] [debug] [<0.26768.3>] request_group >>> {Pid, Seq} {<0.20450.3>,95240673} >>> [Sat, 18 Aug 2012 22:14:08 GMT] [info] [<0.26768.3>] 127.0.0.1 - - >>> POST /app_stats/_design/processing/_view/blanks 200 >>> >>> Here's what I received from couchdb: >>> >>> HTTP/1.0 200 OK >>> Server: CouchDB/1.2.0 (Erlang OTP/R14A) >>> ETag: "B25M1ITCCF4RKMFE87QMQ1N3M" >>> Date: Sat, 18 Aug 2012 22:22:10 GMT >>> Content-Type: text/plain; charset=utf-8 >>> Cache-Control: must-revalidate >>> >>> {"total_rows":14829491,"offset":12357523,"rows":[ >>> >>> {"id":"34049664743","key":"34049664743","value":[{"start_date":"2012-08-05","end_date":null}]}, >>> >>> {"id":"34049674790","key":"34049674790","value":[{"start_date":"2012-08-05","end_date":null}]}, >>> >>> {"id":"34049683784","key":"34049683784","value":[{"start_date":"2012-08-05","end_date":null}]}, >>> >>> {"id":"34049710675","key":"34049710675","value":[{"start_date":"2012-08-05","end_date":null}]}, >>> [ ** SNIP ** ] >>> >>> {"id":"34082476762","key":"34082476762","value":[{"start_date":"2012-08-05","end_date":null}]}, >>> >>> {"id":"34082494494","key":"34082494494","value":[{"start_date":"2012-08-05","end_date":null}]}, >>> >>> {"id":"34082507402","key":"34082507402","value":[{"start_date":"2012-08-05","end_date":null}]}, >>> >>> {"id":"34082533553","key":"34082533553","value":[{"start_date":"2012-08-05","end_date":null}]}, >>> >>> {"id":"34082612840","key":"34082612840","value":[{"start_date":"2012-08-05","end_date":null}]}, >>> >>> {"id":"34082621527","key":"34082621527","value":[{"start_date":"2012-08-05","end_date":null}]}, >>> >>> {"id":"34082680993","key":"34082680993","value":[{"start_date":"2012-08-05","end_date":null}]} >>> >>> it seems to consistently truncate at a point where the next character >>> should either be another comma or a closing square bracket (to close >>> the "rows" array). >>> >>> I tried changing the script to do batches of 100 and it seems to be >>> running without problems. Shouldn't there be some sort of error, >>> though? >>> >