Hi,
I commented on this in a recent thread, a couple of minutes later I
had a working example (compliments to the modular design of couchdb
and the easy set up with an ini file).
Folloing Joe Armstrong's book on pmap I have
update_db(JsonObject, Db)->
NewDoc = couch_doc:from_json_obj(JsonObject),
DocId = couch_util:new_uuid(),
{ok, _NewRev} = couch_db:update_doc(Db, NewDoc#doc{id=DocId}, []).
pmap(F, L, Db) ->
S = self(),
%% make_ref() returns a unique reference
%% we'll match on this later
Ref = erlang:make_ref(),
Pids = map(fun(I) ->
spawn(fun() -> do_f(S, Ref, F, I, Db) end)
end, L),
%% gather the results
gather(Pids, Ref).
do_f(Parent, Ref, F, I, Db) ->
Parent ! {self(), Ref, (catch F(I, Db))}.
gather([Pid|T], Ref) ->
receive
{Pid, Ref, Ret} -> [Ret|gather(T, Ref)]
end;
gather([], _) ->
[].
and then I invoke this with
{FormResp} = my_update_query_servers:render_update(Lang, UpFun,
nil, nil, Req, Db),
case proplists:get_value(<<"results">>, FormResp, nil) of
undefined ->
throw({error, no_result});
ResultList ->
Res = pmap(fun update_db/2, ResultList, Db)
end,
is there something bad with this? my_update_query_servers returns a
collection of JSON objects and this seems to me to be a parallel
write. Since I gather all the references I should be able to update
the caller with the list of created documents?
Norman