is concept_val of type double ? if not you might want to cast it to appropriate type. MAX assumes that bytearray type is of type double and is throwing an exception if the conversion (Double.valueOf(dba.toString()) ) fails. (The exception on type conversion failure is not the norm, pig usually increments a warning counter when that happens).

-Thejas



On 8/19/11 8:33 AM, Thejas Nair wrote:
The full exception stack trace will be useful. It should be there in pig
log file on client side or in the mr task log file.
-thejas.
typed on a tiny virtual keyboard

On Aug 19, 2011 7:42 AM, "ipshita chatterji" <[email protected]
<mailto:[email protected]>> wrote:
 > Hi ,
 > Please see the code snippet below:
 >
 > register pig.jar;
 > register piggybank.jar;
 >
 > o1 = load 'observations.csv' as (obs_id, encounter_id, sub_form_id,
 > observed_by, verified_by, remark);
 >
 > oc1 = load 'observation_concept.csv' as (obs_id, concept_id,
concept_val);
 >
 > l1 = load 'locations.csv' as (location_id, longitude, latitude, address1,
 > address2, village, town,city, state_province, postal_code, country,
 > is_person_address);
 >
 > e1 = load 'encounters.csv' as (encounter_id, person_id, location_id,
 > encounter_date_time);
 >
 > p1 = load 'persons.csv' as (person_id, gender, given_name, middle_name,
 > family_name, birth_date,birth_date_estimated, birth_place, mothers_name,
 > spouses_name,death_date, death_date_estimated, location_id,
 > marriage_date,marriage_date_estimated, entry_date, marriage_status,
 > contact_number,father_name);
 >
 > pid = load 'person__patient_id_type.csv'as (patient_id_type_id,
person_id,
 > patient_id);
 >
 > oc1 = filter oc1 by concept_id == 317;
 >
 > temp = join oc1 by obs_id, o1 by obs_id;
 >
 > temp = join temp by o1::encounter_id, e1 by encounter_id;
 >
 > temp = join p1 by person_id, temp by e1::person_id;
 >
 > temp = join l1 by location_id, temp by p1::location_id;
 >
 > temp = join pid by person_id, temp by p1::person_id;
 >
 >
 > temp = group temp by (p1::person_id);
 > temp = foreach temp generate flatten(temp), MAX(temp.oc1::concept_val) as
 > DeliveryDate;
 >
 > Everytime I try to execute it, I get the following error:
 >
 > 2011-08-19 04:11:29,561 [main] INFO
 >
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
 > - Some jobs have failed! Stop running all dependent jobs
 > 2011-08-19 04:11:29,572 [main] ERROR org.apache.pig.tools.grunt.Grunt -
 > ERROR 2997: Unable to recreate exception from backed error:
 > org.apache.pig.backend.executionengine.ExecException: ERROR 2103: Problem
 > while computing max of doubles.
 >
 > Any clue why this happens?
 >
 > Thanks,

Reply via email to