[email protected] wrote:
> One of the main tasks our applications have is to import testing results from
> data files in a variety of formats.
>
>
>
> In VFP, the basic operation is:
>
>
>
> 1. read data file and append or insert data into a VFP cursor.
>
> 2. validate/massage/interpret data contained in cursor.
>
> 3. scan cursor and insert validated data into VFP tables .
If it were me, I'd use basic Python to get the data from the text files into a
python
list of dictionaries, and then I'd instantiate a dabo bizobj and iterate that
list, like:
{{{
# assume rows is the list of dicts
# assume every key in the row dict matches field names in the bizobj exactly
for row_num, row_dict in enumerate(rows):
print "Processing row %s" % row_num
biz.new()
biz.setFieldVals(**myRowDict)
print "attempting to save":
biz.saveAll()
if biz.isAnyChanged():
print "something went wrong in biz.saveAll()"
}}}
This keeps the simple stuff simple, and still lets the bizobj validate every
row and
rollback the entire operation if something didn't validate.
Paul
_______________________________________________
Post Messages to: [email protected]
Subscription Maintenance: http://leafe.com/mailman/listinfo/dabo-users
Searchable Archives: http://leafe.com/archives/search/dabo-users
This message: http://leafe.com/archives/byMID/[email protected]