Hi again Clinton,
 
I got some sleep, looked at my code again and I now stand corrected on a few issues.  The insertCampusCourse method, was working properly.  There is a simple rule that I forgot about batching and that it is, you cannot search for data that is currently being batched because none of the batched commands have been committed.  So, I modified the insertCampusCourse method accordingly.  It now looks like the following:
 
Map<String, CampusCourseVO> campusCourseMap = new HashMap<String, CampusCourseVO>();

public CampusCourseVO insertCampusCourse(BaseExam exam, String protectedCourseId) throws Exception {
    CampusCourseVO campusCourse = new CampusCourseVO();

    final TermVO term = new TermVO();
    CampusVO campus = new CampusVO();
    CourseVO course = new CourseVO();

    term.setTermId(DEFUALT_SYS_TERM_ID);
    campus = campusService.findByProtectedId(exam.getEscoId()); 
    course = courseService.findByProtectedId(protectedCourseId);

    campusCourse.setTerm(term);
    campusCourse.setCampus(campus);
    campusCourse.setCourse(course);

    Integer termId = campusCourse.getTerm().getTermId();
    Integer campusId = campusCourse.getCampus().getCampusId();
    Integer courseId = campusCourse.getCourse().getCourseId();

    String key = termId.toString() + "|" + campusId.toString() + "|" + courseId.toString();

    final CampusCourseVO temp = campusCourseMap.get(key);
    if (temp == null) {
      campusCourseService.insert(campusCourse);
      campusCourseMap.put(key, campusCourse);
      return campusCourse;
    } else {
      return temp;
    }
  }


In essence, I'm now storing CampusCourse value objects in a HashMap after they have been posted.  This works quite well because out of the 400,000 student exams that were taken for the given course, only 1500 campuses gave the exam.
 
With this being the case, I'm going with the notion that my BatchControllerService is also working properly.  BatchControllerService calls BatchControllerDao.  BatchControllerDao uses getSqlMapExecutor().startBatch() and getSqlMapExecutor().startBatch() for batching. So, if I've understood things correctly, SqlMap has the ability to batch all SQL statements between a sqlMap.startBatch and sqlMap.executeBatch() (See code below):
 
public class BatchControllerServiceImpl extends ServiceBase implements BatchControllerService {
  BatchControllerDao dao;

  public BatchControllerServiceImpl() {
    dao = (BatchControllerDao) daoManager.getDao(BatchControllerDao.class);
  }

  public void startBatch() {
    dao.startBatch();
  }

  public int executeBatch() {
    return dao.executeBatch();
  }
}

public class SqlMapBatchControllerDao extends SqlMapBaseDao implements BatchControllerDao {

  public SqlMapBatchControllerDao(DaoManager daoManager) {
    super(daoManager);
  }

  public void startBatch() {
    try {
       getSqlMapExecutor().startBatch();
    } catch (SQLException e) {
      throw new DaoException("Unable to startBatch", e);
    }
  }

  public int executeBatch() {
    try {
       return getSqlMapExecutor().executeBatch();
    } catch (SQLException e) {
      throw new DaoException("Unable to startBatch", e);
    }
  }
}

So, I now only have two questions:

1) Are my above assumption correct?
2) Is this a good way to handle batching using IBatis Dao's?

Thank you for your patients.

Hycel

Reply via email to