liferoad commented on issue #35286: URL: https://github.com/apache/beam/issues/35286#issuecomment-2973022085
I think the NullPointerException during display data population is a classic symptom of using an anonymous inner class for a transform's logic, in this case, the JdbcIO.RowMapper. To fix this, the RowMapper should be refactored into a static nested class. try this diff: ``` --- Original User Code (Problematic) +++ Corrected Production-Quality Code - // Anonymous inner class for RowMapper - .withRowMapper( - new JdbcIO.RowMapper() { - private static final long serialVersionUID = 1L; - public Customer1 mapRow(ResultSet rs) throws SQLException { - Customer1 cust = new Customer1(); - cust.setCustomerId(rs.getInt("customerId")); - cust.setName(rs.getString("name")); - cust.setEmail(rs.getString("email")); - cust.setAddress(rs.getString("address")); - return cust; - } - }) - .withCoder(SerializableCoder.of(Customer1.class)); + // Define the RowMapper as a static nested class elsewhere in the file + private static class CustomerRowMapper implements JdbcIO.RowMapper<Customer1> { + @Override + public Customer1 mapRow(ResultSet rs) throws SQLException { + Customer1 cust = new Customer1(); + cust.setCustomerId(rs.getInt("customerId")); + cust.setName(rs.getString("name")); + cust.setEmail(rs.getString("email")); + cust.setAddress(rs.getString("address")); + return cust; + } + } + + // In the main pipeline construction: + - JdbcIO.ReadWithPartitions readTransform = JdbcIO.readWithPartitions() + // Parameterize the transform with the output type <Customer1> + JdbcIO.ReadWithPartitions<Customer1> readTransform = JdbcIO.<Customer1>readWithPartitions() + // ... (DataSourceConfiguration, etc.) + // Use the new static nested class for the RowMapper + .withRowMapper(new CustomerRowMapper()); + // The .withCoder() call is now removed as it's inferred automatically. - pipeline.apply("ReadFromMySQL", readTransform) - .apply("ToCSV", MapElements.via(new CustomerToCSV())) + pipeline.apply("ReadFromMySQL", readTransform) + // Add a type hint for robustness + .apply("ToCSV", MapElements.into(TypeDescriptors.strings()).via(new CustomerToCSV())) ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: github-unsubscr...@beam.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org