rohdesamuel commented on a change in pull request #11163: [BEAM-9548] Add better error handling to the TestStreamServiceController URL: https://github.com/apache/beam/pull/11163#discussion_r397344411
########## File path: sdks/python/apache_beam/runners/interactive/caching/streaming_cache.py ########## @@ -202,14 +207,24 @@ def _emit_from_file(self, fh, tail): # The first line at pos = 0 is always the header. Read the line without # the new line. to_decode = line[:-1] - if pos == 0: - header = TestStreamFileHeader() - header.ParseFromString(self._coder.decode(to_decode)) - yield header + proto_cls = TestStreamFileHeader if pos == 0 else TestStreamFileRecord + msg = self._try_parse_as(proto_cls, to_decode) + if msg: + yield msg else: - record = TestStreamFileRecord() - record.ParseFromString(self._coder.decode(to_decode)) - yield record + break + + def _try_parse_as(self, proto_cls, to_decode): + try: + msg = proto_cls() + msg.ParseFromString(self._coder.decode(to_decode)) + except DecodeError: + _LOGGER.error( + 'Could not parse as %s. This can indicate that the cache is ' + 'corruputed. Please restart the kernel. ' + '\nfile: %s \nmessage: %s', proto_cls, self._path, to_decode) + msg = None Review comment: No, if we rethrow the exception, it will get handled by the GRPC layer above and get turned into gobbledygook. This does stop processing because it returns msg = None which breaks out of the loop. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services