On Mon, 2023-02-27 at 06:28 +0000, Jan Bilek wrote:
> Our customer was able to sneak in an Unicode data into a column of a JSON 
> Type and now that record fails on select.
> Would you be able to suggest any way out of this? E.g. finding infringing 
> row, updating its data ... ?

I'd be curious to know how the customer managed to do that.
Perhaps there is a loophole in PostgreSQL that needs to be fixed.

First, find the table that contains the column.
Then you can try something like

  DO
  $$DECLARE
     pkey bigint;
  BEGIN
     FOR pkey IN SELECT id FROM jsontab LOOP
        BEGIN  -- starts block with exception handler
           PERFORM jsoncol -> 'creationDateTime'
           FROM jsontab
           WHERE id = pkey;
        EXCEPTION
           WHEN untranslatable_character THEN
              RAISE NOTICE 'bad character in line with id = %', pkey;
        END;
     END LOOP;
  END;$$;

Yours,
Laurenz Albe
-- 
Cybertec | https://www.cybertec-postgresql.com


Reply via email to