[ https://issues.apache.org/jira/browse/CASSANDRA-14982?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Michael Shuler updated CASSANDRA-14982: --------------------------------------- Fix Version/s: (was: 3.11.3) 3.11.x > PicklingError: Can't pickle <class 'cqlshlib.copyutil.ImmutableDict'>: > attribute lookup cqlshlib.copyutil.ImmutableDict failed > ------------------------------------------------------------------------------------------------------------------------------ > > Key: CASSANDRA-14982 > URL: https://issues.apache.org/jira/browse/CASSANDRA-14982 > Project: Cassandra > Issue Type: Bug > Components: CQL/Interpreter > Environment: cqlsh --version > cqlsh 5.0.1 > > Reporter: Vaclav > Priority: Normal > Fix For: 3.11.x > > > Hello > I am trying to load records from file containing some very long lines (up to > 180_000 characters). In some cases order of lines in file causes error > 'Error from server: code=2200 [Invalid query] message="Batch too large"' > catched and printed in copyutils.py, class SendingChanel in method feed(), > but error is just printed, records are not loaded, no error file with > unimported rows is created and import continues. cqlsh at the end returns > code 0 even not all rows are imported. Even that number of imported rows is > wrong, it shows number of records in file but in fact it loaded less records. > So I cannot be sure, based on returned code, that copy command did load all > rows. My problem is that in this case only way to find something went wrong > and data are not loaded correctly is to watch for error message on screen, > which is problem when this happens in very long import script (script loading > many tables). > I think when not all rows are loaded correctly return code should not be 0, > or when records aren't loaded it should exit with error immediatelly. > $ cqlsh 127.0.0.1 --request-timeout="3600" -e "copy woc.item_container from > '/tmp/cexport/woc/item_container.csv' with escape='\"' and null=null and > header=True" > Reading options from the command line: \{'header': 'True', 'null': 'null', > 'escape': '"'} > Using 7 child processes > Starting copy of woc.item_container with columns [container_id, capacity, > classes, instances, owner_id, type]. > PicklingError: Can't pickle <class 'cqlshlib.copyutil.ImmutableDict'>: > attribute lookup cqlshlib.copyutil.ImmutableDict failed > PicklingError: Can't pickle <class 'cqlshlib.copyutil.ImmutableDict'>: > attribute lookup cqlshlib.copyutil.ImmutableDict failed > PicklingError: Can't pickle <class 'cqlshlib.copyutil.ImmutableDict'>: > attribute lookup cqlshlib.copyutil.ImmutableDict failed > PicklingError: Can't pickle <class 'cqlshlib.copyutil.ImmutableDict'>: > attribute lookup cqlshlib.copyutil.ImmutableDict failed > PicklingError: Can't pickle <class 'cqlshlib.copyutil.ImmutableDict'>: > attribute lookup cqlshlib.copyutil.ImmutableDict failed > PicklingError: Can't pickle <class 'cqlshlib.copyutil.ImmutableDict'>: > attribute lookup cqlshlib.copyutil.ImmutableDict failed > PicklingError: Can't pickle <class 'cqlshlib.copyutil.ImmutableDict'>: > attribute lookup cqlshlib.copyutil.ImmutableDict failed > PicklingError: Can't pickle <class 'cqlshlib.copyutil.ImmutableDict'>: > attribute lookup cqlshlib.copyutil.ImmutableDict failed > PicklingError: Can't pickle <class 'cqlshlib.copyutil.ImmutableDict'>: > attribute lookup cqlshlib.copyutil.ImmutableDict failed > PicklingError: Can't pickle <class 'cqlshlib.copyutil.ImmutableDict'>: > attribute lookup cqlshlib.copyutil.ImmutableDict failed > Processed: 38420 rows; Rate: 7455 rows/s; Avg. rate: 6881 rows/s > 38420 rows imported from 1 files in 5.584 seconds (0 skipped). > $ echo $? > 0 -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@cassandra.apache.org For additional commands, e-mail: commits-h...@cassandra.apache.org