问题
When importing a record with a large field inside (longer than 124214 characters) I am getting the error
"field larger than field limit (131072)"
I saw form other posts how to solve this on Python but I don't know if it is possible on CQLSH.
Thanks
回答1:
Take a look at this answer:
_csv.Error: field larger than field limit (131072)
You will need to add this solution to the top of the cqlsh file. So after:
import csv
import getpass
csv.field_size_limit(sys.maxsize)
回答2:
Rather than hacking into the cqlsh file, there is a standard option provided by cassandra to change the field_size_limit
. The Cassandra installation includes a cqlshrc.sample
file in the conf directory of a tarball distribution. In this file the field_size_limit
option can be found and changed. To make cqlsh read it's options from this file, you need to copy the cqlshrc.sample
file from the conf directory to the hidden .cassandra
folder of your user home folder, and renaming it to cqlshrc.
Cassandra documentation contains more details about it: http://docs.datastax.com/en/cql/3.1/cql/cql_reference/cqlsh.html?scroll=refCqlsh__cqlshUsingCqlshrc
来源:https://stackoverflow.com/questions/24168235/cassandra-cqlsh-text-field-limit-on-copy-from-csv-field-larger-than-field-limit