I am trying to read a CSV file that has barcodes in the first column, but when R gets it into a data.frame, it converts 1665535004661
to 1.67E+12
.<
Picking up on what you said in the comments, you can directly import the text as a character by specifying the colClasses
in read.table()
. For example:
num <- "1665535004661"
dat.char <- read.table(text = num, colClasses="character")
str(dat.char)
#------
'data.frame': 1 obs. of 1 variable:
$ V1: chr "1665535004661"
dat.char
#------
V1
1 1665535004661
Alternatively (and for other uses), you can specify the digits
variable under options()
. The default is 7 digits and the acceptable range is 1-22. To be clear, setting this option in no way changes or alters the underlying data, it merely controls how it is displayed on screen when printed. From the help page for ?options
:
controls the number of digits to print when printing numeric values. It is a suggestion only.
Valid values are 1...22 with default 7. See the note in print.default about values greater than
15.
Example illustrating this:
options(digits = 7)
dat<- read.table(text = num)
dat
#------
V1
1 1.665535e+12
options(digits = 22)
dat
#------
V1
1 1665535004661
To flesh this out completely and to account for the cases when setting a global setting is not preferable, you can specify digits directly as an argument to print(foo, digits = bar)
. You can read more about this under ?print.default
. This is what John describes in his answer so credit should go to him for illuminating that nuance.
You can use the numerals arguments when you are doing
read.csv
. So for example:
read.csv(x, sep = ";", numerals = c("no.loss")) Where x is your data.
This preserves the value of the long integers and doesn't mess with their representation when you import the data.