Using the ff package of R, I imported a csv file into a ffdf object, but was surprised to find that the object occupied some 700MB of RAM. Isn\'t ff supposed to keep data on
I had the same problem, and posted a question, and there is a possible explanation for your issue. When you read a file, character rows are treated as factors, and if there is a lot of unique levels, they will go into RAM. ff seems to load always factor levels into RAM. See this answer from jwijffels in my question:
Loading ffdf data take a lot of memory
best, miguel.