I am using saveAsTextFile() to store the results of a Spark job in the folder dbfs:/FileStore/my_result.
I can access to the different \"part-xxxxx\" files using the
Using browser, you can access to individual file in File Store. You cannot access or even list directories. So you first have to put some file into the file store. If you've got a file "example.txt" at "/FileStore/example_directory/", you can download it via the following URL:
https://community.cloud.databricks.com/files/example_directory/example.txt?o=###
In that URL, "###" has to be replaced by the long number you find at the end of your community edition URL (after you logged into your community edition account).
Add comment · Share
There are a few options for downloading FileStore files to your local machine.
Easier options:
dbfs cp
command. For example: dbfs cp dbfs:/FileStore/test.txt ./test.txt
. If you want to download an entire folder of files, you can use dbfs cp -r
.https://<YOUR_DATABRICKS_INSTANCE_NAME>.cloud.databricks.com/files/
. If you are using Databricks Community Edition then you may need to use a slightly different path. This download method described in more detail in the FileStore docs.Advanced options:
read
calls to access chunks of the full file.