Converting CSV to JSON in bash

前端 未结 9 1872
梦毁少年i
梦毁少年i 2021-02-04 03:12

Trying to convert a CSV file into a JSON

Here is two sample lines :

-21.3214077;55.4851413;Ruizia cordata
-21.3213078;55.4849803;Cossinia pinnata


        
相关标签:
9条回答
  • 2021-02-04 03:40

    Because the jq solution does not handle CSV escaping, column names at the first line, commented-out lines and other common CSV "features", I have extended the CSV Cruncher tool to allow reading CSV and writing it as JSON. It's not exactly "Bash", but neither is jq :)

    It's primarily a CSV-as-SQL processing app, so it's not completely trivial, but here is the trick:

    ./crunch -in myfile.csv -out output.csv --json -sql 'SELECT * FROM myfile'
    

    It also allows output as JSON object per line or proper JSON array. See the documentation.

    It's in beta quality, so all feedback or pull requests are welcome.

    0 讨论(0)
  • 2021-02-04 03:41

    Here's a python one-liner/script that'll do the trick:

    cat my.csv | python -c 'import csv, json, sys; print(json.dumps([dict(r) for r in csv.DictReader(sys.stdin)]))
    
    0 讨论(0)
  • 2021-02-04 03:48

    In general, if your jq has the inputs built-in filter (available since jq 1.5), then it is better to use it rather than the -s command-line option.

    Here in any case is a solution using inputs. This solution is also variable-free.

    {"occurrences":
      [inputs
       | select(length > 0)
       | . / ";"
       | {"position": [.[0], .[1]], 
          "taxo": {"espece": .[2]}} ]}
    

    SSV, CSV, and all that

    The above of course assumes that the file has semicolon-separated fields in each line, and that there are none of the complications associated with CSV files.

    If the input has fields that are strictly delimited by a single character, then jq should have no problems handling it. Otherwise, it might be best to use a tool that can reliably convert to the TSV (tab-separated value) format, which jq can handle directly.

    0 讨论(0)
提交回复
热议问题