Replace 'csv_file.csv' with your filename.
Using the csv tool `miller` you can transform a csv file into a json array of objects, where the properties are the values of the csv header line and the values are the values of the subsequent lines. Show Sample Output
More of the same but with more elaborate perl-fu :-)
Will handle pretty much all types of CSV Files. The ^M character is typed on the command line using Ctrl-V Ctrl-M and can be replaced with any character that does not appear inside the CSV. Tips for simpler CSV files: * If newlines are not placed within a csv cell then you can replace `map(repr, r)` with r Show Sample Output
Useful for CSV files. In the command, the file in question is comma delimited but contains double quoted fields containing commas and contains no @ symbols (as confirmed with http://www.commandlinefu.com/commands/view/9998/delimiter-hunting). This command converts the delimiting commas to @s while preserving the commas in the fields using the "uniqueString" to mark the ends of lines.
Define a function that applies bc, the *nix calculator, with the specified expression to all rows of the input CSV. The first column is mapped to {1}, second one to {2}, and so forth. See sample output for an example. This function uses all available cores thanks to GNU Parallel. Requires GNU Parallel Show Sample Output
Based / Inspired by malathion's below command http://www.commandlinefu.com/commands/view/20528/convert-csv-to-json Is written for python3 and is very easy to use csv2json *csv will convert all files ending in csv to json eg csv2json file.csv will output a file to file.json Validity of json tested in python3 and below site https://jsonformatter.curiousconcept.com/
This little command (function) shows the CSV header fields (which are field names separated by commas) as an ordered list, clearly showing the fields and their order. Show Sample Output
Similar output to using MySQL with the \G at the end of a Query. Displays one column per line. Other modes include: -column Query results will be displayed in a table like form, using whitespace characters to separate the columns and align the output. -html Query results will be output as simple HTML tables. -line Query results will be displayed with one value per line, rows separated by a blank line. Designed to be easily parsed by scripts or other programs -list Query results will be displayed with the separator (|, by default) character between each field value. The default. From inside the command line this can be also changed using the mode command: .mode MODE ?TABLE? Set output mode where MODE is one of: csv Comma-separated values column Left-aligned columns. (See .width) html HTML code insert SQL insert statements for TABLE line One value per line list Values delimited by .separator string tabs Tab-separated values tcl TCL list elements Show Sample Output
Produces a CSV file containing fronts/backs of cards with specified tag ("mytag" above). This command pulls these cards from different card databases, and allows them to be merged into one (by importing the resulting CSV file). The CSV file is not directly produced; instead of commas, "||" are inserted. In your editor of choice, modify the resulting file to put quotes around the text before || and after ||, then change || to a comma (for every line).
This command will "su" the execution of the command to the postgres user(implies that you are already logger as root), and export the result of the query to a file on the csv format. You'll need to adequate the fields and database information to one of your choice/need. Show Sample Output
In addition one can evaluate the formulas in the venerable spreadsheet command sc, with an additional command. function csvev () { cat $1 | sed -e '1i,,,,,,,' |sed -e 's/=sum/@sum/g' -e 's/=SUM/@SUM/g' | psc -k -d, | sed -e 's/\"@SUM(/@SUM(/' -e 's/)"/)/' | sed '/@SUM/ { s/rightstring/let/; }' | sed -e '/= "=/s/rightstring/let/' -e '/= "=/s/"//g' | sed 's/= =/= /g' | sc ; } I will post this command separately as well. Show Sample Output
Requires psc, sed, sc, and cat. Working with csv spreadsheets with formulas in them to evaluate the formulas using sc. View the formulas in a numbered and lettered formated in command line. function sheet () { cat "$1" | sed '1s/^/a,b,c,d,e,f,g,h,j,k,l,m,n,o,p\n/' | column -s , -tn | nl -v 0 ; }
sample.csv: 79.36,94.93,10.92,27.33,95.90 3.57, 20.80,67.06,2.16, 79.23 48.45,27.95,7.66, 56.71,59.97 69.02,89.59,33.88,42.73,22.60 10.15,44.86,70.86,98.45,22.23 Show Sample Output
Inspired by Tatsh's comment.
I always forget this one and find all kinds of complex solutions on google. Also works great while piping data. ex. 'cat data | process-data | tr -d "\"" > processed-data-without-quotes'
It's a tool of the great csvkit https://csvkit.readthedocs.io/en/1.0.2/scripts/csvjson.html
For all lines, sum the columns following the first one, and then print the first column plus the sum of all the other columns. example input: 1,2,3,4,5 2,2,3,4,5 becomes 1,14 2,14
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: