Commands using tr (349)

  • This little command (function) shows the CSV header fields (which are field names separated by commas) as an ordered list, clearly showing the fields and their order. Show Sample Output


    0
    function headers { head -1 $* | tr ',' '\12' | pr -t -n ; }
    totoro · 2009-03-25 20:07:47 14
  • cat datemp.log 04/01/0902:11:42 Sys Temp: +11.0?C CPU Temp: +35.5?C AUX Temp: +3.0?C Show Sample Output


    0
    date +%m/%d/%y%X|tr -d 'n' >>datemp.log&& sensors|grep +5V|cut -d "(" -f1|tr -d 'n'>> datemp.log && sensors |grep Temp |cut -d "(" -f1|tr -d 'n'>>datemp.log
    f241vc15 · 2009-03-31 18:13:23 4

  • 0
    cal -y | tr '\n' '|' | sed "s/^/ /;s/$/ /;s/ $(date +%e) / $(date +%e | sed 's/./#/g') /$(date +%m | sed s/^0//)" | tr '|' '\n'
    luishka · 2009-05-26 20:31:26 715
  • Helpful when we want to do mass file renaming(especially mp3s). Show Sample Output


    0
    echo "${STRING}" | tr '[A-Z]' '[a-z]' | awk '{print toupper(substr($0,1,1))substr($0,2);}'
    mohan43u · 2009-06-23 21:11:34 119
  • gemInst.sh: #!/bin/bash for i in $@; do if [ "$1" != "$i" ] then echo /newInstall/gem install $1 -v=\"$i\" /newInstall/gem install $1 -v="$i" if [ "$?" != "0" ] then echo -e "\n\nGEM INSTALL ERROR: $1\n\n" echo "$1" > gemInst.err fi fi done


    0
    /originalInstall/gem list | tr -d '(),' | xargs -L 1 sudo ./gemInst.sh
    snakerdlk · 2009-07-09 21:46:06 4
  • Save the script as: sort_file Usage: sort_file < sort_me.csv > out_file.csv This script was originally posted by Admiral Beotch in LinuxQuestions.org on the Linux-Software forum. I modified this script to make it more portable. Show Sample Output


    0
    infile=$1 for i in $(cat $infile) do echo $i | tr "," "\n" | sort -n | tr "\n" "," | sed "s/,$//" echo done
    iframe · 2009-07-12 21:23:37 6

  • 0
    dd if=/dev/urandom count=200 bs=1 2>/dev/null | tr "\n" " " | sed 's/[^a-zA-Z0-9]//g' | cut -c-16
    amaymon · 2009-08-07 06:32:55 5
  • Renames all the jpg files as their timestamps with ".jpg" extension. Show Sample Output


    0
    ls -1 *.jpg | while read fn; do export pa=`exiv2 "$fn" | grep timestamp | awk '{ print $4 " " $5 ".jpg"}' | tr ":" "-"`; mv "$fn" "$pa"; done
    axanc · 2009-08-10 00:52:22 3

  • 0
    printf '%*s\n' 20 | tr ' ' '#'
    twfcc · 2009-08-15 22:38:01 3
  • du only accepts lines ending with a NUL, which can be a pain to create. This solves that issue.


    0
    cat filename | tr '\n' '\0' | du -hsc ?files0-from=-
    Diluted · 2009-08-21 18:36:49 4
  • another possibility


    0
    echo sortmeplease|sed 's/./&\n/g'|sort|tr -d '\n'
    foob4r · 2009-09-03 10:37:57 3
  • This is just for fun. Show Sample Output


    0
    echo "Decode this"| tr [a-zA-Z] $(echo {a..z} {A..Z}|grep -o .|sort -R|tr -d "\n ")
    dennisw · 2009-09-18 06:38:28 36

  • 0
    echo $PATH|tr : '\n'|sort|uniq -d
    haivu · 2009-09-24 17:22:45 3

  • 0
    seq 4|xargs -n1 -i bash -c "echo -n 164.85.216.{} - ; nslookup 164.85.216.{} |grep name"|tr -s ' ' ' '|awk '{print $1" - "$5}'|sed 's/.$//'
    Waldirio · 2009-10-14 19:57:24 3
  • A way not so simple but functional for print the command for the process that's listening a specific port. I got the pid from lsof because I think it's more portable but can be used netstat netstat -tlnp Show Sample Output


    0
    port=8888;pid=$(lsof -Pan -i tcp -i udp | grep ":$port"|tr -s " " | cut -d" " -f2); ps -Afe|grep "$pid"|grep --invert-match grep | sed "s/^\([^ ]*[ ]*\)\{7\}\(.*\)$/\2/g"
    glaudiston · 2010-01-11 17:49:22 8
  • ** Replace the ... in URLS with: www.census.gov/genealogy/www/data/1990surnames Couldn't fit in 256 Created on Ubuntu 9.10 but nothing out of the ordinary, should work anywhere with a little tweaking. 5163 is the number of unique first names you get when combine the male and female first name files from. http://www.census.gov/genealogy/www/data/1990surnames/names_files.html Show Sample Output


    0
    paste -d "." <(curl http://.../dist.female.first http://.../dist.male.first | cut -d " " -f 1 | sort -uR) <(curl http://..../dist.all.last | cut -d " " -f 1 | sort -R | head -5163) | tr "[:upper:]" "[:lower:]" | sed 's/$/@test.domain/g'
    connorsy · 2010-01-21 19:52:28 3
  • This is a minimalistic version of the ubiquitious Google definition screen scraper. This version was designed not only to run fast, but to work using BusyBox. BusyBox is a collection of basic Unix tools that have been compiled into a single binary to save space on tiny installations of Unix. For example, although my phone doesn't have perl or the GNU utilities, it does have BusyBox's stripped down versions of wget, tr, and sed. It turns out that those tools suffice for many tasks. Known Bugs: This script does not handle HTML entities at all. I don't think there's an easy way to do that within BusyBox, but I'd love to see it if someone could do it. Also, this script can only define a single word, not phrases. (Well, you could if you typed in %20, but that'd be gross.) Lastly, this script does not show the URL where definitions were found. Given the randomness of the Net, that last bit of information is often key. Show Sample Output


    0
    wget -q -U busybox -O- "http://www.google.com/search?ie=UTF8&q=define%3A$1" | tr '<' '\n' | sed -n 's/^li>\(.*\)/\1\n/p'
    hackerb9 · 2010-02-01 13:01:47 9
  • Will create a sample etc host file based on your router's dhcp list. Now I know this won't work on most routers, so please don't downvote it just because it doesn't work for you. Show Sample Output


    0
    curl -s -u $username:$password http://192.168.1.1/DHCPTable.htm | grep '<td>.* </td>' | sed 's|\t<td>\(.*\) </td>\r|\1|' | tr '\n' ';' | sed 's/\([^;]*\);\([^;]*\);/\2\t\1\n/g'
    matthewbauer · 2010-02-16 02:27:11 3
  • when someone mail you his ssh public key, and the lines are broken with '\n', you can reconstruct a new file with one key by line with this command. Show Sample Output


    0
    cat authorized_keys_with_broken_lines | sed 's,^ssh,%ssh,' | tr '\n' '\0' | tr '%' '\n' | sed '1d' | sed "/^$/d" > authorized_keys
    pepin · 2010-02-19 08:32:35 3
  • The wherepath function will search all the directories in your PATH and print a unique list of locations in the order they are first found in the PATH. (PATH often has redundant entries.) It will automatically use your 'ls' alias if you have one or you can hardcode your favorite 'ls' options in the function to get a long listing or color output for example. Alternatives: 'whereis' only searches certain fixed locations. 'which -a' searches all the directories in your path but prints duplicates. 'locate' is great but isn't installed everywhere (and it's often too verbose). Show Sample Output


    0
    function wherepath () { for DIR in `echo $PATH | tr ":" "\n" | awk '!x[$0]++ {print $0}'`; do ls ${DIR}/$1 2>/dev/null; done }
    mscar · 2010-04-02 20:32:36 17
  • Get a list of all the unique hostnames from the apache configuration files. Handy to see what sites are running on a server.


    0
    cat /etc/apache2/sites-enabled/* | egrep 'ServerAlias|ServerName' | tr -s " " | sed 's/^[ ]//g' | uniq | cut -d ' ' -f 2 | sed 's/www.//g' | sort | uniq
    chronosMark · 2010-04-08 08:51:17 5

  • 0
    logfile=/var/log/gputemp.log; timestamp=$( date +%T );temps=$(nvidia-smi -lsa | grep Temperature | awk -F: ' { print $2 } '| cut -c2-4 | tr "\n" " ");echo "${timestamp} ${temps}" >> ${logfile}
    purehate · 2010-05-28 10:14:47 6
  • first off, if you just want a random UUID, here's the actual command to use: uuidgen Your chances of finding a duplicate after running this nonstop for a year are about the same as being hit by a meteorite before finishing this sentence The reason for the command I have is that it's more provably unique than the one that uuidgen creates. uuidgen creates a random one by default, or an unencrypted one based on time and network address if you give it the -t option. Mine uses the mac address of the ethernet interface, the process id of the caller, and the system time down to nanosecond resolution, which is provably unique over all computers past, present, and future, subject to collisions in the cryptographic hash used, and the uniqueness of your mac address. Warning: feel free to experiment, but be warned that the stdin of the hash is binary data at that point, which may mess up your terminal if you don't pipe it into something. If it does mess up though, just type reset Show Sample Output


    0
    printf $(( echo "obase=16;$(echo $$$(date +%s%N))"|bc; ip link show|sed -n '/eth/ {N; p}'|grep -o -E '([[:xdigit:]]{1,2}:){5}[[:xdigit:]]{1,2}'|head -c 17 )|tr -d [:space:][:punct:] |sed 's/[[:xdigit:]]\{2\}/\\x&/g')|sha1sum|head -c 32; echo
    camocrazed · 2010-07-14 14:04:53 10

  • 0
    TTY=$(tty | cut -c 6-);who | grep "$TTY " | awk '{print $6}' | tr -d '()'
    sharfah · 2010-08-06 13:42:17 6
  • Another way to do it with slightly fewer characters. It doesn't work on Russian characters; please don't vote down because of that. :p It's very handy for those of us working in ascii :) Show Sample Output


    0
    echo StrinG | tr 'A-Z' 'a-z'
    randy909 · 2010-08-12 15:42:56 3
  • ‹ First  < 6 7 8 9 10 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Define shell variable HISTIGNORE so that comments (lines starting with #) appear in shell history
I was surprised to find that with RedHat bash, I could not find any comment lines (begining with #) in my bash shell history. Surprised because in Mageia Linux this works. It turns out that RedHat's bash will keep comment lines if in my .bashrc, I define: export HISTIGNORE=' cd "`*: PROMPT_COMMAND=?*?' Why have comment lines in shell history? It's a handy and convenient way to make proto-commands (to be completed later) and for storing brief text data that is searchable in shell history.

Use find to get around Argument list too long problem
Can be used for other commands as well, replace rm with ls. It is easy to make this shorter but if the filenames involved have spaces, you will need to do use find's "-print0" option in conjunction with xargs's "-0" option. Otherwise the shell that xargs uses to execute the "rm" command line will treat the space as a token separator, thereby treating the name as two (or more) names.

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Retrieve the size of a file on a server
Downloads the entire file, but http servers don't always provide the optional 'Content-Length:' header, and ftp/gopher/dict/etc servers don't provide a filesize header at all.

Sort all running processes by their memory & CPU usage
you can also pipe it to "tail" command to show 10 most memory using processes.

Create POSIX tar archive
tar(1) and cpio(1) are not fully platform agnostic, although their file formats are specified in POSIX.1-2001. As such, GNU tar(1) might not be able to extract a BSD tar(1) archive, and ivce versa. pax(1) is defined in POSIX.1-2001. To extract an archive: $ pax -rf archive.tar

BourneShell: Go to previous directory
cd - would return to the previous directory of your cd command. NB: previous dir is always stored in $OLDPWD variable.

Copy a file using dd and watch its progress
This is a more accurate way to watch the progress of a dd process. The $DDPID=$! is needed so that you don't get the PID of the sleep. The sleep 1 is needed because in my testing at least, if you run kill -USR1 against dd too quickly, it will kill it off instead of display the status. So you need to wait a second, probably so that it can configure itself to trap the USR1 signal.

Find files with size over 100MB and output with better lay-out

Convert all Flac in a directory to Mp3 using maximum quality variable bitrate


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: