Commands using tr (349)

  • This little command (function) shows the CSV header fields (which are field names separated by commas) as an ordered list, clearly showing the fields and their order. Show Sample Output


    0
    function headers { head -1 $* | tr ',' '\12' | pr -t -n ; }
    totoro · 2009-03-25 20:07:47 14
  • cat datemp.log 04/01/0902:11:42 Sys Temp: +11.0?C CPU Temp: +35.5?C AUX Temp: +3.0?C Show Sample Output


    0
    date +%m/%d/%y%X|tr -d 'n' >>datemp.log&& sensors|grep +5V|cut -d "(" -f1|tr -d 'n'>> datemp.log && sensors |grep Temp |cut -d "(" -f1|tr -d 'n'>>datemp.log
    f241vc15 · 2009-03-31 18:13:23 4

  • 0
    cal -y | tr '\n' '|' | sed "s/^/ /;s/$/ /;s/ $(date +%e) / $(date +%e | sed 's/./#/g') /$(date +%m | sed s/^0//)" | tr '|' '\n'
    luishka · 2009-05-26 20:31:26 713
  • Helpful when we want to do mass file renaming(especially mp3s). Show Sample Output


    0
    echo "${STRING}" | tr '[A-Z]' '[a-z]' | awk '{print toupper(substr($0,1,1))substr($0,2);}'
    mohan43u · 2009-06-23 21:11:34 119
  • gemInst.sh: #!/bin/bash for i in $@; do if [ "$1" != "$i" ] then echo /newInstall/gem install $1 -v=\"$i\" /newInstall/gem install $1 -v="$i" if [ "$?" != "0" ] then echo -e "\n\nGEM INSTALL ERROR: $1\n\n" echo "$1" > gemInst.err fi fi done


    0
    /originalInstall/gem list | tr -d '(),' | xargs -L 1 sudo ./gemInst.sh
    snakerdlk · 2009-07-09 21:46:06 4
  • Save the script as: sort_file Usage: sort_file < sort_me.csv > out_file.csv This script was originally posted by Admiral Beotch in LinuxQuestions.org on the Linux-Software forum. I modified this script to make it more portable. Show Sample Output


    0
    infile=$1 for i in $(cat $infile) do echo $i | tr "," "\n" | sort -n | tr "\n" "," | sed "s/,$//" echo done
    iframe · 2009-07-12 21:23:37 6

  • 0
    dd if=/dev/urandom count=200 bs=1 2>/dev/null | tr "\n" " " | sed 's/[^a-zA-Z0-9]//g' | cut -c-16
    amaymon · 2009-08-07 06:32:55 5
  • Renames all the jpg files as their timestamps with ".jpg" extension. Show Sample Output


    0
    ls -1 *.jpg | while read fn; do export pa=`exiv2 "$fn" | grep timestamp | awk '{ print $4 " " $5 ".jpg"}' | tr ":" "-"`; mv "$fn" "$pa"; done
    axanc · 2009-08-10 00:52:22 3

  • 0
    printf '%*s\n' 20 | tr ' ' '#'
    twfcc · 2009-08-15 22:38:01 3
  • du only accepts lines ending with a NUL, which can be a pain to create. This solves that issue.


    0
    cat filename | tr '\n' '\0' | du -hsc ?files0-from=-
    Diluted · 2009-08-21 18:36:49 4
  • another possibility


    0
    echo sortmeplease|sed 's/./&\n/g'|sort|tr -d '\n'
    foob4r · 2009-09-03 10:37:57 3
  • This is just for fun. Show Sample Output


    0
    echo "Decode this"| tr [a-zA-Z] $(echo {a..z} {A..Z}|grep -o .|sort -R|tr -d "\n ")
    dennisw · 2009-09-18 06:38:28 36

  • 0
    echo $PATH|tr : '\n'|sort|uniq -d
    haivu · 2009-09-24 17:22:45 3

  • 0
    seq 4|xargs -n1 -i bash -c "echo -n 164.85.216.{} - ; nslookup 164.85.216.{} |grep name"|tr -s ' ' ' '|awk '{print $1" - "$5}'|sed 's/.$//'
    Waldirio · 2009-10-14 19:57:24 3
  • A way not so simple but functional for print the command for the process that's listening a specific port. I got the pid from lsof because I think it's more portable but can be used netstat netstat -tlnp Show Sample Output


    0
    port=8888;pid=$(lsof -Pan -i tcp -i udp | grep ":$port"|tr -s " " | cut -d" " -f2); ps -Afe|grep "$pid"|grep --invert-match grep | sed "s/^\([^ ]*[ ]*\)\{7\}\(.*\)$/\2/g"
    glaudiston · 2010-01-11 17:49:22 8
  • ** Replace the ... in URLS with: www.census.gov/genealogy/www/data/1990surnames Couldn't fit in 256 Created on Ubuntu 9.10 but nothing out of the ordinary, should work anywhere with a little tweaking. 5163 is the number of unique first names you get when combine the male and female first name files from. http://www.census.gov/genealogy/www/data/1990surnames/names_files.html Show Sample Output


    0
    paste -d "." <(curl http://.../dist.female.first http://.../dist.male.first | cut -d " " -f 1 | sort -uR) <(curl http://..../dist.all.last | cut -d " " -f 1 | sort -R | head -5163) | tr "[:upper:]" "[:lower:]" | sed 's/$/@test.domain/g'
    connorsy · 2010-01-21 19:52:28 3
  • This is a minimalistic version of the ubiquitious Google definition screen scraper. This version was designed not only to run fast, but to work using BusyBox. BusyBox is a collection of basic Unix tools that have been compiled into a single binary to save space on tiny installations of Unix. For example, although my phone doesn't have perl or the GNU utilities, it does have BusyBox's stripped down versions of wget, tr, and sed. It turns out that those tools suffice for many tasks. Known Bugs: This script does not handle HTML entities at all. I don't think there's an easy way to do that within BusyBox, but I'd love to see it if someone could do it. Also, this script can only define a single word, not phrases. (Well, you could if you typed in %20, but that'd be gross.) Lastly, this script does not show the URL where definitions were found. Given the randomness of the Net, that last bit of information is often key. Show Sample Output


    0
    wget -q -U busybox -O- "http://www.google.com/search?ie=UTF8&q=define%3A$1" | tr '<' '\n' | sed -n 's/^li>\(.*\)/\1\n/p'
    hackerb9 · 2010-02-01 13:01:47 9
  • Will create a sample etc host file based on your router's dhcp list. Now I know this won't work on most routers, so please don't downvote it just because it doesn't work for you. Show Sample Output


    0
    curl -s -u $username:$password http://192.168.1.1/DHCPTable.htm | grep '<td>.* </td>' | sed 's|\t<td>\(.*\) </td>\r|\1|' | tr '\n' ';' | sed 's/\([^;]*\);\([^;]*\);/\2\t\1\n/g'
    matthewbauer · 2010-02-16 02:27:11 3
  • when someone mail you his ssh public key, and the lines are broken with '\n', you can reconstruct a new file with one key by line with this command. Show Sample Output


    0
    cat authorized_keys_with_broken_lines | sed 's,^ssh,%ssh,' | tr '\n' '\0' | tr '%' '\n' | sed '1d' | sed "/^$/d" > authorized_keys
    pepin · 2010-02-19 08:32:35 3
  • The wherepath function will search all the directories in your PATH and print a unique list of locations in the order they are first found in the PATH. (PATH often has redundant entries.) It will automatically use your 'ls' alias if you have one or you can hardcode your favorite 'ls' options in the function to get a long listing or color output for example. Alternatives: 'whereis' only searches certain fixed locations. 'which -a' searches all the directories in your path but prints duplicates. 'locate' is great but isn't installed everywhere (and it's often too verbose). Show Sample Output


    0
    function wherepath () { for DIR in `echo $PATH | tr ":" "\n" | awk '!x[$0]++ {print $0}'`; do ls ${DIR}/$1 2>/dev/null; done }
    mscar · 2010-04-02 20:32:36 17
  • Get a list of all the unique hostnames from the apache configuration files. Handy to see what sites are running on a server.


    0
    cat /etc/apache2/sites-enabled/* | egrep 'ServerAlias|ServerName' | tr -s " " | sed 's/^[ ]//g' | uniq | cut -d ' ' -f 2 | sed 's/www.//g' | sort | uniq
    chronosMark · 2010-04-08 08:51:17 5

  • 0
    logfile=/var/log/gputemp.log; timestamp=$( date +%T );temps=$(nvidia-smi -lsa | grep Temperature | awk -F: ' { print $2 } '| cut -c2-4 | tr "\n" " ");echo "${timestamp} ${temps}" >> ${logfile}
    purehate · 2010-05-28 10:14:47 6
  • first off, if you just want a random UUID, here's the actual command to use: uuidgen Your chances of finding a duplicate after running this nonstop for a year are about the same as being hit by a meteorite before finishing this sentence The reason for the command I have is that it's more provably unique than the one that uuidgen creates. uuidgen creates a random one by default, or an unencrypted one based on time and network address if you give it the -t option. Mine uses the mac address of the ethernet interface, the process id of the caller, and the system time down to nanosecond resolution, which is provably unique over all computers past, present, and future, subject to collisions in the cryptographic hash used, and the uniqueness of your mac address. Warning: feel free to experiment, but be warned that the stdin of the hash is binary data at that point, which may mess up your terminal if you don't pipe it into something. If it does mess up though, just type reset Show Sample Output


    0
    printf $(( echo "obase=16;$(echo $$$(date +%s%N))"|bc; ip link show|sed -n '/eth/ {N; p}'|grep -o -E '([[:xdigit:]]{1,2}:){5}[[:xdigit:]]{1,2}'|head -c 17 )|tr -d [:space:][:punct:] |sed 's/[[:xdigit:]]\{2\}/\\x&/g')|sha1sum|head -c 32; echo
    camocrazed · 2010-07-14 14:04:53 10

  • 0
    TTY=$(tty | cut -c 6-);who | grep "$TTY " | awk '{print $6}' | tr -d '()'
    sharfah · 2010-08-06 13:42:17 6
  • Another way to do it with slightly fewer characters. It doesn't work on Russian characters; please don't vote down because of that. :p It's very handy for those of us working in ascii :) Show Sample Output


    0
    echo StrinG | tr 'A-Z' 'a-z'
    randy909 · 2010-08-12 15:42:56 3
  • ‹ First  < 6 7 8 9 10 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Find the annual salary of any White House staffer.
Query the Socrata Open Data API being used by the White House to find any employee's salary using curl, grep and awk. Change the value of the search parameter (example uses Axelrod) to the name of any White House staffer to see their annual salary.

Download Youtube video with wget!
Nothing special required, just wget, sed & tr!

list files recursively by size

Are there any words in the English language that use at least half of the alphabet without repeating any letters?
This is the most straightforward approach: first regexp limits dictionary file to words with thirteen or more characters, second regexp discards any words that have a letter repeated. (Bonus challenge: Try doing it in a single regexp!)

Prepend a text to a file.
Prepend text to a file. It doen't need temporary files, ed or sed.

SSH tunneling self-connection
- port 8080 on localhost will be a SOCKSv5 proxy - at localhost:localport1 you will be connected to the external source server1:remoteport1 and at bind_address2:localport2 to server2:remoteport2 - you will be using only IPv4 and arcfour/blowfish-cbc, in order to speed up the tunnel - if you lose the connection, autossh will resume it at soon as possible - the tunnel is here a background process, wiithout any terminal window open

Find Duplicate Files (based on size first, then MD5 hash)
If you have the fdupes command, you'll save a lot of typing. It can do recursive searches (-r,-R) and it allows you to interactively select which of the duplicate files found you wish to keep or delete.

Output a SSL certificate start or end date
A quick and simple way of outputting the start and end date of a certificate, you can simply use 'openssl x509 -in xxxxxx.crt -noout -enddate' to output the end date (ex. notAfter=Feb 01 11:30:32 2009 GMT) and with the date command you format the output to an ISO format. For the start date use the switch -startdate and for end date use -enddate.

list block devices
Shows all block devices in a tree with descruptions of what they are.

replace spaces in filenames with underscores
This command will replace all the spaces in all the filenames of the current directory with underscores. There are other commands that do this here, but this one is the easiest and shortest.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: