Commands by nnsense (8)

  • Docker's local man pages are (often) half of what you have online, so I wanted that as local man. Install lynx and run my oneliner, then use as follows: dockpage Adjust lynx's page width at will Show Sample Output


    0
    dockpage() { lynx -width=180 --dump https://docs.docker.com/v1.11/engine/reference/commandline/$1/ | sed -n '/^Usage/,/On this page/{/On this page/b;p}'; }
    nnsense · 2017-09-18 23:53:34 2
  • Usefull, for example, when many ports are exposed and the docker ps output looks cluttered. Show Sample Output


    0
    alias dockps='docker ps --format "table {{.ID}}\t{{.Image}}\t{{.Status}}\t{{.Names}}"'
    nnsense · 2017-09-18 23:46:47 0
  • Sometimes it's useful to output just the ip address. Or some other information, changing the "ipv4.addresses" in command. The power of awk! Show all possible "greps" with nmcli connection show [yourInterfaceNameHere] Show Sample Output


    2
    showip() { nmcli connection show $1|grep ipv4.addresses|awk '{print $2}' ; }
    nnsense · 2015-05-13 16:24:28 3
  • This is a common use of bind. Hitting any key after will output the key's character sequence. This makes possible using it into a bind command. So pressing ctrl+v and then F2 will output "^[[12~", once binded every time you'll press the function key F2 it will execute your command. Added the \n to make it execute it as well. Show Sample Output


    1
    bind '"<ctrl+v><functionKey>":"command\n"'
    nnsense · 2015-05-11 17:59:09 2
  • I copied this (let's be honest) somewhere on internet and I just made it as a function ready to be used as alias. It shows the 10 most used commands from history. This seems to be just another "most used commands from history", but hey.. this is a function!!! :D Show Sample Output


    1
    mosth() { history | awk '{CMD[$2]++;count++;}END { for (a in CMD)print CMD[a] " " CMD[a]/count*100 "% " a;}' | grep -v "./" | column -c3 -s " " -t | sort -nr | nl | head -n10; }
    nnsense · 2015-05-11 17:41:55 3
  • Many times I give the same commands in loop to find informations about a file. I use this as an alias to summarize that informations in a single command. Now with variables! :D Show Sample Output


    2
    fileinfo() { RPMQF=$(rpm -qf $1); RPMQL=$(rpm -ql $RPMQF);echo "man page:";whatis $(basename $1); echo "Services:"; echo -e "$RPMQL\n"|grep -P "\.service";echo "Config files:";rpm -qc $RPMQF;echo "Provided by:" $RPMQF; }
    nnsense · 2015-05-11 16:46:01 3
  • I use this as an alias to get all .service files related a single installed file/conf (if it has services, of course). For rpm based systems ;) Show Sample Output


    1
    qf2s() { rpm -ql $(rpm -qf $1)|grep -P "\.service"; }
    nnsense · 2015-05-11 16:32:16 0
  • Not really alternative, just giving a different behavior listing current directory if no directory given.


    0
    cdls() { if [[ $1 != "" ]] ; then cd $1; ls; else ls; fi };
    nnsense · 2015-05-11 15:52:09 2

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Commandline document conversion with Libreoffice
In this example, the docx gets converted to Open Document .odt format. For other formats, you'll need to specify the correct filter (Hint: see "Comments" link below for a nice list).

Find the package that installed a command

check open ports without netstat or lsof

grep processes list avoiding the grep itself
Trick to avoid the form: grep process | grep - v grep

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Iterate through current directory + all subs for C++ header files and rank by # of comments
This shows you which files are most in need of commenting (one line of output per file)

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Push your present working directory to a stack that you can pop later
If are a Bash user and you are in a directory and need to go else where for a while but don't want to lose where you were, use pushd instead of cd. cd /home/complicated/path/.I/dont/want/to/forget pushd /tmp cd thing/in/tmp popd (returns you to /home/complicated/path/.I/dont/want/to/forget)

Print the IP address and the Mac address in the same line
Print the IP address and the Mac address in the same line


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: