Commands tagged ksh (12)

  • If you should happen to find yourself needing some binary numbers, this is a quickie way of doing it. If you need more digits, just add more "{0..1}" sequences for each digit you need. You can assign them to an array, too, and access them by their decimal equivalent for a quickie binary to decimal conversion (for larger values it's probably better to use another method). Note: this works in bash, ksh and zsh. For zsh, though, you'll need to issue a setopt KSH_ARRAYS to make the array zero-based. binary=({0..1}{0..1}{0..1}{0..1}) echo ${binary[9]} Show Sample Output


    18
    echo {0..1}{0..1}{0..1}{0..1}
    dennisw · 2009-06-23 17:30:20 6
  • SH

    cat mod_log_config.c | shmore or shmore < mod_log_config.c Most pagers like less, more, most, and others require additional processes to be loaded, additional cpu time used, and if that wasn't bad enough, most of them modify the output in ways that can be undesirable. What I wanted was a "more" pager that was basically the same as running: cat file Without modifying the output and without additional processes being created, cpu used, etc. Normally if you want to scroll the output of cat file without modifying the output I would have to scroll back my terminal or screen buffer because less modifies the output. After looking over many examples ranging from builtin cat functions created for csh, zsh, ksh, sh, and bash from the 80's, 90s, and more recent examples shipped with bash 4, and after much trial and error, I finally came up with something that satisifed my objective. It automatically adjusts to the size of your terminal window by using the LINES variable (or 80 lines if that is empty) so This is a great function that will work as long as your shell works, so it will work just find if you are booted in single user mode and your /usr/bin directory is missing (where less and other pagers can be). Using builtins like this is fantastic and is comparable to how busybox works, as long as your shell works this will work. One caveat/note: I always have access to a color terminal, and I always setup both the termcap and the terminfo packages for color terminals (and/or ncurses and slang), so for that reason I stuck the tput setab 4; tput setaf 7 command at the beginning of the function, so it only runs 1 time, and that causes the -- SHMore -- prompt to have a blue background and bright white text. This is one of hundreds of functions I have in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html">.bash_profile at http://www.askapache.com/">AskApache.com, but actually won't be included till the next update. If you can improve this in any way at all please let me know, I would be very grateful! ( Like one thing I want is to be able to continue to the next screen by pressing any key instead of now having to press enter to continue) Show Sample Output


    6
    shmore(){ local l L M="`echo;tput setab 4&&tput setaf 7` --- SHMore --- `tput sgr0`";L=2;while read l;do echo "${l}";((L++));[[ "$L" == "${LINES:-80}" ]]&&{ L=2;read -p"$M" -u1;echo;};done;}
    AskApache · 2010-04-21 00:40:37 4
  • works on all unices. Show Sample Output


    4
    function nowrap { export COLS=`tput cols` ; cut -c-$COLS ; unset COLS ; }
    mobidyc · 2009-09-11 15:07:00 9
  • Convert some decimal numbers to binary numbers. You could also build a general base-converter: function convBase { echo "ibase=$1; obase=$2; $3" | bc; } then you could write function decToBun { convBase 10 2 $1; } Show Sample Output


    4
    function decToBin { echo "ibase=10; obase=2; $1" | bc; }
    woxidu · 2009-11-24 22:57:58 0
  • Specify the size in bytes using the 'c' option for the -size flag. The + sign reads as "bigger than". Then execute du on the list; sort in reverse mode and show the first 10 occurrences. Show Sample Output


    2
    find /myfs -size +209715200c -exec du -m {} \; |sort -nr |head -10
    arlequin · 2011-07-07 21:12:46 0
  • fcd : file change directory A bash function that takes a fully qualified file path and cd's into the directory where it lives. Useful on the commadline when you have a file name in a variable and you'd like to cd to the directory to RCS check it in or look at other files associated with it. Will run on any ksh, bash, likely sh, maybe zsh. Show Sample Output


    1
    function fcd () { [ -f $1 ] && { cd $(dirname $1); } || { cd $1 ; } pwd }
    relay · 2009-09-03 18:58:13 3
  • Creates the .ssh directory on the remote host with proper permissions, if it doesnt exist. Appends your public key to authorized_keys, and verifies it has proper permissions. (if it didnt exist it may have been created with undesireable permissions). *Korn shell syntax, may or may not work with bash


    0
    ssh <user>@<host> 'mkdir -m 700 ~/.ssh; echo ' $(< ~/.ssh/id_rsa.pub) ' >> ~/.ssh/authorized_keys ; chmod 600 ~/.ssh/authorized_keys'
    Halki · 2011-10-03 15:59:43 1
  • Had trouble with the other function, because of missing semicolons. (According to my bash on OS X)


    0
    function cdf () { [ -f $1 ] && { cd $(dirname $1); } || { cd $1 ; }; pwd; };
    Josso · 2012-09-08 10:50:58 1
  • This command is used to verify a sha256sum-formatted file hash list on IBM AIX or any other UNIX-like OS that has openssl but doesn't have sha256sum by default. Steps: 1: Save to the filesystem a script that: A: Receives as arguments the two parts of one line of a sha256sum listing B: Feeds a file into openssl on SHA256 standard input hash calculation mode, and saves the result C: Compares the calculated hash against the one received as argument D: Outputs the result in a sha256sum-like format 2: Make the script runnable 3: Feed the sha256sum listing to xargs, running the aforementioned script and passing 2 arguments at a time Show Sample Output


    0
    echo '#! /usr/bin/ksh\ncat $2 | openssl dgst -sha256 | read hashish; if [[ $hashish = $1 ]]; then echo $2: OK; else echo $2: FAILED; fi;' > shacheck; chmod +x shacheck; cat hashishes.sha256 | xargs -n 2 ./shacheck;
    RAKK · 2013-09-18 21:51:20 2
  • tput rmam will disable line wrapping so that long lines are truncated to width of the terminal ($COLUMNS). tput smam will re-enable wrapping. I've always used tput in bash scripts but I guess it works on the command line too. Doesn't work in all terminals. See http://www.gnu.org/software/termutils/manual/termutils-2.0/html_chapter/tput_1.html


    0
    tput rmam
    kennyld · 2014-02-26 07:06:37 1
  • With this command you can use shell variables inside sed scripts. This is useful if the script MUST remain in an external file, otherwise you can simply use an inline -e argument to sed.


    -1
    expanded_script=$(eval "echo \"$(cat ${sed_script_file})\"") && sed -e "${expanded_script}" your_input_file
    giuseppe_rota · 2009-05-07 14:21:14 2
  • search argument in PATH accept grep expressions without args, list all binaries found in PATH Show Sample Output


    -1
    function sepath { echo $PATH |tr ":" "\n" |sort -u |while read L ; do cd "$L" 2>/dev/null && find . \( ! -name . -prune \) \( -type f -o -type l \) 2>/dev/null |sed "s@^\./@@" |egrep -i "${*}" |sed "s@^@$L/@" ; done ; }
    mobidyc · 2009-09-11 15:03:22 2

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Nicely display permissions in octal format with filename
Since the original command (#1873) didn't work on FreeBSD whose stat lacks the "-c" switch, I wrote an alternative that does. This command shows also the fourth digit of octal format permissions which yields the sticky bit information.

Factorial With Case
Computes factorials.

Summarize Apache Extended server-status to show longest running requests
Ever need to know why Apache is bogging down *right now*? Hate scanning Apache's Extended server-status for the longest running requests? Me, too. That's why I use this one liner to quickly find suspect web scripts that might need review. Assuming the Extended server-status is reachable at the target URL desired, this one-liner parses the output through elinks (rendering the HTML) and shows a list of active requests sorted by longest running request at the bottom of the list. I include the following fields (as noted in the header line): Seconds: How long the request is alive PID: Process ID of the request handler State: State of the request, limited to what I think are the relevant ones (GCRK_.) IP: Remote Host IP making the request Domain: Virtual Host target (HTTP/1.1 Host: header). Important for Virtual Hosting servers TYPE: HTTP verb URL: requested URL being served. Putting this in a script that runs when triggered by high load average can be quite revealing. Can also capture "forgotten" scripts being exploited such as "formmail.pl", etc.

Do some learning...

Easily find latex package documentation
If the pdf/dvi/etc documentation for a latex package is already part of your local texmf tree, then texdoc will find and display it for you. If the documentation is not available on your system, it will bring up the package's webpage at CTAN to help you investigate.

get diskusage of files modified during the last n days
get diskusage of files (in this case logfiles in /var/log) modified during the last n days: $ sudo find /var/log/ -mtime -n -type f | xargs du -ch n -> last modified n*24 hours ago Numeric arguments can be specified as +n for greater than n, -n for less than n, n for exactly n. => so 7*24 hours (about 7 days) is -7 $ sudo find /var/log/ -mtime -7 -type f | xargs du -ch | tail -n1

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Skip over .svn directories when using the
Put the positive clauses after the '-o' option.

Recursively unrar into dir containing archive
From the cwd, recursively find all rar files, extracting each rar into the directory where it was found, rather than cwd. A nice time saver if you've used wget or similar to mirror something, where each sub dir contains an rar archive. Its likely this can be tuned to work with multi-part archives where all parts use ambiguous .rar extensions but I didn't test this. Perhaps unrar would handle this gracefully anyway?

Run remote web page, but don't save the results
I have a remote php file that I want to run once an hour. I set up cron to run this wget. I don't really care about what's in the file though, I don't want to save the results, so I run the -O and send it to /dev/null


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: