Commands tagged pwd (16)

  • I love this function because it tells me everything I want to know about files, more than stat, more than ls. It's very useful and infinitely expandable. find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n' | sort -rgbS 50% 00761 drwxrw---x askapache:askapache 777:666 [06/10/10 | 06/10/10 | 06/10/10] [d] /web/cg/tmp The key is: # -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n' which believe it or not took me hundreds of tweaking before I was happy with the output. You can easily use this within a function to do whatever you want.. This simple function works recursively if you call it with -r as an argument, and sorts by file permissions. lsl(){ O="-maxdepth 1";sed -n '/-r/!Q1'<<<$@ &&O=;find $PWD $O -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -rgbS 50%; } Personally I'm using this function because: lll () { local a KS="1 -r -g"; sed -n '/-sort=/!Q1' <<< $@ && KS=`sed 's/.*-sort=\(.*\)/\1/g'<<<$@`; find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -k$KS -bS 50%; } # i can sort by user lll -sort=3 # or sort by group reversed lll -sort=4 -r # and sort by modification time lll -sort=6 If anyone wants to help me make this function handle multiple dirs/files like ls, go for it and I would appreciate it.. Something very minimal would be awesome.. maybe like: for a; do lll $a; done Note this uses the latest version of GNU find built from source, easy to build from gnu ftp tarball. Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    8
    find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'
    AskApache · 2010-06-10 22:03:08 4
  • This is freaking sweet!!! Here is the full alias, (I didn't want to cause display problems on commandlinefu.com's homepage): alias tarred='( ( D=`builtin pwd`; F=$(date +$HOME/`sed "s,[/ ],#,g" <<< ${D/${HOME}/}`#-%F.tgz); S=$SECONDS; tar --ignore-failed-read --transform "s,^${D%/*},`date +${D%/*}.%F`,S" -czPf "$"F "$D" && logger -s "Tarred $D to $F in $(($SECONDS-$S)) seconds" ) & )' Creates a .tgz archive of whatever directory it is run from, in the background, detached from current shell so if you logout it will still complete. Also, you can run this as many times as you want, if the archive .tgz already exists, it just moves it to a numbered backup '--backup=numbered'. The coolest part of this is the transformation performed by tar and sed so that the archive file names are automatically created, and when you extract the archive file it is completely safe thanks to the transform command. If you archive lets say /home/tombdigger/new-stuff-to-backup/ it will create the archive /home/#home#tombdigger#new-stuff-to-backup#-2010-11-18.tgz Then when you extract it, like tar -xvzf #home#tombdigger#new-stuff-to-backup#-2010-11-18.tgz instead of overwriting an existing /home/tombdigger/new-stuff-to-backup/ directory, it will extract to /home/tombdigger/new-stuff-to-backup.2010-11-18/ Basically, the tar archive filename is the PWD with all '/' replaced with '#', and the date is appended to the name so that multiple archives are easily managed. This example saves all archives to your $HOME/archive-name.tgz, but I have a $BKDIR variable with my backup location for each shell user, so I just replaced HOME with BKDIR in the alias. So when I ran this in /opt/askapache/SOURCE/lockfile-progs-0.1.11/ the archive was created at /askapache-bk/#opt#askapache#SOURCE#lockfile-progs-0.1.11#-2010-11-18.tgz Upon completion, uses the universal logger tool to output its completion to syslog and stderr (printed to your terminal), just remove that part if you don't want it, or just remove the '-s ' option from logger to keep the logs only in syslog and not on your terminal. Here's how my syslog server recorded this.. 2010-11-18T00:44:13-05:00 gravedigger.askapache.com (127.0.0.5) [user] [notice] (logger:) Tarred /opt/askapache/SOURCE/lockfile-progs-0.1.11 to /askapache-bk/tarred/#opt#SOURCE#lockfile-progs-0.1.11#-2010-11-18.tgz in 4 seconds Caveats Really this is very robust and foolproof, the only issues I ever have with it (I've been using this for years on my web servers) is if you run it in a directory and then a file changes in that directory, you get a warning message and your archive might have a problem for the changed file. This happens when running this in a logs directory, a temp dir, etc.. That's the only issue I've ever had, really nothing more than a heads up. Advanced: This is a simple alias, and very useful as it works on basically every linux box with semi-current tar and GNU coreutils, bash, and sed.. But if you want to customize it or pass parameters (like a dir to backup instead of pwd), check out this function I use.. this is what I created the alias from BTW, replacing my aa_status function with logger, and adding $SECONDS runtime instead of using tar's --totals function tarred () { local GZIP='--fast' PWD=${1:-`pwd`} F=$(date +${BKDIR}/%m-%d-%g-%H%M-`sed -u 's/[\/\ ]/#/g' [[ ! -r "$PWD" ]] && echo "Bad permissions for $PWD" 1>&2 && return 2; ( ( tar --totals --ignore-failed-read --transform "s@^${PWD%/*}@`date +${PWD%/*}.%m-%d-%g`@S" -czPf $F $PWD && aa_status "Completed Tarp of $PWD to $F" ) & ) } #From my .bash_profile http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    8
    alias tarred='( ( D=`builtin pwd`; F=$(date +$HOME/`sed "s,[/ ],#,g" <<< ${D/${HOME}/}`#-%F.tgz); tar --ignore-failed-read --transform "s,^${D%/*},`date +${D%/*}.%F`,S" -czPf "$F" "$D" &>/dev/null ) & )'
    AskApache · 2010-11-18 06:24:34 0
  • Another way of doing it that's a bit clearer. I'm a fan of readable code.


    7
    script_path=$(cd $(dirname $0);pwd)
    jgc · 2009-10-14 16:04:03 3
  • Usage: upto directory Show Sample Output


    4
    upto() { cd "${PWD/\/$@\/*//$@}" }
    sharfah · 2011-10-06 07:35:14 2
  • This will copy a file from your current directory to the same location on another machine. Handy for configuring ha, copying your resolv.conf, .bashrc, anything in /usr/local, etc. Show Sample Output


    3
    scp filename root@remote:`pwd`
    shadus · 2011-10-29 07:12:17 0
  • Also resolves symlinks, showing the full path of the link target


    3
    BASEDIR=$(dirname $(readlink -f $0))
    mackaz · 2011-12-01 09:10:21 2
  • Put the function in your .bashrc and use "map [alias]" to create the alias you want. Just be careful to not override an existing alias. Show Sample Output


    1
    map() { if [ "$1" != "" ]; then alias $1="cd `pwd`"; fi }
    javidjamae · 2011-07-11 15:46:19 3

  • 1
    function map() { [ -n "$1" ] && alias $1="cd `pwd`" || alias | grep "'cd "; }
    b1067606 · 2013-01-11 13:32:26 0
  • I've seen a lot of overly complicated attempts at figuring out "where am I?" I think this is a part of the problem: type -a pwd force the use of the binary version of `pwd` instead of the built-in with "/bin/pwd -P" -P option provides an absolute path to the present working directory for the overly cautious type: (which pwd) -P Show Sample Output


    1
    /bin/pwd -P
    forestb · 2015-11-21 22:26:57 0
  • I submitted a command like this without $0 if $BASH_SOURCE is unset. Therefor, it did only work when using ./script, not using 'sh script'. This version handles both, and will set $mydir in a script to the current working directory. It also works on linux, osx and probably bsd.


    0
    mydir=$(cd $(dirname ${BASH_SOURCE:-$0});pwd)
    xeor · 2011-04-27 16:33:38 0
  • Shows the size of the directory the command is ran in. The size is in MB and GB. There is no need to type the path, its the current working directory. Show Sample Output


    0
    du -sh `pwd`
    djkee · 2011-10-30 08:47:23 0
  • Sometimes you need the full path to your script, regardless of how it was executed (which starting directory) in order to maintain other relative paths in the script. If you attempt to just use something simple like: STARTING_DIR="${0%/*}" you will only get the relative path depending on where you first executed the script from. You can get the relative path to the script (from your starting point) by using dirname, but you actually have to change directories and print the working directory to get the absolute full path. Show Sample Output


    0
    STARTING_DIR=$(cd $(dirname $0) && pwd)
    bbbco · 2011-11-30 17:35:15 2
  • Since none of the systems I work on have readlink, this works cross-platform (everywhere has perl, right?). Note: This will resolve links. Show Sample Output


    0
    FULLPATH=$(perl -e "use Cwd 'abs_path';print abs_path('$0');")
    follier · 2013-02-01 20:09:34 0
  • work for execute file


    -1
    dirname $(readlink -f ${BASH_SOURCE[0]})
    unixyangg · 2011-12-02 16:10:38 0
  • this command will add the following two lines into the ~/.bash_aliases: alias exit='pwd > ~/.lastdir;exit' [ -n "$(cat .lastdir 2>/dev/null)" ] && cd "$(cat .lastdir)" or redirect it to the ~/.bashrc if you like Donno, I find it usefull. You may also define an alias for 'cd ~' like - alias cdh='cd ~'


    -1
    echo -e 'alias exit='\''pwd > ~/.lastdir;exit'\''\n[ -n "$(cat .lastdir 2>/dev/null)" ] && cd "$(cat .lastdir)"' >> ~/.bash_aliases
    ichbins · 2014-01-28 18:02:04 0
  • Use the -a flag to display all files, including hidden files. If you just want to display regular files, use a -1 (yes, that is the number one). Got this by RTFM and adding some sed magic. [bbbco@bbbco-dt ~]$ ls -a | sed "s#^#${PWD}/#" /home/bbbco/. /home/bbbco/.. /home/bbbco/2011-09-01-00-33-02.073-VirtualBox-2934.log /home/bbbco/2011-09-10-09-49-57.004-VirtualBox-2716.log /home/bbbco/.adobe /home/bbbco/.bash_history /home/bbbco/.bash_logout /home/bbbco/.bash_profile /home/bbbco/.bashrc ... [bbbco@bbbco-dt ~]$ ls -1 | sed "s#^#${PWD}/#" /home/bbbco/2011-09-01-00-33-02.073-VirtualBox-2934.log /home/bbbco/2011-09-10-09-49-57.004-VirtualBox-2716.log /home/bbbco/cookies.txt /home/bbbco/Desktop /home/bbbco/Documents /home/bbbco/Downloads ... Show Sample Output


    -9
    ls -a | sed "s#^#${PWD}/#"
    bbbco · 2011-12-16 22:19:06 2

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

prevent large files from being cached in memory (backups!)
We all know... $ nice -n19 for low CPU priority.   $ ionice -c3 for low I/O priority.   nocache can be useful in related scenarios, when we operate on very large files just a single time, e.g. a backup job. It advises the kernel that no caching is required for the involved files, so our current file cache is not erased, potentially decreasing performance on other, more typical file I/O, e.g. on a desktop.   http://askubuntu.com/questions/122857 https://github.com/Feh/nocache http://packages.debian.org/search?keywords=nocache http://packages.ubuntu.com/search?keywords=nocache   To undo caching of a single file in hindsight, you can do $ cachedel   To check the cache status of a file, do $ cachestats

Convert JSON to YAML
Requires installing json2yaml via npm: npm install -g json2yaml (can also pipe from stdin) Ref: https://www.npmjs.com/package/json2yaml

The proper way to read kernel messages in realtime.

Stream YouTube URL directly to mplayer.
This is the result of a several week venture without X. I found myself totally happy without X (and by extension without flash) and was able to do just about anything but watch YouTube videos... so this a the solution I came up with for that. I am sure this can be done better but this does indeed work... and tends to work far better than YouTube's ghetto proprietary flash player ;-) Replace $i with any YouTube ID you want and this will scrape the site for the _real_ URL to the full quality .FLV file on Youtube's server and will then will hand that over to mplayer (or vlc or whatever you want) to be streamed. In some browsers you can replace $i with just a % or put this in a shell script so all YouTube IDs can be handed directly off to your media player of choice for true streaming without the need for Flash or a downloader like clive. (I do however fully recommend clive if you wish to archive videos instead of streaming them) If any interest is shown I would be more than happy to provide similar commands for other sites. Most streaming flash players use similar logic to YouTube. Edit: 05/03/2011 - Updated line to work with current YouTube. It could be a lot prettier but I will probably follow up with another update when I figure out how to get rid of that pesky Grep. Sed should take that syntax... but it doesn't. Original (no longer working) command: mplayer -fs $(echo "http://youtube.com/get_video.php?$(curl -s $youtube_url | sed -n "/watch_fullscreen/s;.*\(video_id.\+\)&title.*;\1;p")")

Show Directories in the PATH Which does NOT Exist
I often need to know of my directory in the PATH, which one DOES NOT exist. This command answers that question * This command uses only bash's built-in commands * The parentheses spawn a new sub shell to prevent the modification of the IFS (input field separator) variable in the current shell

Sort files in folders alphabetically
Creates one letter folders in the current directory and moves files with corresponding initial in the folder.

Capture screen and mic input using FFmpeg and ALSA
Yet another x11grab using ffmpeg. I also added mic input to the capturing video stream using alsa. Yet I need to find out how to capture audio which is currently playing.

Capture FTP Credentials and Commands

Delete all empty lines from a file with vim
This command delete all the empty lines (include the lines with space) from a file. g = global command \S = non-whitespace character; !\S the opposite d = delete a range

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: