Commands by pauli (2)

  • you can use svn_find just like the regular find command, except that subdirectories named .svn will be ignored. example: svn_find . -mtime -1 -size +200k -ls -> all files modified within last day and bigger then 200 KiB, but ignores subdirectories named .svn


    0
    svn_find () { local a=$1; shift; find $a -not \( -name .svn -prune \) $*; }
    pauli · 2011-08-17 09:16:02 3
  • '-mtime -10' syncs only files newer 10 days (-mtime is just one example, use whatever find expressions you need) printf %P: File's name with the name of the command line argument under which it was found removed. this way, you can use any src directory, no need to cd into your src directory first. using \\0 in printf and a corresponding --from0 in rsync ensures that even filenames with newline characters work (thanks syssyphus for #3808). both, #1481 and #3808 just work if you either copy the current directory (.) , or the filesystem root (/), otherwise the output from find and the source dir from rsync just don't match. #7685 works with an arbitrary source directory.


    2
    find /src/dir/ -mtime -10 -printf %P\\0|rsync --files-from=- --from0 /src/dir/ /dst/dir/
    pauli · 2011-01-18 22:23:47 3

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

pipe commands from a textfile to a telnet-server with netcat
sends commands specified in $commandfile to the telnet-server specified by $telnetserver. to have newlines in $commandfile interpreted as ENTER, save the file in CR+LF (aka "Windows-Textfile") format. if you want to save the output in a separate file, use: $nc $telnetserver 23 < $commandfile > $resultfile

files and directories in the last 1 hour
added alias in ~/.bashrc alias lf='find ./* -ctime -1 | xargs ls -ltr --color'

Generate MD5 of string and output only the hash checksum

Extract ip addresses with sed
Extracts ip addressess from file using sed. Uses a tag(ip) to grep the IP lines after extracting. Must be a way to just output regex matched on sed.

Randomly run command
Randomly decide whether to run a command, or fail. It's useful for testing purposes. . Usage: ran PERCENTAGE COMMAND [ARGS] Note: In this version the percentage is required. . This is like @sesom42 and @snipertyler's commands but in a USABLE form. . e.g. In your complicated shell script, put "ran 99" before a crucial component. Now, it will fail 1% of the time allowing you to test the failure code-path. $ ran 99 my_complex_program arg1 arg2

Ease your directory exploration
Usage : tt [OCCURRENCE] tt will display a tree from your actual path tt .svn will display only line containing .svn

Both view and pipe the file without saving to disk
This is a cool trick to view the contents of the file on /dev/pts/0 (or whatever terminal you're using), and also send the contents of that file to another program by way of an unnamed pipe. All the while, you've not bothered saving any extra data to disk, like you might be tempted to do with sed or grep to filter output.

draw honeycomb
$ tput setaf 1 && tput rev && seq -ws "___|" 81|fold -69|tr "0-9" "_" && tput sgr0 $ $ # (brick wall)

netstat with group by ip adress

Replace spaces in filenames with underscores


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: