file(1) can print details about certain devices in the /dev/ directory (block devices in this example). This helped me to know at a glance the location and revision of my bootloader, UUIDs, filesystem status, which partitions were primaries / logicals, etc.. without running several commands.
See also:
file -s /dev/dm-*
file -s /dev/cciss/*
etc..
Show Sample Output
Can be run as a script `ftrace` if my_command is substrituted with "$@" It is useful when running a command that fails and you have the feeling it is accessing a file you are not aware of. Show Sample Output
copy files to a ssh server with gzip compression
Useful to move many files (thousands or millions files) over ssh. Faster than scp because this way you save a lot of tcp connection establishments (syn/ack packets). If using a fast lan (I have just tested gigabyte ethernet) it is faster to not compress the data so the command would be: tar -cf - /home/user/test | ssh user@sshServer 'cd /tmp; tar xf -'
Best result when file size less than half of RAM size
Upload/download newer version of any file with less size and high speed.
To remake the new file use
bspatch <oldfile> <newfile> <patchfile>
counts the total (recursive) number of files in the immediate (depth 1) subdirectories as well as the current one and displays them sorted. Fixed, as per ashawley's comment Show Sample Output
- Where $URL is the URL of the file. - Replace the $2 by $3 at the end to get a human-readable size. Credits to svanberg @ ArchLinux forums for original idea. Edit: Replaced command with better version by FRUiT. (removed unnecessary grep)
It's common to want to split up large files and the usual method is to use split(1).
If you have a 10GiB file, you'll need 10GiB of free space.
Then the OS has to read 10GiB and write 10GiB (usually on the same filesystem).
This takes AGES.
.
The command uses a set of loop block devices to create fake chunks, but without making any changes to the file.
This means the file splitting is nearly instantaneous.
The example creates a 1GiB file, then splits it into 16 x 64MiB chunks (/dev/loop0 .. loop15).
.
Note: This isn't a drop-in replacement for using split. The results are block devices.
tar and zip won't do what you expect when given block devices.
.
These commands will work:
hexdump /dev/loop4
.
gzip -9 < /dev/loop6 > part6.gz
.
cat /dev/loop10 > /media/usb/part10.bin
Show Sample Output
This command list and sort files by size and in reverse order, the reverse order is very helpful when you have a very long list and wish to have the biggest files at the bottom so you don't have scrool up. The file size info is in human readable output, so ex. 1K..234M...3G Tested with Linux (Red Hat Enterprise Edition)
Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the names of those tunes straight out of FF's cache in a nice, easy to read list. What you do with them after that is *ahem* no concern of mine. ;) Show Sample Output
Creates one letter folders in the current directory and moves files with corresponding initial in the folder.
if you want to move with command mv large list of files than you would get following error /bin/mv: Argument list too long alternavite with exec: find /source/directory -mindepth 1 -maxdepth 1 -name '*' -exec mv {} /target/directory \; Show Sample Output
Create a file with actual date as filename Show Sample Output
This shows every bit of information that stat can get for any file, dir, fifo, etc. It's great because it also shows the format and explains it for each format option.
If you just want stat help, create this handy alias 'stath' to display all format options with explanations.
alias stath="stat --h|sed '/Th/,/NO/!d;/%/!d'"
To display on 2 lines:
( F=/etc/screenrc N=c IFS=$'\n'; for L in $(sed 's/%Z./%Z\n/'<<<`stat --h|sed -n '/^ *%/s/^ *%\(.\).*$/\1:%\1/p'`); do G=$(echo "stat -$N '$L' \"$F\""); eval $G; N=fc;done; )
For a similarly powerful stat-like function optimized for pretty output (and can sort by any field), check out the "lll" function
http://www.commandlinefu.com/commands/view/5815/advanced-ls-output-using-find-for-formattedsortable-file-stat-info
From my .bash_profile ->
http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html
Show Sample Output
All words of the filenames except "a", "of", "that" and "to" are capitalized.
To also match words which begin with a specific string, you can use this:
rename 's/\b((?!hello\b|t)[a-z]+)/\u$1/g' *
This will capitalize all words except "hello" and words beginning with "t".
bs = buffer size (basically defined the size of a "unit" used by count and skip) count = the number of buffers to copy (16m * 32 = 1/2 gig) skip = (32 * 2) we are grabbing piece 3...which means 2 have already been written so skip (2 * count) i will edit this later if i can to make this all more understandable
Grab a list of MP3s (with full path) out of Firefox's cache Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the *full path* of those tunes straight out of FF's cache in a clean list. Shorter and Intuitive version of the command submitted by (TuxOtaku) Show Sample Output
In a folder with many files and folders, you want to move all files where the date is >= the file olderFilesNameToMove and
Retrieve absolute path name from relative path Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: