wraps text lines at the specified width (90 here). -s option is to force to wrap on blank characters -b count bytes instead of columns
sorts the contents of a file without the need for a second file to take the sorted output. This was previously entered as `sort -g list.txt -o $_` but as others have pointed out the $_ references the previous command. so this would've worked had that been the second part of a command joined with && like: cat list.txt && sort -g list.txt -o $_ The user below me Robin had the most correct command. Show Sample Output
Usage: lower [STRING]... Show Sample Output
Usage: upper [STRING]... Show Sample Output
If you just want to write or append some text to a file without having to run a text editor, run this command. After running the command, start typing away. To exit, type . on a line by itself. Replacing the >> with a single > will let you overwrite your file. Show Sample Output
Reads stdin, and outputs each line only once - without sorting ahead of time. This does use more memory than your system's sort utility.
random(6) - random lines from a file or random numbers
Used this command recently to remove the trailing ?> from all the files in a php project, which has having some unnecessary whitespace issues. Obviously, change *.php to whatever you'd like.
Randomizes a file. The opposite of sort is sort -R!
Faster then other method using wget
For obtain all commands use
nu=`curl http://www.commandlinefu.com/commands/browse |grep -o "Terminal - All commands -.*results$" | grep -oE "[[:digit:],]{4,}" | sed 's/,//'`;
curl http://www.commandlinefu.com/commands/browse/sort-by-votes/plaintext/[0-"$nu":25] | grep -vE "_curl_|\.com by David" > clf-ALL.txt
For more version specific
nu=`curl http://www.commandlinefu.com/commands/browse |grep -o "Terminal - All commands -.*results$" | grep -oE "[[:digit:],]{4,}" | sed 's/,//'`;
curl http://www.commandlinefu.com/commands/browse/sort-by-votes/plaintext/[0-"$nu":25] | grep -vE "_curl_|\.com by David" > clf-ALL_"$nu".txt
Also download dirctly from my dropbox
My drop box invitaion link is http://db.tt/sRdJWvQq . Use it and get free 2.5 GB space.
Show Sample Output
These re the best option combination that works fine for compressing my database dumps. It's possible that there are another option or value that might improve the compression ratio, by these are the ones that worked, the syntax for 7zr it's a little messy...
Display some text on the wallpaper especially warning messages
This is not printing, real editing using the text editor.
List all text files in the current directory.
The sort utility is well used, but sometimes you want a little chaos. This will randomize the lines of a text file.
BTW, on OS X there is no
| sort -R
option! There is also no
| shuf
These are only in the newer GNU core...
This is also faster than the alternate of:
| awk 'BEGIN { srand() } { print rand() "\t" $0 }' | sort -n | cut -f2-
Show Sample Output
print the lines of a file in randomized order Show Sample Output
- View non printable characters. - view binary files Show Sample Output
Remove duplicate line in a text file.
by default, will output the whole line on which 'word' has been found
Good for use in your ~/.bash_profile or a script. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: