So I use OSX and don't have the shuf command. This is what I could come up with. This command assumes /usr/share/dict/words does not surpass 137,817,948 lines and line selection is NOT uniformly random. Show Sample Output
You need to have Brightbox Cloud CLI tools installed Show Sample Output
This will find any regular file starting with the current directory and use /dev/urandom to overwrite that file. It will is the same size of the file (in blocks) as the file. Can't handle files with spaces or odd characters in the name (who does that anyway?)
Using large wordlists is cumbersome. Using password cracking programs with rules such as Hashcat or John the ripper is much more effective. In order to do this many times we need to "clean" a wordlist removing all numbers, special characters, spaces, whitespace and other garbage. This command will covert a entire wordlist to all lowercase with no garbage.
My script lists all users & the number of commits they made in the period, sorted alphabetically. To sort by number of commits, append a '|sort' to the end of the command. The script depends on the output format of svn log - original command didn't work for me because the string 'user' was not appearing in my output
LC_ALL=C is here to always grep on "differ" whatever your language env. xargs -n 2 to run gvim -d with 2 arguments gvim --nofork to use only one instance of gvim
since awk was already there one can use it instead of the 2 greps. might not be faster, but fast enough
This will get all links from a given URL, remove any duplicates, and output the result.
This shell function uses curl(1) as it is more portable than wget(1) across Unices, to show what site a shortened URL is pointing to, even if there are many nested shortened URLs. It is a refinement to www.commandlinefu.com/commands/view/9515/expand-shortened-urls to make it better for use in scripts. Only displays final result.
expandurl http://t.co/LDWqmtDM
Show Sample Output
Sometimes you unzip a file that has no root folder and it spews files all over the place. This will clean up all of those files by deleting them. Show Sample Output
Gets the current system user running a process with the specified pid Show Sample Output
if you can use this depends on which netapp you have, this netapp is an FAS2020 Show Sample Output
svn log -v --> takes log of all Filter1 -------- -r {from}{to} --> gives from and to revision Filter2 -------- awk of line 'r'with numbers Assign user=3rd column [ie; username] Filter3 -------- if username = George print details Filter4 -------- Print lines starts with M/U/G/C/A/D [* A Added * D Deleted * U Updated * G Merged * C Conflicted] Filter5 -------- sort all files Filter6 ------- Print only uniq file's name alone. Show Sample Output
Better awk example, using only mplayer, grep, cut, and awk. Show Sample Output
This sums up the page count of multiple pdf files without the useless use of grep and sed which other commandlinefus use. Show Sample Output
This command will help you to find how many number of connection are made to given mysql and what are the different hosts connected to it with number of connection they are making. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: