Commands by gnif (1)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

"Pretty print" $PATH, show directories in $PATH, one per line with replacement pattern using shell parameter expansion
Can be used to create path alias. From: https://www.cyberciti.biz/tips/bash-aliases-mac-centos-linux-unix.html. #9

sum a column of numbers

Find the package that installed a command

backup and remove files with access time older than 5 days.
create an archive of files with access time older than 5 days, and remove original files.

Rename files in batch

Working random fact generator
Though without infinite time and knowledge of how the site will be designed in the future this may stop working, it still will serve as a simple straight forward starting point. This uses the observation that the only item marked as strong on the page is the single logical line that includes the italicized fact. If future revisions of the page show failure, or intermittent failure, one may simply alter the above to read. $ wget randomfunfacts.com -O - 2>/dev/null | tee lastfact | grep \ | sed "s;^.*\(.*\).*$;\1;" The file lastfact, can then be examined whenever the command fails.

Limit memory usage per script/program
When I'm testing some scripts or programs, they end up using more memory than anticipated. In that case, computer nearly halts due to swap space usage, and sometimes I have to press Magic SysRq+REISUB to reboot. So, I was looking for a way to limit memory usage per script and found out that ulimit can limit memory. If you run it this way: $ $ ulimit -v 1000000 . $ $ scriptname Then the new memory limit will be valid for that shell. I think changing the limit within a subshell is much more flexible and it won't interfere with your current shell ulimit settings. note: -v 1000000 corresponds to approximately 1GB of RAM

duration of the DNS-query

drill holes on image

Release memory used by the Linux kernel on caches
The Linux kernel uses unused memory in caches. When you execute "free" you never get the "real" available memory.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: