On the Mac, the 'ls' function can sort based on month/day/time, but seems to lack ability to filter on the Year field (#9 among the long listed fields). The sorted list continuously increases the 'START' year for the most recently accessed set of files. The final month printed will be the highest month that appeared in that START year. The command does its magic on the current directory, and suitably discards all entries that are themselves directories. If you expect files dating prior to 2002, change the START year accordingly.
Long listing alternative Show Sample Output
zsh globbing and glob qualifier: '**/*' = recursive om = ouput by modification (last access) [1,20] = twenty files. The '-t' switch is provided to ls so that the files are ordered with the most recent at the top. For a more 'find' like output the following can be used. print -rl **/*(om[1,20])
Sometime you need to monitor file or direcory change in dimension or other attributes. This command output file (called myfile in the example) attributes in the top of the screen, updating each 1 second. You should change update time, command ( e.g., ls -all ) or target ( myfile, mydir, etc...). Show Sample Output
A way to display directory structure Show Sample Output
list all files are greater than 10mb lent from: http://www.tippscout.de/linux-grosze-dateien-finden_tipp_1653.html
This one has a better performance, as it is a one pass count with awk. For this script it might not matter, but for others it is a good optiomization.
Rename all files in current directory by names from text file 'zzz'
Use awk to sum and print the space used by a group of files. It works well as long as the space used is not bigger than 79094548.80... I found that upper limit when trying to find out what was the total amount of recoverable space from a set of directories: user@servername:/home/user/scripts>for dirName in aleph_bin aleph_sh aleph_work dailycheck INTERFAZ ; do echo "${dirName} = $(cat /tmp/purge_ocfs_dir.*.log | awk '{sum+=$5} END {printf "%4.2f", sum}') "; done aleph_bin = 79094548.80 aleph_sh = 79094548.80 aleph_work = 79094548.80 dailycheck = 79094548.80 INTERFAZ = 79094548.80 In the worst case scenario, the total number might be almost 137G. user@servername:/home/user/scripts>df -h /ocfs/* Filesystem Size Used Avail Use% Mounted on //argalephfsprod/aleph_bin$ 137G 38G 99G 28% /ocfs/aleph_bin //argalephfsprod/aleph_sh$ 137G 38G 99G 28% /ocfs/aleph_sh //argalephfsprod/aleph_work$ 280G 135G 146G 49% /ocfs/aleph_work //argalephfsprod/dailycheck$ 137G 38G 99G 28% /ocfs/dailycheck //argalephfsprod/INTERFAZ/ 137G 38G 99G 28% /ocfs/INTERFAZ Any suggestion about how to get the correct amount of space for total over 80 Mbytes? Show Sample Output
I use this with alias: alias lsl="ls -1F | grep @ | sed 's/@//' | column"
I use this with alias: alias lsl="ls -1F | grep @$ | sed 's/@//' | column" Limitation: This will also list files that happen to have an @ at the end of the filename.
RU: Найдет число файлов в папке по данной маске в цикле по дням месяца
"find ./ ..." could be replaced with "find $PWD ..." to display absolute path instead of relative path. Show Sample Output
Display a list of local shell scripts soft-linked to /usr/local/bin Put local shell scripts to local ~/bin/ directory and soft-link them to /usr/local/bin/ which is in the $PATH variable to run them from anywhere. Show Sample Output
This one works without an external program (watch). Witch is not await able all the time. HINT: use CTRL + C to exit the loop.
alias lst="ls -ls -tr | tail" Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: