Commands tagged ulimit (7)

  • Turn shell tracing and verbosity (set -xv) on/off in any Bourne-type shell If either -x or -v is set, the function turns them both off. If neither is on, both are turned on.


    4
    xv() { case $- in *[xv]*) set +xv;; *) set -xv ;; esac }
    cfajohnson · 2010-02-14 20:57:29 3
  • It is helpful to know the current limits placed on your account, and using this shortcut is a quick way to figuring out which values to change for optimization or security. Alias is: alias ulimith="command ulimit -a|sed 's/^.*\([a-z]\))\(.*\)$/-\1\2/;s/^/ulimit /'|tr '\n' ' ';echo" Here's the result of this command: ulimit -c 0 -d unlimited -e 0 -f unlimited -i 155648 -l 32 -m unlimited -n 8192 -p 8 -q 819200 -r 0 -s 10240 -t unlimited -u unlimited -v unlimited -x unlimited ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 155648 max locked memory (kbytes, -l) 32 max memory size (kbytes, -m) unlimited open files (-n) 8192 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) unlimited virtual memory (kbytes, -v) unlimited file locks (-x) unlimited Show Sample Output


    3
    echo "ulimit `ulimit -a|sed -e 's/^.*\([a-z]\))\(.*\)$/-\1\2/'|tr "\n" ' '`"
    AskApache · 2010-03-12 06:46:54 4
  • When I'm testing some scripts or programs, they end up using more memory than anticipated. In that case, computer nearly halts due to swap space usage, and sometimes I have to press Magic SysRq+REISUB to reboot. So, I was looking for a way to limit memory usage per script and found out that ulimit can limit memory. If you run it this way: $ ulimit -v 1000000 . $ scriptname Then the new memory limit will be valid for that shell. I think changing the limit within a subshell is much more flexible and it won't interfere with your current shell ulimit settings. note: -v 1000000 corresponds to approximately 1GB of RAM


    2
    (ulimit -v 1000000; scriptname)
    alperyilmaz · 2011-01-27 21:30:59 6
  • default stack size is 10M. This makes your multithread app filling rapidly your memory. on my PC I was able to create only 300thread with default stack size. Lower the default stack size to the one effectively used by your threads, let you create more. ex. putting 64k I was able to create more than 10.000threads. Obviously ...your thread shouldn't need more than 64k ram!!!


    1
    ulimit -s 64
    ioggstream · 2009-08-06 10:40:25 5
  • Running this command turns shell tracing and shell verbose debugging on or off. Not only does it do that, it also uses your terminals builtin method of setting colors to make debugging much easier. It looks at the current shell options contained in the $- special bash variable and that lets this function set the opposite of the current value. So from the shell you could do a: setx; echo "y" | ( cat -t ) | echo "d"; setx and it will turn on debbuggin. This is an amazingly useful function that is perfect to add system-wide by adding it to /etc/profile or /etc/bashrc.. You can run it from the shell, and you can also use it in your shell scripts like my .bash_profile - http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    1
    function setx(){ sed '/[xv]/!Q2' <<< $- && { set +xv; export PS4=">>> "; } || { export PS4="`tput setaf 3`>>> `tput sgr0`"; set -xv; }; }
    AskApache · 2010-02-14 01:25:44 5
  • There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs. Show Sample Output


    1
    echo $(( `ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l` ))
    AskApache · 2010-03-12 08:42:49 6

  • 0
    echo $(($(ulimit -u)-$(pgrep -u $USER|wc -l))
    h3xx · 2011-07-30 05:03:36 3

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Print a random 8 digit number
Don't need to pipe the output into rs if you just tell jot to use a null separator character.

bulk rename files with sed, one-liner
Far from my favorite, but works in sh and with an old sed that doesn't support '-E'

Delete all but the latest 5 files
yes 6 (tail from 6th line)

List upcoming events on google calendar
Requires googlecl (http://code.google.com/p/googlecl/) Even better when you wrap this in a script and allow the --date=STRING to be $1. Then you can type: whatson "next Thursday" The date string for UNIX date is very flexible. You can also add --cal "[regex]" to the end for multiple calendars.

create screencast (record text and audio simultaneously) using 'script' and 'arecord'
This shell function takes a single argument, which is used as the base name of the .wav, .timing and .session files created. To create a screencast: $ screencast test type and talk ... then type 'exit' or to exit the screencast. test.wav will contain the audio from your screencast. test.session will contain text and control characters needed to paint the screen test.timing will contain timing information needed to synch individual keystrokes in test.session with the audio. to play back: $ aplay test.wav & scriptreplay test.{timing,session} NOTE: because the shell function uses the variable "$!", and bash likes to expand '!' during history expansion, you will need to turn off bash's history before you enter the shell function. This can be achieved using the command $set +H

Extracting frames from a video as jpeg files
This command extracts 10 seconds worth of frames, starting from 00:15:45 position, from filename.avi and stores them into out_frames folder as jpeg files. Subtitles are turned off with -sid 999 option.

Adjust gamma so monitor doesn't mess up your body's clock
[UPDATE: Now works for multiple connected outputs] I woke up around midnight with an urge to do some late night hacking, but I didn't want a bright monitor screwing up my body's circadian rhythm. I've heard that at night blue (short wavelength) lights are particularly bad for your diurnal clock. That may be a bunch of hooey, but it is true that redder (longer wavelength) colors are easier on my eyes at night. This command makes the screen dimmer and adjusts the gamma curves to improve contrast, particularly darkening blues and greens (Rɣ=2, Gɣ=3, Bɣ=4). To reset your screen to normal, you can run this command: $ xrandr | sed -n 's/ connected.*//p' | xargs -n1 -tri xrandr --output {} --brightness 1 --gamma 1:1:1 or, more briefly, $ xgamma -g 1 Note: The sed part is fragile and wrong. I'm doing it this way because of a misfeature in xrandr(1), which requires an output be specified but has no programmatic way of querying available outputs. Someone needs to patch up xrandr to be shell script friendly or at least add virtual outputs named "PRIMARY" and "ALL". . Todo: Screen should dim (gradually) at sunset and brighten at sunrise. I think this could be done with a self-resubmitting at job, but I'm running into the commandlinefu 127 character limit just getting the sunrise time: $ wget http://aa.usno.navy.mil/cgi-bin/aa_pap.pl --post-data=$(date "+xxy=%Y&xxm=%m&xxd=%d")"&st=WA&place=Seattle" -q -O- | sed -rn 's/\W*Sunrise\W*(.*)/\1/p' I hope some clever hacker comes up with a command line interface to Google's "OneBox", since the correct time shows up as the first hit when googling for "sunrise:cityname". . [Thank you to @flatcap for the sed improvement, which is much better than the head|tail|cut silliness I had before. And thank you to @braunmagrin for pointing out that the "connected" output may not be on the second line.]

copy partition table from /dev/sda to /dev/sdb

List only hidden files
You can omit the -d to see what's inside directories. In that case, you may want -a to see dotfiles inside those directories. (Otherwise you don't need -a since you're explicitly looking at them.)

SMTP Analysis
This works just as well for SMTP. You could run this on your mail server to watch e-mail senders and recipients: tcpdump -l -s0 -w - tcp dst port 25 | strings | grep -i 'MAIL FROM\|RCPT TO'


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: