Commands tagged limits.conf (3)

  • It is helpful to know the current limits placed on your account, and using this shortcut is a quick way to figuring out which values to change for optimization or security. Alias is: alias ulimith="command ulimit -a|sed 's/^.*\([a-z]\))\(.*\)$/-\1\2/;s/^/ulimit /'|tr '\n' ' ';echo" Here's the result of this command: ulimit -c 0 -d unlimited -e 0 -f unlimited -i 155648 -l 32 -m unlimited -n 8192 -p 8 -q 819200 -r 0 -s 10240 -t unlimited -u unlimited -v unlimited -x unlimited ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 155648 max locked memory (kbytes, -l) 32 max memory size (kbytes, -m) unlimited open files (-n) 8192 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) unlimited virtual memory (kbytes, -v) unlimited file locks (-x) unlimited Show Sample Output


    3
    echo "ulimit `ulimit -a|sed -e 's/^.*\([a-z]\))\(.*\)$/-\1\2/'|tr "\n" ' '`"
    AskApache · 2010-03-12 06:46:54 4
  • There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs. Show Sample Output


    1
    echo $(( `ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l` ))
    AskApache · 2010-03-12 08:42:49 6

  • 0
    echo $(($(ulimit -u)-$(pgrep -u $USER|wc -l))
    h3xx · 2011-07-30 05:03:36 3

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

install all archive file type apps in ubuntu

Download song from youtube for import into itunes (m4a format)
Last argument is the youtube link. Requires ffmpeg

Simple Comment an entire file
With this simple sed command we can easily comment and entire file.

Send a local file via email
Another way of sending an attachment. -s : subject file : file to be sent

diff will usually only take one file from STDIN. This is a method to take the result of two streams and compare with diff. The example I use to compare two iTunes libraries but it is generally applicable.
diff is designed to compare two files. You can also compare directories. In this form, bash uses 'process substitution' in place of a file as an input to diff. Each input to diff can be filtered as you choose. I use find and egrep to select the files to compare.

most changed files in domains by rdiff-backup output

Find total Terabytes written to a SSD
You must have smartmontools installed for this to work. This also assumes you 512 byte sector sizes, this is pretty standard.

Application network trace based on application name
This command takes an application name as an argument and then it will listen to the tcp traffic and capture packets matching the process Id of the application. The output shows: local address / local port / Remote Address / Remote port / State / Owning Process ID

Verify/edit bash history command before executing it
Bash history commands are those that begin with the character ! (eg. the most popular 'sudo !!' Explained here => http://www.commandlinefu.com/commands/view/13). By default bash immediately executes the history command. Setting this shell option will make bash first allow you to verify/edit an history command before executing it. To set this option permanently, put this command in ~/.profile or ~/.bashrc file. To unset this option issue following command. $shopt -u histverify

a simple bash one-liner to create php file and call php function


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: