All commands (14,187)

  • You can try it . Nice shell interface to search google from the command line.Visit http://goosh.org in your browser. Show Sample Output


    -4
    http://goosh.org
    unixbhaskar · 2009-08-29 12:19:34 3
  • This command will reveal login has been made to the system as well as when the reboot occurs. It uses a file called /var/log/wtmp,which captures all the information about the successful login and reboot information. It has many switch ,by which you can get an idea when people login how long they stay. Show Sample Output


    -3
    last
    unixbhaskar · 2009-08-29 12:08:30 4
  • The important thing to note in this command, is the "-n" flag.


    10
    while read server; do ssh -n user@$server "command"; done < servers.txt
    sharfah · 2009-08-29 06:52:34 8
  • This is wonderful perl script to check the web server security and vulnerability .Get it from here :http://www.cirt.net/nikto2 Here are some key features of "Nikto": ? Uses rfp's LibWhisker as a base for all network funtionality ? Main scan database in CSV format for easy updates ? Determines "OK" vs "NOT FOUND" responses for each server, if possible ? Determines CGI directories for each server, if possible ? Switch HTTP versions as needed so that the server understands requests properly ? SSL Support (Unix with OpenSSL or maybe Windows with ActiveState's Perl/NetSSL) ? Output to file in plain text, HTML or CSV ? Generic and "server type" specific checks ? Plugin support (standard PERL) ? Checks for outdated server software ? Proxy support (with authentication) ? Host authentication (Basic) ? Watches for "bogus" OK responses ? Attempts to perform educated guesses for Authentication realms ? Captures/prints any Cookies received ? Mutate mode to "go fishing" on web servers for odd items ? Builds Mutate checks based on robots.txt entries (if present) ? Scan multiple ports on a target to find web servers (can integrate nmap for speed, if available) ? Multiple IDS evasion techniques ? Users can add a custom scan database ? Supports automatic code/check updates (with web access) ? Multiple host/port scanning (scan list files) ? Username guessing plugin via the cgiwrap program and Apache ~user methods Show Sample Output


    0
    nikto.pl -h yourwebserver
    unixbhaskar · 2009-08-29 04:54:43 8
  • Sometime you need to run firefox from the command just to rectify something about it.Means,if some of the addon broke you firefox setting or theme broke your ff setting then fall back to commandline i.e shell and type the mentioned command. It will open up an information box with few option along with the checkbox besides them(means you can select them) to start the web browser in safe mode.Besically deactivating all the addon and theme,except the default one.Once you are done/rectified thing ..close that session and reopen the browser normally.It should work.


    0
    firefox --safe-mode
    unixbhaskar · 2009-08-29 04:36:19 3
  • Once it is connected to the remote server by that ssh protocol,the mentioned command will start working on that server.


    -3
    ssh user@remotehost [anycommand](i.e uptime,w)
    unixbhaskar · 2009-08-29 04:27:37 7
  • If you follow my other posting regarding "vipw" and "vigr' then no explanation required.It has done the same thing as did with those two command.Open the /etc/sudoers file and attach a lock with it. Once you are done with it ,the lock gets released and the changes reflected to the original file.It will open a tmp file in vi editor to give you the chance to edit the sudoers file securely.visudo parses the sudoers file after the edit and will not save the changes if there is a syntax error. Upon finding an error, visudo will print a message stating the line number(s) where the error occurred and the user will receive the "What now?" prompt. At this point the user may enter "e" to re-edit the sudoers file, "x" to exit without saving the changes, or "Q" to quit and save changes. The "Q" option should be used with extreme care because if visudo believes there to be a parse error, so will sudo and no one will be able to sudo again until the error is fixed. If "e" is typed to edit the sudoers file after a parse error has been detected, the cursor will be placed on the line where the error occurred (if the editor supports this feature). PS: Although I have had experienced myself and few people shown to me that it behaves badly in some distribution ,noteably SLES.But the problem can be rectified with little caution. Show Sample Output


    -3
    visudo
    unixbhaskar · 2009-08-29 04:06:11 3
  • If you follow my previous posting regarding "vipw" then no explanation required.The same method goes behind this command also.It will open an tmp file in vi editor to give you the enlisting to edit the group file.And most importantly to attach a lock with it.Once you are done ,the lock is released and the changed reflected to the original file.So you can securely edit the group file over the network without the fear of being tampered . Show Sample Output


    -3
    vigr
    unixbhaskar · 2009-08-29 03:56:07 3
  • Now a bit of explanation required for this command.Once you type the command it opens up an vi editor with an temporary file enlisting the password file information .So if you make an change it will not reflected in the passwd file until you save the file.The reason behind using this command over other way to view the password file in network environment is that it locks the password file when you start working with it.So no one can temper with it during that period.Once you are done(means you save the tmp file) ,it will release the lock associated with it.I think it's a better mechanism to view the sensitive data like passwd file.Never ever use other tool like cat, nano or any other means. Show Sample Output


    -3
    vipw
    unixbhaskar · 2009-08-29 03:46:42 3
  • After you install slocate ,the first thing you have to do with it to initialise the database by issuing a command " slocate -u" . And then onwards just give the filename or dirname as a argument to the slocate command will reveal the files/dirs location in the system along with path.Moreover over it's an securely way of looking into the file system. Show Sample Output


    -3
    slocate filename/dirname
    unixbhaskar · 2009-08-29 03:28:08 3
  • Copy the link to an HD movie trailer in to this command. It's more eleganant if it's put in a to a script, taking the URL as input.


    5
    wget -U "QuickTime/7.6.2 (qtver=7.6.2;os=Windows NT 5.1Service Pack 3)" `echo http://movies.apple.com/movies/someHDmovie_720p.mov | sed 's/\([0-9][0-9]\)0p/h\10p/'`
    deadrabbit · 2009-08-29 00:29:40 21

  • 14
    rename -v 's/ /_/g' *
    fsilveira · 2009-08-28 17:28:57 7

  • 23
    tr : '\n' <<<$PATH
    foob4r · 2009-08-28 15:45:09 7
  • another method : awk '{gsub(/:/, "\n");print}' Show Sample Output


    1
    sed 's/:/\n/g' <<<$PATH
    twfcc · 2009-08-28 14:12:07 3
  • Compile *.c files with "gcc -Wall" in actual directory, using as output file the file name without extension.


    -7
    ls *.c | while read F; do gcc -Wall -o `echo $F | cut -d . -f 1 - ` $F; done
    pichinep · 2009-08-28 13:01:56 9

  • 13
    URL="http://www.google.com";curl -L --w "$URL\nDNS %{time_namelookup}s conn %{time_connect}s time %{time_total}s\nSpeed %{speed_download}bps Size %{size_download}bytes\n" -o/dev/null -s $URL
    adminix · 2009-08-28 12:30:56 8
  • First, we convert the VMware avi (VMnc format) to the Microsoft avi format. Next, we convert the Microsoft avi format to FLV format. You can play around with the -r switch (rate per second) and the -b switch (bitrate). But, if those get larger, so does your FLV file.


    0
    mencoder -of avi -ovc lavc movie.avi -o movie2.avi; ffmpeg -i movie2.avi -r 12 -b 100 movie.flv
    dcabanis · 2009-08-28 11:05:21 3

  • 0
    find . -type f -exec dos2unix {} +
    pavanlimo · 2009-08-28 08:44:11 6
  • It is a much better tool then nslookup for getting information about the any site.It has got better capability too.For reverse information please use the switch "-x" and the ip address. Show Sample Output


    -1
    dig google.com
    unixbhaskar · 2009-08-28 04:32:52 3
  • This is an commandline utility to get fair piece of information about the attached network card. Show Sample Output


    0
    ethtool eth0
    unixbhaskar · 2009-08-28 04:22:03 3
  • A wonderful command line utility to check the internet usage. It has got so many useful switch to display the data you want.Please visit the man page to get all the information.Get it from this website http://humdi.net/vnstat Show Sample Output


    0
    vnstat
    unixbhaskar · 2009-08-28 04:14:42 4
  • To get the connection information of protocol tcp and extended infortmation. Show Sample Output


    3
    sudo /bin/netstat -tpee
    unixbhaskar · 2009-08-28 04:02:10 5
  • As mentioned in the summery that it is a powerful command to monitor system activity in great way. It has got the power of vmstat,iostat,mpstat,df,free and sar.Instead of firing each single command separately ,one can fire one single command to get all the info at once.But there is a way to get the individual information too. Please see the man page . You can get it from here : http://dag.wieers.com/home-made/dstat/ Show Sample Output


    4
    dstat -afv
    unixbhaskar · 2009-08-28 03:53:24 9

  • -2
    du -s `find . -maxdepth 1 \! -name '.'` | sort -n | tail
    hydhaval · 2009-08-28 00:24:11 6
  • Of course you need to be able to access host A for this ;-)


    40
    ssh -t hostA ssh hostB
    0x89 · 2009-08-27 21:35:19 19
  • ‹ First  < 456 457 458 459 460 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

how many pages will my text files print on?
This gives a very rough estimate of how many pages your text files will print on. Assumes 60 lines per page, and does not take long lines into account.

Find and display most recent files using find and perl
This pipeline will find, sort and display all files based on mtime. This could be done with find | xargs, but the find | xargs pipeline will not produce correct results if the results of find are greater than xargs command line buffer. If the xargs buffer fills, xargs processes the find results in more than one batch which is not compatible with sorting. Note the "-print0" on find and "-0" switch for perl. This is the equivalent of using xargs. Don't you love perl? Note that this pipeline can be easily modified to any data produced by perl's stat operator. eg, you could sort on size, hard links, creation time, etc. Look at stat and just change the '9' to what you want. Changing the '9' to a '7' for example will sort by file size. A '3' sorts by number of links.... Use head and tail at the end of the pipeline to get oldest files or most recent. Use awk or perl -wnla for further processing. Since there is a tab between the two fields, it is very easy to process.

list files recursively by size

Find Duplicate Files (based on size first, then MD5 hash)
for OS X

Scans for open ports using telnet

Efficient count files in directory (no recursion)
$ time perl -e 'if(opendir D,"."){@a=readdir D;print $#a - 1,"\n"}' 205413 real 0m0.497s user 0m0.220s sys 0m0.268s $ time { ls |wc -l; } 205413 real 0m3.776s user 0m3.340s sys 0m0.424s ********* ** EDIT: turns out this perl liner is mostly masturbation. this is slightly faster: $ find . -maxdepth 1 | wc -l sh-3.2$ time { find . -maxdepth 1|wc -l; } 205414 real 0m0.456s user 0m0.116s sys 0m0.328s ** EDIT: now a slightly faster perl version $ perl -e 'if(opendir D,"."){++$c foreach readdir D}print $c-1,"\n"' sh-3.2$ time perl -e 'if(opendir D,"."){++$c foreach readdir D}print $c-1,"\n"' 205414 real 0m0.415s user 0m0.176s sys 0m0.232s

Remove executable bit from all files in the current directory recursively, excluding other directories
With GNU chmod at least it is that simple.

Check the current price of Bitcoin in USD

Find the most recently changed files (recursively)


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: