Commands tagged login (14)

  • This command will tell lynx to read keystrokes from the specified file - which can be used in a cronjob to auto-login on websites that give you points for logging in once a day *cough cough* (which is why I used -accept_all_cookies). For creating your keystroke file, use: lynx -cmd_log yourfile


    26
    lynx -accept_all_cookies -cmd_script=/your/keystroke-file
    Alanceil · 2009-03-17 00:38:36 13
  • I'm annoyed by the boilerplate "don't login unless you are supposed messages in our environment" - this shuts them up.


    6
    touch ~/.hushlogin
    elofland · 2012-12-05 18:03:41 6
  • This will parse a random command json entry from http://commandlinefu.com A must have in your .bash_profile to learn new shell goodies at login!!!


    4
    echo -e "`curl -sL http://www.commandlinefu.com/commands/random/json|sed -re 's/.*,"command":"(.*)","summary":"([^"]+).*/\\x1b[1;32m\2\\n\\n\\x1b[1;33m\1\\x1b[0m/g'`\n"
    xenomuta · 2012-08-17 11:47:20 15
  • You need to install "sshpass" for this to work. apt-get install sshpass


    4
    sshpass -p "YOUR_PASSWORD" ssh -o StrictHostKeyChecking=no YOUR_USERNAME@SOME_SITE.COM
    o0110o · 2013-05-24 14:33:38 12
  • One of my favorite ways to impress newbies (and old hats) to the power of the shell, is to give them an incredibly colorful and amazing version of the top command that runs once upon login, just like running fortune on login. It's pretty sweet believe me, just add this one-liner to your ~/.bash_profile -- and of course you can set the height to be anything, from 1 line to 1000! G=$(stty -g);stty rows $((${LINES:-50}/2));top -n1; stty $G;unset G Doesn't take more than the below toprc file I've added below, and you get all 4 top windows showing output at the same time.. each with a different color scheme, and each showing different info. Each window would normally take up 1/4th of your screen when run like that - TOP is designed as a full screen program. But here's where you might learn something new today on this great site.. By using the stty command to change the terminals internal understanding of the size of your terminal window, you force top to also think that way as well. # save the correct settings to G var. G=$(stty -g) # change the number of rows to half the actual amount, or 50 otherwise stty rows $((${LINES:-50}/2)) # run top non-interactively for 1 second, the output stays on the screen (half at least) top -n1 # reset the terminal back to the correct values, and clean up after yourself stty $G;unset G This trick from my [ http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html bash_profile ], though the online version will be updated soon. Just think what else you could run like this! Note 1: I had to edit the toprc file out due to this site can't handle that (uploads/including code). So you can grab it from [ http://www.askapache.com/linux-unix/bash-power-prompt.html my site ] Note 2: I had to come back and edit again because the links weren't being correctly parsed Show Sample Output


    3
    G=$(stty -g);stty rows $((${LINES:-50}/2));top -n1; stty $G;unset G
    AskApache · 2010-04-22 18:52:49 13
  • Also with optional message: echo "no login for you" > /etc/nologin (This doesn't affect your current X session - you're already logged in!)


    2
    touch /etc/nologin
    udim · 2009-04-29 19:43:14 4
  • Optionally, pipe the output into http://sed.sourceforge.net/grabbag/scripts/html2iso.sed Or: wget -qO - http://www.asciiartfarts.com/random.cgi | sed -n '//,//p' | sed -n '/ Show Sample Output


    2
    wget -qO - http://www.asciiartfarts.com/random.cgi | sed -n '/<pre>/,/<\/pre>/p' | sed -n '/<table*/,/<\/table>/p' | sed '1d' | sed '$d' | recode html..ascii
    krunktron · 2013-08-17 19:42:47 9
  • The lastb command presents you with the history of failed login attempts (stored in /var/log/btmp). The reference file is read/write by root only by default. This can be quite an exhaustive list with lots of bots hammering away at your machine. Sometimes it is more important to see the scale of things, or in this case the volume of failed logins tied to each source IP. The awk statement determines if the 3rd element is an IP address, and if so increments the running count of failed login attempts associated with it. When done it prints the IP and count. The sort statement sorts numerically (-n) by column 3 (-k 3), so you can see the most aggressive sources of login attempts. Note that the ':' character is the 2nd column, and that the -n and -k can be combined to -nk. Please be aware that the btmp file will contain every instance of a failed login unless explicitly rolled over. It should be safe to delete/archive this file after you've processed it. Show Sample Output


    1
    sudo lastb | awk '{if ($3 ~ /([[:digit:]]{1,3}\.){3}[[:digit:]]{1,3}/)a[$3] = a[$3]+1} END {for (i in a){print i " : " a[i]}}' | sort -nk 3
    sgowie · 2012-09-11 14:51:10 4

  • 0
    ac -p | sort -nk 2 | awk '/total/{print x};{x=$1}'
    pmbuko · 2011-12-14 15:47:02 3
  • The above is OK if you not worried about security, as per sshpass man pages: " The -p option should be considered the least secure of all of sshpass's options. All system users can see the password in the command line with a simple "ps" command." So, instead what I do is use the -e option: " -e The password is taken from the environment variable "SSHPASS"." Show Sample Output


    0
    SSHPASS='your_password' sshpass -e ssh me@myhost.com
    djkadu · 2013-06-03 12:26:40 6
  • display IP's that unsuccessfully attempted to login 5 or more times today may want to filter any trusted IP's and the localhost useful for obtaining a list IP addresses to block on the firewall Show Sample Output


    -1
    lastb -i | grep "$(date '+%a %b %d')" | awk '{ print $3 }' | sort | uniq -c | awk '{ if ($1 >= 5) print $2; }'
    forestb · 2015-11-20 07:19:20 10
  • Explination: https://stackoverflow.com/questions/2257441/random-string-generation-with-upper-case-letters-and-digits/23728630#23728630 Why 16 Characters: https://www.wired.com/story/7-steps-to-password-perfection/ Show Sample Output


    -1
    python -c "import string; import random;print(''.join(random.SystemRandom().choice(string.ascii_uppercase + string.digits + string.ascii_lowercase) for _ in range(16)))"
    rootduck · 2019-06-14 17:35:12 39
  • Change :alnum: to :graph: for all printable characters Show Sample Output


    -2
    cat /dev/urandom |tr -c -d '[:alnum:]'|head -c 16;echo
    AndrewM · 2019-06-17 17:51:04 36
  • This command will reveal login has been made to the system as well as when the reboot occurs. It uses a file called /var/log/wtmp,which captures all the information about the successful login and reboot information. It has many switch ,by which you can get an idea when people login how long they stay. Show Sample Output


    -3
    last
    unixbhaskar · 2009-08-29 12:08:30 4

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

how many pages will my text files print on?
This gives a very rough estimate of how many pages your text files will print on. Assumes 60 lines per page, and does not take long lines into account.

Find and display most recent files using find and perl
This pipeline will find, sort and display all files based on mtime. This could be done with find | xargs, but the find | xargs pipeline will not produce correct results if the results of find are greater than xargs command line buffer. If the xargs buffer fills, xargs processes the find results in more than one batch which is not compatible with sorting. Note the "-print0" on find and "-0" switch for perl. This is the equivalent of using xargs. Don't you love perl? Note that this pipeline can be easily modified to any data produced by perl's stat operator. eg, you could sort on size, hard links, creation time, etc. Look at stat and just change the '9' to what you want. Changing the '9' to a '7' for example will sort by file size. A '3' sorts by number of links.... Use head and tail at the end of the pipeline to get oldest files or most recent. Use awk or perl -wnla for further processing. Since there is a tab between the two fields, it is very easy to process.

list files recursively by size

Find Duplicate Files (based on size first, then MD5 hash)
for OS X

Scans for open ports using telnet

Efficient count files in directory (no recursion)
$ time perl -e 'if(opendir D,"."){@a=readdir D;print $#a - 1,"\n"}' 205413 real 0m0.497s user 0m0.220s sys 0m0.268s $ time { ls |wc -l; } 205413 real 0m3.776s user 0m3.340s sys 0m0.424s ********* ** EDIT: turns out this perl liner is mostly masturbation. this is slightly faster: $ find . -maxdepth 1 | wc -l sh-3.2$ time { find . -maxdepth 1|wc -l; } 205414 real 0m0.456s user 0m0.116s sys 0m0.328s ** EDIT: now a slightly faster perl version $ perl -e 'if(opendir D,"."){++$c foreach readdir D}print $c-1,"\n"' sh-3.2$ time perl -e 'if(opendir D,"."){++$c foreach readdir D}print $c-1,"\n"' 205414 real 0m0.415s user 0m0.176s sys 0m0.232s

Remove executable bit from all files in the current directory recursively, excluding other directories
With GNU chmod at least it is that simple.

Check the current price of Bitcoin in USD

Find the most recently changed files (recursively)


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: