Commands by unixmonkey57804 (1)

  • in place of "output-filename.mp4" put the name you want the file to be named with. in place of "youtube-video-link" put the link of the Video page eg: http://www.youtube.com/watch?v=AclA-7YntvE in place of "format-number" put the number of the file format you would like How to get the "format-number" to get format number type in below command before running this command youtube-dl -F "youtube-video-link" and it will list all the available formats with the format number, like to download in 360p mp4 use the number "18" To automatically let it fetch the best quality available just remove the -f "format-number" and you are good to go. Show Sample Output


    2
    wget -O "output-filename.mp4" $( youtube-dl -g -f "format-number" "youtube-video-link" )
    unixmonkey57804 · 2013-05-19 16:25:30 1

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

add lyrics

Create a local compressed tarball from remote host directory
The command uses ssh(1) to get to a remote host, uses tar(1) to archive a remote directory, prints the result to STDOUT, which is piped to gzip(1) to compress to a local file. In other words, we are archiving and compressing a remote directory to our local box.

Convert Squid unixtime logs in human-readable ones
On-the-fly conversion of Unix Time to human-readable in Squid's access.log

back ssh from firewalled hosts
host B (you) redirects a modem port (62220) to his local ssh. host A is a remote machine (the ones that issues the ssh cmd). once connected port 5497 is in listening mode on host B. host B just do a ssh 127.0.0.1 -p 5497 -l user and reaches the remote host'ssh. This can be used also for vnc and so on.

Build an exhaustive list of maildir folders for mutt
Of course, you can adjust "Maildir" to your config...

Remount root in read-write mode.
Saved my day, when my harddrive got stuck in read-only mode.

I finally found out how to use notify-send with at or cron
You can write a script that does this : $remind []

Merge files, joining each line in one line

Wait for file to stop changing
Here's a way to wait for a file (a download, a logfile, etc) to stop changing, then do something. As written it will just return to the prompt, but you could add a "; echo DONE" or whatever at the end. This just compares the full output of "ls" every 10 seconds, and keeps going as long as that output has changed since the last interval. If the file is being appended to, the size will change, and if it's being modified without growing, the timestamp from the "--full-time" option will have changed. The output of just "ls -l" isn't sufficient since by default it doesn't show seconds, just minutes. Waiting for a file to stop changing is not a very elegant or reliable way to measure that some process is finished - if you know the process ID there are much better ways. This method will also give a false positive if the changes to the target file are delayed longer than the sleep interval for any reason (network timeouts, etc). But sometimes the process that is writing the file doesn't exit, rather it continues on doing something else, so this approach can be useful if you understand its limitations.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: