Commands tagged Stremio (1)

  • I couldn't find movie library on any of the SQLlite Stremio databases, but on ~/.config/stremio/backgrounds2 the background image filenames corresponds to IMDB URL. So I foreach files and wget HTML title of each movie and save it to a file. This will retrieve all movie names, not just the Library.


    0
    time for movie in $(ls -1 $HOME/.config/stremio/backgrounds2 | sort -u);do wget -qO- --header="Accept-Language: en" "https://www.imdb.com/title/$movie/" | hxselect -s '\n' -c 'title' 2>/dev/null | tee -a ~/movie-list.txt ; done
    pabloab · 2018-08-16 06:11:41 1

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Find files that were modified by a given command
This has helped me numerous times trying to find either log files or tmp files that get created after execution of a command. And really eye opening as to how active a given process really is. Play around with -anewer, -cnewer & -newerXY

send substituted text to a command without echo, pipe
zsh only - This avoids the need for echo "message" | which creates an entire subshell. Also, the text you are most likely to edit is at the very end of the line, which, in my opinion, makes it slightly easier to edit.

Show the command line of a process that use a specific port (ubuntu)

Reinstall a Synology NAS without loosing any data from commandline.
Seen many questions how-to reinstall synology nas dsm without loosing data, here you go. Wait for a few min and then head over to http://nasip and setup your fresh installed nas.

use wget to check if a remote file exists

Create a mirror of a local folder, on a remote server
Create a exact mirror of the local folder "/root/files", on remote server 'remote_server' using SSH command (listening on port 22) (all files & folders on destination server/folder will be deleted)

Kill all processes that listen to ports begin with 50 (50, 50x, 50xxx,...)
Run netstat as root (via sudo) to get the ID of the process listening on the desired socket. Use awk to 1) match the entry that is the listening socket, 2) matching the exact port (bounded by leading colon and end of column), 3) remove the trailing slash and process name from the last column, and finally 4) use the system(…) command to call kill to terminate the process. Two direct commands, netstat & awk, and one forked call to kill. This does kill the specific port instead of any port that starts with 50. I consider this to be safer.

Use tee to process a pipe with two or more processes
Tee can be used to split a pipe into multiple streams for one or more process to work it. You can add more " >()" for even more fun.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

TCPDUMP & Save Capture to Remote Server w/ GZIP
NOTE: When opening the files you might need to strip the very top line with notepad++ as its a mistake header This is useful when the local machine where you need to do the packet capture with tcpdump doesn?t have enough room to save the file, where as your remote host does tcpdump -i eth0 -w - | ssh forge.remotehost.com -c arcfour,blowfish-cbc -C -p 50005 "cat - | gzip > /tmp/eth0.pcap.gz" Your @ PC1 doing a tcpdump of PC1s eth0 interface and its going to save the output @ PC2 who is called save.location.com to a file /tmp/eth0-to-me.pcap.gz again on PC2 More info @: http://www.kossboss.com/linuxtcpdump1


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: