Commands tagged musicbrainz (1)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Convert JSON to YAML
Convert JSON to YAML. Note that you'll need to have PyYaml installed.

Create a .png from a command output and upload to ompdlr.org, give URI
Create a .png from output command or whatever, the upload and give URI from ompdlr.org

Display a list of committers sorted by the frequency of commits
Use this command to find out a list of committers sorted by the frequency of commits.

Convert a videos audio track to ogg vorbis.
Assumes you have ffmpeg and oggenc. Similar to other scripts here, but this time outputting to Ogg Vorbis. I added the variable assignment for a nice output name. This is part of an interactive bash script I have with a few little multimedia tasks in it. http://www.dward.us/software/VSAK.sh

Export log to html file
Logtool is a nice tool that can export log file to various format, but its strength lies in the capacity of colorize logs. This command take a log as input and colorize it, then export it to an html file for a more confortable view. Logtool is part of logtool package.Tested on Debian.

defragment files
Thanks to flatcap for optimizing this command. This command takes advantage of the ext4 filesystem's resistance to fragmentation. By using this command, files that were previously fragmented will be copied / deleted / pasted essentially giving the filesystem another chance at saving the file contiguously. ( unlike FAT / NTFS, the *nix filesystem always try to save a file without fragmenting it ) My command only effects the home directory and only those files with your R/W (read / write ) permissions. There are two issues with this command: 1. it really won't help, it works, but linux doesn't suffer much (if any ) fragmentation and even fragmented files have fast I/O 2. it doesn't discriminate between fragmented and non-fragmented files, so a large ~/ directory with no fragments will take almost as long as an equally sized fragmented ~/ directory The benefits i managed to work into the command: 1. it only defragments files under 16mb, because a large file with fragments isn't as noticeable as a small file that's fragmented, and copy/ delete/ paste of large files would take too long 2. it gives a nice countdown in the terminal so you know how far how much progress is being made and just like other defragmenters you can stop at any time ( use ctrl+c ) 3. fast! i can defrag my ~/ directory in 11 seconds thanks to the ramdrive powering the command's temporary storage bottom line: 1. its only an experiment, safe ( i've used it several times for testing ), but probably not very effective ( unless you somehow have a fragmentation problem on linux ). might be a placebo for recent windows converts looking for a defrag utility on linux and won't accept no for an answer 2. it's my first commandlinefu command

Find today created files

list any Linux files without users or groups
suspicious/anomalous ownership may indicate system breach; should return no results

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Fastest segmented parallel sync of a remote directory over ssh
Mirror a remote directory using some tricks to maximize network speed. lftp:: coolest file transfer tool ever -u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa) -e: execute internal lftp commands set sftp:connect-program: use some specific command instead of plain ssh ssh:: -a -x -T: disable useless things -c arcfour: use the most efficient cipher specification -o Compression=no: disable compression to save CPU mirror: copy remote dir subtree to local dir -v: be verbose (cool progress bar and speed meter, one for each file in parallel) -c: continue interrupted file transfers if possible --loop: repeat mirror until no differences found --use-pget-n=3: transfer each file with 3 independent parallel TCP connections -P 2: transfer 2 files in parallel (totalling 6 TCP connections) sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate) You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network. If the files are compressible removing "-o Compression=n" can be beneficial. Better create an alias for the command.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: