Commands tagged xml (28)

  • Diffs two xml files by formatting them first using xmllint and then invoking diff. Usage: diffxml XMLFile1 XMLFile2


    13
    diffxml() { diff -wb <(xmllint --format "$1") <(xmllint --format "$2"); }
    sharfah · 2011-10-06 07:36:13 1
  • This will indent the input to be more readable. Warnings and messages are not send to STDOUT so you can just use a pipe to create the formatted outputfile, like: tidy -i -xml in.xml > out.xml Show Sample Output


    8
    tidy -i -xml <inputfile>
    Testuser_01 · 2012-11-03 18:10:58 0
  • Limited, but useful construct to extract text embedded in XML tags. This will only work if bar is all on one line. If nobody posts an alternative for the multiline sed version, I'll figure it out later...


    4
    sed -n 's/.*<foo>\([^<]*\)<\/foo>.*/\1/p'
    recursiverse · 2009-07-23 07:59:30 0
  • poor man's xml parser :)


    4
    xml2 < file.xml | grep ^/path/to/element | cut -f2- -d=
    bandie91 · 2011-12-19 18:51:17 1
  • If everything validates, there's no output. Can be handy to run on a cron job set up to email output.


    2
    find -type f -name "*.xml" -exec xmllint --noout {} \;
    bradbeattie · 2011-01-25 18:26:57 2

  • 2
    echo '<foo><bar/></foo>' | xmllint --format -
    akavel · 2012-01-12 09:39:56 0
  • The difference between the original version provided and this one is that this one works rather than outputting a wget error


    2
    curl $1 | grep -E "http.*\.mp3" | sed "s/.*\(http.*\.mp3\).*/\1/" | xargs wget
    theodric · 2015-09-17 13:19:53 21
  • Like `tidy`, `xmllint` can be used to prettify XML files. The --nsclean option is also useful to remove redundant namespaces.


    1
    xmllint --format --xmlout --nsclean <file>
    seb1245 · 2012-11-27 06:13:23 0
  • Neither of the others worked for me. This does.


    1
    curl http://url/rss | grep -o '<enclosure url="[^"]*' | grep -o '[^"]*$' | xargs wget -c
    dakira · 2016-05-29 12:07:21 11
  • Formats the output from `ioreg` into XML, then parses the XML with `xmllint`'s xpath feature. Show Sample Output


    1
    ioreg -ad2 -c IOPlatformExpertDevice | xmllint --xpath '//key[.="IOPlatformUUID"]/following-sibling::*[1]/text()' -
    n8felton · 2018-08-18 21:19:47 23
  • Directly download all mp3 files of the desired podcast


    1
    curl http://radiofrance-podcast.net/podcast09/rss_14726.xml | grep -Eo "(http|https)://[a-zA-Z0-9./?=_%:-]*mp3" | sort -u | xargs wget
    pascalv · 2021-08-09 13:40:26 79
  • This one will work a little better, the regular expressions it is not 100% accurate for XML parsing but it will suffice any XML valid document for sure. Show Sample Output


    0
    grep -Eho '<[a-ZA-Z_][a-zA-Z0-9_-:]*' * | sort -u | cut -c2-
    inkel · 2009-08-05 21:54:29 0
  • Might be able to do it in less steps with xmlstarlet, although whether that would end up being shorter overall I don't know - xmlstarlet syntax confuses the heck out of me. Prompts for your password, or if you're a bit mental you can add your password into the command itself in the format "-u user:password". Show Sample Output


    0
    curl -u <username> http://app.boxee.tv/api/get_queue | xml2 | grep /boxeefeed/message/description | awk -F= '{print $2}'
    Strawp · 2010-01-20 16:17:19 6
  • Ever wanted to stream your favorite podcast across the network, well now you can. This command will parse the iTunes enabled podcast and stream the latest episode across the network through ssh encryption. Show Sample Output


    0
    curl -L -s `curl -s http://www.2600.com/oth-broadband.xml` | xmlstarlet sel -t -m "//enclosure[1]" -v "@url" -n | head -n 1` | ssh -t [user]@[host] "mpg123 -"
    denzuko · 2010-07-30 23:20:50 0
  • Gets the latest podcast show from from your favorite Podcast. Uses curl and xmlstarlet. Make sure you change out the items between brackets.


    0
    curl -L -s `curl -s [http://podcast.com/show.rss]` | xmlstarlet sel -t -m "//enclosure[1]" -v "@url" -n | head -n 1` | ssh -t [user]@[host] "mpg123 -"
    denzuko · 2010-07-31 00:17:47 0
  • this simply curls the feed and runs a xpath query on it ... Show Sample Output


    0
    atomtitles () { curl --silent $1 | xmlstarlet sel -N atom="http://www.w3.org/2005/Atom" -t -m /atom:feed/atom:entry -v atom:title -n}
    Seebi · 2010-12-15 11:03:31 3
  • This function uses xmllint to evaluate xpaths. Usage: xpath /some/xpath XMLfile Show Sample Output


    0
    xpath () { xmllint --format --shell "$2" <<< "cat $1" | sed '/^\/ >/d' }
    sharfah · 2011-10-05 07:45:16 1
  • The XML document can be transformed to text, XML, HTML or anything else. The --stringparam option allows to set XSL variables externally.


    0
    xsltproc --stringparam name value <xsl_stylesheet> <xml_document>
    seb1245 · 2012-11-09 15:54:46 0
  • OpenDocument documents from OpenOffice.org,LibreOffice and other applications, are actually ZIP archives. Useful informations in these archives are in XML format. Here we like it or do not. Anyway, the XML files have the unfortunate tendency to not be indented, and for good reason: they consist of only one line! To solve the problem and to use a proper editor on the content, I proceed as follows. Required xmlindent You can also use : zip document.odt content.xml And it works with vi instead of nano !


    0
    unzip document.odt content.xml && xmlindent -w content.xml && nano content.xml
    arthurdent · 2012-12-01 17:05:28 1
  • set BLOCK to "title" or any other HTML / RSS / XML tag and curl URL to get everything in-between e.g. some text


    0
    curl ${URL} 2>/dev/null|grep "<${BLOCK}>"|sed -e "s/.*\<${BLOCK}\>\(.*\)\<\/${BLOCK}\>.*/\1/g"
    c3w · 2013-08-31 14:53:54 0
  • This script can be used to download enclosed files from a RSS feed. For example, it can be used to download mp3 files from a podcasts RSS feed. Show Sample Output


    0
    wget -q -O- http://example-podcast-feed.com/rss | grep -o "<enclosure[ -~][^>]*" | grep -o "http://[ -~][^\"]*" | xargs wget -c
    talha131 · 2013-09-24 12:38:08 14
  • Don't want to open up an editor just to view a bunch of XML files in an easy to read format? Now you can do it from the comfort of your own command line! :-) This creates a new function, xmlpager, which shows an XML file in its entirety, but with the actual content (non-tag text) highlighted. It does this by setting the foreground to color #4 (red) after every tag and resets it before the next tag. (Hint: try `tput bold` as an alternative). I use 'xmlindent' to neatly reflow and indent the text, but, of course, that's optional. If you don't have xmlindent, just replace it with 'cat'. Additionally, this example shows piping into the optional 'less' pager; note the -r option which allows raw escape codes to be passed to the terminal. Show Sample Output


    0
    xmlpager() { xmlindent "$@" | awk '{gsub(">",">'`tput setf 4`'"); gsub("<","'`tput sgr0`'<"); print;} END {print "'`tput sgr0`'"}' | less -r; }
    hackerb9 · 2015-07-12 09:22:10 1
  • Just added a little url encoding with sed - urls with spaces don't work well - this also works against instead of enclosure and adds a sample to show that you can filter against links at a certain domain Show Sample Output


    0
    wget -q -O- http://www.yourfeed.com/rss | grep -o "<link[ -~][^>]*" | grep -o "http://www.myfeed.com[ -~][^\"]*" | sed "s: :%20:g" | xargs wget -c
    dermidgen · 2015-10-30 22:13:43 16

  • 0
    wget `curl -s <podcast feed URL> | grep -o 'https*://[^"]*mp3' | head -1`
    tbon3r · 2017-07-16 23:02:03 9
  • For debian likes, that's in python-xml package.


    -1
    xmlproc_parse.python-xml &>/dev/null <FILE> || exit 1
    sputnick · 2009-12-11 17:30:03 1
  •  1 2 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Throttling Bandwidth On A Mac
sudo ipfw pipe 1 config bw 50KByte/s Set the bandwidth (bw) limit to any number you want. For example you could have a 15kb pipe for X application and then a 100kb pipe for another application and attach things to those pipes. If a port isn’t attached to a pipe, it runs at full speed. Change the number (in this case 1) to a different number for a different pipe. The next step is to attach your port. sudo ipfw add 1 pipe 1 src-port 80 In this case anything on port 80 (http) will be set to a limit of 50Kbyte/s. If you want to attach a second port to this pipe, repeat the command but change the port number at the end. src : http://www.mactricksandtips.com/2008/12/throttling-bandwidth-on-a-mac.html

List only hidden files
You can omit the -d to see what's inside directories. In that case, you may want -a to see dotfiles inside those directories. (Otherwise you don't need -a since you're explicitly looking at them.)

Get size of terminal

Exclude svn directories with grep
exclude-dir option requires grep 2.5.3

fix flash video (flv) file (ffmpeg)
Rebuild flv files that are broken (can't seek). This method probably works for other video/audio formats that can become broken in the same way.

Watch several log files of different machines in a single multitail window on your own machine
this way you have the multitail with all its options running on your own machine with the tails of the two remote machines inside :)

Get all URLs from webpage via Regular Expression
Get all URLs from website via Regular Expression... You must have lynx installed in your computer to execute the command. --> lynx --dump "" | egrep -o "" - Must substitute it for the website path that you want to extract the URLs - Regular Expression that you wanna filter the website

Encode/Decode text to/from Base64 on a Mac w/out Mac Ports
I have a mac, and do not want to install mac ports to get the base64 binary. Using openssl will do the trick just fine. Note, to decode base64, specify a '-d' after 'base64' in the command. Note also the files base64.decoded.txt and base64.encoded.txt are text files.

Scan for [samba|lanman] NetBIOS names and ip addresses in LAN by ARP.

a function to find the fastest DNS server
http://public-dns.info gives a list of online dns servers. you need to change the country in url (br in this url) with your country code. this command need some time to ping all IP in list.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: