Commands tagged jar (3)

  • This command find which of your zip (or jar) files (when you have lots of them) contains a file you're searching for. It's useful when you have a lot of zip (or jar) files and need to know in which of them the file is archived. It's most common with .jar files when you have to know which of the .jar files contains the java class you need. To find in jar files, you must change "zip" to "jar" in the "find" command. The [internal file name] must be changed to the file name you're searching that is archived into one of the zip/jar files. Before run this command you must step into the directory that contains the zip or jar files.


    2
    find . -iname '*.zip' | while read file; do unzip -l "$file" | grep -q [internal file name] && echo $file; done
    ricardofunke · 2012-03-23 18:08:35 11

  • 0
    find <directory> -print -iname "*.jar" -exec jar -ftv '{}' \;|grep -E "jar|<classname>"
    vivek_saini07 · 2014-11-22 20:17:38 8
  • Plain old `unzip` won't unzip output coming from STDOUT the ZIP file format includes a directory (index) at the end of the archive. This directory says where, within the archive each file is located and thus allows for quick, random access, without reading the entire archive. This would appear to pose a problem when attempting to read a ZIP archive through a pipe, in that the index is not accessed until the very end and so individual members cannot be correctly extracted until after the file has been entirely read and is no longer available. As such it appears unsurprising that most ZIP decompressors simply fail when the archive is supplied through a pipe. The directory at the end of the archive is not the only location where file meta information is stored in the archive. In addition, individual entries also include this information in a local file header, for redundancy purposes. From the `jar` manpage: > The jar command is a general-purpose archiving and compression tool, based on ZIP and the ZLIB compression format. JAR is smart enough to know how to handle these local file headers when the index is unavailable when reading through the pipe. (Most of the explanation in this description is taken from https://serverfault.com/a/589528/314226 , though they recommend using `bsdtar`, but that is not always available on systems) Show Sample Output


    0
    cat foo.zip | jar xv
    bbbco · 2019-01-14 22:08:19 36

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Watch the progress of 'dd'
The previously-posted one-liner didn't work for me for whatever reason, so I ended up doing this instead.

Quickly generate an MD5 hash for a text string using OpenSSL

resize all JPG images in folder and create new images (w/o overwriting)
Convert all jpegs in the current directory into ~1024*768 pixels and ~ 150 KBytes jpegs

list files recursively by size

Find usb device in realtime
Using this command you can track a moment when usb device was attached.

Skip over .svn directories when using the
Put the positive clauses after the '-o' option.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

convert unixtime to human-readable with awk
- convert unixtime to human-readable with awk - useful to read logfiles with unix-timestamps, f.e. squid-log: sudo tail -f /var/log/squid3/access.log | awk '{ print strftime("%c ", $1) $0; }

Update twitter with Perl
Requires Net::Twitter. Just replace the double quoted strings with the appropriate info.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: