unjar () { mkdir -p /tmp/unjar/$1 ; unzip -d /tmp/unjar/$1 $1 *class 1>/dev/null && find /tmp/unjar/$1 -name *class -type f | xargs jad -ff -nl -nonlb -o -p -pi99 -space -stat ; rm -r /tmp/unjar/$1 ; }

decompiler for jar files using jad


0
By: return13
2011-01-18 16:54:42

These Might Interest You

  • Yeah, there are many ways to do that. Doing with sed by using a for loop is my favourite, because these are two basic things in all *nix environments. Sed by default does not allow to save the output in the same files so we'll use mv to do that in batch along with the sed. Show Sample Output


    -3
    for files in $(ls -A directory_name); do sed 's/search/replaced/g' $files > $files.new && mv $files.new $files; done;
    bassu · 2009-05-07 20:13:07 6
  • Rather than typing out all 10 files, you can use brace expansion to do the trick for you. This is useful for backup files, numbered files, or any files with a repeating pattern. Gives more control than 'rm file*' as I might want to keep others around.


    3
    rm file{1..10}
    atoponce · 2009-03-02 14:42:05 2
  • A quick find command to identify all TAR files in a given path, extract a list of files contained within the tar, then search for a given string in the filelist. Returns to the user as a list of TAR files found (enclosed in []) followed by any matching files that exist in that archive. TAR can easily be swapped for JAR if required. Show Sample Output


    1
    find . -type f -name "*.tar" -printf [%f]\\n -exec tar -tf {} \; | grep -iE "[\[]|<filename>"
    andrewtayloruk · 2011-01-06 13:01:38 0
  • ./* is for copying files starting with - .[!.]* is for copying hidden files and avoiding copying files from the parent directory. ..?* is for copying files starting with .. (avoids the directory ..) /path/to/dir the path to the directory where the files should be copied Can also be used as a script. Input argument is /path/to/dir in tcsh, replace .[!.]* with .[^.]*


    1
    cp ./* .[!.]* ..?* /path/to/dir
    ako · 2009-03-16 13:27:36 0
  • thanks to GREP_COLOR the output will highlite the first 4 digits. if all files are few MB only, this gives a quick overview of how many powers of 10 bigger than 1MB they really are, a logarithmic scale. same works if files are more than 1GB when you replace the "4" by a "7", I usually use "5" in order to manually decide what files to delete...


    0
    du -sc .[!.]* * |grep '^[0-9]{4}'
    gander · 2015-04-24 10:51:13 1
  • Take a folder full of files and split it into smaller folders containing a maximum number of files. In this case, 100 files per directory. find creates the list of files xargs breaks up the list into groups of 100 for each group, create a directory and copy in the files Note: This command won't work if there is whitespace in the filenames (but then again, neither do the alternative commands :-)


    -1
    files -type f | xargs -n100 | while read l; do mkdir $((++f)); cp $l $f; done
    flatcap · 2011-02-15 23:15:16 1

What Others Think

But jad is closed-source.
kaedenn · 382 weeks and 5 days ago
yeah, but the best decompiler i know - or do you know a better one?
return13 · 382 weeks and 4 days ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: