Check These Out
This command run fine on my Ubuntu machine, but on Red Hat I had to change the awk command to `awk '{print $10}'`.
tar doesn't support wildcard for unpacking (so you can't use tar -xf *.tar) and it's shorter and simpler than
for i in *.tar;do tar -xf $i;done (or even 'for i in *.tar;tar -xf $i' in case of zsh)
-i says tar not to stop after first file (EOF)
i want to count how many regex code i have used in vim in a long time
so i make a directory in svn host and post record to this directory
of course i dont want to post manually so i worte a script to do that
and this is the core thing to do
Using this command you can track a moment when usb device was attached.
This command utilizes 'pv' to show dd's progress.
Notes on use with dd:
-- dd block size (bs=...) is a widely debated command-line switch and should usually be between 1024 and 4096. You won't see much performance improvements beyond 4096, but regardless of the block size, dd will transfer every bit of data.
-- pv's switch, '-s' should be as close to the size of the data source as possible.
-- dd's out file, 'of=...' can be anything as the data within that file are the same regardless of the filename / extension.
This says if the LHC has destroyed the world. Run it in a loop to monitor the state of Earth. Might not work reliable, if the world has actually been destroyed.
It only encodes non-Basic-ASCII chars, as they are the only ones not well readed by UTF-8 and ISO-8859-1 (latin-1).
It converts all
* C3 X (some latin symbols like ASCII-extended ones)
and * C2 X (some punctuation symbols like inverted exclamation)
...UTF-8 double byte symbols to escaped form that every parser understands to form the URLs. I didn't encode spaces and the rest of basic punctuation, but supposedly, space and others are coded as \x20, for example, in UTF-8, latin-1 and Windows-cp1252.... so its read perfectly.
Please feel free to correct, the application to which I designe that function works as expected with my assumption.
Note: I specify a w=999, I didn't find a flag to put unlimited value.
I just suppose very improbable surpass the de-facto 255 (* 3 byte max) = 765 bytes length of URL
Reads stdin, and outputs each line only once - without sorting ahead of time. This does use more memory than your system's sort utility.
Replace 'csv_file.csv' with your filename.
Best to try first with -n flag, to preview