This is a read-only archive. Find the latest Linux articles, documentation, and answers at the new Linux.com!

Linux.com

Feature: Shell & CLI

CLI Magic: For geek cred, try these one-liners

By Sergio Gonzalez Duran on July 23, 2008 (9:00:00 AM)

Share    Print    Comments   

In this context, a one-liner is a set of commands normally joined through a pipe (|). When joined by a pipe, the command on the left passes its output to the command on the right. Simple or complex, you can get useful results from a single line at the bash command prompt.

For example, suppose you want to know how many files are in the current directory. You can run:

ls | wc -l

That's a very simple example -- you can get more elaborate. Suppose you want to know about the five processes that are consuming the most CPU time on your system:

ps -eo user,pcpu,pid,cmd | sort -r -k2 | head -6

The ps command's o lets you specify the columns that you want to be shown. sort -r does a reverse order sort with the second column (pcpu) as reference (k2). head gets only the first six lines from the ordered list, which includes the header line. You can place pcpu as the first column and then omit the k2 option because sort by default takes the first column to do the sort. That illustrates how you may have to try several approaches on some one-liners; different versions and ways to manipulate the options may produce different results.

A common situation for Linux administrators on servers with several users is to get quick ordered user lists. One simple way to get that is with the command:

cat /etc/passwd | sort

If you just need the username, the above command returns too much information. You can fix it with something like this:

cat /etc/passwd | sort | cut -d":" -f1

The sorted list is passed to cut, where the d option indicates the field's delimiter character. cut breaks into pieces each line, and the first field f1 is the one that you need to display. That's better; it shows only usernames now. But you may not want to see all the system usernames, like apache, bin, and lp. If you just want human users, try this:

cat /etc/passwd | sort | gawk '$3 >= 500 {print $1 }' FS=":"

gawk evaluates each line from the output piped to it. If the third field -- the UID -- is equal or greater than 500 (most modern distros start numbering normal users from this number) then the action is done. The action, indicated between braces, is to print the first field, which is the username. The separator for field in the gawk command is a colon, as specified by the FS option.

Now suppose you have a directory with lots of files with different extensions, and you want to back up only the .php files, calling them filename.bkp. The next one-liner should do the job:

for f in *.php; do cp $f $f.bkp; done

This command loops through all the files in the current directory looking for those with .php extensions. Each file's name is held in the $f variable. A simple copy command then does the backup. Notice that in this example we used a semicolon to execute the commands one after another, rather than piping output between them.

What about bulk copy? Consider this:

tar cf - . | (cd /usr/backups/; tar xfp -)

It creates a tar package recursevely on the current directory, then pipes this package to the next command. The parenthesis creates a temporary subshell, changes to a different directory, then extracts the content of the package, which is the whole original directory. The p option on the last tar command preserves file properties like time and permissions. After completion, the shell context will be at the original directory.

A variant on the previous one-liner lets you do the same kind of backup on a remote server:

tar cf - . | ssh smith@remote.server tar xfp - -C /usr/backup/smith

Here, the command establishes an SSH remote session and untars the package with the C option, which changes the directory, in this case to /usr/backup/smith, where the extraction will be made.

grep and gawk and uniq, oh my!

Text processing is a common use for one-liners. You can accomplish marvelous things with the right set of commands. In the next example, suppose you want a report on incoming email messages that look like this:

cat incoming_emails 2008-07-01 08:23:17 user1@example.com 2008-07-01 08:25:20 user2@someplace.com 2008-07-01 08:32:41 somebody@server.net 2008-07-01 08:35:03 spam not recived, filtered 2008-07-01 08:39:57 user1@example.com ...

You are asked for a report with an ordered list of who received incoming messages. Many recipients would be repeated in the output of the cat command. This one-liner resolves the problem:

grep '@' incoming_email | gawk '{print $3}' | sort | uniq

grep filters the lines that contains a @ character, which indicates an email address. Next, gawk extracts the third field, which contains the email address, and passes it to the sort command. Sorting is needed to group the same recipients together because the last command, uniq, omits repeated lines from the sorted list. The output is shown below. Most text processing one-liners use a combination of grep, sed, awk, order, tr, cut, uniq, and other related commands.

somebody@server.net user1@example.com user2@someplace.com

If you like any of these one-liners but think they're too long to type often, you can create an alias for the command and put it in your .bashrc file. When you log in your session, anything inside this file will be run, so your personal aliases would be ready at anytime.

alias p5="ps -eo pcpu,user,pid,cmd | sort -r | head -6"

You can certainly create better and simpler variations of all of the commands in this article, but they're a good place to start. If you are a Linux system administrator, it's good practice to collect, create, and modify your own one-liners and keep them handy; you never know when are you going to need them. If you have a good one-liner, feel free to share it with other readers in a comment below.

Sergio Gonzalez Duran is a Linux administrator, systems developer, and network security counselor who also teaches Linux courses and publishes the Spanish-oriented Linux and open source Web site linuxtotal.com.mx.

Share    Print    Comments   

Comments

on CLI Magic: For geek cred, try these one-liners

Note: Comments are owned by the poster. We are not responsible for their content.

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: unknown] on July 23, 2008 09:26 AM
"cat file | command" is mostly useless.
Use either "command < file" or "command file"

See also here: http://partmaps.org/era/unix/award.html

#

Re: CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 203.166.139.196] on July 24, 2008 05:56 AM
one place it might be useful .. suppose you want to grep a file with many patterns
"cat file | grep pattern" helps you go back and edit the pattern more quickly

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 82.9.199.69] on July 23, 2008 12:08 PM
That's a brilliant set of examples -- very handy for a relative novice like me in Linux!

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 195.241.240.72] on July 23, 2008 12:23 PM
In case the ps example does not work correctly, it might help to add the option '-n' to the sort command. The -n tells sort that the sorting key is a number. Thus:

ps -eo user,pcpu,pid,cmd | sort -n -r -k2 | head -6

#

CLI Magic: For geek cred, try these one-liners

Posted by: klo on July 23, 2008 01:04 PM
Oouh, uniq - Had I known that a week ago, would have saved me quite a bit of time... Thanks!

#

Re: CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 12.47.208.34] on July 23, 2008 02:15 PM
you can also do 'sort -u' which removes duplicate lines

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 195.135.221.2] on July 23, 2008 02:09 PM
Instead of using sort | uniq, you could also just use sort with sort -u -
Instead of using gawk, you could use cut in this case (which is way smaller):

grep '@' incoming_email | cut -d ' ' -f 3 | sort -u

#

Re: CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 12.47.208.34] on July 23, 2008 02:24 PM
One of my favorite commands is xargs. It takes multi-line output from one command and converts each line to arguments for another command. For example to use grep and find together:
> find . -name "*.txt" | xargs grep foo

#

Re: CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 206.218.218.57] on July 23, 2008 03:02 PM
awk '/@/ {print $3}' incoming_email | sort -u

Less typing, fewer processes.

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 97.88.198.245] on July 23, 2008 03:01 PM
Sort will take a file name so "cat /etc/passwd | sort" can be replaced by "sort /etc/passwd".

Also, I tend to push sort as far to the end of the pipeline as possible so that I don't have to sort as much data.
For instance "gawk '$3 >= 500 {print $1 }' FS=":" /etc/passwd | sort"

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 192.91.173.36] on July 23, 2008 03:49 PM
Why use uniq if you need to sort? Just use the -u option of sort instead of piping through another process

#

Re: CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 72.81.217.119] on July 24, 2008 07:30 PM
I often use the -c option of uniq, which sort does not have. This gives a count of the uniq lines. For example, to find out who is sending me the most mail, I could run:

cat mail_list | sort | uniq -c | sort -nr | head -n -5

63 foo1@verizon.net
24 support@fbar.com
22 bar1@comcast.net
20 foo2@verizon.net
9 abcxyz123@verizon.net

Later . . . Jim

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 216.241.105.6] on July 23, 2008 03:52 PM
Recently, I had to convert a bunch of files image files (137,000 total) from JPG to GIF and reduce colors to 8.

This is what I came up with:
for file in $(ls *.jpg) ; do convert -verbose -colors 8 ${file} ${file%.jpg}.gif ; done

8 hours later, it was done.

Next I had to convert a bunch of WAV files to MP3s and set the bit rate to 32. (all in multiple directories).

for DIR in $(ls -1) ; do for FILE in $(ls -1 ${DIR}/*.wav); do lame -b 32 "${FILE}" "${FILE%.wav}.mp3" ; done ; done

Remember loops for anything you have to do more than once.

#

Re: CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 195.135.221.2] on July 24, 2008 12:47 PM
In times when you have systems with several cores, you might want to use make for this task. This could work as follows (not tested, just written):
# cat Makefile
.SUFFIXES:
.SUFFIXES: .gif .jpg

JPGS=$(wildcard *.jpg)
GIFS=$(subst jpg,gif,$(JPGS))

default: $(GIFS)

.jpg.gif:
convert -verbose -colors 8 $< $@

then:
make -j <number of CPUs>
(note that there must be a tabulator before the convert line in order to make this work)

#

Re: CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 67.152.57.254] on July 24, 2008 03:08 PM
Well, this doesn't work if the top level dir has regular files in it :-/

I'd propose using find(1)

find . -name '*.wav' -maxdepth 2 | while read FILE; do lame -b 32 "${FILE}" "${FILE%.wav}.mp3"; done

The use of -maxdepth constrains the find(1) to the current dir and 1 level down.

Using $(ls -1) as an element list to a for loop runs the risk of overflow should there be a large number of files in a given directory.

Use of {xxx%yyy} is a nice construct.

I didn't know lame can convert from wav to mp3 (I've be using audacity for this) thanks for the hint.


#

Re(1): CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 10.64.2.143] on July 29, 2008 12:39 PM
"while read" constructions are good, but many of them can be replaced by xargs.

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 144.195.6.10] on July 23, 2008 04:19 PM
I use the following one liners as aliases so they are always available with a short name. These are very good when you are trying to clean up a system.

alias findtar='find ./ -name '*.tar''
alias bigone='du | sort -rn | more'

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 144.195.6.10] on July 23, 2008 04:40 PM
Posting the lines again. The two single quotes on the end of the first line look like a double quote. It should look like this:

alias findtar=' find ./ -name ' *.tar' '
alias bigone=' du | sort -rn | more '

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 66.92.90.34] on July 23, 2008 05:45 PM
One other thing, "cat | anything" is almost always wrong. Most apps will let you do "anything filename", or if that fails you can always use bash redirection: "anything < filename". Use cat for the intended purpose; combining multiple input files into a single output.

#

Re: CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 72.65.79.220] on July 24, 2008 01:51 AM
cat | anything is never wrong, there are many situations in which it is indeed vthe preferred formk, newb

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 89.220.107.198] on July 23, 2008 07:11 PM
Danger Will Robinson!

Watch out with the cp php -> .php.bkp example; if you do this in a place where your web-server (with php-module enabled) can access it you could have a code leak! (.php will be interpreted and .bkp will serve the plain text contents of the file!)

A possible example; http://server/config.php.bkp would disclose sensitive data (a lot of php applications store their database credentials here, among other things...)

Also look out when using editors with a backup function; for example the joe editor leaves its backup files with a trailing ~, backup of config.php will be config.php~ causing the same effect.

-----
news at eleven

#

Re: CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 72.65.79.220] on July 24, 2008 01:55 AM
another gotcha is .* or more likely from windows users *.* as this will match the current directory and the parent directory. Lots of fun seeing an rm command go very wrong.

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 66.151.59.138] on July 23, 2008 08:52 PM
A few things here...
This:
grep '@' incoming_email | gawk '{print $3}' | sort | uniq

Can become:
gawk '/@/{print $3}' incoming_email | sort -u

The trick for showing the top 5 processes is pretty handy. Here are a few of my own in scripts I've written:
http://www.digitalprognosis.com/opensource/scripts/top-open-files
http://www.digitalprognosis.com/opensource/scripts/top-disk-users

These scripts might help someone else

---
Jeff Schroeder
http://www.digitalprognosis.com/blog

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 83.233.180.215] on July 23, 2008 09:53 PM
1993 just called and wanted back the time when "clean" CLI-commands/piping - and people arguing for them - actually mattered.
Jeez, just when you thought Vista was the ultimate resource hog, just wait until you enter that totally unnecessary cat command and watch 3Kb of memory just vaporize. BEWARE n00bs!

#

Re: CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 206.248.171.94] on July 24, 2008 02:03 AM
Oh i love it

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 199.227.154.26] on July 23, 2008 10:48 PM
Sometimes you need uniqueness but also want to preserve order so sorting is out.
In those cases you can use awk.

Arrays in awk can be indexed by strings. $0 is the entire line and $1, $2, etc. are fields within the line.
So...
awk '/@/ && !un[$3]++ {print $3}' incoming_email

The ++ increments the element addressed by $3 and only elements numbered zero, the first, are
printed. Of course you can print other fields or combinations of fields as well.

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 172.17.100.87] on July 24, 2008 01:44 AM
Wonderful article and great comments! I'm bookmarking this one!

If you ever find the need to erase sensitive data from a linux system try this:

$ find -type f -execdir shred -fuzv -n7 '{}' \; rm -rf *

Now note that you can leave off the end and erase the empty directories manually. Not trying to get anyone to erase their files!

There is also debate on whether the shred command is even useful on journalizing filesystems since the metadata is still left behind. I'm not going to argue this point since I usually use this when I am erasing old floppies I have lying around (yes, I still use them sometimes).

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 190.55.101.115] on July 24, 2008 03:47 AM
I think the parentheses in 'tar cf - . | (cd /usr/backups/; tar xfp -)' are redundant. From bash(1):
"Each command in a pipeline is executed as a separate process (i.e., in a subshell)."

#

Re: CLI Magic: For geek cred, try these one-liners

Posted by: David Tangye on July 24, 2008 05:55 AM
Hmmm, I always thought that the parentheses are not redundant: the subshell is needed so you do not change the current directory of the shell you are in, just the one that the output is to go to, and also the output of the tar create becomes input to the shell where tar extract runs, not where the cd statement is executed. If you remove the parentheses you are piping the tar output to a cd command, which is nonsense, then doing a tar x where its input stream will be your terminal.

[Modified by: David Tangye on July 24, 2008 06:07 AM]

#

CLI Magic: For geek cred, try these one-liners

Posted by: jhansonxi on July 24, 2008 04:28 AM
Whenever I hear "one-liner" I think back to my Apple II days and Nibble Magazine. They had a column in each issue that featured BASIC programs that performed some useful function, like a pop-up ASCII chart or game, on one or two line numbers. There were some really wild tricks to typing those in. The magazine is available on-line: http://www.nibblemagazine.net/Nibble_Magazines.htm

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 189.187.7.170] on July 24, 2008 06:45 AM

Traduccion espaƱol de este articulo - spanish translation of this article
http://www.linuxtotal.com.mx/index.php?cont=info_shell_007

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 76.94.213.82] on July 24, 2008 07:56 AM
In my opinion, system administrator need to first understand how different commands work, how awk/sed/grep etc.. work individually and they need to take real life scenarios and try to use it effectively themselves. I have seen admins who just have a separate files with lot of one-liners, which they copy paste and use it without understanding what that exactly does, which is a pity.

Ramesh
http://www.thegeekstuff.com

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: unknown] on July 24, 2008 08:00 AM
There's another way of "maintaining" these one-liners, if the shell used is bash. Edit your profile file (or the system-wide '/etc/profile' - to do the same for all your users) and set the value of HISTSIZE to something large enough. Then you can type your prefered one-liners and later you can use the emacs line editing mode to "recall" them: hit CTRL-R and start typing parts of it. The shell will search your history in reverse and will offer previous commands. It also allows you to edit it (e.g. change the extension of the files on which you need to work) before hitting ENTER.

The one-liners will then sit in your .bash_history file, which can be later used to search for less used commands that did something useful for you (when you don't quite remember what the command looked like).

#

CLI Magic: For geek cred, try these one-liners

Posted by: Johannes Truschnigg on July 24, 2008 11:28 AM
I doubt that the numerous useless-use-of-cat-awards one will be awarded by following the instructions in this article article will contribute much to this fellow's geek-cred, as many readers before me pointed out ;-)

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 89.8.51.119] on July 24, 2008 02:38 PM
for f in *.php; do cp $f{,.bkp}; done

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 66.195.168.2] on July 24, 2008 04:37 PM
I use this command to find largest file:

find <dir> -type f -print | xargs ls -s | sort -rn | head

largest dir:

du -sk * |sort -n

#

Why use tar?

Posted by: Anonymous [ip: 208.111.221.85] on July 24, 2008 05:51 PM
What is that advantage of: tar cf - . | (cd /usr/backups/; tar xfp -)
over: cp -pr . /usr/backups

#

Re: Why use tar?

Posted by: Anonymous [ip: 217.229.2.195] on July 24, 2008 10:03 PM
I woudn't see a direct advantage, but you could e.g. use the tar together with netcat to do a fast copy (without scp) over the network:

on the target machine:
netcat -l -p 2345 | tar xf -

on the sending machine:
tar cpf - . | netcat target_ip 2345

Advantage is that you get maximum speed with low CPU utilization. However in a hostile environment you may prefer scp...

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 70.81.193.196] on July 25, 2008 12:51 AM
There are lot of similar examples of stuff like this at http://www.colcommands.com

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 194.237.142.7] on July 28, 2008 05:52 PM
"For example, suppose you want to know how many files are in the current directory. You can run:

ls | wc -l"

You can, but the result would most likely be BS. Hows about:

ls -1 | wc -l

#

Re: CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 75.108.40.110] on July 30, 2008 09:28 AM
On Ubuntu, at least, the ls command defaults to a single column whenever the output is redirected, so

ls | wc -l

and

ls -1 | wc -l

provide the same count every time. I suppose YMMV.

#

CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 129.125.20.211] on July 29, 2008 12:47 PM
What's the proper way to run a one-liner (with pipe or i/o-redirection) through sudo? normally sudo will only "extend" until the first pipe or re-direction symbol and the rest of the command will often not have the proper permissions e.g. to write a file. Obviously, I want to avoid doing a 'su' first or login as root, which is trivial.

#

Re: CLI Magic: For geek cred, try these one-liners

Posted by: Anonymous [ip: 75.108.40.110] on July 30, 2008 09:46 AM
I don't know about "proper", but something like this would work:

sudo bash -c "echo 'test' > test1.txt && echo 'test' > test2.txt"

The -c parameter for bash takes a string and executes it. Since we've sudo'ed the bash, it has root permissions, and therefore, the commands listed in the quoted string will run with root permissions.

#

This story has been archived. Comments can no longer be posted.



 
Tableless layout Validate XHTML 1.0 Strict Validate CSS Powered by Xaraya