Softpanorama May the source be with you, but remember the KISS principle ;-) Contents Bulletin Scripting in shell and Perl Network troubleshooting History Humor

# Bash Tips and Tricks

 News Bash customization Recommended Links Unix Sysadmin Tips Command history reuse cdpath Advanced filesystem navigation IFS Customizing Shell Dot Files: .profile, RC-file, and history Examples of .bashrc files Shell Prompts WinSCP Tips Readline and inputrc Piping Vim Buffer Through Unix Filters: ! and !! Commands .screenrc examples Attaching to and detaching from screen sessions How to rename files with special characters in names Midnight Commander Tips and Tricks Shell Aliases basename function dirname BASH Debugging Arithmetic Expressions in BASH String Operations if statements in shell sort command tr command exec command dd cut command System Activity Reporter (sar) Shell Input and Output Redirection Unix Find Tutorial. Using -exec option with find Unix find tutorial Finding files using file name or path Brace Expansion AWK Tips AWK one liners GNU Screen Tips VIM Tips pv Command completion BASH Debugging SSH Tips SCP Tips Shell Input and Output Redirection Bash Built-in Variables Directory favorites Brace Expansion Process Substitution in Shell Sequences of commands in Unix shell Subshells Shell scripts collections nmap_tips Annotated List of Bash Enhancements Pipes in Loops Pushd, popd and dirs Sysadmin Horror Stories Unix shells history Unix Shell Tips and Tricks Humor Etc

The introduction below was adapted from article "Unix Scripting: some Traps, Pitfalls and Recommendations" by Marc Dobson

BASH has some very unintuitive behavior if you source a script and do not provide a path to it: by default, when the command "source filename" does not contain a slash (i.e. does not include a path for the file), BASH searches the $PATH environment variable for the filename, and only if it does not find one there it searches the current directory!!!! Furthermore this happens whether the file has the executable bit set or not . This is counter intuitive to say the least and as an example TCSH does NOT do this search. One can disable this behavior with the BASH "shopt" built-in command: shopt -u sourcepath Recommendation 1: Create a special bash related log file or notebook where you write your findings. Environment is now so complex that you will definitely forget some of the most useful findings, if you do not write them down and periodically browse the content. For the same purpose create and maintain separate file with aliases (say .aliases) and a file with functions (say .functions), where you can write all the best ideas you have found or invented yourself. Which actually might never visit you the second time unless you write them the first time. Just don't overdo it, too many aliases are as bad as too few. Here excessive zeal is really destructive. But even if you do not use them resizing your .aliases and .functions file is a very useful exercise that refresh some of long forgotten skills that at one point of time you used to have ;-) Recommendation 2: In order to force bash to write lines in history on exit you need to put the line shopt -s histappend into your .bash_profile or a similar file (e.g. .profile) that executes for interactive sessions only. Without this option bash behaviour is simply stupid and is a source of a lot of grief. If you use multiple shell sessions, then write the history manually using the command history -a ATTENTION: If you use multiple shell sessions, you need to write the history manually to preserver it using the command history -w Recommendation 3: ls command has option -h which like in df produces "human readable" size of the file. So the most famous shell alias alias ll='ls -la' Might better be written as alias ll='ls -halF' Recommendation 4: If you prefer light color for your terminal, you are generally screwed: it is very difficult select proper colors for light background. Default colors work well on black of dark blue background, but that's it. For light background you need to limit yourself to three basic colors (black, red and blue) and forget about all other nuances. Actually they do not matter much anyway, too many colors it is just another sign of overcomplexity of the Linux environment as people simply stop paying attention to them. To disable or simplify color scheme create your own DIR_COLORS file or use option --nocolor Recommendation 5: When sourcing a script always use a path name for the file or at lease the prefix "./". By default Bash first searches regular names in PATH first. You can disable this behaviour with shopt -u sourcepath but it you work on multiple boxes where you are not primary administrator you can't just put this option into /etc/bashrc. Sourcing script from a wrong directory might lead to disasters/horror stories, especially, if you are working as root. Recommendation 6: always choose a unique script name. Ther is nothing wrong with long names, if they help to prevent a SNAFU. Unique script names can easily be obtained by prefixing the name with the project name to the script name (e.g. gpfs_setup ). Bad generic names, where multiple scripts with the same name might exist in multiple directories are for example, setup, configure, install etc... Recommendation 7: While this page is about clever, ingenious tips and tricks, you should never try to be too clever or too bold. Always play safe and test your commands, such as find with -exec option by printing set of files they operate on before applying it to a production server (especially if this is a remote server). System administration is a very conservative profession and absence of SNAFU is more important that demonstration of excessive cleverness, boldness... ### Sourcing versus Executing In the sourced file, an EXIT command, will terminate the whole script in which it was issues (the shell that invoked this script) not just sourced sub-script where the exit command was executed. In contrast in standalone scripts which are executed in subshell the EXIT command in this case will exit the shell/interpreter which was started to execute the "main" script. Therefore the executed script file will just stop and return to the shell which called it. As an example take the following two scripts. Script 1 is: #!/bin/bash echo "Executing script2" ./script2 if [$? -eq 0 ]; then
echo "Executing ls in /tmp/md"
ls -l /tmp/md
else
echo "Exiting"
exit 1
fi

And script 2 is:

#!/bin/bash

echo "In script 2"
if [ -e "/tmp/md" ]; then
echo "/tmp/md exists"
else
echo "/tmp/md does not exist"
exit 1
fi

Both scripts should have the execute bit set. Start a BASH shell by typing bash, and at the next prompt execute script 1. The following output is produced:

 If directory /tmp/md  exists: If directory /tmp/md  does not exist: Executing script2 In script 2 /tmp/md exists Executing ls in /tmp/md total 0 Executing script2 In script 2 /tmp/md does not exist Exiting

Now change script 1 to source script 2 instead of executing it (source ./script2  instead of ./script2). When the script 1 is executed the following output will be produced:

 If directory /tmp/md  exists: If directory /tmp/md  does not exist: Executing script2 In script 2 /tmp/md exists Executing ls in /tmp/md total 0 Executing script2 In script 2 /tmp/md does not exist

If the directory /tmp/md  exists then the output is the same and exactly the same commands were executed. If however the directory /tmp/md  does not exist then the script 2 has an EXIT and as it was sourced from script 1, it is actually script 1 which exits without the desired effect, i.e. printing "Exiting". In this case it is not very important but it could have very profound consequences with complex scripts.

The ambiguity in this case is compounded by the difference in coding in the two branches of the IF statement of script 2. For the case when the directory exists the EXIT command is implicit (the script goes to the end and exits normally), whereas for the case when the directory does not exist the EXIT command is explicit (this is the one which causes the exit from script 1).

If the programmer wishes to exit from a sourced script file (as he would with the EXIT command in an executed script), he may do so with:

return [n]
where "[n]" is the return value that can be tested for in the script/shell which sourced the script file (as with the EXIT command). Beware though that the RETURN command is also used to exit a function, therefore make sure that the RETURN command is placed in the appropriate place for the desired effect.

### Do not use source functionality as a poor man subroutines

If the same functionality is required (i.e. the same commands) to be executed multiple times it is better to use shell functions or standalone scripts, then to source the same fragment multiple times.  If you use this "multiple sourcing"  as a poor man subroutine always put a banner to remind yourself what is happening:

#!/bin/bash

echo "We have been executed"
echo "Sourcing the external commands from the file /root/bin/standard_gpfs_setup_actions..."
. /root/bin/standard_gpfs_setup_actions
echo "Exiting"

If the set of commands is written as a file that needs to be sourced  use the full path or least specify "dot-slash prefix if it reside in the current directory. Never use "naked", non-qualified names. For example

. ./sourced_script

Your browser does not support iframes.

Softpanorama Switchboard Softpanorama Search

## Old News ;-)

#### [Mar 13, 2017] 6.3 Arrays

##### "... type ..."
###### Mar 13, 2017 | name="KSH-CH-6-SECT-3">

So far we have seen two types of variables: character strings and integers. The third type of variable the Korn shell supports is an array . As you may know, an array is like a list of things; you can refer to specific elements in an array with integer indices , so that a[i] refers to the i th element of array a .

The Korn shell provides an array facility that, while useful, is much more limited than analogous features in conventional programming languages. In particular, arrays can be only one-dimensional (i.e., no arrays of arrays), and they are limited to 1024 elements. Indices can start at 0.

There are two ways to assign values to elements of an array. The first is the most intuitive: you can use the standard shell variable assignment syntax with the array index in brackets ( [] ). For example:

nicknames[2]=bob
nicknames[3]=ed


puts the values bob and ed into the elements of the array nicknames with indices 2 and 3, respectively. As with regular shell variables, values assigned to array elements are treated as character strings unless the assignment is preceded by let .

The second way to assign values to an array is with a variant of the set statement, which we saw in Chapter 3, Customizing Your Environment . The statement:

set -A

aname val1 val2 val3

...


creates the array aname (if it doesn't already exist) and assigns val1 to aname[0] , val2 to aname[1] , etc. As you would guess, this is more convenient for loading up an array with an initial set of values.

To extract a value from an array, use the syntax ${ aname [ i ]} . For example,${nicknames[2]} has the value "bob". The index i can be an arithmetic expression-see above. If you use *  in place of the index, the value will be all elements, separated by spaces. Omitting the index is the same as specifying index 0.

Now we come to the somewhat unusual aspect of Korn shell arrays. Assume that the only values assigned to nicknames are the two we saw above. If you type print "  ${nicknames[ *  ]}" , you will see the output: bob ed  In other words, nicknames[0] and nicknames[1] don't exist. Furthermore, if you were to type: nicknames[9]=pete nicknames[31]=ralph  and then type print " ${nicknames[ *  ]}" , the output would look like this:

bob ed pete ralph


This is why we said "the elements of nicknames with indices 2 and 3" earlier, instead of "the 2nd and 3rd elements of nicknames ". Any array elements with unassigned values just don't exist; if you try to access their values, you will get null strings.

You can preserve whatever whitespace you put in your array elements by using "  ${ aname [@] }  "  (with the double quotes) instead of$ { aname [ *  ] }  "  , just as you can with "  $@ "  instead of$ *  .

The shell provides an operator that tells you how many elements an array has defined: ${# aname [ *  ] } . Thus${#nicknames[ *  ] } has the value 4. Note that you need the [ *  ] because the name of the array alone is interpreted as the 0th element. This means, for example, that ${#nicknames} equals the length of nicknames[0] (see Chapter 4 ). Since nicknames[0] doesn't exist, the value of${#nicknames} is 0, the length of the null string.

To be quite frank, we feel that the Korn shell's array facility is of little use to shell programmers. This is partially because it is so limited, but mainly because shell programming tasks are much more often oriented toward character strings and text than toward numbers. If you think of an array as a mapping from integers to values (i.e., put in a number, get out a value), then you can see why arrays are "number-dominated" data structures.

Nevertheless, we can find useful things to do with arrays. For example, here is a cleaner solution to Task 5-4, in which a user can select his or her terminal type ( TERM environment variable) at login time. Recall that the "user-friendly" version of this code used select and a case statement:

print 'Select your terminal type:'
PS3='terminal? '
select term in
'Givalt GL35a' \
'Tsoris T-2000' \
'Shande 531' \
'Vey VT99'
do
case $REPLY in 1 ) TERM=gl35a ;; 2 ) TERM=t2000 ;; 3 ) TERM=s531 ;; 4 ) TERM=vt99 ;; * ) print "invalid." ;; esac if [[ -n$term ]]; then
print "TERM is $TERM" break fi done  We can eliminate the entire case construct by taking advantage of the fact that the select construct stores the user's number choice in the variable REPLY . We just need a line of code that stores all of the possibilities for TERM in an array, in an order that corresponds to the items in the select menu. Then we can use$REPLY to index the array. The resulting code is:

set -A termnames gl35a t2000 s531 vt99
PS3='terminal? '
select term in
'Givalt GL35a' \
'Tsoris T-2000' \
'Shande 531' \
'Vey VT99'
do
if [[ -n $term ]]; then TERM=${termnames[REPLY-1]}
print "TERM is $TERM" break fi done  This code sets up the array termnames so that${termnames[0]} is "gl35a", ${termnames[1]} is "t2000", etc. The line TERM=${termnames[REPLY-1]} essentially replaces the entire case construct by using REPLY to index the array.

Notice that the shell knows to interpret the text in an array index as an arithmetic expression, as if it were enclosed in (( and )) , which in turn means that variable need not be preceded by a dollar sign ( $). We have to subtract 1 from the value of REPLY because array indices start at 0, while select menu item numbers start at 1. 6.3.1 typeset The final Korn shell feature that relates to the kinds of values that variables can hold is the typeset command. If you are a programmer, you might guess that typeset is used to specify the type of a variable (integer, string, etc.); you'd be partially right. typeset is a rather ad hoc collection of things that you can do to variables that restrict the kinds of values they can take. Operations are specified by options to typeset ; the basic syntax is: typeset -o varname [= value ]  Options can be combined; multiple varname s can be used. If you leave out varname , the shell prints a list of variables for which the given option is turned on. The options available break down into two basic categories: 1. String formatting operations, such as right- and left-justification, truncation, and letter case control. 2. Type and attribute functions that are of primary interest to advanced programmers. 6.3.2 Local Variables in Functions typeset without options has an important meaning: if a typeset statement is inside a function definition, then the variables involved all become local to that function (in addition to any properties they may take on as a result of typeset options). The ability to define variables that are local to "subprogram" units (procedures, functions, subroutines, etc.) is necessary for writing large programs, because it helps keep subprograms independent of the main program and of each other. If you just want to declare a variable local to a function, use typeset without any options. For example: function afunc { typeset diffvar samevar=funcvalue diffvar=funcvalue print "samevar is$samevar"
print "diffvar is $diffvar" } samevar=globvalue diffvar=globvalue print "samevar is$samevar"
print "diffvar is $diffvar" afunc print "samevar is$samevar"
print "diffvar is $diffvar"  This code will print the following: samevar is globvalue diffvar is globvalue samevar is funcvalue diffvar is funcvalue samevar is funcvalue diffvar is globvalue  Figure 6.1 shows this graphically. Figure 6.1: Local variables in functions #### [Mar 13, 2017] Leaning the Korn shell: Chapter 6 Integer Variables and Arithmetic ###### Mar 13, 2017 | docstore.mik.ua 6.2 Integer Variables and Arithmetic The expression$(($OPTIND - 1)) in the last example gives a clue as to how the shell can do integer arithmetic. As you might guess, the shell interprets words surrounded by$(( and )) as arithmetic expressions. Variables in arithmetic expressions do not need to be preceded by dollar signs, though it is not wrong to do so.

Arithmetic expressions are evaluated inside double quotes, like tildes, variables, and command substitutions. We're finally in a position to state the definitive rule about quoting strings: When in doubt, enclose a string in single quotes, unless it contains tildes or any expression involving a dollar sign, in which case you should use double quotes.

date (1) command on System V-derived versions of UNIX accepts arguments that tell it how to format its output. The argument +%j tells it to print the day of the year, i.e., the number of days since December 31st of the previous year.

We can use +%j to print a little holiday anticipation message:

print "Only $(( (365-$(date +%j)) / 7 )) weeks until the New Year!"

We'll show where this fits in the overall scheme of command-line processing in Chapter 7, Input/Output and Command-line Processing .

The arithmetic expression feature is built in to the Korn shell's syntax, and was available in the Bourne shell (most versions) only through the external command expr (1). Thus it is yet another example of a desirable feature provided by an external command (i.e., a syntactic kludge) being better integrated into the shell. [[ / ]] and getopts are also examples of this design trend.

Korn shell arithmetic expressions are equivalent to their counterparts in the C language. [5] Precedence and associativity are the same as in C. Table 6.2 shows the arithmetic operators that are supported. Although some of these are (or contain) special characters, there is no need to backslash-escape them, because they are within the $(( ... )) syntax. [5] The assignment forms of these operators are also permitted. For example,$((x += 2)) adds 2 to x and stores the result back in x .

Table 6.2: Arithmetic Operators
Operator Meaning
+ Plus
- Minus
*  Times
/ Division (with truncation)
% Remainder
<< Bit-shift left
>> Bit-shift right
& Bitwise and
| Bitwise or
~ Bitwise not
^ Bitwise exclusive or

Parentheses can be used to group subexpressions. The arithmetic expression syntax also (like C) supports relational operators as "truth values" of 1 for true and 0 for false. Table 6.3 shows the relational operators and the logical operators that can be used to combine relational expressions.

Table 6.3: Relational Operators
Operator Meaning
< Less than
> Greater than
<= Less than or equal
>= Greater than or equal
== Equal
!= Not equal
&& Logical and
|| Logical or

For example, $((3 > 2)) has the value 1;$(( (3 > 2) || (4 <= 1) )) also has the value 1, since at least one of the two subexpressions is true.

The shell also supports base N numbers, where N can be up to 36. The notation B # N means " N base B ". Of course, if you omit the B # , the base defaults to 10.

6.2.1 Arithmetic Conditionals

Another construct, closely related to $((...)) , is ((...)) (without the leading dollar sign). We use this for evaluating arithmetic condition tests, just as [[...]] is used for string, file attribute, and other types of tests. ((...)) evaluates relational operators differently from$((...)) so that you can use it in if and while constructs. Instead of producing a textual result, it just sets its exit status according to the truth of the expression: 0 if true, 1 otherwise. So, for example, ((3 > 2)) produces exit status 0, as does (( (3 > 2) || (4 <= 1) )) , but (( (3 > 2) && (4 <= 1) )) has exit status 1 since the second subexpression isn't true.

You can also use numerical values for truth values within this construct. It's like the analogous concept in C, which means that it's somewhat counterintuitive to non-C programmers: a value of 0 means false (i.e., returns exit status 1), and a non-0 value means true (returns exit status 0), e.g., (( 14 )) is true. See the code for the kshdb debugger in Chapter 9 for two more examples of this.

6.2.2 Arithmetic Variables and Assignment

The (( ... )) construct can also be used to define integer variables and assign values to them. The statement:

(( intvar=expression ))

creates the integer variable intvar (if it doesn't already exist) and assigns to it the result of expression .

That syntax isn't intuitive, so the shell provides a better equivalent: the built-in command let . The syntax is:

let intvar=expression

It is not necessary (because it's actually redundant) to surround the expression with $(( and )) in a let statement. As with any variable assignment, there must not be any space on either side of the equal sign ( = ). It is good practice to surround expressions with quotes, since many characters are treated as special by the shell (e.g., * , # , and parentheses); furthermore, you must quote expressions that include whitespace (spaces or TABs). See Table 6.4 for examples. Table 6.4: Sample Integer Expression Assignments Assignment Value let x=$x
1+4 5
' 1 + 4 '  5
' (2+3) * 5 '  25
' 2 + 3 * 5 '  17
' 17 / 3 '  5
' 17 % 3 '  2
' 1<<4 '  16
' 48>>3 '  6
' 17 & 3 '  1
' 17 | 3 '  19
' 17 ^ 3 '  18

Here is a small task that makes use of integer arithmetic.

Write a script called pages that, given the name of a text file, tells how many pages of output it contains. Assume that there are 66 lines to a page but provide an option allowing the user to override that.

We'll make our option - N , a la head . The syntax for this single option is so simple that we need not bother with getopts . Here is the code:

if [[ $1 = -+([0-9]) ]]; then let page_lines=${1#-}
shift
else
let page_lines=66
fi
let file_lines="$(wc -l <$1)"

let pages=file_lines/page_lines
if (( file_lines % page_lines > 0 )); then
let pages=pages+1
fi

print "$1 has$pages pages of text."


Notice that we use the integer conditional (( file_lines % page_lines > 0 )) rather than the [[ ... ]] form.

At the heart of this code is the UNIX utility wc(1) , which counts the number of lines, words, and characters (bytes) in its input. By default, its output looks something like this:

8      34     161  bob


wc 's output means that the file bob has 8 lines, 34 words, and 161 characters. wc recognizes the options -l , -w , and -c , which tell it to print only the number of lines, words, or characters, respectively.

wc normally prints the name of its input file (given as argument). Since we want only the number of lines, we have to do two things. First, we give it input from file redirection instead, as in wc -l < bob instead of wc -l bob . This produces the number of lines preceded by a single space (which would normally separate the filename from the number).

Unfortunately, that space complicates matters: the statement let file_lines=$(wc -l <$1) becomes "let file_lines= N " after command substitution; the space after the equal sign is an error. That leads to the second modification, the quotes around the command substitution expression. The statement let file_lines=" N " is perfectly legal, and let knows how to remove the leading space.

The first if clause in the pages script checks for an option and, if it was given, strips the dash ( - ) off and assigns it to the variable page_lines . wc in the command substitution expression returns the number of lines in the file whose name is given as argument.

The next group of lines calculates the number of pages and, if there is a remainder after the division, adds 1. Finally, the appropriate message is printed.

As a bigger example of integer arithmetic, we will complete our emulation of the C shell's pushd and popd functions (Task 4-8). Remember that these functions operate on DIRSTACK , a stack of directories represented as a string with the directory names separated by spaces. The C shell's pushd and popd take additional types of arguments, which are:

• pushd +n takes the n th directory in the stack (starting with 0), rotates it to the top, and cd s to it.

• pushd without arguments, instead of complaining, swaps the two top directories on the stack and cd s to the new top.

• popd +n takes the n th directory in the stack and just deletes it.

The most useful of these features is the ability to get at the n th directory in the stack. Here are the latest versions of both functions:

function pushd { # push current directory onto stack
dirname=$1 if [[ -d$dirname && -x $dirname ]]; then cd$dirname
DIRSTACK="$dirname${DIRSTACK:-$PWD}" print "$DIRSTACK"
else
print "still in $PWD." fi } function popd { # pop directory off the stack, cd to new top if [[ -n$DIRSTACK ]]; then
DIRSTACK=${DIRSTACK#* } cd${DIRSTACK%% *}
print "$PWD" else print "stack empty, still in$PWD."
fi
}


To get at the n th directory, we use a while loop that transfers the top directory to a temporary copy of the stack n times. We'll put the loop into a function called getNdirs that looks like this:

function getNdirs{
stackfront=''
let count=0
while (( count < $1 )); do stackfront="$stackfront ${DIRSTACK%% *}" DIRSTACK=${DIRSTACK#* }
let count=count+1
done
}


The argument passed to getNdirs is the n in question. The variable stackfront is the temporary copy that will contain the first n directories when the loop is done. stackfront starts as null; count , which counts the number of loop iterations, starts as 0.

The first line of the loop body appends the top of the stack ( ${DIRSTACK%% * } ) to stackfront ; the second line deletes the top from the stack. The last line increments the counter for the next iteration. The entire loop executes N times, for values of count from 0 to N -1. When the loop finishes, the last directory in$stackfront is the N th directory. The expression ${stackfront## * } extracts this directory. Furthermore, DIRSTACK now contains the "back" of the stack, i.e., the stack without the first n directories. With this in mind, we can now write the code for the improved versions of pushd and popd : function pushd { if [[$1 = ++([0-9]) ]]; then
# case of pushd +n: rotate n-th directory to top
let num=${1#+} getNdirs$num

newtop=${stackfront##* } stackfront=${stackfront%$newtop} DIRSTACK="$newtop $stackfront$DIRSTACK"
cd $newtop elif [[ -z$1 ]]; then
# case of pushd without args; swap top two directories
firstdir=${DIRSTACK%% *} DIRSTACK=${DIRSTACK#* }
seconddir=${DIRSTACK%% *} DIRSTACK=${DIRSTACK#* }
DIRSTACK="$seconddir$firstdir $DIRSTACK" cd$seconddir

else
cd $dirname # normal case of pushd dirname dirname=$1
if [[ -d $dirname && -x$dirname ]]; then
DIRSTACK="$dirname${DIRSTACK:-$PWD}" print "$DIRSTACK"
else
print still in "$PWD." fi fi } function popd { # pop directory off the stack, cd to new top if [[$1 = ++([0-9]) ]]; then
# case of popd +n: delete n-th directory from stack
let num={$1#+} getNdirs$num
stackfront=${stackfront% *} DIRSTACK="$stackfront $DIRSTACK" else # normal case of popd without argument if [[ -n$DIRSTACK ]]; then
DIRSTACK=${DIRSTACK#* } cd${DIRSTACK%% *}
print "$PWD" else print "stack empty, still in$PWD."
fi
fi
}


These functions have grown rather large; let's look at them in turn. The if at the beginning of pushd checks if the first argument is an option of the form + N . If so, the first body of code is run. The first let simply strips the plus sign (+) from the argument and assigns the result - as an integer - to the variable num . This, in turn, is passed to the getNdirs function.

The next two assignment statements set newtop to the N th directory - i.e., the last directory in $stackfront - and delete that directory from stackfront . The final two lines in this part of pushd put the stack back together again in the appropriate order and cd to the new top directory. The elif clause tests for no argument, in which case pushd should swap the top two directories on the stack. The first four lines of this clause assign the top two directories to firstdir and seconddir , and delete these from the stack. Then, as above, the code puts the stack back together in the new order and cd s to the new top directory. The else clause corresponds to the usual case, where the user supplies a directory name as argument. popd works similarly. The if clause checks for the + N option, which in this case means delete the N th directory. A let extracts the N as an integer; the getNdirs function puts the first n directories into stackfront . Then the line stackfront=${stackfront% *} deletes the last directory (the N th directory) from stackfront . Finally, the stack is put back together with the N th directory missing.

The else clause covers the usual case, where the user doesn't supply an argument.

Before we leave this subject, here are a few exercises that should test your understanding of this code:

1. Add code to pushd that exits with an error message if the user supplies no argument and the stack contains fewer than two directories.

2. Verify that when the user specifies + N and N exceeds the number of directories in the stack, both pushd and popd use the last directory as the N th directory.

3. Modify the getNdirs function so that it checks for the above condition and exits with an appropriate error message if true.

4. Change getNdirs so that it uses cut (with command substitution), instead of the while loop, to extract the first N directories. This uses less code but runs more slowly because of the extra processes generated.

#### [Feb 14, 2017] Ms Dos style aliases for linux

##### I think alias ipconfig = 'ifconfig' is really useful for people who work with Linus from Windows POc desktop/laptop.
###### Feb 14, 2017 | bash.cyberciti.biz
# MS-DOS / XP cmd like stuff
alias edit = $VISUAL alias copy = 'cp' alias cls = 'clear' alias del = 'rm' alias dir = 'ls' alias md = 'mkdir' alias move = 'mv' alias rd = 'rmdir' alias ren = 'mv' alias ipconfig = 'ifconfig'  #### [Feb 04, 2017] Quickly find differences between two directories ##### You will be surprised, but GNU diff use in Linux understands the situation when two arguments are directories and behaves accordingly ###### Feb 04, 2017 | www.cyberciti.biz The diff command compare files line by line. It can also compare two directories: # Compare two folders using diff ## diff /etc /tmp/etc_old  Rafal Matczak September 29, 2015, 7:36 am § Quickly find differences between two directories And quicker:  diff -y <(ls -l${DIR1}) <(ls -l ${DIR2})  #### [Feb 04, 2017] Restoring deleted /tmp folder ###### Jan 13, 2015 | cyberciti.biz As my journey continues with Linux and Unix shell, I made a few mistakes. I accidentally deleted /tmp folder. To restore it all you have to do is: mkdir /tmp chmod 1777 /tmp chown root:root /tmp ls -ld /tmp mkdir /tmp chmod 1777 /tmp chown root:root /tmp ls -ld /tmp  #### [Feb 04, 2017] Use CDPATH to access frequent directories in bash - Mac OS X Hints ###### Feb 04, 2017 | hints.macworld.com ##### The variable CDPATH defines the search path for the directory containing directories. So it served much like "directories home". The dangers are in creating too complex CDPATH. Often a single directory works best. For example export CDPATH = /srv/www/public_html . Now, instead of typing cd /srv/www/public_html/CSS I can simply type: cd CSS Use CDPATH to access frequent directories in bash Mar 21, '05 10:01:00AM • Contributed by: jonbauman I often find myself wanting to cd to the various directories beneath my home directory (i.e. ~/Library, ~/Music, etc.), but being lazy, I find it painful to have to type the ~/ if I'm not in my home directory already. Enter CDPATH , as desribed in man bash ): The search path for the cd command. This is a colon-separated list of directories in which the shell looks for destination directories specified by the cd command. A sample value is ".:~:/usr". Personally, I use the following command (either on the command line for use in just that session, or in .bash_profile for permanent use): CDPATH=".:~:~/Library"   This way, no matter where I am in the directory tree, I can just cd dirname , and it will take me to the directory that is a subdirectory of any of the ones in the list. For example: $ cd
$cd Documents /Users/baumanj/Documents$ cd Pictures
$cd Preferences /Users/username/Library/Preferences etc...   [ robg adds: No, this isn't some deeply buried treasure of OS X, but I'd never heard of the CDPATH variable, so I'm assuming it will be of interest to some other readers as well.] cdable_vars is also nice Authored by: clh on Mar 21, '05 08:16:26PM Check out the bash command shopt -s cdable_vars From the man bash page: cdable_vars If set, an argument to the cd builtin command that is not a directory is assumed to be the name of a variable whose value is the directory to change to. With this set, if I give the following bash command: export d="/Users/chap/Desktop" I can then simply type cd d to change to my Desktop directory. I put the shopt command and the various export commands in my .bashrc file. #### [Feb 04, 2017] Copy file into multiple directories ###### Feb 04, 2017 | www.cyberciti.biz Instead of running: cp /path/to/file /usr/dir1 cp /path/to/file /var/dir2 cp /path/to/file /nas/dir3  Run the following command to copy file into multiple dirs: echo /usr/dir1 /var/dir2 /nas/dir3 | xargs -n 1 cp -v /path/to/file  #### [Feb 04, 2017] 20 Unix Command Line Tricks – Part I ###### Feb 04, 2017 | www.cyberciti.biz Locking a directory For privacy of my data I wanted to lock down /downloads on my file server. So I ran:  chmod 0000 / downloads  chmod 0000 /downloads The root user can still has access and ls and cd commands will not work. To go back:  chmod 0755 / downloads  chmod 0755 /downloads Clear gibberish all over the screen Just type:  reset  reset Becoming human Pass the -h or -H (and other options) command line option to GNU or BSD utilities to get output of command commands like ls, df, du, in human-understandable formats:  ls -lh # print sizes in human readable format (e.g., 1K 234M 2G) df -h df -k # show output in bytes, KB, MB, or GB free -b free -k free -m free -g # print sizes in human readable format (e.g., 1K 234M 2G) du -h # get file system perms in human readable format stat -c % A / boot # compare human readable numbers sort -h -a file # display the CPU information in human readable format on a Linux lscpu lscpu -e lscpu -e =cpu,node # Show the size of each file but in a more human readable way tree -h tree -h / boot  ls -lh # print sizes in human readable format (e.g., 1K 234M 2G) df -h df -k # show output in bytes, KB, MB, or GB free -b free -k free -m free -g # print sizes in human readable format (e.g., 1K 234M 2G) du -h # get file system perms in human readable format stat -c %A /boot # compare human readable numbers sort -h -a file # display the CPU information in human readable format on a Linux lscpu lscpu -e lscpu -e=cpu,node # Show the size of each file but in a more human readable way tree -h tree -h /boot Show information about known users in the Linux based system Just type:  ## linux version ## lslogins ## BSD version ## logins  ## linux version ## lslogins## BSD version ## logins Sample outputs: UID USER PWD-LOCK PWD-DENY LAST-LOGIN GECOS 0 root 0 0 22:37:59 root 1 bin 0 1 bin 2 daemon 0 1 daemon 3 adm 0 1 adm 4 lp 0 1 lp 5 sync 0 1 sync 6 shutdown 0 1 2014-Dec17 shutdown 7 halt 0 1 halt 8 mail 0 1 mail 10 uucp 0 1 uucp 11 operator 0 1 operator 12 games 0 1 games 13 gopher 0 1 gopher 14 ftp 0 1 FTP User 27 mysql 0 1 MySQL Server 38 ntp 0 1 48 apache 0 1 Apache 68 haldaemon 0 1 HAL daemon 69 vcsa 0 1 virtual console memory owner 72 tcpdump 0 1 74 sshd 0 1 Privilege-separated SSH 81 dbus 0 1 System message bus 89 postfix 0 1 99 nobody 0 1 Nobody 173 abrt 0 1 497 vnstat 0 1 vnStat user 498 nginx 0 1 nginx user 499 saslauth 0 1 "Saslauthd user"  Confused on a top command output? Seriously, you need to try out htop instead of top:  sudo htop  sudo htop Want to run the same command again? Just type !! . For example:  / myhome / dir / script / name arg1 arg2 # To run the same command again !! ## To run the last command again as root user sudo !!  /myhome/dir/script/name arg1 arg2# To run the same command again !!## To run the last command again as root user sudo !! The !! repeats the most recent command. To run the most recent command beginning with "foo":  ! foo # Run the most recent command beginning with "service" as root sudo ! service  !foo # Run the most recent command beginning with "service" as root sudo !service The !$ use to run command with the last argument of the most recent command:

 # Edit nginx.conf sudo vi / etc / nginx / nginx.conf # Test nginx.conf for errors / sbin / nginx -t -c / etc / nginx / nginx.conf # After testing a file with "/sbin/nginx -t -c /etc/nginx/nginx.conf", you # can edit file again with vi sudo vi ! $ # Edit nginx.conf sudo vi /etc/nginx/nginx.conf# Test nginx.conf for errors /sbin/nginx -t -c /etc/nginx/nginx.conf# After testing a file with "/sbin/nginx -t -c /etc/nginx/nginx.conf", you # can edit file again with vi sudo vi !$ Get a reminder you when you have to leave

If you need a reminder to leave your terminal, type the following command:

 leave +hhmm 

leave +hhmm

Where,

• hhmm – The time of day is in the form hhmm where hh is a time in hours (on a 12 or 24 hour clock), and mm are minutes. All times are converted to a 12 hour clock, and assumed to be in the next 12 hours.
Home sweet home

Want to go the directory you were just in? Run:
cd - 
cd 
The variable CDPATH defines the search path for the directory containing directories:

 export CDPATH = / var / www: / nas10 

export CDPATH=/var/www:/nas10

Now, instead of typing cd /var/www/html/ I can simply type the following to cd into /var/www/html path:

 cd html 

cd html Editing a file being viewed with less pager

To edit a file being viewed with less pager, press v . You will have the file for edit under $EDITOR:  less * .c less foo.html ## Press v to edit file ## ## Quit from editor and you would return to the less pager again ##  less *.c less foo.html ## Press v to edit file ## ## Quit from editor and you would return to the less pager again ## List all files or directories on your system To see all of the directories on your system, run:  find / -type d | less # List all directories in your$HOME find $HOME -type d -ls | less  find / -type d | less# List all directories in your$HOME find $HOME -type d -ls | less To see all of the files, run:  find / -type f | less # List all files in your$HOME find $HOME -type f -ls | less  find / -type f | less# List all files in your$HOME find $HOME -type f -ls | less Build directory trees in a single command You can create directory trees one at a time using mkdir command by passing the -p option:  mkdir -p / jail / { dev,bin,sbin,etc,usr,lib,lib64 } ls -l / jail /  mkdir -p /jail/{dev,bin,sbin,etc,usr,lib,lib64} ls -l /jail/ Copy file into multiple directories Instead of running:  cp / path / to / file / usr / dir1 cp / path / to / file / var / dir2 cp / path / to / file / nas / dir3  cp /path/to/file /usr/dir1 cp /path/to/file /var/dir2 cp /path/to/file /nas/dir3 Run the following command to copy file into multiple dirs:  echo / usr / dir1 / var / dir2 / nas / dir3 | xargs -n 1 cp -v / path / to / file  echo /usr/dir1 /var/dir2 /nas/dir3 | xargs -n 1 cp -v /path/to/file Creating a shell function is left as an exercise for the reader Quickly find differences between two directories The diff command compare files line by line. It can also compare two directories:  ls -l / tmp / r ls -l / tmp / s # Compare two folders using diff ## diff / tmp / r / / tmp / s /  #### [Feb 04, 2017] List all files or directories on your system ###### Feb 04, 2017 | www.cyberciti.biz List all files or directories on your system To see all of the directories on your system, run:  find / -type d | less # List all directories in your$HOME find $HOME -type d -ls | less  find / -type d | less# List all directories in your$HOME find $HOME -type d -ls | less To see all of the files, run:  find / -type f | less # List all files in your$HOME find $HOME -type f -ls | less  #### basic ~-.bashrc ~-.bash_profile tips thread ###### Arch Linux Forums I added some comments explaining each piece. Misc stuff: # My prompt, quite basic, decent coloring, shows the value of$?
# (exit value of last command, useful sometimes):
C_DEFAULT="[33[0m]"
C_BLUE="[33[0;34m]"
export PS1="$C_BLUE($C_DEFAULT$?$C_BLUE)[$C_DEFAULTu$C_BLUE@$C_DEFAULTh$C_BLUE:$C_DEFAULTw$C_BLUE]\C_DEFAULT"
export PS2="$C_BLUE>$C_DEFAULT"

# If you allow Ctrl+Alt+Backspace to kill the X server but are paranoid,
# then this alias will ensure that there will be no shell open afterwards.
alias startx="exec startx"

# Let grep colorize the search results
alias g="egrep --color=always"
alias gi="egrep -i --color=always"

# Hostname appended to bash history filename
export HISTFILE="$HOME/.bash_history_hostname -s" # Don't save repeated commands in bash history export HISTCONTROL="ignoredups" # Confirm before overwriting something alias cp="cp -i" # Disable ^S/^Q flow control (does anyone like/use this at all?) stty -ixon # If your resolution gets fucked up, use this to reset (requires XRandR) alias resreset="xrandr --size 1280x1024" And some small but handy functions: # mkmv - creates a new directory and moves the file into it, in 1 step # Usage: mkmv <file> <directory> mkmv() { mkdir "$2"
mv "$1" "$2"
}

# sanitize - set file/directory owner and permissions to normal values (644/755)
# Usage: sanitize <file>
sanitize() {
chmod -R u=rwX,go=rX "$@" chown -R${USER}.users "$@" } # nh - run command detached from terminal and without output # Usage: nh <command> nh() { nohup "$@" &>/dev/null &
}

# run - compile a simple c or cpp file, run the program, afterwards delete it
# Usage: run <file> [params]
run() {
filename="${1%%.*}" extension="${1##*.}"
file="$1" shift params="$@"
command=""

if [ $extension = "cc" -o$extension = "cpp" -o $extension = "c++" ]; then command="g++" elif [$extension = "c" ]; then
command="gcc"
else
echo "Invalid file extension!"
return 1
fi

$command -Wall -o$filename $file chmod a+x$filename
./$filename$params
rm -f $filename 2>/dev/null } Offline ... ... ... function mktar() { tar czf "${1%%/}.tar.gz" "${1%%/}/"; } function mkmine() { sudo chown -R${USER} ${1:-.}; } alias svim='sudo vim' # mkmv - creates a new directory and moves the file into it, in 1 step# Usage: mkmv <file> <directory>mkmv() { mkdir "$2" mv "$1" "$2"}

# sanitize - set file/directory owner and permissions to normal values (644/755)# Usage: sanitize <file>sanitize() { chmod -R u=rwX,go=rX "$@" chown -R${USER}.users "$@"} # nh - run command detached from terminal and without output# Usage: nh <command>nh() { nohup "$@" &>/dev/null &}

alias un='tar -zxvf'alias mountedinfo='df -hT'alias ping='ping -c 10'alias openports='netstat -nape --inet'alias ns='netstat -alnp --protocol=inet | grep -v CLOSE_WAIT | cut-c-6,21-94 | tail +2'alias du1='du -h --max-depth=1'alias da='date "+%Y-%m-%d %A %T %Z"'alias ebrc='pico ~/.bashrc'

# Alias to multiple ls commandsalias la='ls -Al' # show hidden filesalias ls='ls -aF --color=always' # add colors and file type extensionsalias lx='ls -lXB' # sort by extensionalias lk='ls -lSr' # sort by sizealias lc='ls -lcr' # sort by change timealias lu='ls -lur' # sort by access timealias lr='ls -lR' # recursive lsalias lt='ls -ltr' # sort by datealias lm='ls -al |more' # pipe through 'more'

# Alias chmod commandsalias mx='chmod a+x'alias 000='chmod 000'alias 644='chmod 644'alias 755='chmod 755'

#### What are some useful Bash tricks - Quora

Gaurav Gada, Master's Information Management, University of Washington Information School (2018)
sudo !!

http://www.commandlinefu.com/com...

Mattias Jansson, I like cats
Some things I used to use often... and not so often:

o Comment the line you're currently on (Esc-#).

o Send stuff to a host/port using bash builtins- echo foo > /dev/tcp/host/port. For example, quick and dirty file transfer from a minimal linux install to some place with nc installed:
On destination machine: nc -l 7070 > newfile
On source machine: cat somefile > /dev/tcp/somehostname/7070

o C-x C-e to edit current line in your $EDITOR (all readline-enabled programs have this- I really needed this often when writing an SQL query which ended up being very long) o I've got this simple shell function to take a config file (which uses the hash as a comment initiator) and dump all the contents which do not start with a comment or whitespace: unc () { grep -vE "^[ ]*\#"$1 | grep .
}

o A tiny no-nonsense webserver to share the directory you're standing in:
alias webshare='python -c "import SimpleHTTPServer;SimpleHTTPServer.test()"'

I wouldn't classify the following as tricks but things that every developer writing a bash script should know.

Every shell script should start with set -o nounset and set -o errexit

nounset means that using a variable that is not set will raise an error. In the following example if the bash script is called without argument all the files in /var/log will be delete.

1. #!/bin/bash
2. set -o nounset
3. CONTAINER_ROOT=$1 4. ... 5. rm$CONTAINER_ROOT/var/log/*

errexit when this options is set the the bash scrip will exit is a command fail. In the following example if the directory /bigdisk/temp doesn't exist mktemp will fail but the script will continue and call generate_big_data with no $TEMPFILE. 1. #!/bin/bash 2. set -o errexit 3. TEMPDIR=/bigdisk/temp 4. TEMPFILE=$(mktemp /$TEMPDIR/app.XXXXXX) 5. generate_big_data -out$TEMPFILE


In that previews case you can write TEMPFILE=$(mktemp /$TEMPDIR/app.XXXXXX) || exit 1 but it is always good to exist on error in order to be sure they will be no surprising side effect when a command called in the middle of your script fail.

Variables substitution

When you don string manipulation use variables substitution. This is faster and save from doing useless forks.

Stop writing things like that: FILENAME=basename $1 instead write FILENAME=${1##*/} or DIRNAME=${1%/*} instead of DIRNAME=dirname$1. You'll find more information on variables substitution in the bash manual in the paragraph Manipulating Strings.

Nick Shelly, Stanford CS PhD candid, Apple, Air Force capt, Rhodes scholar

Ctrl+R to reverse search through your Bash history. Ctrl+R again keeps searching, Ctrl+G cancels the search.

Though GNU's Readline package is not unique to Bash (Python's interactive shell has this as well), reverse search is one of the most useful aspects of command line shells over GUIs.

Dan Fango, www.danfango.co.uk

Written Apr 21, 2015

A couple I haven't seen (or missed) in the previous answers:

Alt+. : brings back the last word from the previous line. If your previous line was "ls somefile.txt" then "vi Alt+." will translate to "vi somefile.txt". Hitting Alt+.multiple times will cycle back through your history

Alt+# : translates to adding # (comment) to the start of your current command line and hitting return

Written Dec 19, 2011

How about C-x C-e to open your favorite editor for editing the current command line.

Written Feb 27, 2014

When I cd to some/long/path, then I type
$here=pwd then I cd back/to/some/other/path, then for example$ there=pwd
Now I can do stuff like...
$cd$here
$cp file.txt$there

Chris Rutherford, 20 years of unix admin

Written Aug 15, 2012

Check out this guys stuff, best bash script tricks Ive seen in my 20 years of scripting http://www.catonmat.net/blog/bas...

Michael Rinus, every day basher

Written Jun 5, 2015

I strongly recommend spending some time at http://www.commandlinefu.com/com...

There is some quite awesome and helpful stuff out there :)

#### What are some useful .bash_profile and .bashrc tips - Quora

function cl(){ cd "$@" && la; } function cdn(){ for i in seq$1; do cd ..; done;}

 

PROMPT_COMMAND="${PROMPT_COMMAND:+$PROMPT_COMMAND ; }"'echo dt pwd  $USER "$(history 1)" >> ~/.bash_eternal_history'

if [ -f /etc/bash_completion ]; then . /etc/bash_completion fi alias 'dus=du -sckx * | sort -nr' #directories sorted by size alias lsdirs="ls -l | grep '^d'"

 Gaurav Gada, Master's Information Management, University of Washington Information School (2018)

 Written Dec 17, 2011 I have these lines: 1shopt -s histappend 2PROMPT_COMMAND="history -n; history -a" 3unset HISTFILESIZE 4HISTSIZE=2000 

 The first 2 lines keep the history between multiple bash sessions synced and the last two increase the history size from the default 500.

Yaniv Ng, researcher, atheist, ex-gamer, terminalist vimmer

This is something I found very useful when working with multiple terminals on different directories. Sometimes the new terminal opens in the home directory instead of the current working directory (depending on the terminal program).

Use gg in the terminal where you want to go. Then go to the new terminal and use hh.


1. gg() { pwd > /tmp/last_path; }
2. hh() { cd $(cat /tmp/last_path); }     1. # Easy extract 2. extract () { 3. if [ -f$1 ] ; then
4. case $1 in 5. *.tar.bz2) tar xvjf$1 ;;
6. *.tar.gz) tar xvzf $1 ;; 7. *.bz2) bunzip2$1 ;;
8. *.rar) rar x $1 ;; 9. *.gz) gunzip$1 ;;
10. *.tar) tar xvf $1 ;; 11. *.tbz2) tar xvjf$1 ;;
12. *.tgz) tar xvzf $1 ;; 13. *.zip) unzip$1 ;;
14. *.Z) uncompress $1 ;; 15. *.7z) 7z x$1 ;;
16. *) echo "don't know how to extract '$1'..." ;; 17. esac 18. else 19. echo "'$1' is not a valid file!"
20. fi
21. }

alias top-commands='history | awk "{print $2}" | awk "{print$1}" |sort|uniq

Ch Huang

When bash is invoked as an interactive login shell, or as a non-inter‐
active shell with the --login option, it first reads and executes com‐
mands from the file /etc/profile, if that file exists. After reading
that file, it looks for ~/.bash_profile, ~/.bash_login, and ~/.profile,
in that order, and reads and executes commands from the first one that

#in case you rm a file by mistake
alias rm=safe_rm

safe_rm () {
local d t f s

[ -z "$PS1" ] && (/bin/rm "$@"; return)

d="${TRASH_DIR:=$HOME/.__trash}/date +%W"
t=date +%F_%H-%M-%S
[ -e "$d" ] || mkdir -p "$d" || return

for f do
[ -e "$f" ] || continue s=basename "$f"
/bin/mv "$f" "$d/${t}_$s" || break
done

echo -e "[$?$t whoami pwd]$@\n" >> "$d/00rmlog.txt"
}

Akhil Ravidas


1. alias Cd='cd -'


#### My Favorite bash Tips and Tricks Linux Journal

However, you can use spaces if they're enclosed in quotes outside the braces or within an item in the comma-separated list:

$echo {"one ","two ","red ","blue "}fish one fish two fish red fish blue fish$ echo {one,two,red,blue}" fish"
one fish two fish red fish blue fish


You also can nest braces, but you must use some caution here too:

$echo {{1,2,3},1,2,3} 1 2 3 1 2 3$ echo {{1,2,3}1,2,3}
11 21 31 2 3

#### [Dec 19, 2016] Unknown Bash Tips and Tricks For Linux Linux.com The source for Linux information

The type command looks a lot like the command builtin, but it does more:

$type ll ll is aliased to ls -alF'$ type -t grep
alias

Bash Functions

Run declare -F to see a list of Bash's builtin function names. declare -f prints out the complete functions, and declare -f [function-name] prints the named function. type won't find list functions, but once you know a function name it will also print it:

$type quote quote is a function quote () { echo \'${1//\'/\'\\\'\'}\'
}


This even works for your own functions that you create, like this simple example testfunc that does one thing: changes to the /etc directory:

$function testfunc > { > cd /etc > }  Now you can use declare and type to list and view your new function just like the builtins. #### [Dec 19, 2016] Bash Tricks » Linux Magazine Graham Nicholls • 2 years ago Oh FGS alias rm="rm -i" what a crock. I have _never_ needed this. Unix/Linux is expert friendly, not fool friendly. Possibly useful if you're root, otherwise just an incredible irritant. OTOH, I think that history time-stamping should be the default. So useful for auditing, and for "I know I did something the other day" stuff. I use '%c' for my HISTTIMEFORMAT. marnixava > Graham Nicholls • 2 years ago I fully agree that the 'rm="rm -i"' alias and similar aliases are irritating. I think it might also lull newcomers into a false sense of security that it's pretty safe to do that command. One day they might be on a system without such an alias. It's good to learn early on that "rm" means it's going to be removed, no ifs or buts. One needs to make a habit of reviewing the command line before hitting enter. Graham Nicholls > marnixava • 2 years ago That's a really good point, which I'd not considered. John Lockard • 2 years ago The "HISTIGNORE" is interesting for other purposes, but the option for ignoring commands which start with space is actually a setting in bash using "export HISTCONTROL=ignorespace". If you want to eliminate duplicate entries you can use "ignoredups" or "erasedups". "ignoreboth" does both "ignoredups" and "ignorespace". Ryan • 2 years ago I like that using chattr -a is mentioned as a possible security fix for .bash_history, when the next talked about item is HISTIGNORE and someone could just export HISTIGNORE="*" and it doesnt matter if .bash_alias is append only. The commands are not logged in the first place to be deleted later. edit: But good post overall. enjoyed it :) marnixava > Ryan • 2 years ago Even if the history file is chattr'ed to append-only mode, wouldn't the user still be able to simply remove that history file? IMHO there are too many workarounds for a determined user to make it worthwhile except perhaps if used only as a gentle reminder that we'd like not to alter the history file. #### Linux secrets most users don't know about ITworld J1r1k: "Alt + . (dot) in bash. Last argument of previous command. It took me few years to discover this." #### [Dec 06, 2015] Bash For Loop Examples ##### A very nice tutorial by Vivek Gite (created October 31, 2008 last updated June 24, 2015). His mistake is putting new for loop too far inside the tutorial. It should emphazied, not hidden. ###### June 24, 2015 | cyberciti.biz ... ... ... Bash v4.0+ has inbuilt support for setting up a step value using {START..END..INCREMENT} syntax: #!/bin/bash echo "Bash version${BASH_VERSION}..."
for i in {0..10..2}
do
echo "Welcome $i times" done Sample outputs: Bash version 4.0.33(0)-release... Welcome 0 times Welcome 2 times Welcome 4 times Welcome 6 times Welcome 8 times Welcome 10 times ... ... ... Three-expression bash for loops syntax This type of for loop share a common heritage with the C programming language. It is characterized by a three-parameter loop control expression; consisting of an initializer (EXP1), a loop-test or condition (EXP2), and a counting expression (EXP3). for (( EXP1; EXP2; EXP3 )) do command1 command2 command3 done A representative three-expression example in bash as follows: #!/bin/bash for (( c=1; c<=5; c++ )) do echo "Welcome$c times"
done
... ... ...

Jadu Saikia, November 2, 2008, 3:37 pm

Nice one. All the examples are explained well, thanks Vivek.

seq 1 2 20
output can also be produced using jot

jot – 1 20 2

The infinite loops as everyone knows have the following alternatives.

while(true)
or
while :

Andi Reinbrech, November 18, 2010, 7:42 pm
I know this is an ancient thread, but thought this trick might be helpful to someone:

For the above example with all the cuts, simply do

set echo $line This will split line into positional parameters and you can after the set simply say F1=$1; F2=$2; F3=$3

I used this a lot many years ago on solaris with "set date", it neatly splits the whole date string into variables and saves lots of messy cutting :-)

… no, you can't change the FS, if it's not space, you can't use this method

Peko, July 16, 2009, 6:11 pm
Hi Vivek,
Thanks for this a useful topic.

IMNSHO, there may be something to modify here
=======================
Latest bash version 3.0+ has inbuilt support for setting up a step value:

#!/bin/bash
for i in {1..5}
=======================
1) The increment feature seems to belong to the version 4 of bash.
Reference: http://bash-hackers.org/wiki/doku.php/syntax/expansion/brace
Accordingly, my bash v3.2 does not include this feature.

BTW, where did you read that it was 3.0+ ?
(I ask because you may know some good website of interest on the subject).

2) The syntax is {from..to..step} where from, to, step are 3 integers.
You code is missing the increment.

Note that GNU Bash documentation may be bugged at this time,
because on GNU Bash manual, you will find the syntax {x..y[incr]}
which may be a typo. (missing the second ".." between y and increment).

The Bash Hackers page
again, see http://bash-hackers.org/wiki/doku.php/syntax/expansion/brace
seeems to be more accurate,
but who knows ? Anyway, at least one of them may be right… ;-)

Keep on the good work of your own,
Thanks a million.

- Peko

Michal Kaut July 22, 2009, 6:12 am
Hello,

is there a simple way to control the number formatting? I use several computers, some of which have non-US settings with comma as a decimal point. This means that
for x in $(seq 0 0.1 1) gives 0 0.1 0.2 … 1 one some machines and 0 0,1 0,2 … 1 on other. Is there a way to force the first variant, regardless of the language settings? Can I, for example, set the keyboard to US inside the script? Or perhaps some alternative to $x that would convert commas to points?
(I am sending these as parameters to another code and it won't accept numbers with commas…)

The best thing I could think of is adding x=echo $x | sed s/,/./ as a first line inside the loop, but there should be a better solution? (Interestingly, the sed command does not seem to be upset by me rewriting its variable.) Thanks, Michal Peko July 22, 2009, 7:27 am To Michal Kaut: Hi Michal, Such output format is configured through LOCALE settings. I tried : export LC_CTYPE="en_EN.UTF-8″; seq 0 0.1 1 and it works as desired. You just have to find the exact value for LC_CTYPE that fits to your systems and your needs. Peko Peko July 22, 2009, 2:29 pm To Michal Kaus [2] Ooops – ;-) Instead of LC_CTYPE, LC_NUMERIC should be more appropriate (Although LC_CTYPE is actually yielding to the same result – I tested both) By the way, Vivek has already documented the matter : http://www.cyberciti.biz/tips/linux-find-supportable-character-sets.html Philippe Petrinko October 30, 2009, 8:35 am To Vivek: Regarding your last example, that is : running a loop through arguments given to the script on the command line, there is a simplier way of doing this: # instead of: # FILES="$@"
# for f in $FILES # use the following syntax for arg do # whatever you need here – try : echo "$arg"
done

Of course, you can use any variable name, not only "arg".

Philippe Petrinko November 11, 2009, 11:25 am

To tdurden:

Why would'nt you use

1) either a [for] loop
for old in * ; do mv ${old}${old}.new; done

2) Either the [rename] command ?
excerpt form "man rename" :

RENAME(1) Perl Programmers Reference Guide RENAME(1)

NAME
rename – renames multiple files

SYNOPSIS
rename [ -v ] [ -n ] [ -f ] perlexpr [ files ]

DESCRIPTION
"rename" renames the filenames supplied according to the rule specified
as the first argument. The perlexpr argument is a Perl expression
which is expected to modify the $_ string in Perl for at least some of the filenames specified. If a given filename is not modified by the expression, it will not be renamed. If no filenames are given on the command line, filenames will be read via standard input. For example, to rename all files matching "*.bak" to strip the extension, you might say rename 's/\.bak$//' *.bak

To translate uppercase names to lower, you'd use

rename 'y/A-Z/a-z/' *

- Philippe

Philippe Petrinko November 11, 2009, 9:27 pm

If you set the shell option extglob, Bash understands some more powerful patterns. Here, a is one or more pattern, separated by the pipe-symbol (|).

?() Matches zero or one occurrence of the given patterns
*() Matches zero or more occurrences of the given patterns
+() Matches one or more occurrences of the given patterns
@() Matches one of the given patterns
!() Matches anything except one of the given patterns

Philippe Petrinko November 12, 2009, 3:44 pm

To Sean:
Right, the more sharp a knife is, the easier it can cut your fingers…

I mean: There are side-effects to the use of file globbing (like in [ for f in * ] ) , when the globbing expression matches nothing: the globbing expression is not susbtitued.

Then you might want to consider using [ nullglob ] shell extension,
to prevent this.
see: http://www.bash-hackers.org/wiki/doku.php/syntax/expansion/globs#customization

Devil hides in detail ;-)

Dominic January 14, 2010, 10:04 am

There is an interesting difference between the exit value for two different for looping structures (hope this comes out right):
for (( c=1; c<=2; c++ )) do echo -n "inside (( )) loop c is $c, "; done; echo "done (( )) loop c is$c" for c in {1..2}; do echo -n "inside { } loop c is $c, "; done; echo "done { } loop c is$c"
You see that the first structure does a final increment of c, the second does not. The first is more useful IMO because if you have a conditional break in the for loop, then you can subsequently test the value of $c to see if the for loop was broken or not; with the second structure you can't know whether the loop was broken on the last iteration or continued to completion. Dominic January 14, 2010, 10:09 am sorry, my previous post would have been clearer if I had shown the output of my code snippet, which is: inside (( )) loop c is 1, inside (( )) loop c is 2, done (( )) loop c is 3 inside { } loop c is 1, inside { } loop c is 2, done { } loop c is 2 Philippe Petrinko March 9, 2010, 2:34 pm @Dmitry And, again, as stated many times up there, using [seq] is counter productive, because it requires a call to an external program, when you should Keep It Short and Simple, using only bash internals functions:  for ((c=1; c<21; c+=2)); do echo "Welcome$c times" ; done 

(and I wonder why Vivek is sticking to that old solution which should be presented only for historical reasons when there was no way of using bash internals.
By the way, this historical recall should be placed only at topic end, and not on top of the topic, which makes newbies sticking to the not-up-to-date technique ;-) )

Sean March 9, 2010, 11:15 pm

I have a comment to add about using the builtin for (( … )) syntax. I would agree the builtin method is cleaner, but from what I've noticed with other builtin functionality, I had to check the speed advantage for myself. I wrote the following files:

builtin_count.sh:
 #!/bin/bash for ((i=1;i<=1000000;i++)) do echo "Output $i" done  seq_count.sh:  #!/bin/bash for i in$(seq 1 1000000) do echo "Output $i" done  And here were the results that I got: time ./builtin_count.sh real 0m22.122s user 0m18.329s sys 0m3.166s time ./seq_count.sh real 0m19.590s user 0m15.326s sys 0m2.503s The performance increase isn't too significant, especially when you are probably going to be doing something a little more interesting inside of the for loop, but it does show that builtin commands are not necessarily faster. Andi Reinbrech November 18, 2010, 8:35 pm The reason why the external seq is faster, is because it is executed only once, and returns a huge splurb of space separated integers which need no further processing, apart from the for loop advancing to the next one for the variable substitution. The internal loop is a nice and clean/readable construct, but it has a lot of overhead. The check expression is re-evaluated on every iteration, and a variable on the interpreter's heap gets incremented, possibly checked for overflow etc. etc. Note that the check expression cannot be simplified or internally optimised by the interpreter because the value may change inside the loop's body (yes, there are cases where you'd want to do this, however rare and stupid they may seem), hence the variables are volatile and get re-evaluted. I.e. botom line, the internal one has more overhead, the "seq" version is equivalent to either having 1000000 integers inside the script (hard coded), or reading once from a text file with 1000000 integers with a cat. Point being that it gets executed only once and becomes static. OK, blah blah fishpaste, past my bed time :-) Cheers, Andi Anthony Thyssen June 4, 2010, 6:53 am The {1..10} syntax is pretty useful as you can use a variable with it! limit=10 echo {1..${limit}}
{1..10}

You need to eval it to get it to work!

limit=10
eval "echo {1..${limit}}" 1 2 3 4 5 6 7 8 9 10  'seq' is not avilable on ALL system (MacOSX for example) and BASH is not available on all systems either. You are better off either using the old while-expr method for computer compatiblity!  limit=10; n=1; while [$n -le 10 ]; do
echo $n; n=expr$n + 1;
done


Alternativally use a seq() function replacement…

 # seq_count 10
seq_count() {
i=1; while [ $i -le$1 ]; do echo $i; i=expr$i + 1; done
}
# simple_seq 1 2 10
simple_seq() {
i=$1; while [$i -le $3 ]; do echo$i; i=expr $i +$2; done
}
seq_integer() {
if [ "X$1" = "X-f" ] then format="$2"; shift; shift
else format="%d"
fi
case $# in 1) i=1 inc=1 end=$1 ;;
2) i=$1 inc=1 end=$2 ;;
*) i=$1 inc=$2 end=$3 ;; esac while [$i -le $end ]; do printf "$format\n" $i; i=expr$i + $inc; done }  Edited: by Admin – added code tags. TheBonsai June 4, 2010, 9:57 am The Bash C-style for loop was taken from KSH93, thus I guess it's at least portable towards Korn and Z. The seq-function above could use i=$((i + inc)), if only POSIX matters. expr is obsolete for those things, even in POSIX.

Philippe Petrinko June 4, 2010, 10:15 am

But FOR C-style does not seem to be POSIXLY-correct…

Top is here, http://www.opengroup.org/onlinepubs/009695399/mindex.html

and the Shell and Utilities volume (XCU) T.OC. is here
http://www.opengroup.org/onlinepubs/009695399/utilities/toc.html
doc is:
http://www.opengroup.org/onlinepubs/009695399/basedefs/xbd_chap01.html

Anthony Thyssen June 6, 2010, 7:18 am

TheBonsai wrote…. "The seq-function above could use i=$((i + inc)), if only POSIX matters. expr is obsolete for those things, even in POSIX." I am not certain it is in Posix. It was NOT part of the original Bourne Shell, and on some machines, I deal with Bourne Shell. Not Ksh, Bash, or anything else. Bourne Shell syntax works everywhere! But as 'expr' is a builtin in more modern shells, then it is not a big loss or slow down. This is especially important if writing a replacement command, such as for "seq" where you want your "just-paste-it-in" function to work as widely as possible. I have been shell programming pretty well all the time since 1988, so I know what I am talking about! Believe me. MacOSX has in this regard been the worse, and a very big backward step in UNIX compatibility. 2 year after it came out, its shell still did not even understand most of the normal 'test' functions. A major pain to write shells scripts that need to also work on this system. TheBonsai June 6, 2010, 12:35 pm Yea, the question was if it's POSIX, not if it's 100% portable (which is a difference). The POSIX base more or less is a subset of the Korn features (88, 93), pure Bourne is something "else", I know. Real portability, which means a program can go wherever UNIX went, only in C ;) Philippe Petrinko November 22, 2010, 8:23 am And if you want to get rid of double-quotes, use: one-liner code: while read; do record=${REPLY}; echo ${record}|while read -d ","; do field="${REPLY#\"}"; field="${field%\"}"; echo${field}; done; done<data 

script code, added of some text to better see record and field breakdown:
 #!/bin/bash while read do echo "New record" record=${REPLY} echo${record}|while read -d , do field="${REPLY#\"}" field="${field%\"}" echo "Field is :${field}:" done done<data  Does it work with your data? - PP Philippe Petrinko November 22, 2010, 9:01 am Of course, all the above code was assuming that your CSV file is named "data". If you want to use anyname with the script, replace:  done<data  With:  done  And then use your script file (named for instance "myScript") with standard input redirection:  myScript < anyFileNameYouWant  Enjoy! Philippe Petrinko November 22, 2010, 11:28 am well no there is a bug, last field of each record is not read – it needs a workout and may be IFS modification ! After all that's what it was built for… :O) Anthony Thyssen November 22, 2010, 11:31 pm Another bug is the inner loop is a pipeline, so you can't assign variables for use later in the script. but you can use '<<<' to break the pipeline and avoid the echo. But this does not help when you have commas within the quotes! Which is why you needed quotes in the first place. In any case It is a little off topic. Perhaps a new thread for reading CVS files in shell should be created. Philippe Petrinko November 24, 2010, 6:29 pm Anthony, Would you try this one-liner script on your CSV file? This one-liner assumes that CSV file named [data] has __every__ field double-quoted.  while read; do r="${REPLY#\"}";echo "${r//\",\"/\"}"|while read -d \";do echo "Field is :${REPLY}:";done;done<data 

Here is the same code, but for a script file, not a one-liner tweak.

 #!/bin/bash # script csv01.sh # # 1) Usage # This script reads from standard input # any CSV with double-quoted data fields # and breaks down each field on standard output # # 2) Within each record (line), _every_ field MUST: # - Be surrounded by double quotes, # - and be separated from preceeding field by a comma # (not the first field of course, no comma before the first field) # while read do echo "New record" # this is not mandatory-just for explanation # # # store REPLY and remove opening double quote record="${REPLY#\"}" # # # replace every "," by a single double quote record=${record//\",\"/\"} # # echo ${record}|while read -d \" do # store REPLY into variable "field" field="${REPLY}" # # echo "Field is :${field}:" # just for explanation done done  This script named here [cvs01.sh] must be used so:  cvs01.sh < my-cvs-file-with-doublequotes  Philippe Petrinko November 24, 2010, 6:35 pm @Anthony, By the way, using [REPLY] in the outer loop _and_ the inner loop is not a bug. As long as you know what you do, this is not problem, you just have to store [REPLY] value conveniently, as this script shows. TheBonsai March 8, 2011, 6:26 am for ((i=1; i<=20; i++)); do printf "%02d\n" "$i"; done

nixCraft March 8, 2011, 6:37 am

+1 for printf due to portability, but you can use bashy .. syntax too

for i in {01..20}; do echo "$i"; done TheBonsai March 8, 2011, 6:48 am Well, it isn't portable per se, it makes it portable to pre-4 Bash versions. I think a more or less "portable" (in terms of POSIX, at least) code would be i=0 while [ "$((i >= 20))" -eq 0 ]; do
printf "%02d\n" "$i" i=$((i+1))
done

Philip Ratzsch April 20, 2011, 5:53 am

I didn't see this in the article or any of the comments so I thought I'd share. While this is a contrived example, I find that nesting two groups can help squeeze a two-liner (once for each range) into a one-liner:

for num in {{1..10},{15..20}};do echo $num;done Great reference article! Philippe Petrinko April 20, 2011, 8:23 am @Philip Nice thing to think of, using brace nesting, thanks for sharing. Philippe Petrinko May 6, 2011, 10:13 am Hello Sanya, That would be because brace expansion does not support variables. I have to check this. Anyway, Keep It Short and Simple: (KISS) here is a simple solution I already gave above: xstart=1;xend=10;xstep=1 for (( x =$xstart; x <= $xend; x +=$xstep)); do echo $x;done Actually, POSIX compliance allows to forget$ in for quotes, as said before, you could also write:

xstart=1;xend=10;xstep=1
for (( x = xstart; x <= xend; x += xstep)); do echo $x;done Philippe Petrinko May 6, 2011, 10:48 am Sanya, Actually brace expansion happens __before__$ parameter exapansion, so you cannot use it this way.

Nevertheless, you could overcome this this way:

max=10; for i in $(eval echo {1..$max}); do echo $i; done Sanya May 6, 2011, 11:42 am Hello, Philippe Thanks for your suggestions You basically confirmed my findings, that bash constructions are not as simple as zsh ones. But since I don't care about POSIX compliance, and want to keep my scripts "readable" for less experienced people, I would prefer to stick to zsh where my simple for-loop works Cheers, Sanya Philippe Petrinko May 6, 2011, 12:07 pm Sanya, First, you got it wrong: solutions I gave are not related to POSIX, I just pointed out that POSIX allows not to use$ in for (( )), which is just a little bit more readable – sort of.

Second, why do you see this less readable than your [zsh] [for loop]?

for (( x = start; x <= end; x += step)) do
echo "Loop number ${x}" done It is clear that it is a loop, loop increments and limits are clear. IMNSHO, if anyone cannot read this right, he should not be allowed to code. :-D BFN Anthony Thyssen May 8, 2011, 11:30 pm If you are going to do…$(eval echo {1..$max}); You may as well use "seq" or one of the many other forms. See all the other comments on doing for loops. Tom P May 19, 2011, 12:16 pm I am trying to use the variable I set in the for line on to set another variable with a different extension. Couldn't get this to work and couldnt find it anywhere on the web… Can someone help. Example: FILE_TOKEN=cat /tmp/All_Tokens.txt for token in$FILE_TOKEN
do
A1_$token=grep$A1_token /file/path/file.txt | cut -d ":" -f2


my goal is to take the values from the ALL Tokens file and set a new variable with A1_ infront of it… This tells be that A1_ is not a command…

#### [Nov 08, 2015] Get timestamps on Bash's History

###### nickgeoghegan.net
One of the annoyances of Bash, is that searching through your history has no context. When did I last run that command? What commands were run at 3am, while on the lock?

The following, single line, run in the shell, will provide date and time stamping for your Bash History the next time you login, or run bash.

echo  'export HISTTIMEFORMAT="%h/%d - %H:%M:%S "' >>  ~/.bashrc

#### [May 08, 2014] 25 Even More – Sick Linux Commands UrFix's Blog

6) Display a cool clock on your terminal

watch -t -n1 "date +%T|figlet"

This command displays a clock on your terminal which updates the time every second. Press Ctrl-C to exit.

A couple of variants:

A little bit bigger text:

watch -t -n1 "date +%T|figlet -f big"You can try other figlet fonts, too.

Big sideways characters:

watch -n 1 -t '/usr/games/banner -w 30 $(date +%M:%S)'This requires a particular version of banner and a 40-line terminal or you can adjust the width (“30″ here). 7) intercept stdout/stderr of another process strace -ff -e trace=write -e write=1,2 -p SOME_PID  8) Remove duplicate entries in a file without sorting. awk '!x[$0]++' <file>


Using awk, find duplicates in a file without sorting, which reorders the contents. awk will not reorder them, and still find and remove duplicates which you can then redirect into another file.

9) Record a screencast and convert it to an mpeg
ffmpeg -f x11grab -r 25 -s 800x600 -i :0.0 /tmp/outputFile.mpg


Grab X11 input and create an MPEG at 25 fps with the resolution 800×600

10) Mount a .iso file in UNIX/Linux
mount /path/to/file.iso /mnt/cdrom -oloop


“-o loop” lets you use a file as a block device

11) Insert the last command without the last argument (bash)
!:-


/usr/sbin/ab2 -f TLS1 -S -n 1000 -c 100 -t 2 http://www.google.com/then

!:- http://www.urfix.com/is the same as

/usr/sbin/ab2 -f TLS1 -S -n 1000 -c 100 -t 2 http://www.urfix.com/

12) Convert seconds to human-readable format

date -d@1234567890


This example, for example, produces the output, “Fri Feb 13 15:26:30 EST 2009″

13) Job Control
^Z $bg$disown


You’re running a script, command, whatever.. You don’t expect it to take long, now 5pm has rolled around and you’re ready to go home… Wait, it’s still running… You forgot to nohup it before running it… Suspend it, send it to the background, then disown it… The ouput wont go anywhere, but at least the command will still run…

14) Edit a file on a remote host using vim
vim scp://username@host//path/to/somefile

15) Monitor the queries being run by MySQL
watch -n 1 mysqladmin --user=<user> --password=<password> processlist


Watch is a very useful command for periodically running another command – in this using mysqladmin to display the processlist. This is useful for monitoring which queries are causing your server to clog up.

16) escape any command aliases
command]  e.g. if rm is aliased for ‘rm -i’, you can escape the alias by prepending a backslash: rm [file] # WILL prompt for confirmation per the alias \rm [file] # will NOT prompt for confirmation per the default behavior of the command 17) Show apps that use internet connection at the moment. (Multi-Language) ss -p  for one line per process: ss -p | catfor established sockets only: ss -p | grep STAfor just process names: ss -p | cut -f2 -sd\"or ss -p | grep STA | cut -f2 -d\" 18) Send pop-up notifications on Gnome notify-send ["<title>"] "<body>"  The title is optional. Options: -t: expire time in milliseconds. -u: urgency (low, normal, critical). -i: icon path. On Debian-based systems you may need to install the ‘libnotify-bin’ package. Useful to advise when a wget download or a simulation ends. Example: wget URL ; notify-send "Done" 19) quickly rename a file mv filename.{old,new}  20) Remove all but one specific file rm -f !(survivior.txt)  21) Generate a random password 30 characters long strings /dev/urandom | grep -o '[[:alnum:]]' | head -n 30 | tr -d '\n'; echo  Find random strings within /dev/urandom. Using grep filter to just Alphanumeric characters, and then print the first 30 and remove all the line feeds. 22) Run a command only when load average is below a certain threshold echo "rm -rf /unwanted-but-large/folder" | batch  Good for one off jobs that you want to run at a quiet time. The default threshold is a load average of 0.8 but this can be set using atrun. 23) Binary Clock watch -n 1 'echo "obase=2;date +%s" | bc'  Create a binary clock. 24) Processor / memory bandwidthd? in GB/s dd if=/dev/zero of=/dev/null bs=1M count=32768  Read 32GB zero’s and throw them away. How fast is your system? 25) Backup all MySQL Databases to individual files for I in (mysql -e 'show databases' -s --skip-column-names); do mysqldump I | gzp > "I.sql.gz"; done #### [May 08, 2014] 25 Best Linux Commands UrFix's Blog 25) sshfs name@server:/path/to/folder /path/to/mount/point Mount folder/filesystem through SSH Install SSHFS from http://fuse.sourceforge.net/sshfs.html Will allow you to mount a folder security over a network. 24) !!:gs/foo/bar Runs previous command replacing foo by bar every time that foo appears Very useful for rerunning a long command changing some arguments globally. As opposed to ^foo^bar, which only replaces the first occurrence of foo, this one changes every occurrence. 23) mount | column -t currently mounted filesystems in nice layout Particularly useful if you’re mounting different drives, using the following command will allow you to see all the filesystems currently mounted on your computer and their respective specs with the added benefit of nice formatting. 22) <space>command Execute a command without saving it in the history Prepending one or more spaces to your command won’t be saved in history. Useful for pr0n or passwords on the commandline. 21) ssh user@host cat /path/to/remotefile | diff /path/to/localfile - Compare a remote file with a local file Useful for checking if there are differences between local and remote files. 20) mount -t tmpfs tmpfs /mnt -o size=1024m Mount a temporary ram partition Makes a partition in ram which is useful if you need a temporary working space as read/write access is fast. Be aware that anything saved in this partition will be gone after your computer is turned off. 19) dig +short txt <keyword>.wp.dg.cx Query Wikipedia via console over DNS Query Wikipedia by issuing a DNS query for a TXT record. The TXT record will also include a short URL to the complete corresponding Wikipedia entry. 18) netstat -tlnp Lists all listening ports together with the PID of the associated process The PID will only be printed if you’re holding a root equivalent ID. 17) dd if=/dev/dsp | ssh -c arcfour -C username@host dd of=/dev/dsp output your microphone to a remote computer’s speaker This will output the sound from your microphone port to the ssh target computer’s speaker port. The sound quality is very bad, so you will hear a lot of hissing. 16) echo “ls -l” | at midnight Execute a command at a given time This is an alternative to cron which allows a one-off task to be scheduled for a certain time. 15) curl -u user:pass -d status=”Tweeting from the shell” http://twitter.com/statuses/update.xml Update twitter via curl 14) ssh -N -L2001:localhost:80 somemachine start a tunnel from some machine’s port 80 to your local post 2001 now you can acces the website by going to http://localhost:2001/ 13) reset Salvage a borked terminal If you bork your terminal by sending binary data to STDOUT or similar, you can get your terminal back using this command rather than killing and restarting the session. Note that you often won’t be able to see the characters as you type them. 12) ffmpeg -f x11grab -s wxga -r 25 -i :0.0 -sameq /tmp/out.mpg Capture video of a linux desktop 11) > file.txt Empty a file For when you want to flush all content from a file without removing it (hat-tip to Marc Kilgus). 10) ssh-copy-id user@host Copy ssh keys to user@host to enable password-less ssh logins. To generate the keys use the command ssh-keygen 9) ctrl-x e Rapidly invoke an editor to write a long, complex, or tricky command Next time you are using your shell, try typing ctrl-x e (that is holding control key press x and then e). The shell will take what you’ve written on the command line thus far and paste it into the editor specified by EDITOR. Then you can edit at leisure using all the powerful macros and commands of vi, emacs, nano, or whatever. 8 ) !whatever:p Check command history, but avoid running it !whatever will search your command history and execute the first command that matches ‘whatever’. If you don’t feel safe doing this put :p on the end to print without executing. Recommended when running as superuser. 7) mtr google.com mtr, better than traceroute and ping combined mtr combines the functionality of the traceroute and ping programs in a single network diagnostic tool. As mtr starts, it investigates the network connection between the host mtr runs on and HOSTNAME. by sending packets with purposly low TTLs. It continues to send packets with low TTL, noting the response time of the intervening routers. This allows mtr to print the response percentage and response times of the internet route to HOSTNAME. A sudden increase in packetloss or response time is often an indication of a bad (or simply over‐loaded) link. 6 ) cp filename{,.bak} quickly backup or copy a file with bash 5) ^foo^bar Runs previous command but replacing Really useful for when you have a typo in a previous command. Also, arguments default to empty so if you accidentally run: echo “no typozs” you can correct it with ^z 4) cd - change to the previous working directory 3):w !sudo tee % Save a file you edited in vim without the needed permissions I often forget to sudo before editing a file I don’t have write permissions on. When you come to save that file and get the infamous “E212: Can’t open file for writing”, just issue that vim command in order to save the file without the need to save it to a temp file and then copy it back again. 2) python -m SimpleHTTPServer Serve current directory tree at http://HOSTNAME:8000/ 1) sudo !! Run the last command as root Useful when you forget to use sudo for a command. “!!” grabs the last run command. #### [Dec 16, 2012] bash - how do I list the functions defined in my shell - Stack Overflow Function names and definitions may be listed with the -f option to the declare or typeset builtin commands (see Bash Builtins). The -F option to declare or typeset will list the function names only (and optionally the source file and line number #### Unknown Bash Tips and Tricks For Linux Linux.com Bash Builtins Bash has a bunch of built-in commands, and some of them are stripped-down versions of their external GNU coreutils cousins. So why use them? You probably already do, because of the order of command execution in Bash: 1. Bash aliases 2. Bash keywords 3. Bash functions 4. Bash builtins 5. Scripts and executable programs that are in your PATH So when you run echo, kill, printf, pwd, or test most likely you're using the Bash builtins rather than the GNU coreutils commands. How do you know? By using one of the Bash builtins to tell you, the command command:  command -V echo echo is a shell builtin  command -V ping ping is /bin/ping The Bash builtins do not have man pages, but they do have a backwards help builtin command that displays syntax and options:  help echo echo: echo [-neE] [arg ...] Write arguments to the standard output. Display the ARGs on the standard output followed by a newline. Options: -n do not append a newline -e enable interpretation of the following backslash escapes [...] I call it backwards because most Linux commands use a syntax of commandname --help, where help is a command option instead of a command. The type command looks a lot like the command builtin, but it does more:  type -a cat cat is /bin/cat  type -t cat file  type ll ll is aliased to ls -alF'  type -a echo echo is a shell builtin echo is /bin/echo  type -t grep alias The type utility identifies builtin commands, functions, aliases, keywords (also called reserved words), and also binary executables and scripts, which it calls file. At this point, if you are like me, you are grumbling "How about showing me a LIST of the darned things." I hear and obey, for you can find these delightfully documented in the The GNU Bash Reference Manual indexes. Don't be afraid, because unlike most software documention this isn't a scary mythical creature like Sasquatch, but a real live complete command reference. The point of this little exercise is so you know what you're really using when you type a command into the Bash shell, and so you know how it looks to Bash. There is one more overlapping Bash builtin, and that is the time keyword:  type -t time keyword So why would you want to use Bash builtins instead of their GNU cousins? Builtins may execute a little faster than the external commands, because external commands have to fork an extra process. I doubt this is much of an issue on modern computers because we have horsepower to burn, unlike the olden days when all we had were tiny little nanohertzes, but when you're tweaking performance it's one thing to look at. When you want to use the GNU command instead of the Bash builtin use its whole path, which you can find with command, type, or the good old not-Bash command which:  which echo /bin/echo  which which /usr/bin/which  Bash Functions Run declare -F to see a list of Bash's builtin function names. declare -f prints out the complete functions, and declare -f [function-name] prints the named function. type won't find list functions, but once you know a function name it will also print it:  type quote quote is a function quote () { echo \'{1//\'/\'\\\'\'}\' } This even works for your own functions that you create, like this simple example testfunc that does one thing: changes to the /etc directory:  function testfunc > { > cd /etc > } Now you can use declare and type to list and view your new function just like the builtins. Bash's Violent Side Don't be fooled by Bash's calm, obedient exterior, because it is capable of killing. There have been a lot of changes to how Linux manages processes, in some cases making them more difficult to stop, so knowing how to kill runaway processes is still an important bit of knowledge. Fortunately, despite all this newfangled "progress" the reliable old killers still work. I've had some troubles with bleeding-edge releases of KMail; it hangs and doesn't want to close by normal means. It spawns a single process, which we can see with the ps command: ps axf|grep kmail 2489 ? Sl 1:44 /usr/bin/kmail -caption KMail You can start out gently and try this:  kill 2489 This sends the default SIGTERM (signal terminate) signal, which is similar to the SIGINT (signal interrupt) sent from the keyboard with Ctrl+c. So what if this doesn't work? Then you amp up your stopping power and use SIGKILL, like this:  kill -9 2489 This is the nuclear option and it will work. As the relevant section of the GNU C manual says: "The SIGKILL signal is used to cause immediate program termination. It cannot be handled or ignored, and is therefore always fatal. It is also not possible to block this signal." This is different from SIGTERM and SIGINT and other signals that politely ask processes to terminate. They can be trapped and handled in different ways, and even blocked, so the response you get to a SIGTERM depends on how the program you're trying to kill has been programmed to handle signals. In an ideal world a program responds to SIGTERM by tidying up before exiting, like finishing disk writes and deleting temporary files. SIGKILL knocks it out and doesn't give it a chance to do any cleanup. (See man 7 signal for a complete description of all signals.) So what's special about Bash kill over GNU /bin/kill? My favorite is how it looks when you invoke the online help summary:  help kill Another advantage is it can use job control numbers in addition to PIDs. In this modern era of tabbed terminal emulators job control isn't the big deal it used to be, but the option is there if you want it. The biggest advantage is you can kill processes even if they have gone berserk and maxed out your system's process number limit, which would prevent you from launching /bin/kill. Yes, there is a limit, and you can see what it is by querying /proc:  cat /proc/sys/kernel/threads-max 61985  With Bash kill there are several ways to specify which signal you want to use. These are all the same:  kill 2489  kill -s TERM 2489  kill -s SIGTERM 2489  kill -n 15 2489 kill -l lists all supported signals. If you spend a little quality time with man bash and the GNU Bash Manual I daresay you will learn more valuable tasks that Bash can do for you. #### My Favorite Bash Substitution Tricks Drastic Code ## My Favorite Bash Substitution Tricks August 01, 2009 Here’s a few tricks that I often use on the command line to save time. They take advantage of some variables that the bash shell uses to store various aspects of your history. #### Repeating the last command with !! Sometimes I run a command that requires sudo access, but forget the sudo. This is a great opportunity to use !! which holds the last command you ran.   tail /var/log/mail.log tail: cannot open /var/log/mail.log' for reading: Permission denied  sudo !! sudo tail /var/log/mail.log # output of command  #### The last argument of the last command using ! Sometimes it’s handy to be able to reference the last argument of your last command. This can make certain operations safer, by preventing a fat fingered typo from deleting important files.   ls *.log a.log b.log  rm -v ! removed a.log' removed b.log'  Similarly you can use !* to reference all of the last commands’ arguments.   touch a.log b.log  rm -v !* rm -v a.log b.log removed a.log' removed b.log'  #### Correcting mistakes with ^^ This is a nifty trick that performs a substitution on your last command. It’s great for correcting typos, or running similar commands back to back. It looks for a match with whatever is after the first carrot, and replaces it with whatever is after the second.   cmhod a+x my_script.sh -bash: cmhod: command not found  ^mh^hm chmod a+x my_script.sh  I use this one all the time doing rails development if I make a mistake on a script/generate command.  script/generate model Animal species:string sex:string birthday:date exists app/models/ exists test/unit/ exists test/fixtures/ create app/models/animal.rb create test/unit/animal_test.rb create test/fixtures/animals.yml create db/migrate create db/migrate/20090801180754_create_animals.rb   ^generate^destroy script/destroy model Animal species:string sex:string birthday:date notempty db/migrate notempty db rm db/migrate/20090801180754_create_animals.rb rm test/fixtures/animals.yml rm test/unit/animal_test.rb rm app/models/animal.rb rmdir test/fixtures notempty test rmdir test/unit notempty test rmdir app/models notempty app  ^destroy ^generate rspec_ script/generate rspec_model Animal species:string sex:string birthday:date create app/models/ create spec/models/ create spec/fixtures/ create app/models/animal.rb create spec/models/animal_spec.rb create spec/fixtures/animals.yml create db/migrate create db/migrate/20090801180937_create_animals.rb   Hope someone else finds these as handy as I do. Tagged with: bash command-line tips | #### Comments 1. Sam August 02, 2009 @ 11:20 AM One more tip: You can also echo a specific number of arguments off the end of the last command using !:n*, where n is the number of the first argument to echo. For example:  touch 1.log 2.log 3.log 4.log 5.log  rm -v !:3* rm -v 3.log 4.log 5.log 3.log 4.log 5.log  I don’t use this one too much in practice but it could come in handy in certain situations. 2. Kirsten August 03, 2009 @ 05:15 PM Thanks Sam, I didn’t know about !* and the ^^ substitution, those will be useful! #### Re New line in bash variables pain Maxim Vexler Tue, 14 Nov 2006 12:40:28 -0800 On 11/14/06, Oded Arbel <[EMAIL PROTECTED]> wrote: [snip]  (IFS="(echo)"; \ for pair in awk '/^[^[].+[^\n]/ {print 1,3}' passwd.fake; do echo "pair"; done) In the second example, I force the record separator to be only the new line character (the output from 'echo'. I can probably use \n, but I wanted to play it safe). Do mind the wrapping of the second form in parenthesis, otherwise you clobber your global IFS, which is something you want to avoid. -- Oded ::.. We make a living by what we get, but we make a life by what we give. -- Winston Churchill  Thanks to everyone for the help, all solution worked. To sum up the tips: By Oded Arbel: a. Use a subshell to avoid mistakenly over riding your shell variables. b. Use "(echo)" as portable(?) newline variable scripting style. By Ehud Karni: a. Pipeing into bash subshell can be accepted inside the shell with read. b. using a "while read VAR1 VAR2 VAR3..." is a convenient method to accepting stdin data. c. awk has system() !! By Amos Shapira: a. General work around is to construct the whole command as text, then use either piping to sh or bash buildin "expr". By Omer Shapira: a. xargs -n switch can be used to "collect" variables separated by either of [\n\t ]. By Valery Reznic: a. set -- "space delimited word list" can be used as a quick method for assigning value to number variables (1..9). [question: Really? this does not seem to work for me]. b. bash while loop can get stdin from file IO redirection. Ariel Biener doesn't understand the need for voodoo in modern life... ;) Thanks guys for an educational thread.  #### [Mar 17, 2010] Power Shell Usage Bash Tips & Tricks Searching the Past • There are several bad ways of finding previous lines from history • Many people go for pressing Up lots (and lots) • A tad inefficient, perhaps • Ctrl+R searches previous lines • But Ctrl+R zip Esc doesn’t find the last zip command — it also matches any line that copied, deleted, unzipped, or did anything else with a zip file • Those of a gambling bent can chance ! and a command name • Irritating when !gv opens gvim instead of gv • Sane Incremental Searching • Bash can cycle through lines starting in a particular way • Just type in a few characters then press Up • Don’t need to press Up so many times • Don’t see lines that merely contain those letters • Don’t have to chance executing the wrong line • Incremental searching with Up and Down is configured in .inputrc "\e[A": history-search-backward "\e[B": history-search-forward • Old behavior still available with Ctrl+P and Ctrl+N • If that prevents Left and Right from working, fix them like this: "\e[C": forward-char "\e[D": backward-char • Repeating Command Arguments • Commonly want to repeat just bits of commands • Very often the previous command’s last argument • Meta+. retrieves the last argument. Press repeatedly to cycle through the final argument from earlier commands • Magic Space • A magic space inserts a space character as normal • And also performs history expansion in the line • See what you type before you commit to it • Press Space before Enter if necessary • Magic Space Set-Up • Magic space is configured in .inputrc • Redefine what Space does • There are other readline-based programs without this feature, so make it only apply in Bash: if Bash Space: magic-space endif • Forgetting Options • Common to forget an option from a command • Want to rerun the command with the option • Go to the previous history line, then move just after the command name to type the option • Can set up a keyboard macro to do this • Insert-Option Macro • Meta+O can be made to load the previous command and position the cursor for typing an option • Defined in .inputrc: "\M-o": "\C-p\C-a\M-f " • Ctrl+P: previous line • Ctrl+A: start of line • Meta+F: forward a word, past the command • : insert a space • 17 unused keystrokes with just Ctrl or Meta modifiers #### [Aug 9, 2009] My Favorite bash Tips and Tricks One last tip I'd like to offer is using loops from the command line. The command line is not the place to write complicated scripts that include multiple loops or branching. For small loops, though, it can be a great time saver. Unfortunately, I don't see many people taking advantage of this. Instead, I frequently see people use the up arrow key to go back in the command history and modify the previous command for each iteration. If you are not familiar with creating for loops or other types of loops, many good books on shell scripting discuss this topic. A discussion on for loops in general is an article in itself. You can write loops interactively in two ways. The first way, and the method I prefer, is to separate each line with a semicolon. A simple loop to make a backup copy of all the files in a directory would look like this:  for file in * ; do cp file file.bak; done Another way to write loops is to press Enter after each line instead of inserting a semicolon. bash recognizes that you are creating a loop from the use of the for keyword, and it prompts you for the next line with a secondary prompt. It knows you are done when you enter the keyword done, signifying that your loop is complete:  for file in * > do cp file file.bak > done  #### [Aug 4, 2009] Tech Tip View Config Files Without Comments Linux Journal I've been using this grep invocation for years to trim comments out of config files. Comments are great but can get in your way if you just want to see the currently running configuration. I've found files hundreds of lines long which had fewer than ten active configuration lines, it's really hard to get an overview of what's going on when you have to wade through hundreds of lines of comments.  grep ^[^#] /etc/ntp.conf  The regex ^[^#] matches the first character of any line, as long as that character that is not a #. Because blank lines don't have a first character they're not matched either, resulting in a nice compact output of just the active configuration lines. #### The Various bash Prompts by Juliet Kemp ##### PS4 is the prompt shown when you set the debug mode on a shell script using set -x at the top of the script. This echoes each line of the script to STDOUT before executing it. The default prompt is ++. More usefully, you can set it to display the line number, with: ##### export PS4='LINENO+ ' It's fairly likely that you already have a personalized setting for PS1, the default bash interaction prompt. But what about the others available: PS2, PS3, and PS4? PS1 is the default interaction prompt. To set it to give you username@host:directory use  export PS1="u@h w " in your ~/.bash_rc. u is the current username, h the current host, and w the working directory. There's a list of escape codes you can use in the bash man page, or in the Bash Prompt HOWTO. PS2 is the prompt you get when you extend a command over multiple lines by putting at the end of a line and hitting return. By default it's just >, but you can make this a little more obvious with: export PS2="more -> " so it looks like:  juliet@glade:~  very-long-command-here more -> -with -lots -of -options PS3 governs the prompt that shows up if you use the select statement in a shell script. The default is #?, so if you do nothing to change that, the select statement will print out the options and then just leave that prompt. Alternatively, use this:  PS3="Choose an option: " select i in yes maybe no do # code to handle reply done  which will output:  1) yes 2) maybe 3) no Choose an option:  Far more readable for the user! Finally, PS4 is the prompt shown when you set the debug mode on a shell script using set -x at the top of the script. This echoes each line of the script to STDOUT before executing it. The default prompt is ++. More usefully, you can set it to display the line number, with: export PS4='LINENO+ ' All of these can be made to be permanent changes by setting them in your ~/.bash_profile or ~/.bashrc file. (Note that this probably makes little sense to do for PS3, which is better to set per-script.) Recovering Deleted Files With lsof By Juliet Kemp One of the more neat things you can do with the versatile utility lsof is use it to recover a file you've just accidentally deleted. A file in Linux is a pointer to an inode, which contains the file data (permissions, owner and where its actual content lives on the disk). Deleting the file removes the link, but not the inode itself – if another process has it open, the inode isn't released for writing until that process is done with it. To try this out, create a test text file, save it and then type less test.txt. Open another terminal window, and type rm testing.txt. If you try ls testing.txt you'll get an error message. But! less still has a reference to the file. So:  > lsof | grep testing.txt less 4607 juliet 4r REG 254,4 21 8880214 /home/juliet/testing.txt (deleted) The important columns are the second one, which gives you the PID of the process that has the file open (4607), and the fourth one, which gives you the file descriptor (4). Now, we go look in /proc, where there will still be a reference to the inode, from which you can copy the file back out:  > ls -l /proc/4607/fd/4 lr-x------ 1 juliet juliet 64 Apr 7 03:19 /proc/4607/fd/4 -> /home/juliet/testing.txt (deleted) > cp /proc/4607/fd/4 testing.txt.bk  Note: don't use the -a flag with cp, as this will copy the (broken) symbolic link, rather than the actual file contents. #### [Jul 7, 2009] xclip Command-Line Clipboard xclip (available as a package for Debian and Ubuntu) enables you to interact with the X clipboard directly from the command-line — without having to use the mouse to cut and paste. This is particularly useful if you're trying to get command-line output over to an e-mail or web page. Instead of scrolling around in the terminal to cut and paste with the mouse, screen by screen, you can use this:  command --arg | xclip Then go to whichever graphical program you want to paste the input into, and paste with the middle mouse button or the appropriate menu item. You can also enter the contents of a file straight into xclip:  xclip /path/to/file and again, can then paste that directly wherever you want it. The -o option enables you to operate it the other way around: output the contents of the clipboard straight onto the command line. So, you could, for example, copy a command line from a web page, then use  xclip -o to output it. To output to a file, use  xclip -o /path/to/file Use the -selection switch to use the buffer-cut or one of the other selection options, rather than the clipboard default. You can also hook it up to an X display other than the default one (e.g., if you're logged on as a different user on :!) with  xclip -d localhost:1 #### [Jun 29, 2009] !! provides the ability to rerun long commands which cannot be executed on your current account without prefixing them with sudo.  whoami  sudo !! #### [Mar 14, 2009] How to Be Faster at the Linux Command Line ###### 02/05/2009 | hacktux.com Want to be faster at the Linux command line interface? Since most Linux distributions provide Bash as the default CLI, here are some Bash tricks that will help cut down the amount of typing needed to execute commands. Feel free to comment and share your own speed tricks. Control-R Through Your History This is my most used shortcut. Hit Control-R and begin to type a string. You immediately get the last command in your Bash history with that string. Hit Control-R again to cycle further backwards in your history. For instance, type the following and hit Enter. grep root /etc/passwd Then hit Control-R and begin to type 'grep'. Control-R (reverse-i-search)gre': grep root /etc/passwd When you see the original command listed, hit Enter to execute it. Alternatively, you can also hit the Right-Arrow to edit the command before running it. Use History Expansion Bash's command history can be referenced using the exclamation mark. For instance, typing two exclamation marks (!!) will re-execute the last command. The next example executes date twice: date !! If you are interested in more than just the last command executed, type history to see a numbered listing of your Bash's history. history 39 grep root /etc/passwd 40 date 41 date 42 history Since grep root /etc/passwd is command number 39, you can re-execute it like so: !39 You can also reference Bash's history using a search string. For instance, the following will run the last command that started with 'grep'. !grep Note, you can set the number of commands stored in your history by setting HISTSIZE. export HISTSIZE=1000 You can also wipe your history clear with the -c switch. history -c Use History Quick Substitution Historical commands can be edited and reused with quick substitution. Let's say you grep for 'root' in /etc/passwd: grep root /etc/passwd Now, you need to grep for 'root' in /etc/group. Substitute 'passwd' for 'group' in the last command using the caret (^). ^passwd^group The above command will run: grep root /etc/group Comments Sun, 02/08/2009 - 2:25pm — Anonymous (not verified) ### For my backup function, I For my backup function, I use pass the %F-%R to my date command. This would allow me to make multiple backup copies of a file in one day and have them ordered by date/time. Keith Thu, 02/05/2009 - 2:58pm — Anonymous (not verified) ### Thankyou for ctrl R I have Thankyou for ctrl R I have been using command line for two years and one of my biggest grips was this issue. I and now flying around the command line thanks Wed, 02/04/2009 - 5:59pm — Max (not verified) ### Nice set of tricks. I knew Nice set of tricks. I knew most of them already but it refreshed my memory. Thanks. I find even more handy to have this in ~/.inputrc : # -------- Bind page up/down wih history search --------- "\e[5~": history-search-backward "\e[6~": history-search-forward I'll take the same example : on the bash prompt, type "gre" and Page up, this will give you "grep root /etc/passwd", the last command that started with "gre". Enter Page up again and it'll show you the previous one. Page down is obvioulsy used to show the next one. I just noticed that the "set -o vi" trick is messing with this one ^_^ Can't tell you why. Thu, 02/05/2009 - 5:43am — MaximB (not verified) ### Nice stuff... There are some Nice stuff... There are some GNU/Linux distributions that already use aliases "built-in" . like rm which is "rm -i" in rhel5 . So if you want to ignore the alias for known commands like rm for example, just type : command rm it will ignore the alias for the command. #### [Feb 22, 2009] 10 shortcuts to master bash - Program - Linux - Builder AU By Guest Contributor, TechRepublic | 2007/06/25 18:30:02 If you've ever typed a command at the Linux shell prompt, you've probably already used bash -- after all, it's the default command shell on most modern GNU/Linux distributions. The bash shell is the primary interface to the Linux operating system -- it accepts, interprets and executes your commands, and provides you with the building blocks for shell scripting and automated task execution. Bash's unassuming exterior hides some very powerful tools and shortcuts. If you're a heavy user of the command line, these can save you a fair bit of typing. This document outlines 10 of the most useful tools: 1. Easily recall previous commands Bash keeps track of the commands you execute in a history buffer, and allows you to recall previous commands by cycling through them with the Up and Down cursor keys. For even faster recall, "speed search" previously-executed commands by typing the first few letters of the command followed by the key combination Ctrl-R; bash will then scan the command history for matching commands and display them on the console. Type Ctrl-R repeatedly to cycle through the entire list of matching commands. 2. Use command aliases If you always run a command with the same set of options, you can have bash create an alias for it. This alias will incorporate the required options, so that you don't need to remember them or manually type them every time. For example, if you always run ls with the -l option to obtain a detailed directory listing, you can use this command: bash> alias ls='ls -l'  To create an alias that automatically includes the -l option. Once this alias has been created, typing ls at the bash prompt will invoke the alias and produce the ls -l output. You can obtain a list of available aliases by invoking alias without any arguments, and you can delete an alias with unalias. 3. Use filename auto-completion Bash supports filename auto-completion at the command prompt. To use this feature, type the first few letters of the file name, followed by Tab. bash will scan the current directory, as well as all other directories in the search path, for matches to that name. If a single match is found, bash will automatically complete the filename for you. If multiple matches are found, you will be prompted to choose one. 4. Use key shortcuts to efficiently edit the command line Bash supports a number of keyboard shortcuts for command-line navigation and editing. The Ctrl-A key shortcut moves the cursor to the beginning of the command line, while the Ctrl-E shortcut moves the cursor to the end of the command line. The Ctrl-W shortcut deletes the word immediately before the cursor, while the Ctrl-K shortcut deletes everything immediately after the cursor. You can undo a deletion with Ctrl-Y. 5. Get automatic notification of new mail You can configure bash to automatically notify you of new mail, by setting the MAILPATH variable to point to your local mail spool. For example, the command: bash> MAILPATH='/var/spool/mail/john' bash> export MAILPATH  Causes bash to print a notification on john's console every time a new message is appended to John's mail spool. 6. Run tasks in the background Bash lets you run one or more tasks in the background, and selectively suspend or resume any of the current tasks (or "jobs"). To run a task in the background, add an ampersand (&) to the end of its command line. Here's an example: bash> tail -f /var/log/messages & [1] 614 Each task backgrounded in this manner is assigned a job ID, which is printed to the console. A task can be brought back to the foreground with the command fg jobnumber, where jobnumber is the job ID of the task you wish to bring to the foreground. Here's an example: bash> fg 1 A list of active jobs can be obtained at any time by typing jobs at the bash prompt. 7. Quickly jump to frequently-used directories You probably already know that the PATH variable lists bash's "search path" -- the directories it will search when it can't find the requested file in the current directory. However, bash also supports the CDPATH variable, which lists the directories the cd command will look in when attempting to change directories. To use this feature, assign a directory list to the CDPATH variable, as shown in the example below: bash> CDPATH='.:~:/usr/local/apache/htdocs:/disk1/backups' bash> export CDPATH Now, whenever you use the cd command, bash will check all the directories in the CDPATH list for matches to the directory name. 8. Perform calculations Bash can perform simple arithmetic operations at the command prompt. To use this feature, simply type in the arithmetic expression you wish to evaluate at the prompt within double parentheses, as illustrated below. Bash will attempt to perform the calculation and return the answer. bash> echo ((16/2)) 8 9. Customise the shell prompt You can customise the bash shell prompt to display -- among other things -- the current username and host name, the current time, the load average and/or the current working directory. To do this, alter the PS1 variable, as below: bash> PS1='\u@\h:\w \@> ' bash> export PS1 root@medusa:/tmp 03:01 PM> This will display the name of the currently logged-in user, the host name, the current working directory and the current time at the shell prompt. You can obtain a list of symbols understood by bash from its manual page. 10. Get context-specific help Bash comes with help for all built-in commands. To see a list of all built-in commands, type help. To obtain help on a specific command, type help command, where command is the command you need help on. Here's an example: bash> help alias ...some help text... Obviously, you can obtain detailed help on the bash shell by typing man bash at your command prompt at any time. #### How to Be Faster at the Linux Command Line Want to be faster at the Linux command line interface? Since most Linux distributions provide Bash as the default CLI, here are some Bash tricks that will help cut down the amount of typing needed to execute commands. Feel free to comment and share your own speed tricks. Control-R Through Your History This is my most used shortcut. Hit Control-R and begin to type a string. You immediately get the last command in your Bash history with that string. Hit Control-R again to cycle further backwards in your history. For instance, type the following and hit Enter. grep root /etc/passwd Then hit Control-R and begin to type 'grep'. Control-R (reverse-i-search)gre': grep root /etc/passwd When you see the original command listed, hit Enter to execute it. Alternatively, you can also hit the Right-Arrow to edit the command before running it. Use History Expansion Bash's command history can be referenced using the exclamation mark. For instance, typing two exclamation marks (!!) will re-execute the last command. The next example executes date twice: date !! If you are interested in more than just the last command executed, type history to see a numbered listing of your Bash's history. history 39 grep root /etc/passwd 40 date 41 date 42 history Since grep root /etc/passwd is command number 39, you can re-execute it like so: !39 You can also reference Bash's history using a search string. For instance, the following will run the last command that started with 'grep'. !grep Note, you can set the number of commands stored in your history by setting HISTSIZE. export HISTSIZE=1000 You can also wipe your history clear with the -c switch. history -c Use History Quick Substitution Historical commands can be edited and reused with quick substitution. Let's say you grep for 'root' in /etc/passwd: grep root /etc/passwd Now, you need to grep for 'root' in /etc/group. Substitute 'passwd' for 'group' in the last command using the caret (^). ^passwd^group The above command will run: grep root /etc/group Use Vi or Emacs Editing Mode You can further enhance your abilities to edit previous commands using Vi or Emacs keystrokes. For example, the following sets Vi style command line editing: set -o vi After setting Vi mode, try it out by typing a command and hitting Enter. grep root /etc/passwd Then, Up-Arrow once to the same command: Up-Arrow grep root /etc/passwd Now, move the cursor to the 'p' in 'passwd' and hit Esc. grep root /etc/passwd ^ Now, use the Vi cw command to change the word 'passwd' to 'group'. grep root /etc/group For more Vi mode options, see this list of commands available in Vi mode. Alternatively, If you prefer Emacs, use Bash's Emacs mode: set -o emacs Emacs mode provides shortcuts that are available through the Control and Alt key. For example, Control-A takes you to the beginning of the line and Control-E takes you to the end of the line. Here is a list of commands available in Bash's Emacs mode. Use Aliases and Functions Bash allows for commands, or sets of commands, to be aliased into a single instruction. Your interactive Bash shell should already load some useful aliases from /etc/profile.d/. For one, you probably have ll aliased to ls -l. If you want to see all aliases loaded, run the alias Bash builtin. alias To create an alias, use the alias command: alias ll='ls -l' Here are some other common aliases: alias ls='ls --color=tty' alias l.='ls -d .* --color=auto' alias cp='cp -i' alias mv='mv -i' Note that you can also string together commands. The follow will alias gohome as cd , then run ls. Note that running cd without any arguments will change directory to your HOME directory. alias gohome='cd; ls' Better yet, only run ls if the cd is successful: alias gohome='cd && ls || echo "error(?) with cd to HOME"' More complex commands can be written into a Bash function. Functions will allow you to provide input parameters for a block of code. For instance, let's say you want to create a backup function that puts a user inputted file into ~/backups. backup() { file={1:?"error: I need a file to backup"} timestamp=(date '+%m%d%y') backupdir=~/backups [ -d {backupdir} ] || mkdir -p {backupdir} cp -a {file} {backupdir}/(basename {file}).{timestamp} return ? } Like the example above, use functions to automate small, daily tasks. Here is one I use to set my xterm title. xtitle() { unset PROMPT_COMMAND echo -ne "\033]0;{@}\007" } Of course, you can use functions together with aliases. Here is one I use to set my xterm title to 'MAIL' and then run Mutt. alias mutt='xtitle "MAIL" && /usr/bin/mutt' Finally, to ensure that your custom aliases and functions are available each login, add them to your .bashrc. vim ~/.bashrc #### [Apr 2, 2008] 10 shortcuts to master bash - Program - Linux - Builder AU 2007/06/25 | Guest Contributor, TechRepublic 1. Easily recall previous commands Bash keeps track of the commands you execute in a history buffer, and allows you to recall previous commands by cycling through them with the Up and Down cursor keys. For even faster recall, "speed search" previously-executed commands by typing the first few letters of the command followed by the key combination Ctrl-R; bash will then scan the command history for matching commands and display them on the console. Type Ctrl-R repeatedly to cycle through the entire list of matching commands. ... ... ... 5. Get automatic notification of new mail You can configure bash to automatically notify you of new mail, by setting the MAILPATH variable to point to your local mail spool. For example, the command: bash> MAILPATH='/var/spool/mail/john' bash> export MAILPATH  Causes bash to print a notification on john's console every time a new message is appended to John's mail spool. 6. Run tasks in the background Bash lets you run one or more tasks in the background, and selectively suspend or resume any of the current tasks (or "jobs"). To run a task in the background, add an ampersand (&) to the end of its command line. Here's an example: bash> tail -f /var/log/messages & [1] 614 Each task backgrounded in this manner is assigned a job ID, which is printed to the console. A task can be brought back to the foreground with the command fg jobnumber, where jobnumber is the job ID of the task you wish to bring to the foreground. Here's an example: bash> fg 1 A list of active jobs can be obtained at any time by typing jobs at the bash prompt. 7. Quickly jump to frequently-used directories You probably already know that the PATH variable lists bash's "search path" -- the directories it will search when it can't find the requested file in the current directory. However, bash also supports the CDPATH variable, which lists the directories the cd command will look in when attempting to change directories. To use this feature, assign a directory list to the CDPATH variable, as shown in the example below: bash> CDPATH='.:~:/usr/local/apache/htdocs:/disk1/backups' bash> export CDPATH Now, whenever you use the cd command, bash will check all the directories in the CDPATH list for matches to the directory name. 8. Perform calculations Bash can perform simple arithmetic operations at the command prompt. To use this feature, simply type in the arithmetic expression you wish to evaluate at the prompt within double parentheses, as illustrated below. Bash will attempt to perform the calculation and return the answer. bash> echo ((16/2)) 8 ... ... ... 10. Get context-specific help Bash comes with help for all built-in commands. To see a list of all built-in commands, type help. To obtain help on a specific command, type help command, where command is the command you need help on. Here's an example: bash> help alias ...some help text... Obviously, you can obtain detailed help on the bash shell by typing man bash at your command prompt at any time. #### [Mar 30, 2008] Bash tips and tricks « Richard’s linux, web design and e-learning collection # Bash tips and tricks for History related preferences # see http://richbradshaw.wordpress.com/2007/11/25/bash-tips-and-tricks/ # == 1 Lost bash history == # the bash history is only saved when you close the terminal, not after each command. fix it.. shopt -s histappend PROMPT_COMMAND=’history -a’ # == 2. Stupid spelling mistakes == # This will make sure that spelling mistakes such as ect instead of etc are ignored. shopt -s cdspell # == 3. Duplicate entries in bash history == # This will ignore duplicates, as well as ls, bg, fg and exit as well, making for a cleaner bash history. export HISTIGNORE=”&:ls:[bf]g:exit” # == 4 Multiple line commands split up in history == # this will change multiple line commands into single lines for easy editing. shopt -s cmdhist #### My Favorite bash Tips and Tricks One thing you can do is redirect your output to a file. Basic output redirection should be nothing new to anyone who has spent a reasonable amount of time using any UNIX or Linux shell, so I won't go into detail regarding the basics of output redirection. To save the useful output from the find command, you can redirect the output to a file:  find / -name foo > output.txt  You still see the error messages on the screen but not the path of the file you're looking for. Instead, that is placed in the file output.txt. When the find command completes, you can cat the file output.txt to get the location(s) of the file(s) you want. That's an acceptable solution, but there's a better way. Instead of redirecting the standard output to a file, you can redirect the error messages to a file. This can be done by placing a 2 directly in front of the redirection angle bracket. If you are not interested in the error messages, you simply can send them to /dev/null: This shows you the location of file foo, if it exists, without those pesky permission denied error messages. I almost always invoke the find command in this way. The number 2 represents the standard error output stream. Standard error is where most commands send their error messages. Normal (non-error) output is sent to standard output, which can be represented by the number 1. Because most redirected output is the standard output, output redirection works only on the standard output stream by default. This makes the following two commands equivalent:  find / -name foo > output.txt  find / -name foo 1> output.txt  Sometimes you might want to save both the error messages and the standard output to file. This often is done with cron jobs, when you want to save all the output to a log file. This also can be done by directing both output streams to the same file:  find / -name foo > output.txt 2> output.txt  This works, but again, there's a better way to do it. You can tie the standard error stream to the standard output stream using an ampersand. Once you do this, the error messages goes to wherever you redirect the standard output:  find / -name foo > output.txt 2>&1  One caveat about doing this is that the tying operation goes at the end of the command generating the output. This is important if piping the output to another command. This line works as expected: find -name test.sh 2>&1 | tee /tmp/output2.txt  but this line doesn't: find -name test.sh | tee /tmp/output2.txt 2>&1  and neither does this one: find -name test.sh 2>&1 > /tmp/output.txt  I started this discussion on output redirection using the find command as an example, and all the examples used the find command. This discussion isn't limited to the output of find, however. Many other commands can generate enough error messages to obscure the one or two lines of output you need. Output redirection isn't limited to bash, either. All UNIX/Linux shells support output redirection using the same syntax. #### Bash Tip #2 subprocess Bash bang commands can be used for shortcuts too. • !! = last line in history • !* = all args from last line in history • ! = last arg from last line in history • !^ = first arg from last line in history I really only use ! with the cd command. Here’s some examples, although some not really useful. Just to give you an idea of what it does: 1. which php (maybe it outputs /usr/local/bin/php) 2. !! /path/to/php_script.php (executes php on the script) #### Bash Tips and Tricks 'cd' with style Something you may have seen before in other systems (the much maligned SCO OSes, for example) is this handy option: shopt -s cdspell "This will correct minor spelling errors in a 'cd' command, so that instances of transposed characters, missing characters and extra characters are corrected without the need for retyping." #### [Mar 20, 2008] bash Tricks From the Developers of the O'Reilly Network - O'Reilly ONLamp Blog No more worrying about cases The best bash tip I can share is very helpful when working on systems that don't allow filenames to differ only in case (like OSX and Windows): create a file called .inputrc in your home directory and put this line in it: set completion-ignore-case on Now bash tab-completion won't worry about case in filenames. Thus 'cd sit[tab]' would complete to 'cd Sites/' Last argument You can also use Esc-period and get the last parm of the previous line. You can repeatedly use Esc-period to scroll back through time with them. That turns out to be even better than ! because you can edit it once it shows up on your command line. should be ! Instead of !, use !, it works much better. :)  echo asdf asdf  echo ! echo asdf asdf  echo !  So ! is an empty variable, while ! brings back the last argument from the last command. Command substitution  for s in cat server.list; do ssh s uptime; done; Command substution is also done using (command) notation, which I prefer to the backquotes. It allows commands to be nested (backquotes allow that too, but the inner quotes must be escaped using backslashes, which gets messy. For example:  for s in (cat server.list); do echo "s: (ssh s uptime)"; done; or: # get the uptime for just the first server  echo "(date): (ssh (head -1 server.list) uptime)" ===== More key bindings and tricks Bash will keep a history of the directories you visit, you just have to ask. You can also always go back to the previous directory you were in by typing cd - without the need to pushd the current directory. Using it more than once cycles between the current and previous directory. CTRL-A takes you to the beginning of the line and CTRL-E takes you to the end of the line. This is probably basic shell knowledge, I think it's actually common readline/emacs knowledge, and it works in much more programs than just Bash or a terminal. For instance, you can enable them in Gnome applications by adding the line gtk-key-theme-name = "Emacs" to the ~/.gtkrc-2.0 file. Other handy key bindings you can use are: • ctrl-u : Cut everything on the current line before the cursor. • ctrl-y : 'Yank' (paste) text that was cut using ctrl-u. • ctrl-w : Delete the word on the left of the cursor There's so much usefull knowledge hidden in Bash that, if you spend any time at the command line, you should really get yourself aquinted with. It saves incredible ammounts of time. Take for example something I wanted to do yesterday. I wanted to now the number of hits on a certain website. I could have installed a tool to parse the Apache access.log, but this was much easier:  cat access.log | cut -d"[" -f2 | cut -d"]" -f1 | cut -d"/" -f2 | uniq -c 28905 Mar 16554 Apr Takes no more than a couple of seconds to write, but saves so much time. Try reading through the Bash man page. It's huge, but think of all the stuff you'll learn! Or read some online Bash scripting tutorials. Everything from gathering statistics from files to creating thumbnails of images (From the top of my head: for A in *; do convert A -resize 140x140 th_A; done) becomes a cinch. #### BASH Help - A Bash Tutorial Flip the Last Two Characters If you type like me your fingers spit characters out in the wrong order on occasion. ctrl-t swaps the order that the last two character appear in. Searching Bash History As you enter commands at the CLI they are saved in a file ~./.bash_history. From the bash prompt you can browse the most recently used commands through the least recently used commands by pressing the up arrow. Pressing the down arrow does the opposite. If you have entered a command a long time ago and need to execute it again you can search for it. Type the command 'ctrl-r' and enter the text you want to search for. #### [Dec 9, 2007] Cool Solutions Bash - Making use of your .bashrc file Good sample bashrc file  This Is Your Open Enterprise™ Skip to Content United States - EnglishNovell Home LoginDownload Products & Solutions Services & Support Partners & Communities Search Advanced Search SolutionsIdentity, Security and Systems ManagementLinux Operating SystemsWorkgroup CollaborationProducts forIndustriesSmall BusinessProducts A-ZServicesConsultingTrainingSupportCustomer CenterDiscussion ForumsDocumentationKnowledgebasePatches & SecuritySupport by ProductPartnersCertified Partner ProductsFind a PartnerPartner with NovellStrategic PartnersCommunitiesBlogsConnection MagazineCool SolutionsDevelopersNovell Users Intl. (NUI)Partner > cool solutions home > cool tools home Bash - Making use of your .bashrc file Novell Cool Solutions: Cool ToolRate This Page Reader Rating from 4 ratings Printer Friendlytell a friendDigg This - Slashdot This In Brief A sample .bashrc file. Vitals Product Categories: Open Enterprise Server SUSE Linux SUSE Linux Enterprise Desktop SUSE Linux Enterprise Server Functional Categories: BASH Shortcuts Workgroup Updated: 23 Oct 2006 File Size: 6.9KB License: GPL Download: /coolsolutions/tools/downloads/bashrc.txt Publisher: David Crouse Disclaimer Please read the note from our friends in legal before using this file. Details I was playing with my .bashrc file again, and was once again impressed by how you can tweak Linux to do what YOU want it to do so easily. I am sure there are tons of other tweaks you can do to your .bashrc file, but I really like some of mine, and thought I would share them. Some of the alias's I created, some I found on the net, and some things in my .bashrc file are just there for fun, like the "# WELCOME SCREEN", although it does serve a purpose for me at the same time, it might not be something everyone would want or need. For those that don't know what a .bashrc file does: "The ~/.bashrc file determines the behavior of interactive shells." Quoted From: The Advanced Bash Scripting Guide Basically , it allows you to create shortcuts (alias's) and interactive programs (functions) that run on the startup of the bash shell or that are used when running an interactive shell. For example, it's much easier to just type: ebrc instead of pico ~/.bashrc (I used the alias ebrc , and it stands for "Edit Bash RC file". I could have also aliased it to just use one letter, making it a VERY fast short cut. The bashrc file allows you to create alias's (shortcuts) to almost anything you want. My list is pretty long, but I'm sure there is someone with a longer list ;) I have my .bashrc file setup in sections. The following is the breakdown by section of how I keep my list of alias's and functions separated. This is just how I do this, your .bashrc file can be modified to suit YOUR needs, that's the interesting part about the .bashrc file. It's VERY customizable and very easy to change. Header (So I know when i modified it last and what i was running it on) Exports (So I can set history size, paths , editors, define colors, etc,) Sourced Alias's (So I can find those hidden alias's faster) Workstation Alias's (so i can ssh to local machines quickly) Remote Server Alias's (so i can ssh to remote servers easily) Script Alias's (quick links to some of my bashscripts) Hardware control alias's (so I can control cd/dvd/scanners/audio/etc) Modified commands (Alias's to normal linux commands with special flags) Chmod Alias's (makes changing permissions faster) Alias's for GUI programs (start firefox, etc from command line) Alias's for xterm and others (open xterm with special settings) Alias's for Lynx (open lynx with urls - kind of a bash bookmark ;) ) UNused Alias's (Alias's that aren't in use on the system, but that i might use later) Special functions (more of a function than just an alias..it goes here) Notes (that should be self explanatory ;) ) Welcome Screen (code to make my bash shell display some stuff as it starts up) That's how I lay out my .bashrc files. It may not be perfect, but it works well for me. I like making changes in just my .bashrc file and not the global files. I like the .bashrc file because you don't need root permissions to make changes that make your life easier at the bash shell. The following is my .bashrc file (with some things obviously commented out for security... but most of it should be self explanatory). Anyone with comments/suggestions/ideas feel free to let me know. I'm always looking for new and interesting things to do with the .bashrc file. Want to know what alias's your bash shell has? Simply type the word alias at the command line. The shell will then print out the list of active alias's to the standard output (normally your screen). ####################################################### # Dave Crouse's .bashrc file # www.bashscripts.org # www.usalug.org # # Last Modified 04-08-2006 # Running on OpenSUSE 10 ####################################################### # EXPORTS ####################################################### PATH=PATH:/usr/lib/festival/ ;export PATH export PS1="[\[\033[1;34m\w\[\033[0m]\n[\t \u] " export EDITOR=/usr/bin/pico export HISTFILESIZE=3000 # the bash history should save 3000 commands export HISTCONTROL=ignoredups #don't put duplicate lines in the history. alias hist='history | grep 1' #Requires one input # Define a few Color's BLACK='\e[0;30m' BLUE='\e[0;34m' GREEN='\e[0;32m' CYAN='\e[0;36m' RED='\e[0;31m' PURPLE='\e[0;35m' BROWN='\e[0;33m' LIGHTGRAY='\e[0;37m' DARKGRAY='\e[1;30m' LIGHTBLUE='\e[1;34m' LIGHTGREEN='\e[1;32m' LIGHTCYAN='\e[1;36m' LIGHTRED='\e[1;31m' LIGHTPURPLE='\e[1;35m' YELLOW='\e[1;33m' WHITE='\e[1;37m' NC='\e[0m' # No Color # Sample Command using color: echo -e "{CYAN}This is BASH {RED}{BASH_VERSION%.*}{CYAN} - DISPLAY on {RED}DISPLAY{NC}\n" # SOURCED ALIAS'S AND SCRIPTS ####################################################### ### Begin insertion of bbips alias's ### source ~/.bbips/commandline/bbipsbashrc ### END bbips alias's ### # Source global definitions if [ -f /etc/bashrc ]; then . /etc/bashrc fi # enable programmable completion features if [ -f /etc/bash_completion ]; then . /etc/bash_completion fi # ALIAS'S OF ALL TYPES SHAPES AND FORMS ;) ####################################################### # Alias's to local workstations alias tom='ssh 192.168.2.102 -l root' alias jason='ssh 192.168.2.103 -l root' alias randy='ssh 192.168.2.104 -l root' alias bob='ssh 192.168.2.105 -l root' alias don='ssh 192.168.2.106 -l root' alias counter='ssh 192.168.2.107 -l root' # ALIAS TO REMOTE SERVERS alias ANYNAMEHERE='ssh YOURWEBSITE.com -l USERNAME -p PORTNUMBERHERE' # My server info removed from above for obvious reasons ;) # Alias's to TN5250 programs. AS400 access commands. alias d1='xt5250 env.TERM = IBM-3477-FC env.DEVNAME=D1 192.168.2.5 &' alias d2='xt5250 env.TERM = IBM-3477-FC env.DEVNAME=D2 192.168.2.5 &' alias tn5250j='nohup java -jar /home/crouse/tn5250j/lib/tn5250j.jar 2>>error.log &' # Alias's to some of my BashScripts alias bics='sh /home/crouse/scripts/bics/bics.sh' alias backup='sh /home/crouse/scripts/usalugbackup.sh' alias calc='sh /home/crouse/scripts/bashcalc.sh' alias makepdf='sh /home/crouse/scripts/makepdf.sh' alias phonebook='sh /home/crouse/scripts/PHONEBOOK/baps.sh' alias pb='sh /home/crouse/scripts/PHONEBOOK/baps.sh' alias ppe='/home/crouse/scripts/passphraseencryption.sh' alias scripts='cd /home/crouse/scripts' # Alias's to control hardware alias cdo='eject /dev/cdrecorder' alias cdc='eject -t /dev/cdrecorder' alias dvdo='eject /dev/dvd' alias dvdc='eject -t /dev/dvd' alias scan='scanimage -L' alias playw='for i in *.wav; do play i; done' alias playo='for i in *.ogg; do play i; done' alias playm='for i in *.mp3; do play i; done' alias copydisk='dd if=/dev/dvd of=/dev/cdrecorder' # Copies bit by bit from dvd to cdrecorder drives. alias dvdrip='vobcopy -i /dev/dvd/ -o ~/DVDs/ -l' # Alias's to modified commands alias ps='ps auxf' alias home='cd ~' alias pg='ps aux | grep' #requires an argument alias un='tar -zxvf' alias mountedinfo='df -hT' alias ping='ping -c 10' alias openports='netstat -nape --inet' alias ns='netstat -alnp --protocol=inet | grep -v CLOSE_WAIT | cut -c-6,21-94 | tail +2' alias du1='du -h --max-depth=1' alias da='date "+%Y-%m-%d %A %T %Z"' alias ebrc='pico ~/.bashrc' # Alias to multiple ls commands alias la='ls -Al' # show hidden files alias ls='ls -aF --color=always' # add colors and file type extensions alias lx='ls -lXB' # sort by extension alias lk='ls -lSr' # sort by size alias lc='ls -lcr' # sort by change time alias lu='ls -lur' # sort by access time alias lr='ls -lR' # recursive ls alias lt='ls -ltr' # sort by date alias lm='ls -al |more' # pipe through 'more' # Alias chmod commands alias mx='chmod a+x' alias 000='chmod 000' alias 644='chmod 644' alias 755='chmod 755' # Alias Shortcuts to graphical programs. alias kwrite='kwrite 2>/dev/null &' alias firefox='firefox 2>/dev/null &' alias gaim='gaim 2>/dev/null &' alias kate='kate 2>/dev/null &' alias suk='kdesu konqueror 2>/dev/null &' # Alias xterm and aterm alias term='xterm -bg AntiqueWhite -fg Black &' alias termb='xterm -bg AntiqueWhite -fg NavyBlue &' alias termg='xterm -bg AntiqueWhite -fg OliveDrab &' alias termr='xterm -bg AntiqueWhite -fg DarkRed &' alias aterm='aterm -ls -fg gray -bg black' alias xtop='xterm -fn 6x13 -bg LightSlateGray -fg black -e top &' alias xsu='xterm -fn 7x14 -bg DarkOrange4 -fg white -e su &' # Alias for lynx web browser alias bbc='lynx -term=vt100 http://news.bbc.co.uk/text_only.stm' alias nytimes='lynx -term=vt100 http://nytimes.com' alias dmregister='lynx -term=vt100 http://desmoinesregister.com' # SOME OF MY UNUSED ALIAS's ####################################################### # alias d=echo "Good Morning Dave. today's date is" | festival --tts; date +'%A %B %e' | festival --tts # alias shrink84='/home/crouse/shrink84/shrink84.sh' # alias tl='tail -f /var/log/apache/access.log' # alias te='tail -f /var/log/apache/error.log' # SPECIAL FUNCTIONS ####################################################### netinfo () { echo "--------------- Network Information ---------------" /sbin/ifconfig | awk /'inet addr/ {print 2}' echo "" /sbin/ifconfig | awk /'Bcast/ {print 3}' echo "" /sbin/ifconfig | awk /'inet addr/ {print 4}' # /sbin/ifconfig | awk /'HWaddr/ {print 4,5}' echo "---------------------------------------------------" } spin () { echo -ne "{RED}-" echo -ne "{WHITE}\b|" echo -ne "{BLUE}\bx" sleep .02 echo -ne "{RED}\b+{NC}" } scpsend () { scp -P PORTNUMBERHERE "@" USERNAME@YOURWEBSITE.com:/var/www/html/pathtodirectoryonremoteserver/; } # NOTES ####################################################### # To temporarily bypass an alias, we preceed the command with a \ # EG: the ls command is aliased, but to use the normal ls command you would # type \ls # mount -o loop /home/crouse/NAMEOFISO.iso /home/crouse/ISOMOUNTDIR/ # umount /home/crouse/NAMEOFISO.iso # Both commands done as root only. # WELCOME SCREEN ####################################################### clear for i in seq 1 15 ; do spin; done ;echo -ne "{WHITE} USA Linux Users Group {NC}"; for i in seq 1 15 ; do spin; done ;echo ""; echo -e {LIGHTBLUE}cat /etc/SUSE-release ; echo -e "Kernel Information: " uname -smr; echo -e {LIGHTBLUE}bash --version;echo "" echo -ne "Hello USER today is "; date echo -e "{WHITE}"; cal ; echo ""; echo -ne "{CYAN}";netinfo; mountedinfo ; echo "" echo -ne "{LIGHTBLUE}Uptime for this computer is ";uptime | awk /'up/ {print 3,4}' for i in seq 1 15 ; do spin; done ;echo -ne "{WHITE} http://usalug.org {NC}"; for i in seq 1 15 ; do spin; done ;echo ""; echo ""; echo ""The following belong under the "function" section in my .bashrc. Useable as seperate programs, I've integrated them simply as functions for my .bashrc file in order to make them quick to use and easy to modify and find. These are functions that are used to symetrically encrypt and to decrypt files and messages. Some are completely command line, and the last two create gui interfaces to locate the files to encrypt/decrypt. If you create a program out of the functions creating a link via a shortcut/icon on the desktop would create a completely gui based interface to locate and encrypt/decrypt files. Either way, it's an easy way to use gpg. Requires: zenity, gpg ################### Begin gpg functions ################## encrypt () { # Use ascii armor gpg -ac --no-options "1" } bencrypt () { # No ascii armor # Encrypt binary data. jpegs/gifs/vobs/etc. gpg -c --no-options "1" } decrypt () { gpg --no-options "1" } pe () { # Passphrase encryption program # Created by Dave Crouse 01-13-2006 # Reads input from text editor and encrypts to screen. clear echo " Passphrase Encryption Program"; echo "--------------------------------------------------"; echo ""; which EDITOR &>/dev/null if [ ? != "0" ]; then echo "It appears that you do not have a text editor set in your .bashrc file."; echo "What editor would you like to use ? " ; read EDITOR ; echo ""; fi echo "Enter the name/comment for this message :" read comment EDITOR passphraseencryption gpg --armor --comment "comment" --no-options --output passphraseencryption.gpg --symmetric passphraseencryption shred -u passphraseencryption ; clear echo "Outputting passphrase encrypted message"; echo "" ; echo "" ; cat passphraseencryption.gpg ; echo "" ; echo "" ; shred -u passphraseencryption.gpg ; read -p "Hit enter to exit" temp; clear } keys () { # Opens up kgpg keymanager kgpg -k } encryptfile () { zenity --title="zcrypt: Select a file to encrypt" --file-selection > zcrypt encryptthisfile=cat zcrypt;rm zcrypt # Use ascii armor # --no-options (for NO gui usage) gpg -acq --yes {encryptthisfile} zenity --info --title "File Encrypted" --text "encryptthisfile has been encrypted" } decryptfile () { zenity --title="zcrypt: Select a file to decrypt" --file-selection > zcrypt decryptthisfile=cat zcrypt;rm zcrypt # NOTE: This will OVERWRITE existing files with the same name !!! gpg --yes -q {decryptthisfile} zenity --info --title "File Decrypted" --text "encryptthisfile has been decrypted" } ################### End gpg functions ################## Novell Cool Solutions (corporate web communities) are produced by WebWise Solutions. www.webwiseone.com Reader Comments cool man, really cool. i love such stuffs you know. working in the command line makes you feel like a real linux geek it's really cool. good job. AuthorsDocumentationGlossaryKnowledgebaseNovell ConnectionPartner Product GuideSupport ForumsTrainingAppNotes by DateAppNotes by TitleSubmit an AppNoteCool Tools by ProductCool Tools by Tool NameCool Tools by DateCool Tools by CategoryCool Tools by File NameCool Tools by PublisherSubmit a ToolAdvertising in Cool SolutionsTalk to UsSubmit a TipSubmit an AppNoteSubscribeXML/RSS News FeedsFirefox FeedsJavascript News FeedsAccess ManagerAuditBorderManagereDirectoryExteNdIdentity ManagerSecureLoginSentinelZENworksSUSE Linux Enterprise DesktopSUSE Linux Enterprise ServerGroupWiseOpen Enterprise ServerSUSE Linux Enterprise DesktopTeaming + ConferencingIdentity, Security, & Systems ManagementLinux Operating SystemsWorkgroupCool Solutions dot ComCool Solutions HomeResourcesAppNotesCool ToolsGet InvolvedCool Solutions to GoOther CoolsCool BlogsCool Solutions WikiOpen Audio (podcasts) 1.800.529.3400 local numbers Request Call Corporate Governance | Legal & Export | Privacy | Subscribe | Feedback | Glossary | RSS | Contact | Printer Friendly © 2007 Novell, Inc. All Rights Reserved.  #### xargs, find and several useful shortcuts See also Unix Xargs and Unix Find Command pages. Re:pushd and popd (and other tricks) (Score:2) by Ramses0 (63476) on Wednesday March 10, @07:39PM (#8527252) My favorite "Nifty" was when I spent the time to learn about "xargs" (I pronounce it zargs), and brush up on "for" syntax. ls | xargs -n 1 echo "ZZZ> " Basically indents (prefixes) everything with a "ZZZ" string. Not really useful, right? But since it invokes the echo command (or whatever command you specify) n times (where n is the number of lines passed to it) this saves me from having to write a lot of crappy little shell scripts sometimes. A more serious example is: find -name \*.jsp | sed 's/^/http:\/\/127.0.0.1/server/g' | xargs -n 1 wget ...will find all your jsp's, map them to your localhost webserver, and invoke a wget (fetch) on them. Viola, precompiled JSP's. Another: for f in find -name \*.jsp ; do echo "==> f" >> out.txt ; grep "TODO" f >> out.txt ; done ...this searches JSP's for "TODO" lines and appends them all to a file with a header showing what file they came from (yeah, I know grep can do this, but it's an example. What if grep couldn't?) ...and finally... ( echo "These were the command line params" echo "---------" for f in @ ; do echo "Param: f" done ) | mail -s "List" you@you.com ...the parenthesis let your build up lists of things (like interestingly formatted text) and it gets returned as a chunk, ready to be passed on to some other shell processing function. Shell scripting has saved me a lot of time in my life, which I am grateful for. :^) #### [May 7, 2007] To strip file extensions in bash, like this.rbl --> this fname={file%.rbl} #### Last argument reuse tail -f /tmp/foo rm ! # ! is the last argument to the previous command.  #### Correction sed style grep 'wibble' afile | lwss #typo: meant to type less !!:s/lw/le #!! is last command string, :s does sed-style modification. :gs does a global replace # or for simpler corrections # n.b. textile screws this up. replace the sup elements with circumflexes. cat .bash_profilx #typo - meant the x to be an e <sup>x</sup>e # 'repeat last command, subsituting x for e touch a{1,2,3,4}b # brace gets expanded to a1b a2b a3b a4b so 4 files get touched cp file{,.old} # brace gets expanded to file file.old , thus creating a backup.  • shell variables CDPATH This is a little known and very underrated shell variable. CDPATH does for the cd built-in what PATH does for executables. By setting this wisely, you can cut down on the number of key-strokes you enter per day. Try this:  export CDPATH=.:~:~/docs:~/src:~/src/ops/docs:/mnt:/usr/src/redhat:/usr/src/redhat/RPMS:/usr/src:/usr/lib:/usr/local:/software:/software/redhat Using this, cd i386 would likely take you to /usr/src/redhat/RPMS/i386 on a Red Hat Linux system. HISTIGNORE Set this to to avoid having consecutive duplicate commands and other not so useful information appended to the history list. This will cut down on hitting the up arrow endlessly to get to the command before the one you just entered twenty times. It will also avoid filling a large percentage of your history list with useless commands. Try this:  export HISTIGNORE="&:ls:ls *:mutt:[bf]g:exit" Using this, consecutive duplicate commands, invocations of ls, executions of the mutt mail client without any additional parameters, plus calls to the bg, fg and exit built-ins will not be appended to the history list. MAILPATH bash will warn you of new mail in any folder appended to MAILPATH. This is very handy if you use a tool like procmail to presort your e-mail into folders. Try adding the following to your ~/.bash_profile to be notified when any new mail is deposited in any mailbox under ~/Mail.  MAILPATH=/var/spool/mail/USER for i in echo ~/Mail/[^.]* do MAILPATH=MAILPATH:i done export MAILPATH unset i  If you use mutt and many of those folders don't receive automatically filtered mail, you may prefer to have bash alert you only when new e-mail arrives in a folder that you also track in mutt. In that case, try something like the following in your ~/.bash_profile:  export perl -ne 's/^mailboxes /MAILPATH=/ && tr/ /:/ && print && exit' < ~/.muttrc  TMOUT If you set this to a value greater than zero, bash will terminate after this number of seconds have elapsed if no input arrives. This setting is useful in root's environment to reduce the potential security risk of someone forgetting to log out as the superuser. • set options ignoreeof Ordinarily, issuing Ctrl-D at the prompt will log you out of an interactive shell. This can be annoying if you regularly need to type Ctrl-D in other situations, for example, when trying to disconnect from a Telnet session. In such a situation, hitting Ctrl-D once too often will close your shell, which can be very frustrating. This option disables the use of Ctrl-D to exit the shell. • shopt options You can set each of the options below with shopt -s <option>. cdspell This will correct minor spelling errors in a cd command, so that instances of transposed characters, missing characters and extra characters are corrected without the need for retyping. cmdhist This is very much a matter of taste. Defining this will cause multi-line commands to be appended to your bash history as a single line command. This makes for easy command editing. dotglob This one allows files beginning with a dot ('.') to be returned in the results of path-name expansion. extglob This will give you ksh-88 egrep-style extended pattern matching or, in other words, turbo-charged pattern matching within bash. The available operators are: ?(pattern-list) Matches zero or one occurrence of the given patterns *(pattern-list) Matches zero or more occurrences of the given patterns +(pattern-list) Matches one or more occurrences of the given patterns @(pattern-list) Matches exactly one of the given patterns !(pattern-list) Matches anything except one of the given patterns Here's an example. Say, you wanted to install all RPMs in a given directory, except those built for the noarch architecture. You might use something like this: rpm -Uvh /usr/src/RPMS/!(*noarch*) These expressions can be nested, too, so if you wanted a directory listing of all non PDF and PostScript files in the current directory, you might do this: ls -lad !(*.p?(df|s)) ## readline Tips and Tricks The readline library is used by bash and many other programs to read a line from the terminal, allowing the user to edit the line with standard Emacs editing keys. • set show-all-if-ambiguous on If you have this in your /etc/inputrc or ~/.inputrc, you will no longer have to hit the <Tab> key twice to produce a list of all possible completions. A single <Tab> will suffice. This setting is highly recommended. • set visible-stats on Adding this to your /etc/inputrc or ~/.inputrc will result in a character being appended to any file-names returned by completion, in much the same way as ls -F works. • If you're a fan of vi as opposed to Emacs, you might prefer to operate bash in vi editing mode. Being a GNU program, bash uses Emacs bindings unless you specify otherwise. Set the following in your /etc/inputrc or ~/.inputrc: set editing-mode vi set keymap vi and this in your /etc/bashrc or ~/.bashrc: set -o vi #### Set Vi Mode in Bash  # set -o vi Vi mode allows for the use of vi like commands when at the bash prompt. When set to this mode initially you will be in insert mode (be able to type at the prompt unlike when you enter vi). Hitting the escape key takes you into command mode. Commands to take advantage of bash's Vi Mode:  h Move cursor left l Move cursor right A Move cursor to end of line and put in insert mode 0 (zero) Move cursor to beginning of line (doesn't put in insert mode) i Put into insert mode at current position a Put into insert mode after current position dd Delete line (saved for pasting) D Delete text after current cursor position (saved for pasting) p Paste text that was deleted j Move up through history commands k Move down through history commands u Undo ### Useful Commands and Features The commands in this section are non-mode specific, unlike the ones listed above. #### Flip the Last Two Characters If you type like me your fingers spit characters out in the wrong order on occasion. ctrl-t swaps the order that the last two character appear in. #### Searching Bash History As you enter commands at the CLI they are saved in a file ~./.bash_history. From the bash prompt you can browse the most recently used commands through the least recently used commands by pressing the up arrow. Pressing the down arrow does the opposite. If you have entered a command a long time ago and need to execute it again you can search for it. Type the command 'ctrl-r' and enter the text you want to search for. #### Dealing with Spaces First, I will mention a few ways to deal with spaces in directory names, file names, and everywhere else. #### Using the Backslash Escape Sequence One option is to use bash's escape character \. Any space following the backslash is treated as being part of the same string. These commands create a directory called "foo bar" and then remove it.  # mkdir foo\ bar # rm foo\ bar The backslash escape sequence can also be used to decode commands embedded in strings which can be very useful for scripting or modifying the command prompt as discussed later. #### Using Single/Double Quotes with Spaces and Variables Single and double quotes can also be used for dealing with spaces.  # touch 'dog poo' # rm "dog poo" The difference between single and double quotes being that in double quotes the , \, and ' characters still preserve their special meanings. Single quotes will take the  and \ literally and regard the ' as the end of the string. Here's an example:  # MY_VAR='This is my text' # echo MY_VAR This is my text # echo "MY_VAR" This is my text # echo 'MY_VAR' MY_VAR The string following the  character is interpreted as being a variable except when enclosed in single quotes as shown above. #### Lists Using { and } The characters { and } allow for list creation. In other words you can have a command be executed on each item in the list. This is perhaps best explained with examples:  # touch {temp1,temp2,temp3,temp4} This will create/modify the files temp1, temp2, temp3, and temp4 and as in the example above when the files share common parts of the name you can do:  # mv temp{1,2,3,4} ./foo\ bar/ This will move all four of the files into a directory 'foo bar'. #### Executing Multiple Commands in Sequence This is a hefty title for a simple task. Consider that you want to run three commands, one right after the other, and you do not want to wait for each to finish before typing the next. You can type all three commands on a line and then start the process:  # ./configure; make; make install OR # ./configure && make && make install I often use these formats in crontab files for commands that need to be executed in sequence if I choose not to make a script. #### Piping Output from One Command to Another Piping allows the user to do several fantastic thing by combining utilities. I will cover only very basic uses for piping. I most commonly use the pipe command, |, to pipe text that is outputted from one command through the grep command to search for text. Examples:  See if a program, centericq, is running: # ps ax | grep centericq 25824 pts/2 S 0:18 centericq Count the number of files in a directory (nl counts things): # ls | nl 1 #.emacs# 2 BitchX 3 Outcast double cd.lst 4 bm.shader 5 bmtexturesbase.pk3 If my memory serves using RPM to check if a package is installed: # rpm -qa | grep package_name A more advance example: # cat /etc/passwd | awk -F: '{print 1 "\t" 6}' | sort > ./users This sequence takes the information if the file passwd, pipes it to awk, which takes the first and sixth fields (the user name and home directory respectively), pipes these fields separated by a tab ("\t") to sort, which sorts the list alphabetically, and puts it into a file called users. ### Aliasing Commands Once again I like how this topic is covered on freeunix.dyndns.org:8088 in "Customizing your Bash environment" I will quote the section entitled "Aliasses": If you have used UNIX for a while, you will know that there are many commands available and that some of them have very cryptic names and/or can be invoked with a truckload of options and arguments. So, it would be nice to have a feature allowing you to rename these commands or type something simple instead of a list of options. Bash provides such a feature : the alias . Aliasses can be defined on the command line, in .bash_profile, or in .bashrc, using this form :  alias name=command This means that name is an alias for command. Whenever name is typed as a command, Bash will substitute command in its place. Note that there are no spaces on either side of the equal sign. Quotes around command are necessary if the string being aliassed consists of more than one word. A few examples :  alias ls='ls -aF --color=always' alias ll='ls -l' alias search=grep alias mcd='mount /mnt/cdrom' alias ucd='umount /mnt/cdrom' alias mc='mc -c' alias ..='cd ..' alias ...='cd ../..' The first example ensures that ls always uses color if availabe, that dotfiles are listed as well,that directories are marked with a / and executables with a *. To make ls do the same on FreeBSD, the alias would become :  alias ls='/bin/ls -aFG' To see what aliasses are currently active, simply type alias at the command prompt and all active aliasses will be listed. To "disable" an alias type unalias followed by the alias name. ### Altering the Command Prompt Look and Information Bash has the ability to change how the command prompt is displayed in information as well as colour. This is done by setting the PS1 variable. There is also a PS2 variable. It controls what is displayed after a second line of prompt is added and is usually by default '> '. The PS1 variable is usually set to show some useful information by the Linux distribution you are running but you may want to earn style points by doing your own modifications. Here are the backslash-escape special characters that have meaning to bash:   \a an ASCII bell character (07) \d the date in "Weekday Month Date" format (e.g., "Tue May 26") \e an ASCII escape character (033) \h the hostname up to the first .' \H the hostname \j the number of jobs currently managed by the shell \l the basename of the shell's terminal device name \n newline \r carriage return \s the name of the shell, the basename of 0 (the portion following the final slash) \t the current time in 24-hour HH:MM:SS format \T the current time in 12-hour HH:MM:SS format \@ the current time in 12-hour am/pm format \u the username of the current user \v the version of bash (e.g., 2.00) \V the release of bash, version + patchlevel (e.g., 2.00.0) \w the current working directory \W the basename of the current working direcory \! the history number of this command \# the command number of this command \ if the effective UID is 0, a #, otherwise a  \nnn the character corresponding to the octal number nnn \\ a backslash \[ begin a sequence of non-printing characters, which could be used to embed a terminal control sequence into the prompt  end a sequence of non-printing characters 

Colours In Bash:

 Black    0;30  Dark Gray  1;30 Blue     0;34  Light Blue 1;34 Green    0;32  Light Green 1;32 Cyan     0;36  Light Cyan 1;36 Red      0;31  Light Red  1;31 Purple   0;35  Light Purple 1;35 Brown    0;33  Yellow     1;33 Light Gray 0;37  White      1;37 

Here is an example borrowed from the Bash-Prompt-HOWTO:

 PS1="$\033[1;34m$[\$(date +%H%M)][\u@\h:\w]$$\033[0m$ "

This turns the text blue, displays the time in brackets (very useful for not losing track of time while working), and displays the user name, host, and current directory enclosed in brackets. The "$\033[0m$" following the $returns the colour to the previous foreground colour. How about command prompt modification thats a bit more "pretty":  PS1="$\033[1;30m$[$\033[1;34m$\u$\033[1;30m$@$\033[0;35m$\h$\033[1;30m$] $\033[0;37m$\W $\033[1;30m$\$$\033[0m$ "

This one sets up a prompt like this: [user@host] directory $Break down:  $\033[1;30m$ - Sets the color for the characters that follow it. Here 1;30 will set them to Dark Gray. \u \h \W \$ - Look to the table above $033[0m$ - Sets the colours back to how they were originally.

Each user on a system can have their own customized prompt by setting the PS1 variable in either the .bashrc or .profile files located in their home directories.

#### FUN STUFF!

A quick note about bashish. It allows for adding themes to a terminal running under a GUI. Check out the site for some screen-shots of what it can do.

Also, the program fortune is a must [At least I have considered it so every since my Slackware days (since Slackware included it by default)]. It doesn't have anything to do with bash and is a program that outputs a quote to the screen. Several add-ons are available to make it say stuff about programming, the xfiles, futurama, starwars, and more. Just add a line in your /etc/profile like this to brighten your day when you log into your computer:

 echo;fortune;echo

### Basic and Extended Bash Completion

Basic Bash Completion will work in any bash shell. It allows for completion of:

1. File Names
2. Directory Names
3. Executable Names
4. User Names (when they are prefixed with a ~)
5. Host Names (when they are prefixed with a @)
6. Variable Names (when they are prefixed with a $) This is done simply by pressing the tab key after enough of the word you are trying to complete has been typed in. If when hitting tab the word is not completed there are probably multiple possibilities for the completion. Press tab again and it will list the possibilities. Sometimes on my machine I have to hit it a third time. Extended Programmable Bash Completion is a program that you can install to complete much more than the names of the things listed above. With extended bash completion you can, for example, complete the name of a computer you are trying to connect to with ssh or scp. It achieves this by looking through the known_hosts file and using the hosts listed there for the completion. This is greatly customizable and the package and more information can be found here. Configuration of Programmable Bash Completion is done in /etc/bash_completion. Here is a list of completions that are in my bash_completion file by default.  completes on signal names completes on network interfaces expands tildes in pathnames completes on process IDs completes on process group IDs completes on user IDs completes on group IDs ifconfig(8) and iwconfig(8) helper function bash alias completion bash export completion bash shell function completion bash complete completion service completion chown(1) completion chgrp(1) completion umount(8) completion mount(8) completion Linux rmmod(8) completion Linux insmod(8), modprobe(8) and modinfo(8) completion man(1) completion renice(8) completion kill(1) completion Linux and FreeBSD killall(1) completion GNU find(1) completion Linux ifconfig(8) completion Linux iwconfig(8) completion RedHat & Debian GNU/Linux if{up,down} completion Linux ipsec(8) completion (for FreeS/WAN) Postfix completion cvs(1) completion rpm completion apt-get(8) completion chsh(1) completion chkconfig(8) completion user@host completion host completion based on ssh's known_hosts ssh(1) completion scp(1) completion rsync(1) completion Linux route(8) completion GNU make(1) completion GNU tar(1) completion jar(1) completion Linux iptables(8) completion tcpdump(8) completion autorpm(8) completion ant(1) completion mysqladmin(1) completion gzip(1) completion bzip2(1) completion openssl(1) completion screen(1) completion lftp(1) bookmark completion ncftp(1) bookmark completion gdb(1) completion Postgresql completion psql(1) completion createdb(1) completion dropdb(1) completion gcc(1) completion Linux cardctl(8) completion Debian dpkg(8) completion Debian GNU dpkg-reconfigure(8) completion Debian Linux dselect(8) completion Java completion PINE address-book completion mutt completion Debian reportbug(1) completion Debian querybts(1) completion update-alternatives completion Python completion Perl completion rcs(1) completion lilo(8) completion links completion FreeBSD package management tool completion FreeBSD kernel module commands FreeBSD portupgrade completion FreeBSD portinstall completion Slackware Linux removepkg completion look(1) completion ypcat(1) and ypmatch(1) completion mplayer(1) completion KDE dcop completion wvdial(1) completion gpg(1) completion iconv(1) completion dict(1) completion cdrecord(1) completion mkisofs(8) completion mc(1) completion yum(8) completion yum-arch(8) completion ImageMagick completion Links #### Learn About Bash Scripting: #### unixtips.org bash tips  bash Nicolas Lidzborski at 19 February, 23:54:09  If you want your xterm or rxvt tille bar to show the username, hostname and current directory and if you uses bash, you can set the PROMPT_COMMAND shell variable. Personally, I use the following command in my /etc/profile: if [$TERM = "xterm" ]; then export PROMPT_COMMAND='echo -ne \ "\033]0;${USER}@${HOSTNAME}: ${PWD}\007"' fi The test around the export command is done in order to avoid causing problems in text terms.  bash sRp at 19 February, 05:23:23  You can execute bash command a certain number of times by using something similar to the following: n=0;while test -$n -gt -10; do echo n=$n; n=$[$n+1]; done That code will print "n=0", "n=1", and so on 10 times.  bash sRp at 30 January, 07:18:30  You can use CTRL-_ or CTRL-X, CTRL-U to make undo's at the bash prompt.  bash Ian Eure at 29 January, 12:55:02  Bash supports tab-completion. That is, you type the first few characters of a command (or file / directory) and hit tab, and bash automagically completes it for you. For example, if you wanted to run the program WPrefs (Window Maker prefrences util), all you have to do is type WP and bash will fill in the rest plus a trailing space.  bash sRp at 28 January, 01:06:05  Hitting META-P in bash will allow you to search through the bash history.  bash sRp at 27 January, 13:24:42  If you find yourself having to cd back and forth between long directory names, bash's pushd is the perfect solution. Start in one of the directories, and the type pushd directory2 to go to the second directory. Now if you type dirs you should see the two directories listed. To switch between these two directories just type pushd +1  bash sRp at 27 January, 13:16:39  While using bash, if you have typed a long command, and then realize you don't want to execute it yet, don't delete it. Simply append a # to the beginning of the line, and then hit enter. Bash will not execute the command, but will store it in history so later you can go back, remove the # from the front, and execute it.  bash sRp at 27 January, 13:10:05  In the bash shell, CTRL-U will delete everything to the left of the cursor.  bash sRp at 27 January, 13:08:22  CTRL-T in bash will transpose two characters; great for typos.  bash sRp at 21 January, 05:39:18  Hitting CTRL-W in bash will delete the word just before your cursor. CTRL-Y will yank back in the last deleted word (or words if they were delete consecutively). If you deleted words after you deleted what you wanted to yank back in, and already pressed CTRL-Y, you can use ALT-Y to look through those words.  bash Mike Lowrie at 19 January, 07:17:36  Here's another way to change into long directory names in bash. For example, the directory, samba-2.0.0beat2. You can put in cd samb* and it will change to the directory that matches the wildcard.  bash Sid Boyce at 19 January, 05:00:05  In the bash shell, you can utilize shortcuts. If your last command started with an l was less xxx, then !l will re-execute it. However, if you had been using lpr and ln as well, and you wanted to run less again, then !le would execute it.  bash sRp at 18 January, 21:25:17  In bash, hitting ALT-b will move you back a word, and hitting ALT-f will move you forward a word.  bash sRp at 18 January, 21:25:10  Typeing CTRL-l at a bash prompt, will clear the screen, and put the current line at the top of the screen.  bash schvin at 17 January, 12:46:44  Turning on the scrolllock in a console will pause or suspend the current command in progress in bash, such as ls, du or mpg123.  bash mulo at 30 September, 21:43:22  To lowercase files in current$PWD #!/bin/sh for x in * do newx=echo $x | tr "[:upper:]" "[:lower:]"; mv "$x" "$newx" echo "$x --> $newx" done  bash Jose at 30 September, 21:43:33  For one fast and effective clear' use echo e='\ec' It does more that clear'  bash nexz at 30 September, 21:44:50  Finding out all the commands installed on your box? At the prompt, press tab twice and it will ask you if you want to see all the commands. Say y and it will show you all the commands that you installed on your box including shell syntax. Very easy to find out and to familiar yourself with the commands you don't know (btw, this only searches according to path variable set in bash login files). But be careful if you are the root; try --help or man page first before blindly type into it. If all the commands listed are in single column and you can't see the top, edit .bash_profile or .bashrc to include this alias: alias ls="ls -C". Then you should be able to see all. One other alternative might be to increase the buffer for the terminal so that it will hold more characters. Hope this helps!  bash Daniel Giribet at 30 September, 21:46:07  Would you like to list only directories (without a long -l listing)? dirs () { ls -F$1 | grep \/ | sed -e 's/\/$//g' } Use 'dirs ' on your bash shell and enjoy!  bash sRp at 31 July, 19:23:44  The readline support in the bash shell defaults to emacs editing mode. You can easily switch that to vi mode by issuing the following command: set -o vi.  bash Antonio at 8 February, 12:42:20  If you use bash, you can search backwards into its history: hit CTRL-R and start typing what you want to search (it works exactly as in Emacs). If there are lots of similar lines in your history, repetedly typing CTRL-R will browse through them  bash irfan ahmed at 23 December, 19:36:34  bash allows you to move between the current directory and the previous directory using the hyphen after the cd command. Say you were in /home/john/pies/american. You give the command cd /home/jack/steak/grilled Now you could back to the ../../american directory using cd -  bash hictio at 18 January, 02:30:17  you can clear the screen when you logout, in bash, by adding this to the ~/.bash_logout file:   setterm -clear   if you don´t have a .bash_logout file, just make one.  bash johnnycal at 18 January, 02:31:14  I use cd bla; ls -l bla so much I made a function for it see   function see () { cd$1; ls . ; } 
 bash Nate Fox at 27 December, 04:32:07
 In bash, if you add this:   complete -d cd   Into your ~/.bash_profile or /etc/profile file, then when you cd, it will only search for directories. So if you have a file called "jiggy" and a directory called "joogy" and those are the only things in the directory, and you type cd and press tab, it will just go into "joogy".
 bash sRp at 5 September, 17:02:58
 Under bash or zsh, if you would like to edit a previous command in a text editor instead of on the command line, use the fc command.
 bash frodo at 10 April, 04:15:04
 Aliasing dir to list just directories can be useful. To do so, do the following: alias dir='ls -l | grep ^d' grep in this case searches for a d in the first column of each line.
 bash HellHound at 14 January, 04:31:20
 Another search-in-bash thingy: CTRL+R, this is more "realtime"--when you enter a char/string, it gives you a found match directly.
 bash Joerg Tretter at
 If you want to switch off the "beep" during command line-completion you should add an entry either in your ~/.inputrc or system wide in your /etc/inputrc: for visual signal : set bell-style visible for absolutely no signal: set bell-style none
 bash Jason P. Stanford at 20 May, 05:21:59
 This is a variation for the "colorful directory listing" hint users, that works "better" under bash. Put the following in $HOME/.bashrc or$HOME/.bash_profile: function v () { ls -l --color=auto $*; } function d () { ls --color=auto$*; } HINT: Think of 'v' as "verbose" and 'd' as "directory". And they're much quicker to type (only a single char), so this should satisfy most unix junkies.

### Softpanorama Recommended

Good advices and "tutorials" for .profile you can find here:

Just in general go to github.com and check out the bash files, there are some really good ones.

## Etc

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit exclusivly for research and educational purposes.   If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner.

ABUSE: IPs or network segments from which we detect a stream of probes might be blocked for no less then 90 days. Multiple types of probes increase this period.

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least

Copyright © 1996-2016 by Dr. Nikolai Bezroukov. www.softpanorama.org was created as a service to the UN Sustainable Development Networking Programme (SDNP) in the author free time. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License.

The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

 You can use PayPal to make a contribution, supporting development of this site and speed up access. In case softpanorama.org is down you can use the at softpanorama.info

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.