|This article does not cite any references or sources. (December 2009)|
|C, C++, Go, Swift (unsigned types only);
Standard ML, Verilog
|Object Pascal, Delphi||shl||shr|
|F# (unsigned types only)||<<<||>>>|
In computer science, a logical shift is a bitwise operation that shifts all the bits of its operand. The two base variants are the logical left shift and the logical right shift. This is further modulated by the number of bit positions a given value shall be shifted, like "shift left by 1" or a "shift right by n". Unlike an arithmetic shift, a logical shift does not preserve a number's sign bit or distinguish a number's exponent from its mantissa; every bit in the operand is simply moved a given number of bit positions, and the vacant bit-positions are filled in, usually with zeros (contrast with a circular shift).
A logical shift is often used when its operand is being treated as a sequence of bits rather than as a number.
Logical shifts can be useful as efficient ways of performing multiplication or division of unsigned integers by powers of two. Shifting left by n bits on a signed or unsigned binary number has the effect of multiplying it by 2n. Shifting right by n bits on an unsigned binary number has the effect of dividing it by 2n (rounding towards 0).
The programming languages C, C++, and Go, however, have only one right shift operator, >>. Most C and C++ implementations, and Go, choose which right shift to perform depending on the type of integer being shifted: signed integers are shifted using the arithmetic shift, and unsigned integers are shifted using the logical shift.
All currently relevant C standards (ISO/IEC 9899:1999 to 2011) leave a definition gap for cases where the number of shifts is equal to or bigger than the number of bits in the operands in a way that the result is simply undefined. This helps allow C compilers emit efficient code for various platforms by allowing direct use of the native shift instructions which have differing behavior. For example, shift-left-word in PowerPC chooses the more-intuitive behavior where shifting by the bit width or above gives zero, whereas SHL in x86 chooses to mask the shift amount to the lower bits "to reduce the maximum execution time of the instructions", and as such a shift by the bit width doesn't change the value.
Some languages, such as the .NET Framework and LLVM, also leave shifting by the bit width and above "unspecified" (.NET) or "undefined" (LLVM). Others choose to specify the behavior of their most common target platforms, such as C Sharp (programming language) which specifies the x86 behavior.
If the bit sequence 0001 0111 (decimal 23) were subjected to a logical shift of one bit position...
|* ...to the left would yield: 0010 1110 (decimal 46)||* ...to the right would yield: 0000 1011 (decimal 11)|