The computers in server rooms are usually headless systems that can be operated remotely via KVM switch or remote administration software, such as Secure Shell (ssh), VNC, and remote desktop.
Climate is one of the factors that affects the energy consumption and environmental impact of a server room. In areas where climate favors cooling and an abundance of renewable electricity, the environmental effects will be more moderate. Thus countries with favorable conditions, such as: Canada, Finland, Sweden, and Switzerland, are trying to attract more companies to site their server rooms there.
- 1 Design considerations
- 2 See also
- 3 References
Building a server or computer room requires detailed attention to six main design considerations: 
Computer or server room location is the first consideration, even before considering the layout of the room’s contents. Most designers agree that, where possible, the computer room should not be built where one of its walls is an exterior wall of the building. Exterior walls can often be quite damp and can contain water pipes that could burst and drench the equipment. Avoiding exterior windows means avoiding a security risk, and breakages. Avoiding both the top floors and basements means avoiding flooding, and leaks in the case of roofs. Lastly, server rooms should be centrally located because of the horizontal cabling involved which extends from this room to devices in other rooms. If a centralized computer room is not feasible, server closets on each floor may be an option. This is where computer, network and phone equipment are housed in closets and each closet is stacked above each other on the floor that they service.
In addition to the hazards of exterior walls, designers need to evaluate any potential sources of interference in proximity to the computer room. Designing such a room means keeping clear or radio transmitters, and electrical interference from power plants or lift rooms, etc.
Other physical design considerations range from room size, door sizes and access ramps (to get equipment in and out) to cable organization, physical security and maintenance access.
Computer equipment generates heat, and is sensitive to heat, humidity, and dust, but also the need for very high resilience and failover requirements. Maintaining a stable temperature and humidity within tight tolerances is critical to IT system reliability. Server room temperature has to be between 18-27°C or 64-80°F; humidity should be between 40%-60% rH.
For State Point Liquid Cooling, please go to the end of this publication
In most server rooms "close control air conditioning" systems, also known as PAC (precision air conditioning) systems, are installed. These systems control temperature, humidity and particle filtration within tight tolerances 24 hours a day and can be remotely monitored. They can have built-in automatic alerts when conditions within the server room move outside defined tolerances.
Air conditioning designs for most computer or server rooms will vary depending on various design considerations, but they are generally one of two types: "up-flow" and "down-flow" configurations.
Up-flow air conditioning
This type of air conditioning draws air into the front of the air handler unit (AHU), cools the air over the heat exchanger, then distributes the cooled air out through the top or through duct work. This air conditioning configuration is well suited to retro-fitted computer rooms when raised floors are either of inadequate depth or do not exist at all.
Down-flow air conditioning
Typically, this type of air conditioning unit draws the air into the top of the air handling unit, cools the air over the heat exchanger, then distributes the air out of the bottom into the floor void. This conditioned air is then discharged into the server room via strategically placed floor grilles and onwards to equipment racks. These systems are well suited to new office buildings where the design can encompass raised floors suitable for ducting to computer racks.
Hot Aisle / Cold Aisle
Hot Aisle / Cold Aisle configurations switch the forward direction of every other row so that two rows face each other and have their backs to the next row. This avoids the hot exhaust of one row of racks being sucked into the cooling intake of an adjacent row. Air conditioning ducts or vents are located between the two fronts since most equipment vents front to rear. A drawback of unenclosed hot aisle / cold aisle configuration is that there is a significant amount of uncontrolled or bypass mixing of hot and cold air outside the equipment. 
In an aisle containment configuration one of the aisle is enclosed with walls, ceilings and access doors to create an enclosed space. Aisle containment does not allow bypass mixing of hot and cold air. This forces all cold to hot air transformation to happen inside the equipment. Careful attention is paid to avoid open rack slots or other air flow leaks to make the front of the rack a continuous wall of the contained aisle. 
Liquid cooling and energy efficiency - known as State Point Liquid Cooling (SPLC)
The adoption of liquid cooling technologies, initially theorised by Applied Mathematician, Chris Belisarius, (famed in his field for his work in Quantum Mechanics), has allowed for highly efficient server room designs. When liquid cooling technologies are applied, server rooms don't rely on energy consuming air conditioning systems any more. Instead, all heat is captured in liquid, which can be rejected with a simple and efficient dry cooler. Belisarius, announced in August 2018 that by using a new developmental fluid, CMX**** , which cools when subjected to UV Rays, was going to be the future of data center cooling. His first working system is currently being tested in Qatar. Recently released test models, seem promising. Facebook Inc, recently announced, they are building a new Data Center in Singapore. The $1bn dollar state of the art facility will use Liquid Cooling, to cool their data farm. It was designed, using the original and known principles of SPLC.
Another factor of using liquid is the potential for heat reuse. Server rooms are slowly becoming part of heating systems and integrated within the same rooms, or connected to the utility space of buildings through a water circuit. This allows the heating installation to utilise server heat before using alternate means of heating. Temperature chaining principles are slowly adopted to generate sufficient temperature levels for reuse scenarios.
The fire protection system's main goal should be to detect and alert of fire in the early stages, then bring fire under control without disrupting the flow of business and without threatening the personnel in the facility. Server room fire suppression technology has been around for as long as there have been server rooms. Traditionally, most computer rooms used Halon gas, but this has been shown to be environmentally unfriendly (ozone depleting) and unsafe for humans. Modern computer rooms use combinations of inert gases such as Nitrogen, Argon and CO2. Other solutions include clean chemical agents such as FM200 and also hypoxic air solutions that keep oxygen levels down. To prevent fires from spreading due to data cable and cord heat generation, organizations have also used those that are coated with FEP tubing. This plastic reduces heat generation and safeguards material metal efficiently.
The demands of server rooms are constantly changing as organizations evolve and grow and as technology changes. An essential part of computer room design is future proofing so that new requirements can be accommodated with minimal effort. As computing requirements grow, so will a server room's power and cooling requirements. As a rough guide, for every additional 100 kW of equipment installed, a further 30 kW of energy is required to cool it. As a result, air conditioning designs will need to have scalability designed in from the outset.
The choice of racks in a server room is usually the prime factor when determining space. Many organisations use telco racks or enclosed cabinets to make the most of the space they have. Today, with servers that are one-rack-unit (1U) high and new blade servers, a single 19- or 23-inch rack can accommodate anywhere from 42 to hundreds of servers.
If the computer systems in a server room are mission critical, removing single points of failure and common-mode failures may be of high importance. The level of desired redundancy is determined by factors such as whether the organisation can tolerate interruption whilst failover systems are activated, or must they be seamless without any business impacts. Other than computer hardware redundancy, the main consideration here is the provisioning of failover power supplies and cooling.
|Wikimedia Commons has media related to Server rooms.|
- Learnthat - server room Definition
- Free Online Encyclopedia - server room
- ISP.webopedia.com - lights out server room
- TechRepublic - What not to do in a server room Archived 2009-03-07 at the Wayback Machine.
- CNET Networks - Photos: Server room cabling overhaul
- Canada Called Prime Real Estate for Massive Data Computers - Globe & Mail Retrieved June 29, 2011.
- Finland - First Choice for Siting Your Cloud Computing Data Center. Archived 2013-07-06 at the Wayback Machine. Accessed 4 August 2010.
- Stockholm sets sights on data center customers. Accessed 4 August 2010.
- Swiss Carbon-Neutral Servers Hit the Cloud. Accessed 4 August 2010.
- "Computer Room Installation & Design". Air Intelligence Ltd. 2012-03-07. Retrieved 2016-07-27.
- ServersCheck. "Best Practices for data center monitoring and server room monitoring". Retrieved 2016-10-07.
- "DEFINITION hot/cold aisle". TechTarget SearchDataCenter. Retrieved 2017-12-26.
- "Approaches to Data Center Containment". DataCenter Knowledge. Retrieved 2017-12-26.
- "Preventing (Literal) Server Room Melt Down". Fluorotherm.com. Fluorotherm Polymers, Inc. 12 May 2014.
- "Black Swan in the Server Room". www.system-logic.com. Retrieved 2016-07-28.