Datacenters today have become an integral part of our life. Datacenters, as the name suggests, is a facility where organizations store, deploy, manage and monitor their massive amount of industrial/corporate crucial data and information. It comprises of computer systems & hardware such as racks, storage & associated devices, computers, networks, backup, power supply, and security etc.

Businesses, Industry’s and organizations are generating a massive amount of data and information be it structured or unstructured and it is a continuous recurring process It is very difficult for these businesses to store, deploy, manage and monitor this data & information on their own infrastructures this is where Datacenters come into the picture. A Datacenter service provider stores, manage, maintain & secures the data on behalf of the organizations it is a Datacenter service providers responsibility to keep the data and information along with the company’s applications available and updated as and when they are required with zero or negligible downtime.

Definition of Datacenter

“A data center is an infrastructure facility which stores, manage, deploy and monitors organizations massive data, information and IT applications at a centralized location that ensures business continuity.”

Evolution of Datacenters

The evolution of the computer has its roots in the early 19th century when Charles Babbage, considered to be the father of computers invented the first mechanical computer. The first computers were quite similar to today’s Datacenters they were known as traditional Datacenters or ‘siloed’ Datacenters in the early 90’s. Traditional Datacenters or siloed Datacenters offered reliability, & flexibility though they were astonishingly slow, an organization would take months to deploy new applications with traditional Datacenters.

Evolution of Cloud Technology

Virtualisation technology which came into existence between 2003-2010 played a major role in the evolution of today’s Datacenters. It enables us to virtualize and pool resources of computing such as Storage, RAM, ROM, Network etc. to create more flexible, reliable, and resourceful infrastructure. Thanks to Cloud Computing we are now able to better utilize the resources with greater flexibility, reliability & Speed.

Software-defined Datacenters (SDDC) / Software-Led Infrastructure (SLI)

Virtualisation of the siloed Datacenters does enable flexibility and reliability on traditional Datacenters but had detrimental impacts on storage and network components. Server virtualization wasn’t the perfect solution but a stepping stone for an ultimate solution. Server virtualisation enabled better utilisation of resources in an ideal situation though it had a shortfall, as, what if an application on a shared resource utilises more resources than that of the other applications relying on the same resources these applications will witness a shortfall of computing resources & it will void the complete SLA of 100% uptime between the organisations and the service provider. This is the part when software-defined virtualization or Software-Led Infrastructure (SLI) came into the picture. This gave total control on resource utilization via software’s giving a centralized control over deployment, automation, regulations, provisioning & configuring with the aid of software.  SLI also enabled us to allocate resources as per the applications demands automatically scaling resources up and down as and when required handling dynamic demands & reliving IT burdens of the organizations and it’s also cost-effective as the organizations only have to pay for the resources they have utilized. According to a survey by Right scale 2017, 95% of respondents are using cloud computing some or the other way.

Software defined Datacenters

 Datacenter Types

The Telecommunications Industry Association is a trade association accredited by ANSI (American National Standards Institute). In 2005, it published ANSI/TIA-942, Telecommunications Infrastructure Standard for Data Centers, which defined four levels of data centers in a thorough, quantifiable manner.TIA-942 was amended in 2008, 2010, 2014 and 2017. TIA-942: Data Center Standards Overview describes the requirements for the data center infrastructure. The simplest is a Level 1 data center, which is basically a server room, following basic guidelines for the installation of computer systems. The most stringent level is a Level 4 data center, which is designed to host the most mission-critical computer systems, with fully redundant subsystems, the ability to continuously operate for an indefinite period of time during primary power outages.

The Uptime Institute, a data center research and professional-services organization based in Seattle, WA defined what is commonly referred to today as “Tiers” or more accurately, the “Tier Standard”. Uptime’s Tier Standard levels describe the availability of data processing from the hardware at a location. The higher the Tier level, the greater the expected availability. The Uptime Institute Tier Standards are shown below.

Uptime Institutes Tiers StandardsSource: Wikipedia

For the 2014 TIA-942 revision, the TIA organization and Uptime Institute mutually agreed that TIA would remove any use of the word “Tier” from their published TIA-942 specifications, reserving that terminology to be solely used by Uptime Institute to describe its system.

Datacenters – An Integral Part of Our Lives

You might not know yet but yes, it is a fact Datacenters are or have become an integral part of our lives or it affects our lives in some or the other ways & Datacenters today have evolved far more than just stacks of servers, here are some interesting facts about Datacenters.

  • Datacenters accounts to 17% of carbon footprints of the global information and communication technology sector.
  • For every watt of computer power consumed by a Datacenter, it takes another watt to cool it
  • You need a whole power plant just to cool a Datacenter.
  • A large Datacenter has the capacity to use as much electricity as a small town.
  • Every Datacenter has to be uptime certified.
  • Google has around 15 Datacenters all over the world with 9,00,000 servers and Google alone roughly accounts for 0.013% of the worlds energy use.
  • Facebook has around 6 Datacenters across the world with 60,000 servers and consumes around approx. 678m kWh of energy every year.
  • Amazon has Datacenters at 7 locations in the world with 1,58,000 servers and their 30 million customers streams around 40 PB/Month of data.
  • Microsoft has around 20 facilities across the world with 1,00,000 servers and there southern Virginia Datacenter costs a whopping $1 Billion.