Last Updated on 26/02/2022 by Nidhi Khandelwal
There’s something special about handiwork. It’s personal, it’s artistry, and it has the potential to be extremely effective in attaining its objectives. Mass-market production, on the other hand, can be effective in other aspects, such as speed, efficiency, and cost savings.
The evolution of data centers has been from handcrafting – where each individual machine is a passion project that is meticulously maintained – to mass production with large server farms where individual units are utterly disposable.
We’ll look at how data centers have evolved throughout the years in this post. We look at the repercussions for datacenter workloads and the personnel who manage them — who have now lost their personal computers. We’ll also go through the new data center landscape’s cybersecurity ramifications.
Systems were exquisitely constructed pieces of hardware – and treated with the same love as a pet – for any sysadmin who started their profession before the arrival of virtualization and other cloud and automation technologies.
It all began with the advent of computer rooms in the 1940s, where large equipment were manually connected by miles of wires in what could only be described as a labor of love. The steam engines of the computing period were housed in these computer rooms, which were shortly to be replaced by more sophisticated technology thanks to the silicon revolutions. What about safety? All that was required was a large lock on the door.
Mainframes, the forerunners of today’s data centers, were similarly meticulously built, with a single machine taking up an entire room and requiring ongoing, specialized craftsmanship to keep running. This necessitated both hardware and coding abilities, as mainframe operators were required to code on the fly in order to keep their workloads operating.