Skip to content
Analytica > Blogs > Data center capacity planning & modeling

Data center capacity planning & modeling

What goes around comes around. Do you know how old virtualization is really? Although it’s a buzzword today thanks to the rapid growth of cloud computing and vendors like Amazon with its Elastic Cloud Compute offering, virtualization is over 30 years old. It was invented as a datacenter capacity planning solution to share lumpy (and expensive) mainframe resources between different applications. Then the PC arrived, hardware prices plummeted and virtualization problem was no longer relevant – until the need to address reliability, availability and security between multiple systems came back again.

A simple model of how virtualization works

Image source: vmware.com

The promise of virtualization and the need to plan

Virtualization also offers great potential for cutting down on energy consumption. This is a factor of growing importance in data centers, not only because of the cost it represents, but also because of the ecological impact. Even if you build solar-powered data centers, the less solar panels you have to put in, the better. However, the move from computer programs running in native mode on separate physical servers to virtual mode sharing one or several networked servers needs proper planning, not just for energy consumption, but also for the total amount of processing power and storage. Planning can be significantly improved by modeling virtualization to predict outcomes or requirements given different scenarios (computing use cases).

When modeling virtualization goes awry

Image source: technet;com

Not such plain sailing for datacenter capacity planning

Cloud computing is here, but at least one survey says that data centers are lagging badly behind. Planning and modeling seem to have been forgotten in a large number of centers already 20 years old. In fact, Gartner group says that if your data center is more than seven years old, it’s already obsolete. As a result data centers and their networks also score low in terms of user confidence, something that realistic models of replacement installations could perhaps address. However, those involved in planning will need to get their models right in the first place. Data center planning is also dangerously rich in the potential for adopting the wrong models or for making datacenter capacity planning mistakes.

Getting the models right

Modeling virtualization for datacenter capacity planning can be done properly with a common sense approach and comparison with resources for modeling virtual environments. With the interlinking of data centers either via intranets or extranets, information is also available for networking and network virtualization modeling. Models will likely need to be constructed over timelines of say five years to take account of amortization periods and the evolution of data-processing needs. Uncertainty in the growth of the number of users and the quantity of applications will need to be handled appropriately too.

A more complex model of virtualization Image source: commons.wikimedia.org

Basic building blocks for visualizing virtualization

While the concept of virtualization is generally the same (share computing resources without physical constraints), the practical manifestations differ. There are a number of architectures that organizations can use, each one having an impact on computing power and energy consumption. Possibilities range from multiple logical universes all running on one base operating system to users accessing databases distributed across a variety of different machines and IT environments. Virtualization processes may also cost less in terms of staffing if automation technologies for systems are used.

If you’d like to know how Analytica, the modeling software from Lumina, can help you to model computing installations, then try the free trial of Analytica to see what it can do for you.

Share now 

See also

Heat pumps and hybrid systems in cold climates

Recent advancements in cold-climate heat pump technology have proven their effectiveness in heating homes even in areas with harsh winters. However, there’s been less research
More...

The units state of America

It would be much easier to do calculations if we used kWh and GJ versus British thermal units and gallons. How did the US end up as the only place
More...

Analytica software top in user satisfaction on G2

We’re thrilled to announce that Analytica has achieved the highest satisfaction score in G2’s business process simulation (BPS) category! G2 is the leading software
More...
The imitation game

Does GPT-4 pass the Turing test?

In 1950, Alan Turing proposed “The Imitation Game”, today known as the Turing test, as a hypothetical way of measuring whether a computer can think [1]. It stakes out the
More...