Data Center News: Accurate Analytics

Data Center News: Accurate Analytics

With corporate data surging – research estimates it’s doubling every 14 months and will reach 10.5 ZB by 2020 – and consumers constantly connected to their smart technology and devices, businesses have become increasingly reliant on their data centres to deliver high quality services and continuously improve operational performance. As a result, data centres now have a significant impact on business strategies and goals and have become a high priority for the C-suite.

Read the full article here

How to curb data centers’ hunger for energy and resources

How to curb data centers’ hunger for energy and resources

The mantra ‘keep the lights on, keep it cold’ is often quoted by data center engineers when defining their roles, but is this the right approach to energy efficiency and the environmental impact of what are, essentially, utility-hungry sheds that consume the equivalent power of small towns every day?

In January 2016 it was estimated that the global data center industry was using about 416.2 terawatt hours per year. To put things into perspective, the entire UK with a population of over 65m people and over 5.7m businesses consumes about 300 terawatt hours during an entire year.

There have been many opinions on how best to curb the use of such huge amounts of energy, however with billions of people watching cat videos, streaming films, gaming and ‘liking’ photos on social media, an ‘end of pipe’ solution to data center usage would be foolhardy at best, and probably cause global uproar.

The accessibility of data and ease of use is a great example of the Jevons paradox – “technological progress increases the efficiency with which a resource is used, but the rate of consumption of that resource rises because of increasing demand” - in other words, the easier you make it to consume the product the greater the consumption will be!

Is green energy the answer?

The most appropriate place to look at energy efficiency is not at the domestic end-user, but rather at the data center, colo or hyperscale, where energy is brought in and consumed. A potential solution to lessen the environmental impact of these energy hungry giants could be renewable energy. But is this practical?

Some companies will pay a higher ‘green rate’ for renewable energy where in reality the actual energy consumed has come from a traditional fired (coal, oil, gas, nuclear) power generation plant. Is this promotion of renewable energy or just a tax under another name?

The ability to force organisations to use solely renewable energy supplies hinges on the ability to provide enough energy for constant demand as well as servicing the peaks and troughs of supply/generation.

However, until the renewables industry plays catch-up with their technology, there are plenty of alternative strategies data center operators can pursue to reduce their energy consumption – and improve their bottom line in the process.

So, what can data center operators do?

Margins in colocation are now lower than they used to be and clients want to pay via a PUE mechanism. Therefore, more efficient operations are a must.

In Romonet’s experience, significant efficiencies can be gained from CAPEX neutral changes to the data center or from minimal CAPEX expenditure and best practice.

Through our modeling activities we invariably see fixed speed fans and set points that are too low. Air flow management is often poor with a lack of blanking panels and containment. Long rows of racks, while seeming efficient, are often not the case; if an operator only has CRAHs (Computer Room Air Handlers) at one end of the row, the engineers will have to overpressurise the floor to make sure cool air reaches the furthest point.

The ability to increase free cooling hours is vital to reduce cost and improve efficiency.

All these little adjustments can increase a facility’s energy consumption but how can you understand what the effect will be when changing the set points within a cooling system or changing the UPS to a more efficient type?

Predictive analytics for increased efficiency

Many data centers are still managed with outdated and time consuming tools like spreadsheets. However, spreadsheets are prone to errors and can become so complex that the chance of tracking a mistake is very slim.

This is where predictive modeling and ‘what if’ analysis come into play to support data center managers in making data-led decisions.

By analysing trends and data from hundreds of different data centers and meteorological information spanning various climate regions Romonet can accurately simulate, calibrate and validate its models to within 98% accuracy.

Also, this data is used to model proposed changes and predict future savings and PUE values that such a change will bring about.

Using machine learning for processing the data produced by BMS and EPMS systems is far more efficient than humans could ever hope to be…

The ability to produce consistent results relies not only on a good program but also one designed by those who actually know and have operated data centers at the sharp end.

Through the use of our predictive analytics engine and database of information, between January 2016 and December 2017 Romonet promoted utility cost savings of £3,068,500, saved 47,928MW hours and 11,555 tonnes of CO2 in 21 facilities.

The best part is that the majority of these identified savings have been at the expenditure of minimal CAPEX.

The answers to making these savings already lie within the data centers. The information is being produced daily. However, often the systems employed are not able to drill into this data or instead make broad, false assumptions that everything is fine.

Very often we find that although a site may have extensive metering, much of it is either broken, flatlining or just uncalibrated. So, the resulting data is quite frankly worthless.

We spend countless hours cleaning data so it is usable, and then find that meters have not recorded load for some time. Or the BMS system is unable to produce trended information or historic information.

Perhaps the BMS or EPMS system has been replaced and all historic data has been lost. Whilst this is frustrating at best or may cause your data scientist to go off on a rant, work can be done to get a facility back into line and start driving efficiency which can be documented to show savings, reduced PUE and better resilience.

In conclusion, if you want to maintain uptime of a facility at a better and more environmentally cost-efficient model, adopt new technologies to make predictive data driven decisions!

Romonet customers save £3m+ and decrease CO2 emissions by over 11,550 tons

Romonet customers save £3m+ and decrease CO2 emissions by over 11,550 tons

For many years data centers have been seen as highly complex facilities, difficult to understand and under the sole responsibility of IT departments. Most senior managers saw these facilities as financial black holes without any consideration of their contribution to business profitability. On top of this, environmental organizations were flagging the huge amount of energy consumed by data centers at a global level and the increasing levels of CO2 emissions that were damaging our environment.

In recent years, due to technological advancements, data centers have started to become a critical priority for many organizations as a failure can bring a business to its knees. These complex facilities now underpin our lifestyle, from our daily commute, to financial transactions, business and personal communications and grocery shopping.

As a result more and more tech companies have started to invest significant resources into increasing data center efficiency and reducing the amount of energy these facilities consume in order to keep our society going 24/7, 365 days a year. 

Improved performance, reduced environmental impact

Since its inception Romonet’s vision has been to ensure data centers are both sustainable and efficient. Our mission is to improve performance and reduce the impact on our environment by enabling data center stakeholders to make faster and more accurate data driven decisions.

This year we are proud to announce that in 2016-2017 we enabled our clients to save £3m+ on utility costs, reduction of energy usage by 48,000 MWh and decrease CO2 emissions by over 11,550 tons.

By accessing Romonet’s predictive analytics solution and gaining unparalleled insights regarding the data centers’ lifecycle, our clients were able to improve their decision-making process and significantly increase their facilities’ profitability.

From a financial perspective we helped our customers maximize productivity and minimize expense by:

- Reducing the initial capital investment by accurately analyzing and predicting the most suitable design.

- Decreasing operating expenses by continually analyzing metered data against predictive models.

- Eliminating unnecessary ongoing capital projects and choosing the optimal, most cost-effective and efficient equipment to meet current and future workload requirements.

Enabling the running and building of environmentally-friendly facilities is also a major priority for our company and we helped our customers reach their CSR targets by:

- Measuring and comparing site design and equipment specifications to actual metered data to identify energy and water inefficiencies.

- Calibrating and reporting sustainability metrics including Power Usage Effectiveness (PUE) and carbon emissions.

- Highlighting potential future risks to availability or service levels.

Danny Reeves, CEO, Romonet: “If the entire data center industry and tech companies came together to use smart analytics in running facilities, we could see amazing widespread improvements in both profitability and environmental impact. As it stands, there are still many organizations using labor intensive, mistake prone Excel sheets to manage critical assets. However, the industry is progressing at light speed and industry leaders are already forging the way towards long term sustainability in the data center space. We are very proud to have customers that are aware of these crucial aspects and are investing resources into changing the data center market and creating a better environment for all players.”

Moving forward, Romonet plans to publish quarterly reports regarding savings, energy usage and CO2 emission reductions achieved by our clients.

Stay tuned for our next update!

What will move and shake the data center industry in 2018

What will move and shake the data center industry in 2018

Zahl Limbuwala, Co-Founder & Executive Director

2017 was eventful for the tech industry and data center market and 2018 should bring just as many interesting developments that shake the status quo and increase both competitiveness and efficiency for all players involved. Some of the main trends expected to make their mark and shift the industry to new directions are:

 

Edge computing

Edge data centers will start playing a critical role on the data center and cloud computing market because the IoT industry is developing at light speed and customers and companies generate huge amounts of data.

Edge data centers are not small versions of the same highly resilient data centers businesses have been building already, but rather localized compute and caching nodes for stateless applications. These facilities are important because businesses and data center operators need to support the increased pressure of IoT and the increase in edge applications and processing that 5G enables. Also, the demand for connectivity, bandwidth and low latency will increase exponentially in the following years as smart houses, offices, driverless cars and artificial intelligence become the new norm in our society.

RISC computer

Owning IT hardware will continue its decline in 2018 as well. Centralized cloud providers such as Google, AWS and Azure will continue to dominate more and more of the growing compute, storage and application capabilities out of the traditional enterprise data center.

Scale is king in the cloud game and most companies aren’t big enough to achieve scale to competitive levels. Also AI and machine learning will increase the demand for RISC (Reduced Instruction Set Computer) compute and 2018 will see more specialised compute-as-a-service. Whether it’s Google’s Tensor Flow or AWS’s addition of FPGAs (Field Programmable Gate Arrays) for AI applications, this specialist compute as a service market will put an even bigger gap between the big cloud players and the rest of the pack.

Microsoft and Google are leading the way towards making the data center industry more energy efficient and adopting modern green technological advancements.

Environmental sustainability

The tech industry’s awareness of environmental sustainability has increased significantly over recent years and data centers are also facing more stringent compliance requirements. As a result, a major trend that will influence the data center landscape will be the rising demand for new technologies that increase facilities’ energy efficiency requirements and decrease their overall impact on the environment. 

We’ve already seen many technological innovations and design developments that are enabling data centers to cut down CO2 emissions, energy, and water consumption. Tech leaders such as Amazon, Facebook, 

In 2016 Romonet issued a strong call to action for the world’s executives, enterprises and data center operators to act on one of the most urgent challenges facing the global economy and environment today – water consumption.

Through Romonet’s platform we have enabled our clients to precisely measure water efficiency, capacity, consumption and cost. As a result, many customers have significantly decreased their water consumption and overall costs.

Furthermore, by accurately modelling and simulating data center energy and total cost of ownership, in 2016-2017 Romonet’s platform helped customers save £3m+ on utility costs, reduce energy usage by 48,000 Mwh and decrease CO2 emissions by over 11,550 tons.

I’m looking forward to seeing the data center industry developing further in 2018 and what other trends will emerge in the near future.

Director of Finance: Why CFOs need the right data centre

Director of Finance: Why CFOs need the right data centre

The fast-paced global economy and escalating competitiveness are driving many companies to focus their attention on new technologies that empower their workforce, satisfy the ever increasing demands of customers, and improve their ability to quickly change direction based on market trends and conditions.

Read the full article here

Winner of Infrastructure Innovation of the Year, UK IT Industry Awards

Winner of the Infrastructure Innovation of the Year, UK IT Industry Awards

The Romonet team have won another prestigious award - Infrastructure Innovation of the Year at the UK IT Industry Awards 2017.

The award recognised Romonet’s ability to provide unrivalled insight through the use of modeling and predictive analytics, from initial design through to end of life. With Romonet’s help companies are making better informed investment decisions and increasing operational efficiency to new levels.

Its main competitors in this category were BT, Capgemini & Rolls Royce, Cradlepoint, DWP Digital & IBM, Geeks, Hyperoptic, Silver Peak, Sudlows and Tegile Systems.

Independently selected as one of IDC’s top three innovative companies under $100 million in the data center facility industry, Romonet is constantly innovating to improve how organizations purchase, manage and monitor their data center infrastructure.

Danny Reeves, CEO of Romonet, said, “The data center industry is evolving fast and Romonet is always striving to develop cutting edge technologies and keep the pace with our customers’ needs, ensuring they have the most accurate model of their facility and operations, so they can make the best possible data driven decisions when investing, designing, building and operating their data center assets.

It’s great to see Romonet’s team being recognized for their hard work in this fast paced industry and we are delighted to be able to help our clients maximize their efficiency potential.”

Romonet's Platform accurately models, simulates and predicts data center power and cooling efficiency, total cost of ownership (TCO), operational and financial performance, and use of natural resources, helping customers better control capital expenditure, manage operating expenses, independently assess the impact of technologies on the data center and the environment, therefore making better investment decisions.

Datacenter Dynamics: Why clean data is as important as clean energy

Datacenter Dynamics: Why clean data is as important as clean energy

Renewable energy initiatives have been on the news agenda the last couple of weeks. According to Bloomberg, a large proportion of the Fortune 500 has set clean energy goals in response to the savings generated by renewable power. As companies amass huge amounts of data, a significant part of their strategy for reaching their ambitious goals will involve data centers, no matter if a business owns, builds or uses them in the Cloud.

Apple is leading the way in this area. The company recently released its Annual Environmental Responsibility Report which provides a detailed outline of the steps it is taking to ensure its data centers are environmentally friendly.   

Read the full article here

bobsguide: How CFOs Can Assess the Efficiency of Data Centres

bobsguide: How CFOs Can Assess the Efficiency of Data Centres 

As businesses transform their organisations to meet increasing customer demands for always-available online and mobile services, data centres become a critical element in a CFO’s growth strategy.

We recently wrote an article for bobsguide on what CFOs should be looking for to ensure they are getting the best return from their data centre investment.

Read the full article here

Romonet delivers vital predictive analysis for Global Switch’s data center in Sydney

Romonet delivers vital predictive analysis for Global Switch's data center in Sydney

Data center owner and operator Global Switch turned to Romonet to analyze one of the most energy efficient and sustainable data centers in the Asia-Pacific region.

The company’s new Sydney East facility has been a huge undertaking, increasing its entire West and East data center campus to 73,000 sq m of space with 83MVA of utility power capacity. Once completed it will be Australia’s first hyper scale data center facility.

Because Sydney’s climate patterns are challenging for low-cost data center operations, Global Switch, which has a strong commitment to environmental sustainability and social responsibility, had to build at significant scale so customers could benefit from reduced Total Cost of Occupancy through energy savings. It also wanted cost effective solutions that would enable it to deliver independently verified Power Usage Effectiveness (PUE) over the lifecycle of the building.

Romonet’s patented predictive analytics platform was able to provide Global Switch with a detailed model of the design options for the Sydney East facility. This played a crucial part in establishing how the design could be enhanced to provide efficiency improvements at every stage. Romonet’s analysis, which aimed to identify designs that would achieve an annualized PUE of 1.33 for the building, gave Global Switch confidence that the design options were suitably tested before expensive detailed design and construction work was initiated.

The modeling captured multiple variations for the cooling system design; a water-based thermal transport system using water cooled chillers and cooling towers. This was rigorously tested against annual weather data to establish the annualised PUE.

Romonet’s ongoing insight is being used to support modifications to the existing cooling system to further improve PUE. As a result, Global Switch’s customers, who include international and national telecommunications companies, cloud providers, corporates and government agencies will receive a significantly more cost-effective solution for housing their equipment at the East Sydney facility.

In addition, because Global Switch can demonstrate its commitment to energy efficient data centers, its customers can be confident that the declared PUE values are accurate and validated by an independent party.

Matthew Winter, Global Switch’s Regional Project Director for Europe commented: “Romonet’s predictive analytics has given what we believe is reliable and accurate insight into the future of the facility. With this model, the Operations team are able to forecast and better understand the next steps of occupation for the remaining stages of Sydney East. We have analyzed many of our sites across the globe with this tool and we value the information it provides as it assists the decision making when considering the manner of data center investments.”

Read the case study

Datacenter Dynamics: The birth of the learning data center

Datacenter Dynamics: The birth of the learning data center

Machine learning is arguably the most powerful tool we have seen so far for unlocking the potential of Big Data, and one of the sectors that stands to benefit most from it is the data center industry.

Given that most companies are highly reliant on the performance of their data centers to ensure services can be delivered and productivity maintained, there is a constant, acute pressure on facilities managers to predict, manage and react to any change that could adversely affect operational availability and performance.

Read the full article here

Information Age: Is data accurate enough for high impact decisions?

Information Age: Is data accurate enough for high impact decisions?

How can data centre managers verify the accuracy of their data, strengthen their company’s commitment to improving energy efficiency and track ROI on critical infrastructure investments?

Analytics tools are being increasingly used to help organisations make crucial strategic decisions. The major challenge for organisations using analytic tools stems from the incorrect assumption that the source data is of an acceptable quality.

Read the full article here

Machine Learning enhances data center performance

Machine Learning enhances data center performance

Machine Learning is arguably the most powerful tool we have seen so far for unlocking the potential of Big Data, and one of the sectors that stands to benefit most from it is the data center industry.

Given that most companies are highly reliant on the performance of their data centers to ensure services can be delivered and productivity maintained, there is a constant, acute pressure on facilities managers to predict, manage and react to any change that could adversely affect operational availability and performance.

As a result, data centers are starting to realise the potential of Machine Learning technologies to automate and improve their ability to keep these facilities performing optimaly. Furthermore, even though many companies still use rudimentary tools to track and oversee data center performance, it is becoming increasingly difficult for humans to manually analyse data directly from DCIM systems and verify if the data collected by the vast sensor networks monitoring these facilities is accurate.

Only last month, the Royal Society published a report that looked at the power and promise of Machine Learning. Its focus was on how this area of computing will reshape the UK economy and people’s lives. The report also asked questions about who will be affected, how the benefits will be distributed and where the opportunities for growth lie.

The benefits for data centers have already been proven and the knock-on effects are many and various. Google recently explained that it is using its DeepMind Machine Learning technology to manage power consumption at its data centers by dynamically tuning their performance to reduce their operational energy consumption. In one of Facebook's data centers meanwhile, its Big Sur servers are training Machine Learning systems to ‘read’ images and videos to the blind and display over two million translated stories every day.

In the same way, Machine Learning has a big part to play in further enhancing analytics tools for data centers. The ability to generate validation models for performance analysis, investment analysis and even for assessing the suitability of a location for a new data center offers immense advantages such as reducing costs and improving operating performance.

Our own analytics platform identifies anomalies and can help track down the causes behind a symptom based on comparison to a calibrated predictive model as well as aggregated data ‘knowledge’ collected over years from over 350 data centers around the world. In this way we’re able to provide our customers with powerful and actionable insight on their operations and systems so they can maintain peak performance at all times and increase ROI.

Inside Big Data: Validated Data Supports Accurate Decision Making and Rapid ROI

Inside Big Data: Validated Data Supports Accurate Decision Making and Rapid ROI

In this special guest feature, Zahl Limbuwala, Co-founder & Executive Director at Romonet, explores how data center managers could verify the accuracy of their data, strengthen the company’s commitment to improving energy efficiency and track ROI on critical infrastructure investments. As co-founder of Romonet, Zahl is deeply passionate about the data center and IT industries. Educated as an engineer in Analogue and Digital Electronics, Zahl’s early career was spent at Microsoft, Cisco and one of the City of London’s first B2B Internet service providers. He was the founding chairman of BCS Data Centre Specialist Group and consultant to the EU Code of Conduct for Data Centers. He is a regular keynote speaker at industry events around the world and holds board advisory positions with a number of other European and US based technology companies.

Read the full article here

Why clean data is as important as clean energy

Why clean data is as important as clean energy

Zahl Limbuwala, Co-Founder & Executive Director

Renewable energy initiatives have been on the news agenda the last couple of weeks. 

According to Bloomberg, a large proportion of the Fortune 500 has set clean energy goals in response to the savings generated by renewable power. As companies amass huge amounts of data, a significant part of their strategy for reaching their ambitious goals will involve data centers, no matter if a business owns, builds or uses them in the Cloud.

Apple is leading the way in this area. The company recently released its Annual Environmental Responsibility Report which provides a detailed outline of the steps it is taking to ensure its data centers are environmentally friendly.   

Of course, in order to assess progress and success these companies will also need to track and report against sustainability and energy efficiency metrics too.

Accurate, reliable data is critical

But metrics are only as good as the accuracy of the data feeding into them. If companies put sustainability at the core of their business strategies, the metrics they set will be heavily scrutinized.

So, what happens if raw data from data centers is not properly cleaned and validated, leading to weeks and even months of incorrect and misleading information?

The result will be an embarrassing anomaly in the resulting operational report and a lot of awkward explaining to managers, stakeholders and potentially customers and shareholders.  

Accurate, reliable data is central to a serious sustainability initiative; collecting raw data and presenting it is simply not enough. After all, important decisions about a facility’s environmental profile are made on the basis of that data, so it needs to be spot-on.

The key to meeting environmental goals with confidence is to collect, clean, validate and then analyze the data relating to energy efficiency, carbon emissions and water consumption, in order for the business to have confidence in it. In this way, data center managers can quickly understand what areas need adjustments and remove the risk of making poorly informed decisions due to bad data when planning changes or improvements for each facility.

Another crucial point for organizations with clean energy objectives is in the planning of data centers. A report containing incorrect data could lead to a design that struggles to meet the business requirements, or to large budgets being spent without a verifiable return on investment . Analysis of available data can help to ascertain the most economic, sustainable and cost effective design options and locations before a spade even hits the ground.

It is reassuring to see so many renewable and environmentally-minded projects being initiated by world leading organizations. Let’s hope they pay as much attention to clean data as they do to clean energy.  

Data Informed: Picking Wisely: Choosing Your Data Center Site with Analytics

Data Informed: Picking Wisely: Choosing Your Data Center Site with Analytics

Before you buy a new house, you measure its cost against its potential future value. You assess utility bills and calculate whether you could afford a yearly increase. You even decide what furniture you need or whether there are better-styled alternatives to existing items.

This is essentially the same process a business should take when selecting a site for a new data center. How much does land cost? What about the unrecoverable costs of construction and are they all manageable? Is there renewable power available? Will hardware from one vendor meet environmental and efficiency requirements over another supplier?

Read the full article here

Mission Critical: How Much Water Do Data Centers Drink?

Mission Critical: How Much Water Do Data Centers Drink?

Ask most data center staff how their professional performance is measured and it will be on availability, nothing else. Financial or operational efficiency doesn’t matter. Their main responsibility is ensuring workers and customers can access the data and business applications they rely on every day.

If the engineering manager requests a batch of new servers or hardware critical to maintaining availability, then the CFO has little choice but to comply. After all, the data center can never fail. It powers everything our world runs on – national economies and banking, transportation, social media.

Read the full article here

The VAR Guy: Data Validation: The Opportunity for the Channel

The VAR Guy: Data Validation: The Opportunity for the Channel

Plenty of enterprises are unsure which data are appropriate to meet their business objectives. Fortunately for channel players, this complexity equals new sales opportunities.

Data is generated by everything – from buildings and machinery to employees and customers. Each of these data sources has the power to change the relative success of a company, though only if data is understood, accurate, analyzed and acted upon.

Read the full article here

Romonet named an Innovator by IDC Report

Romonet named an Innovator by IDC Report

idc-innovatorWe’re one of only three vendors in IDC’s latest report recognising innovative companies in the data center industry. Our platform offers complete data center lifecycle analytics and as IDC states, “provides a single, accurate way of reporting data to key decision makers.”

"Running an agile IT environment requires an equally agile physical facility that is prepared to accommodate demanding and fluctuating IT loads. Technologies that improve the ability to manage the physical environment are essential, especially as data center resources become more distributed to support digital transformation and IoT initiatives," said Jennifer Cooke, research director, Datacenter Trends & Strategies at IDC.

Read the full report.

Sustainability is back on the agenda

Sustainability is back on the agenda

green-step-sustainability-blog-hero-bannerIt’s been quite a while since my last blog, I’ve been pretty busy at work and have been keeping abreast with the latest in analytics, machine learning and AI technologies. There has been a pretty big resurgence in the world of sustainability, especially in the data center sector. Initially I was a little surprised but there’s more substance to the movement compared to the pre-2009 economic crash that pretty much killed sustainability and green initiatives as a board level issue.

Before the last recession the green movement in the data center sector had gathered quite some pace. There was a lot of good work done by the BCS, Green Grid, EPA, LBNL, METI and others around metrics and tracking of how green a data center was. Indeed, it was this very movement that gave birth to the J.R.R Tolkien of metrics, PUE - one ring/metric to rule them all…get it?

Even way back then, when I was chairman of the BCS Data Center Specialist Group, raising awareness of data center energy efficiency (or lack thereof) was best done by talking to environmental lobbyists.

Greenpeace started its Click Green Report back in 2010, naming and shaming companies for how green their data centers weren’t. The Click Green program initially examined energy efficiency in the data center but has evolved to encompass a much broader scope since then.

When the global economic downturn descended upon us most of the less publicly visible corporate world (which back then included most data center companies) put green on the back burner and focused on saving money instead.

I have to say that this always seemed an unwise move to me because in most cases any green initiative worth its salt, especially in the energy efficiency arena, should have a good financial ROI and not just a green brownie points ROI’. The issue was that many didn’t have the tools, context or knowledge to assess and build strong green and financial business cases that could stand up to scrutiny or any sort of third party validation.

Thus I was very happy this year when my conversations with both customers and other industry pundits once again started to include discussions about the green, now referred to more often as sustainability, agenda. I prefer the term sustainability as it is much more encompassing of the broader agenda beyond energy efficiency, such as water consumption, embodied carbon, sustainable construction practices, etc.

Before everyone jumps back on the ‘we need more metrics’ bandwagon, let me say, no we don’t! Stop, put down the white paper draft on the new 'super all-encompassing one sustainable metric to rule them all’! Please just keep it simple and collect, track and analyse data indicating your energy efficiency, carbon emissions (this is a calculation from energy), water consumption and you’ll be well on your way to improving your data center’s sustainability efforts.

Oh and by the way, remember that simply believing raw data from sensors and instrumentation points is not an accurate representation of what’s going on (trust me, we clean and validate data for a living). If you are a service provider, not being able to allocate the fair-share of your overall carbon emissions to your customers is less than ideal.

Sustainability is a board level issue again and claiming your IT is zero carbon because it’s all in the cloud is not going to cut it as far as brand value is concerned, at least in the public’s eyes.

Zahl Limbulwala, Romonet CEO

How we became a Data Center Knowledge Startup to Watch

How we became a Data Center Knowledge Startup to Watch

dck-to-watch-hero-bannerWe passionately believe in the potential of Big Data and the power of our Platform and it is flattering when organizations in our industry recognise what we are doing.

Earlier this year Data Center Knowledge named us as a 2016 Startup to Watch. The editorial team chose a selection of companies addressing some of the most significant challenges facing data center managers and executives today.

Most of these companies have emerged from humble beginnings and evolved rapidly, us included. Just a few years ago, being able to compare expected and actual performance of the data center, model, simulate, predict and control a data center’s energy consumption, capacity, total cost of ownership and environmental risks was considered an elusive concept.

We believe we are changing that perception. Take Intel as an example. With our Platform the company is providing its clients with the operational understanding they need to make more informed decisions.

Intel's objective was to assess how, given the significant energy cost and capital expenses associated with cooling technology, there must be opportunities to run data center facilities at higher ambient temperatures. Our Platform proved Intel's theory to be correct.

Intel's challenge is mirrored by many other enterprises posing questions such as how do you model capacity and predict technology inflexion points? How do you know when to implement the right technologies to deliver the greatest return on investment? What happens to IT performance if you challenge accepted operational parameters and push boundaries?

In the last seven years, we have modeled 500 data centers with 98% accuracy, justified $800 million worth of investment and answered the above-mentioned questions for enterprise data centers and cloud and colocation providers.

Finance and Operations Working in Harmony

At the crux of these 'what if' scenarios is one financial question – how much does it truly cost to run a data center?

In another recent Data Center Knowledge article, the publication explained how complex that question is to answer without tools such as Romonet.

This is where Romonet adds value to an organization. With the power of predictive analytics both enterprise-class data centers and those businesses providing hosting, colocation and cloud services (multi tenant data centers) can address inefficiencies, uncover significant savings, increase infrastructure performance and maximize profitability.

That said, sometimes the information we deliver is used for alternative purposes. For example, Iceotope manufactures servers for cloud service providers and HPC environments. Its liquid-cooled server platform has been modeled and engineered to ensure it harvests as much heat from electronics as possible in the most efficient way. As a result, organizations can reduce data center cooling costs by up to 97%, ICT power load by up to 20% and overall ICT infrastructure costs by up to 50%.

Iceotope used Romonet to analyze and prove the performance benefits of its technology compared to traditional, air-cooled servers. Armed with this accurate, quantifiable data, the company secured $10 million in funding to continue developing its technology.

The challenges facing data center operators and managers extend far beyond simplistic energy targets. They include everything from profit & loss (P&L) targets, return on investment (ROI), total cost of ownership objectives and Corporate Social Responsibility (CSR), to regulatory compliance and how a company sources its natural resources.

Designing a Platform that solves this multitude of challenges is an exciting path, however it is always made more satisfying when those in the industry agree with what we’re attempting to achieve.

To PUE or not to PUE? Is that the question?

To PUE or not to PUE? Is that the question?

pue-question-mark-hero-banner"OMG!" I hear you say! Not another blog wanting to debate the pros and cons of PUE!
I'm not writing this to re-open (did it ever close?!) the debate about PUE. I'm here to talk about how those of you who use it today to track your data center performance can greatly improve its value to you and your business.

But first a little history.....

Many years ago in a land far far away...well it wasn't that far actually; it was Milan in northern Italy. Three guys sat around a dinner table chatting about how the data center industry just needed to start measuring something simple that gave an indication of how efficiently data centers were using energy.

I was one of the three along with Liam Newcombe (my CTO) and Christian Belady (Mr Data Center at Microsoft) and we'd just spent the day together at the European Commission's Joint Research Center (JRC) in Ispra, Italy, which is a very impressive campus where real science happens funded mostly by the EU member states.

Actually while it's an impressive site, because of the large number of non European attendees the meeting didn't take place inside the JRC, but rather the big meeting hall above the JRC tennis club just outside the high security fences of the JRC grounds themselves.

Christian had done a good job at that meeting of pitching the use of PUE to be used within the European Code of Conduct for data centers.

Luckily, Christian and Liam (who was the primary author of the original code and its best practice guidelines at the time) saw eye to eye about the use of PUE. It was the first time they'd met but it was clear to me (being mostly a spectator during much of the conversation that transpired over dinner) they were both cut from the same cloth.

With the might of the Green Grid and many vendors behind it, PUE went on to become the de facto metric for representing data center infrastructure efficiency (can you spot the irony there?).

Today many people spend many hours of their lives trying to explain to others in their company, usually the senior ranks, why the data center PUE getting "worse" (becoming a larger number) was not necessarily a "bad" thing and didn't necessarily mean they'd not done their job properly in terms of looking after the data center.

The problem with PUE (and clearly the industry knows that PUE is far from a perfect metric) is that using the absolute PUE number to track the performance of a site only tells part of the story.

"We know this already" I hear you say...

Yes, you already know that without asking what the corresponding utilization is, you can't really take a view on whether the number you have I front of you is good, bad, indifferent and whether it can or should be improved (we already know that every data center reaches an inflection point where you start trading TCO for PUE) and in an economized data center, you'd also be well advised to ask what the climate did too.

Most organizations today target their data center managers on an absolute reduction of their site's PUE but is that really a good plan? How do you know when you’ve reached that inflection point and while you might continue to shoot for as low as you can go, you’re unknowingly targeting the DC manager to increase your overall Total Cost of Ownership!

Also looking at the absolute PUE value is a bit of an unfair measure for the site manager, because generally the site managers have no control over what happens with the IT load; servers come in, go out, their level of utilization fluctuates, etc. We already know that improving server utilization through virtualization and consolidation for example will often make your PUE worse, due to the total IT load going down, which for any Enterprise IT operator is a good thing but for a colo operator it's generally a bad thing.

In an economised site the PUE will vary significantly with the outside temperature. I've yet to meet a DC manager than has any control over their local climate so a particularly warm year may mean it's simply impossible meet their PUE reduction target for the year.

Dynamic setting of PUE targets and tracking performance against them

With the introduction of predictive system level modeling for data centers (and no I don’t mean a CFD model) it is possible to build a highly calibrated (98% calibration accuracy) model that will allow you to do a number of things:

  1. Verify that your data center is performing at its most optimum PUE given the way it’s been designed and built and with the load and climate it’s operating with.
  2. Where it's not operating at it’s optimum PUE the model is able to show you why as well as where and how to improve it.
  3. Using the metered data from the site and continuously feeding the actual climate data and actual IT load, the calibrated model of a now fully optimized data center will continuously and dynamically tell the site manager what the PUE "should be” if everything is working as expected - something we call the "expected PUE".

Now with this dynamically calculated “Expected PUE” to compare against the actual PUE, the target for the site manager should be to keep the “Expected vs Actual PUE” within an acceptable tolerance; remember the expected PUE will automatically adjust itself for variation in IT load and climate so it’s a fair and equitable target and more appropriately represents the actual domain of control that a site manager can impact.

Now of course you may say "well I could still improve the PUE by making more impactful changes" and you’d be right, whether it’s increasing set points, changing to a different control strategy or upgrading to more efficient drives or equipment, all of these things could well improve the site's PUE.

Another benefit of having a calibrated predictive model is that you can now rapidly try out all the different things you might do to your site to improve it’s PUE, and if the model is capable of modeling cost as well as PUE, then you can make some really well informed decisions about what actions you might take to reduce the absolute PUE of your site, but not going past that TCO vs PUE inflection point without knowing so.

Don't sit back and think you’ve just got to live with being beaten regularly with the internal PUE stick! There is a much smarter, more significant and valuable way to use this important industry metric that will help you manage and reduce PUE using meaningful and achievable targets that take account of all the variables that impact the site’s performance.

Zahl Limbuwala, CEO of Romonet

The Rise and Rise of the Data Center CFO

The Rise and Rise of the Data Center CFO

rise-of-rise-cfoThe data center market has enjoyed many decades of almost unbridled and recession proof growth. This but this year, more than any other, we can see the inevitable signs of a market that's rapidly maturing and finding its longer term feet in the form of bigger, stronger (and in theory) more financially sustainable data center businesses. Businesses that provide the core underpinning resource of the digital and internet-based economy.

That said, this is a significantly different marketplace than it was just a year ago.

The focus has shifted from top to bottom line performance, and with the ever-increasing challenge of understanding, controlling and managing the financial drivers of these businesses, a long overdue change in operational and financial management is required.

It used to be sufficient to create an excel based financial model together for each asset that spoke to the capital requirement, expected operating costs, and projected revenues and thus provide a pretty good macro level yield model.

Roll many of those together and so long as your model was relatively conservative it was almost a sure thing that you'd have a free cash flow generative business, so long as you had sales people that could sell.

There were good deals but also so less than great deals done over the years as far as acquiring customer revenue was concerned for most operators. Quite often the commercial model was tweaked, and even more often larger customers were sold capacity on "special pricing or special terms".

Thus today most multi tenant operators have a mix of good and 'bad' customers from both a revenue, but more importantly, a margin perspective. However, understanding exactly what the margin per customer actually is, is an amazingly complex and time/resource consuming task. In fact it's worse than that, as while doing a point-in-time analysis is hard, the reality is the dynamic. So by the time you've figured it out, the variables have changed and thus your information is already out of date!

CFOs of all competent data center businesses out there will recognize this problem because it's what they are grappling with right now. If they aren't feeling these issues yet, it's either because of the inertia (mostly a function of their financial size in this case) their business already has, or they are in even bigger trouble than they realize!

CFOs have risen to prominence alongside the CIO within corporate enterprise due to the ever increasing budget and importance of technology to a business's competitive advantage - or even just its continued existence in heavily commoditized markets.

The CFO is about to rise to prominence in a similar way within data center companies.

It's no longer tenable to try and manage capital and operational spend using spreadsheets and a finance system alone. Financial models need to be tied to operational models or one will mislead the other, leading to tears and gnashing of teeth.

Financial planning and modeling can no longer be a once a year static high cost time intensive exercise. Every deal must be rapidly assessed and its margin understood before a contract is signed. Further, the financial performance of each customer must be automatically tracked so that when the dynamic of the asset changes, either intentionally or unintentionally, the impact on the financial return is immediately visible.

All of this requires new tools, new capability and automation between operations, engineering and finance that's never existed before, and it is far from easy to create!

Luckily, Romonet exists to solve these problems and meet this need and we've been anticipating it for the last eight years, so guess what, we are ready and able to help the data center industry go through this next stage of its economic maturity.

Zahl Limbuwala, CEO of Romonet