Machine Learning enhances data center performance

Machine Learning enhances data center performance

Machine Learning is arguably the most powerful tool we have seen so far for unlocking the potential of Big Data, and one of the sectors that stands to benefit most from it is the data center industry.

Given that most companies are highly reliant on the performance of their data centers to ensure services can be delivered and productivity maintained, there is a constant, acute pressure on facilities managers to predict, manage and react to any change that could adversely affect operational availability and performance.

As a result, data centers are starting to realise the potential of Machine Learning technologies to automate and improve their ability to keep these facilities performing optimaly. Furthermore, even though many companies still use rudimentary tools to track and oversee data center performance, it is becoming increasingly difficult for humans to manually analyse data directly from DCIM systems and verify if the data collected by the vast sensor networks monitoring these facilities is accurate.

Only last month, the Royal Society published a report that looked at the power and promise of Machine Learning. Its focus was on how this area of computing will reshape the UK economy and people’s lives. The report also asked questions about who will be affected, how the benefits will be distributed and where the opportunities for growth lie.

The benefits for data centers have already been proven and the knock-on effects are many and various. Google recently explained that it is using its DeepMind Machine Learning technology to manage power consumption at its data centers by dynamically tuning their performance to reduce their operational energy consumption. In one of Facebook's data centers meanwhile, its Big Sur servers are training Machine Learning systems to ‘read’ images and videos to the blind and display over two million translated stories every day.

In the same way, Machine Learning has a big part to play in further enhancing analytics tools for data centers. The ability to generate validation models for performance analysis, investment analysis and even for assessing the suitability of a location for a new data center offers immense advantages such as reducing costs and improving operating performance.

Our own analytics platform identifies anomalies and can help track down the causes behind a symptom based on comparison to a calibrated predictive model as well as aggregated data ‘knowledge’ collected over years from over 350 data centers around the world. In this way we’re able to provide our customers with powerful and actionable insight on their operations and systems so they can maintain peak performance at all times and increase ROI.

Romonet Launches Disruptive Free Tool to Help Companies Choose the Best Data Center Location

Romonet Launches Disruptive Free Tool to Help Companies Choose the Best Data Center Location

Romonet, the award-winning data center analytics company, has launched a new tool – Site Analysis Tool (SAT) – aimed at helping businesses and data center managers to quickly analyze and compare the impact of location and design on the performance of their data centers.

SAT is freely available on Romonet’s website to all business leaders and operational staff who want to understand and compare the differences in performance between the most common data center designs:

The tool enables users to compare the energy, PUE (Power Usage Effectiveness), energy-cost and CO2 performance of each design across many different climates in Europe and North America. Currently Romonet’s tool covers 69 locations in the US and 22 locations in Europe and the company plans to continue adding locations as its data grows.

Users can either select a specific location and see how the four data center types compare in performance, or they can select a specific data center type to see how it performs across different geographical and climate regions. To ensure the simulation is as accurate as possible, users can also select the estimated IT load of the data center.

Users can also download a PUE surface plot that shows the expected PUE for all load points and external climate conditions. Romonet’s PUE surface plots are one of the most industry recognised visual outputs.

“In many cases, when having to decide on the location of a data center or its design companies would invest significant resources and time to get the necessary insight regarding the options most suitable for their business needs. SAT offers this information for free in a matter of seconds. I hope this will shake up the industry and make many business leaders reconsider the value and power of analytics data in decision-making regarding their data centers,” said Zahl Limbuwala, co-founder, Romonet.

SAT’s results are generated using over 1,600 simulation years’ worth of data processed with Romonet's patented Data Center Analytics technology.

Each simulation model represents a 1MW data center operating at one of four load points for twelve months using TMY (Typical Meteorological Year) data, calculating performance for each hour of the simulated year. These results are then totaled to provide typical annual performance metrics for each climate and energy cost region.

“SAT is a great starting point when looking at the initial design and location choices for a new data center. However, as the industry evolves and innovates data centers become increasingly sophisticated and generic data is simply not enough to model and calibrate a highly efficient, reliable facility. Romonet provides a range of data center lifecycle services that can guide data center decision makers with accurate analytics data based on their specific and unique characteristics.

Data centers are now under close scrutiny when it comes to their financial, energy and environmental performance. Business leaders need to up their game and realize that they need to analyse data center telemetry data constantly and compare their data centers’ performance against a dynamic baseline, if these facilities are to support business growth, meet their original business case and not become a risk,” said Limbuwala.

Why clean data is as important as clean energy

Why clean data is as important as clean energy

Zahl Limbuwala, Co-Founder & Executive Director

Renewable energy initiatives have been on the news agenda the last couple of weeks. 

According to Bloomberg, a large proportion of the Fortune 500 has set clean energy goals in response to the savings generated by renewable power. As companies amass huge amounts of data, a significant part of their strategy for reaching their ambitious goals will involve data centers, no matter if a business owns, builds or uses them in the Cloud.

Apple is leading the way in this area. The company recently released its Annual Environmental Responsibility Report which provides a detailed outline of the steps it is taking to ensure its data centers are environmentally friendly.   

Of course, in order to assess progress and success these companies will also need to track and report against sustainability and energy efficiency metrics too.

Accurate, reliable data is critical

But metrics are only as good as the accuracy of the data feeding into them. If companies put sustainability at the core of their business strategies, the metrics they set will be heavily scrutinized.

So, what happens if raw data from data centers is not properly cleaned and validated, leading to weeks and even months of incorrect and misleading information?

The result will be an embarrassing anomaly in the resulting operational report and a lot of awkward explaining to managers, stakeholders and potentially customers and shareholders.  

Accurate, reliable data is central to a serious sustainability initiative; collecting raw data and presenting it is simply not enough. After all, important decisions about a facility’s environmental profile are made on the basis of that data, so it needs to be spot-on.

The key to meeting environmental goals with confidence is to collect, clean, validate and then analyze the data relating to energy efficiency, carbon emissions and water consumption, in order for the business to have confidence in it. In this way, data center managers can quickly understand what areas need adjustments and remove the risk of making poorly informed decisions due to bad data when planning changes or improvements for each facility.

Another crucial point for organizations with clean energy objectives is in the planning of data centers. A report containing incorrect data could lead to a design that struggles to meet the business requirements, or to large budgets being spent without a verifiable return on investment . Analysis of available data can help to ascertain the most economic, sustainable and cost effective design options and locations before a spade even hits the ground.

It is reassuring to see so many renewable and environmentally-minded projects being initiated by world leading organizations. Let’s hope they pay as much attention to clean data as they do to clean energy.  

Romonet named an Innovator by IDC Report

Romonet named an Innovator by IDC Report

idc-innovatorWe’re one of only three vendors in IDC’s latest report recognising innovative companies in the data center industry. Our platform offers complete data center lifecycle analytics and as IDC states, “provides a single, accurate way of reporting data to key decision makers.”

"Running an agile IT environment requires an equally agile physical facility that is prepared to accommodate demanding and fluctuating IT loads. Technologies that improve the ability to manage the physical environment are essential, especially as data center resources become more distributed to support digital transformation and IoT initiatives," said Jennifer Cooke, research director, Datacenter Trends & Strategies at IDC.

Read the full report.

The unquenchable thirst of the data center

The unquenchable thirst of the data center

stack-romonet-hero-bannerRomonet explains the importance of measuring water in the data center and what can be done to reduce consumption.

Operators need to do is find is a balance to satisfy business stakeholders, shareholders, the press and green lobbyists.

Read the full article here on The Stack.

CSR has turned its attention to data centers, their management practices and their widespread impact on the environment. Times are changing and water consumption is a major component of company CSR policies.

Sustainability is back on the agenda

Sustainability is back on the agenda

green-step-sustainability-blog-hero-bannerIt’s been quite a while since my last blog, I’ve been pretty busy at work and have been keeping abreast with the latest in analytics, machine learning and AI technologies. There has been a pretty big resurgence in the world of sustainability, especially in the data center sector. Initially I was a little surprised but there’s more substance to the movement compared to the pre-2009 economic crash that pretty much killed sustainability and green initiatives as a board level issue.

Before the last recession the green movement in the data center sector had gathered quite some pace. There was a lot of good work done by the BCS, Green Grid, EPA, LBNL, METI and others around metrics and tracking of how green a data center was. Indeed, it was this very movement that gave birth to the J.R.R Tolkien of metrics, PUE - one ring/metric to rule them all…get it?

Even way back then, when I was chairman of the BCS Data Center Specialist Group, raising awareness of data center energy efficiency (or lack thereof) was best done by talking to environmental lobbyists.

Greenpeace started its Click Green Report back in 2010, naming and shaming companies for how green their data centers weren’t. The Click Green program initially examined energy efficiency in the data center but has evolved to encompass a much broader scope since then.

When the global economic downturn descended upon us most of the less publicly visible corporate world (which back then included most data center companies) put green on the back burner and focused on saving money instead.

I have to say that this always seemed an unwise move to me because in most cases any green initiative worth its salt, especially in the energy efficiency arena, should have a good financial ROI and not just a green brownie points ROI’. The issue was that many didn’t have the tools, context or knowledge to assess and build strong green and financial business cases that could stand up to scrutiny or any sort of third party validation.

Thus I was very happy this year when my conversations with both customers and other industry pundits once again started to include discussions about the green, now referred to more often as sustainability, agenda. I prefer the term sustainability as it is much more encompassing of the broader agenda beyond energy efficiency, such as water consumption, embodied carbon, sustainable construction practices, etc.

Before everyone jumps back on the ‘we need more metrics’ bandwagon, let me say, no we don’t! Stop, put down the white paper draft on the new 'super all-encompassing one sustainable metric to rule them all’! Please just keep it simple and collect, track and analyse data indicating your energy efficiency, carbon emissions (this is a calculation from energy), water consumption and you’ll be well on your way to improving your data center’s sustainability efforts.

Oh and by the way, remember that simply believing raw data from sensors and instrumentation points is not an accurate representation of what’s going on (trust me, we clean and validate data for a living). If you are a service provider, not being able to allocate the fair-share of your overall carbon emissions to your customers is less than ideal.

Sustainability is a board level issue again and claiming your IT is zero carbon because it’s all in the cloud is not going to cut it as far as brand value is concerned, at least in the public’s eyes.

Zahl Limbulwala, Romonet CEO

And the Winner is…

BusinessGreen Leaders Award Winner IT Project of the Year

WINNER: Romonet, Slashing Data Centre Water Usage Project

The judges praised Romonet for its commitment to tackling the oft-ignored problem of data centre water use through its Big Data Platform, which provides the data and modelling functionality needed to help businesses manage and optimise their IT-related water use. 

Romonet’s Analytics Platform: New Machine Learning Capabilities to be Launched

Romonet's Analytics Platform New Machine Learning Capabilities to be Launched

machine-learning-romonet-bannerRomonet today announced it is filing for a number of new patents for the next phase of its data center intelligence platform, utilizing the applications of Machine Learning. The company is known as the leader in data center analytics and this development enhances the value of Romonet’s already patented solution.

"We have been working on advanced data handling and Machine Learning algorithms for over a year, focusing predominantly on enhancing our solution to learn and become as proficient as our human data scientists are today at identifying anomalies, and tracking down the cause behind the symptom. This capability provides powerful operational and business insight into data center systems and component level performance," said Liam Newcombe, Romonet's co-founder and CTO.

Having modeled, collected data and analyzed hundreds of data centers in the past eight years, Romonet’s platform has an incredibly detailed and expansive data archive on how facilities of every size perform under different climate, environmental, energy, IT and commercial factors.

While the use of Machine Learning applications is not new, Romonet’s platform is the first to combine metered data, Machine Learning, simulation and predictive analytics.

"In our case, teaching the machine is much faster as we feed it pre-cleansed and calibrated data to recognize and learn patterns, incorporate additional data from outside sources, and teach the software to suggest causes and recommended actions from previously learned results," continued Newcombe.

Like Google, Amazon, Cisco and Netflix, who already use Machine Learning to personalize services and business intelligence, Romonet's industry-leading platform is revolutionizing the booming global data center market.

With Romonet, Hyperscale and Multi-Tenant Data Center (MTDC) operators are improving the services they provide to their customers while strengthening financial management through investment, cost and margin analysis. Enterprise data center owners, whose facilities, while core to service delivery, are also a drain on profitability, are rationalizing their investments, accurately planning a hybrid (owned, colocated and cloud) strategy years into the future, and improving their ability to make socially responsible decisions that impact the environment, shareholders and employees.

Romonet Launches Water Analytics for the Data Center

Romonet Launches Water Analytics for the Data Center

water-chartRomonet has issued a strong call to action for the world’s executives, enterprises and data center operators to immediately act on one of the most urgent challenges facing the global economy and environment today.

To support this campaign, Romonet has developed the world’s only Big Data and predictive analytics Platform that enables organizations to solve their water-related sustainability, financial and operational challenges.

Romonet’s Platform already accurately models and simulates data center energy and total cost of ownership, and now provides the ability to precisely measure water efficiency, capacity, consumption and the cost of water. This lets organisations analyze and understand the trade-off between energy, cost and water consumption.

Data center water consumption is rising rapidly as organizations trade improved power efficiency for unsustainable water usage practices. Massive amounts of water are pumped into data center cooling towers for energy efficiency purposes, a strategy that is particularly prevalent in hot climates and desert regions where public water supplies are under enormous pressure. Furthermore, during the process water is often treated with a cocktail of industrial chemicals and later drained back into municipal sewage systems.

Any organization that does not address its water consumption risks intense scrutiny from the general public, customers and shareholders; environmental lobbying groups such as Greenpeace; and from national and regional governments which are tasked with tackling large-scale industrial usage and identifying ways to secure fresh water reserves for agricultural and national security purposes.

"The efficient and judicious management of resources to power IT will only become more challenging in the next few years, and the ability to automate tasks and leverage analytics to drive decisions will be a competitive differentiator for data center managers. Romonet's Platform meets a critical need in the market for increased visibility into resource usage and management, in particular water consumption and cost," said Jennifer Cooke, research director at IDC.

California is a clear example. The Wall Street Journal reports there are 800 data centers consuming an estimated total of 158,000 Olympic-sized swimming pools of water each year. This is in an American state overcoming one of its most severe droughts on record.

Organizations in California and across the world must demonstrate a commitment to lowering water usage and production related processes such as transportation, logistics, filtration, recycling and long-term storage.

Zahl Limbuwala, CEO of Romonet, said, "Water is one of the largest threats to international stability and data centers are voluntarily using fresh water reserves as though they are infinite. With water subsidiaries ending and public pressure mounting, organizations cannot be frivolous with how they treat our environment. Water is not merely a cost challenge, but a highly sensitive CSR objective. This challenge must be addressed now, not in the future when it is too late. Organizations should act positively before they potentially find themselves under the public spotlight for what some consider corporate mismanagement and environmental indifference."

 

The Rise and Rise of the Data Center CFO

The Rise and Rise of the Data Center CFO

rise-of-rise-cfoThe data center market has enjoyed many decades of almost unbridled and recession proof growth. This but this year, more than any other, we can see the inevitable signs of a market that's rapidly maturing and finding its longer term feet in the form of bigger, stronger (and in theory) more financially sustainable data center businesses. Businesses that provide the core underpinning resource of the digital and internet-based economy.

That said, this is a significantly different marketplace than it was just a year ago.

The focus has shifted from top to bottom line performance, and with the ever-increasing challenge of understanding, controlling and managing the financial drivers of these businesses, a long overdue change in operational and financial management is required.

It used to be sufficient to create an excel based financial model together for each asset that spoke to the capital requirement, expected operating costs, and projected revenues and thus provide a pretty good macro level yield model.

Roll many of those together and so long as your model was relatively conservative it was almost a sure thing that you'd have a free cash flow generative business, so long as you had sales people that could sell.

There were good deals but also so less than great deals done over the years as far as acquiring customer revenue was concerned for most operators. Quite often the commercial model was tweaked, and even more often larger customers were sold capacity on "special pricing or special terms".

Thus today most multi tenant operators have a mix of good and 'bad' customers from both a revenue, but more importantly, a margin perspective. However, understanding exactly what the margin per customer actually is, is an amazingly complex and time/resource consuming task. In fact it's worse than that, as while doing a point-in-time analysis is hard, the reality is the dynamic. So by the time you've figured it out, the variables have changed and thus your information is already out of date!

CFOs of all competent data center businesses out there will recognize this problem because it's what they are grappling with right now. If they aren't feeling these issues yet, it's either because of the inertia (mostly a function of their financial size in this case) their business already has, or they are in even bigger trouble than they realize!

CFOs have risen to prominence alongside the CIO within corporate enterprise due to the ever increasing budget and importance of technology to a business's competitive advantage - or even just its continued existence in heavily commoditized markets.

The CFO is about to rise to prominence in a similar way within data center companies.

It's no longer tenable to try and manage capital and operational spend using spreadsheets and a finance system alone. Financial models need to be tied to operational models or one will mislead the other, leading to tears and gnashing of teeth.

Financial planning and modeling can no longer be a once a year static high cost time intensive exercise. Every deal must be rapidly assessed and its margin understood before a contract is signed. Further, the financial performance of each customer must be automatically tracked so that when the dynamic of the asset changes, either intentionally or unintentionally, the impact on the financial return is immediately visible.

All of this requires new tools, new capability and automation between operations, engineering and finance that's never existed before, and it is far from easy to create!

Luckily, Romonet exists to solve these problems and meet this need and we've been anticipating it for the last eight years, so guess what, we are ready and able to help the data center industry go through this next stage of its economic maturity.

Zahl Limbuwala, CEO of Romonet