Datacentrix recently participated in a multicloud feature story, running in Brainstorm Magazine.
14 April 2021
With data’s creation only set to increase as more devices come online, the trusty datacentre is in need of an upgrade. But with multicloud offerings so readily available, is it worthwhile to refresh your datacentre?
Rudie Raath, Datacentrix (Karolina Komendera)
Datacentre technology is impacted by the same drivers as the rest of the ICT sector. Trends such as 5G, IoT, as-a-Service and environmental awareness are all likely to impact the datacentre. Naturally, the Covid-19 pandemic and associated lockdown have also had an impact.
So says Jason Osner, sales manager for Rittal South Africa, adding: “Without assistance from systems that feature artificial intelligence, IT datacentre specialists will soon be unable to operate large and complex IT systems in a fail-safe manner.
“According to the IDC, by 2022, half the components in large datacentres will already feature integrated AI functions and operate autonomously. To support this development, IT administrators will have to use predictive analytics and machine learning to streamline IT operations. These tools will provide predictive fault forecasts and support optimised load balancing so that companies can ensure that their IT environment offers high availability.”
There’s also an increasing need for decentralised datacentres, says Osner. “Many SMEs used to keep servers or entire racks on-premises, which were operated and maintained by their IT staff. As offices started to close due to the lockdown, businesses were forced to move their hardware to proper datacentres and interest in colocation services has increased significantly. And as data volumes grow – driven by 5G and IoT – it’s just not possible to store all of this data in an on-site datacentre. Not only will this place undue pressure on the company’s infrastructure, it will cost a fortune. There’s a growing trend towards companies migrating their data to the cloud – and even in multicloud environments, in some instances. Cloud environments typically come with Everything-as-a-Service, freeing up companies to focus on their core business.”
These and other trends are placing organisations under pressure to modernise their datacentres to allow them to operate at the speed of business. However, modernising a datacentre can not only be extremely costly, it can also be a journey that may never end. Businesses should, instead, look at their workloads and identify what they’re actually needing to put into the datacentre, allowing for a clearer understanding of whether it should be modernised or closed down, says Rudie Raath, Datacentrix’ chief digital officer. He adds that as we’re moving towards hybrid IT, it’s no longer about that rack, or having the latest technology in your datacentre, it’s about the location. Companies need to ask questions like, ‘Am I close enough to where everything happens?’, and ‘Am I close enough for cloud on-ramps?’. The answers may show that it’s no longer a question of modernising a datacentre, but whether it’s even still necessary to own a datacentre.
If you don’t have a clear idea of what is running and where, or how you can optimise it, you could end up paying for the datacentre to effectively be idle for up to 60% of the time.
Rudie Raath, Datacentrix
According to Raath, there are no benefits to datacentre modernisation. “Looking at this statement through the lens of an end-user organisation that has its own datacentre, we may discover that, without rethinking their entire datacentre hosting strategy and blindly spending on modernising the existing infrastructure, they could find themselves on the back foot when compared to competitors. Rival companies may have evolved beyond owning a datacentre, moving closer towards an ecosystem of third-party providers that are better able to harness hybrid IT. What’s important to consider here is that datacentre modernisation is no longer about updating brick and mortar and the technology within it, but, rather, rethinking what a datacentre means to an organisation.”
The automation of tasks from an organisational perspective makes doing business a lot simpler, and more predictable, adds Mark Reynolds, director, Commercial Business, Sub-Saharan Africa at VMware. “
The reality is that your datacentre doesn’t need to be a physical environment in your building anymore. From a financial perspective, things have changed. Whereas in the past, there may have been significant capital outlay, new datacentres are more of an opex outlay, particularly if you’re renting space to host your environment on, or running a combination of on- and off-premise solutions.”
There are also risks associated with older infrastructure. Obviously, downtime is a major risk, says Raath. The older the equipment, the more failures you’ll see. That’s the bottom line. Previously, downtime could be managed and was certainly more acceptable. Today, it’s measured right down to the second and has a direct impact on the bottom line. In some cases, this could even impact on lives and risk the failure of businesses. Ageing infrastructure can no longer be held onto, with the idea of sweating assets. In most cases, businesses are, in fact, limited by their ability to harness innovative technologies, like hyperscalers, due to ageing infrastructure. They end up stifling their own potential growth and allowing their competitors, including startups, to bypass them.
The question of risk often seems to circle around costs when it comes to ageing infrastructure, and how to realise the value in that underlying cost, says Reynolds. Before investing in technology refreshes, businesses must stop to think about them potentially being held back by what they have. This is an important question. It’s tricky to futureproof your organisation; you can never be fully sure of what the market is going to look like in a year or three, just consider the last 12 months. It’s essential to find a way to do this, whether that is through virtualisation or another means; it must be done to assist a business with the flexibility needed to move workloads and be agile. As much as there’s significant growth in the cloud, there are businesses that still feel they’d rather control and manage their own environment. This is where the balancing act comes in and where you need to aim to get the maximum capacity out of your infrastructure while taking advantage of new technologies.
So, what are the steps to take when embarking on a datacentre modernisation journey? Raath says if you’ve made a strategic decision to modernise your datacentre for a certain reason, it’s critical to start understanding all of your workloads and their dependencies, and then mapping them back to the different hosting platforms. “For instance, you would need to ascertain which workloads depend on which others, and what their runtimes are. It’s about understanding the service you offer, what it encompasses, and where it sits. This is because, in any datacentre discussion, at least 70% of spend is going to go to power and cooling costs. If you don’t have a clear idea of what is running and where, or how you can optimise it, you could end up paying for the datacentre to effectively be idle for up to 60% of the time.”
“Without assistance from systems that feature artificial intelligence, IT datacentre specialists will soon be unable to operate large and complex IT systems in a fail-safe manner.”
Jason Osner, Rittal
Reynolds believes this is different for everyone. “The journey we walk with customers is making the shift towards a software-defined datacentre, as hardware or compute virtualisation is something that’s well entrenched in the market, but it isn’t an easy mindset change for everyone. There are considerations such as networking, storage, the management of these and the security around them that need to be taken when modernising. Some companies can make the shift with ease, while others find themselves taking a more calculated approach. Traditionally, the first step tends to be the move beyond computing and the adoption of virtualisation to manage the environment. This is then followed by virtualisation of storage and networking (always considering security), all critical to creating the right environment to take the next step to the cloud. The beauty of being software-defined is that you can carve up what you want to and host it where you need to. Today, the biggest challenge is that IT teams and businesses need to consider all their remote workers. The IT organisation used to control how and when they connected to the environment, to get the most productivity out of them. That won’t always be the case.
“Also, consider the legacy costs that companies have. Organisations or startups scaling into a datacentre for the first time can manage their finances a lot differently to those that are tied to a legacy environment and have sanctioned costs. When navigating how you can get around constraints in your datacentre, automation, self-reliance, and the fact that it physically doesn’t need to be in one location are importa nt considerations. Think about how we are going to work in future, the cost of office space and whether you even want to use office space, for example, then factor in datacentre space – all of these should guide your decisions of where your applications should land.”
One of the major pain points to avoid is holding onto the idea that you have to own everything within your environment. This is simply not true, says Raath. Another issue to avoid is remaining on legacy application development frameworks. Here, the takeaway point is that you don’t need to refrain from utilising legacy workloads, but from using legacy application development frameworks. So, when it comes to application development, this should be completed within a DevOps framework, where you’re looking at serverless computing or processing. In this day and age, developers cannot develop in legacy terms; they must recognise that the world has fundamentally changed. In addition, companies today must move away from using the legacy methodology of requiring all users to connect centrally. Looking specifically at the Covid-19 pandemic, everyone had to move away from the office environment and work from home, but organisations were still trying to take all of these home connections, force them all the way back to the office, and then drive them all the way back to the internet again.
The reality is that your datacentre doesn’t need to be a physical environment in your building anymore.
Mark Reynolds, VMware
“Our entire philosophy has changed,” says Raath. “We cannot architect in the same way as before, meaning that we command centrally, but control at the edge. Also critical to consider are compliancy principles. The regulations put in place by PoPI, GDPR and PCI are key when looking at any redesign or modernisation of the datacentre.”
Speaking of the security challenges posed by modern datacentre environments, Reynolds says: “Your datacentre shouldn’t necessarily be situated in one place. There’s onpremise, hybrid or multicloud to think about. But, importantly, it’s not just a physical environment anymore – it’s amorphous. You don’t even have to manage it and it can be under someone else’s control, managed as a service and financed differently. Given the remote workforce, security is a real challenge to your datacentre environment and you must make sure that they connect or engage with it via a secure VPN or on an SD-WAN. When you’re working with a datacentre or cloud provider, they’re likely certified professionals underwritten by the OEMs that are providing the technology they use. So, one would, therefore, be assured that they bring their security acumen along with this.”
Read: Wrap your enterprise in HCI
Raath believes security principles must change. “There’s no longer a traditional boundary or architecture with a single perimeter to protect. You can no longer depend on a firewall; security must be run all the way to the edge. In fact, I predict that we will see the death of the traditional firewall in companies within the next five to 10 years. Further, a move away from reactive to proactive security is critical. The incidents we’ve seen in SA alone over the past few months can be directly attributed to the fact that we’re not managing vulnerabilities in a proactive manner. Modern datacentre challenges include the diversification of connections – people are connecting from everywhere – which means that we must also have vulnerability management everywhere. We have to see how we connect, manage and secure all of these connections through proactive vulnerability management with the proper processes and controls, as well as central reporting and management in place. ‘Noise’ is a huge challenge within modern datacentres. Because we have so many applications with natively built-in security controls and are thus seeing so many events per second, the most critical events could potentially be missed. We need to think very carefully about what sort of security measures are deployed, how they can be managed centrally, and, with today’s vast amounts of data, how we make sure that this noise is managed and the critical security elements are attended to.”
Read the original Brainstorm article here.