The pace of change within the regulatory and compliance landscape has significantly increased over the last few years, making data centre compliance even more of a challenge.
To keep up, they need to constantly track and respond to changes in regulation to remain compliant with regulatory and auditing standards. This means data centre operators need to optimise cost, performance and security to ensure compliance with regulations and industry standards is achieved.‘With generative AI, data protection compliance and privacy regulations are going to become much more challenging’ Ideally, this should be a voluntary exercise, one which also audits the organisation’s technology – including everything from network infrastructure through to cloud storage. This should include how data centres manage data to ensure that it can be transferred quickly and securely between data centres and disaster recovery sites. This may be achieved by using WAN Acceleration to mitigate latency and packet loss, and by sending encrypted data in ways that WAN Optimisation cannot do. The larger the organisation, the more important it is to consider data centre compliance auditing obligations. After all, a non-compliant data centre could lead to a breach of data protection and privacy regulations. Yet small and medium-sized businesses (SMEs)are also having to consider compliance and regulation. They aren’t isolated from its impact, and so they also need to ensure they work with a compliant hosting data centre.
The cost benefit of private data centres for SMEs has changed with the increase in hosting possibilities and public and private cloud and with hybrid private cloud deployment to lower costs. This should lead to increased competition between data centres, along with more flexible and lower hosting costs. Requirements such as the Health Insurance Portability and Accountability Act (HIPAA), PCI, the General Data Protection Regulation (GDPR), California Privacy Rights Act (CPRA), and California Consumer Privacy Act (CCPA), mean that data protection regulations now extend to the hosting data centres. They cover the US and the EU, and the UK has its own version of GDPR. Over the past couple of years, Environmental, Social and Governance (ESG) has become very important for larger corporations. They have to have environmental measures across their supply chain, including data centre hosts. This has been reflected back to the data centres. For example, many of the larger data centres in the USA are hosted in areas that have low-cost, environmentally sound electricity. They use solar and wind, and yet they use millions of gallons of water for evaporating cooling systems. The trouble is that they can cause water shortages in these areas.
Adding to the ESG compliance challenge is the rising number of cyber-attacks. Compliance isn’t therefore just about becoming greener, it’s also about protecting data.
“Ransomware is particularly a threat”, says Steven Umbehocker, CEO of OSNEXUS. With generative AI, data protection compliance and privacy regulations are going to become much more challenging.
Says Umbehocker: “The likes of ChatGPT could spawn a significant increase in cyber-attacks.” He says this is because cyber-criminals are now using generative AI tools to commit cyber-crimes. Regulations and legislation are there to ensure organisations do the necessary business continuity planning. This can be made more complex as data centres are constantly evolving, and because governments inadvertently create uncertainty by changing policies. This can make it harder to know what data centres need to do to meet regulatory compliance obligations. Umbehocker explains: “As an example, QuantStor is designed with hybrid cloud in mind so that organisations can protect their data with immutability features and easily move data to a public or private cloud in a different data centre. When the regulations change, you need the right tools in order to do that rapidly.”
Yet it’s not only beholden to the data centres to implement regulations and to maintain regulatory compliance. Customers too must accommodate them, and this often requires having dedicated teams to review and implement new regulations. The smaller the organisation, the harder this is to achieve. Larger organisations may have dedicated teams or decide to outsource these processes – and this may include auditing their wide area network (WAN) performance.
As we move more and more sensitive data around, data centres and their customers need to have efficient, reliable, secure transmission paths. Many of the WAN Optimisation tools, which are meant to improve the efficiency of sending and receiving data over a Wide Area Network, store data and require encryption keys to function. Encryption is like having a padlock on a gate, requiring keys to unlock it. With encryption keys, data can be unlocked, but WAN Optimisation tools cannot cope with encrypted data. This limits the upper bandwidth capabilities of these products. The problem is that this reflects upon the poor service level of this technology.
Data security perspective
To protect data, data centres conduct audits. They are an inspection of all of the processes, physical and virtual security, and procedures in place to ensure that data is kept securely. It also enables data centres to uncover any weaknesses they have, which could render them inoperable – known as downtime.
To ensure that they can remain functional and operable, audits can enable them to create plans to maintain operations or to recover from a disaster such as a security breach. Audits also include the screening of employees and contractors who access equipment and sensitive data. Umbehocker sums up why auditing is crucial from a data security perspective: “It’s important to audit to ensure that encryption is used end-to-end. You need to keep up with the latest encryption technology because vulnerabilities will be found.
“Even if there is encryption you need to ensure that it stays within compliance. This includes within the WAN and network infrastructure. Routers and switches have firmware that has to be updated regularly to protect against cyber-attacks. A cyber-criminal getting access to these is like attacking the heart of the data centre.” WAN Acceleration uses artificial intelligence and machine learning, as well as data parallelisation, to mitigate the effects of latency and packet loss. Latency and packet loss can, for example, lead to what’s know as “jitter” – that’s occurs when a download takes forever – or not at all. It slows down the transfer of data across a WAN. Artificial intelligence is the development of computer systems that will have the ability to learn like humans can, to perform complex tasks such as visual perception, speech recognition, decision-making, or the acceleration of data transfers across a network.
Using WAN Acceleration, the requirement for storing data and providing encryption keys disappears because WAN Acceleration has a “no-touch – no change – no store” attitude to customer data. This helps to remove concerns that are at the heart of WAN Optimisation devices – storing data and providing the keys for data encryption. Joshua Newington-Blake, principal support engineer for EMEA OSNEXUS, adds: “In a WAN Acceleration solution such as PORTrockIT you can understand how fast the data is going, and if it isn’t up to speed, you can investigate why.” Umbehocker says the audits should include monitoring the packets coming in and out of the WAN to ensure everything is being encrypted. There is also a need to ensure the hardware and software is being regularly updated with the latest security patches.
Data centre compliance
“Companies such as Intel have created software to help data centres monitor the hardware and ensure it is in compliance; these include data centre infrastructure management [DCIM] products, which are critical to audits and to ensuring compliance,” he says. He then adds that there are many standards, such as Redfish – which is managed by the Distributed Management Taskforce (DMTF). DMTF (formerly known as the Distributed Management Task Force) creates open manageability standards spanning diverse emerging and traditional IT infrastructures including cloud, virtualisation, network, servers and storage. Member companies and alliance partners worldwide collaborate on standards to improve the interoperable management of information technologies. “They are a standards involving companies such as DELL and Cisco, working together to enable their products to do these data centre checks to ensure compliance, and to see if there is a need for security patches”, Umbehocker explains. Newington-Blake adds that we should not forget HIPAA, a federal law which requires the creation of national standards to protect sensitive patient health information from being disclosed without the patient’s consent or knowledge.
For HIPAA compliance, organisations have to ensure features meet regulatory requirements. This means that data centres and their customers must do some additional work to make sure they procure software and hardware that will enable them to meet their regulatory obligations. How data is transferred, stored, and utilised needs to be considered – including how to securely do these things at speed. Umbehocker explains: “If you don’t have fast data movement, you are not able to respond to regulatory changes or various forms of instability. If it’s slow to move data from one site to another, then that’s a risk. A natural disaster requires fast action, and so it’s important to look at agility as a factor in an IT audit. WAN Acceleration can mitigate some of these challenges. Your whole organisation can then run more efficiently, particularly when you are having to consider large data migrations.”
Agility to address issues
For data centres and their customers to achieve and maintain regulatory compliance, they will need to ensure they have the agility to address issues quickly. He says this is where an audit of WAN performance comes into play. Without an audit, it’s impossible to understand what’s needed and to know where the gaps are in the business continuity plan (BCP), or if it’s broken. Once these gaps in the BCP have been identified, Newington-Blake then advises data centres and perhaps even their customers to perform a technology evaluation.
Umbehocker says you need to consider whether the technology you are using has the encryption features required by regulatory compliance standards. He says WAN Acceleration solutions such as PORTrockIT offers some distinct advantages in that it accelerates data no matter how it’s compressed or encrypted. It gets to the heart of the WAN problem of latency. Data is able to move with efficiency, as if it’s local within a data centre. The less time data is in flight, the better from an attack point of view. Yet that is not the end of it, as the ability to move large amounts of data rapidly allows data centres to become disaster recovery sites for customers or for data centres themselves. That way, they can meet evolving data centre auditing and regulatory compliance obligations more efficiently and easily.