Search Site

ADNOC Drilling closes JV

It is a JV between ADNOC Drilling, SLB and Patterson UTI.

Boeing to boost 787 production

The firm will invest$1bn to ramp up production in South Carolina.

ADNOC signs deal with PETRONAS

Under the agreement, ADNOC will supply 1m tons of LNG per year.

Aramco-Horse Powertrain deal completed

An agreement for the purchase of 10% equity stake was signed in June 2024.

Roche to buy Poseida Therapeutics

The $1.5 billion deal is due to close in early 2025.

Data Integrity: The foundation of true data freedom

The movement of data poses one of the most significant risks to data integrity (Photo by sagenext)
  • Organizations must liken preparing for migration to how pilots train to resolve the unexpected
  • Data freedom is about ensuring data remains accurate, secure, and usable during migrations

With businesses managing increasingly complex data ecosystems, maintaining data integrity when moving data has never been more critical. However, ensuring data integrity has become more challenging, leaving organizations at risk of data loss and corruption. For businesses to truly embrace data freedom, they must not only be able to move data but to also ensure that data remains accurate, complete and reliable during every migration.

The impact of data migration on data integrity

The movement of data poses one of the most significant risks to data integrity, with the lack of pre-migration testing as the main cause of issues such as data corruption and data loss. This can cause unexpected downtime, reputational damage and loss of important information. The recent global Crowdstrike incident is an example of how one fault can result in a significant impact across the business and its stakeholders. It sends a clear message – testing before implementation is essential. This enables organizations to identify potential issues and implement corrective measures.

The role of awareness and preparation in data integrity

Data integrity begins with awareness. Many organizations do not fully understand what data they have, when it was added, or what was updated over time, making it challenging to conduct data audits or integrity checks. Building awareness of data assets is the first step towards validating data and detecting abnormalities based on historical analyses.

Photo: Rick Vanover, Vice President, Product Strategy, Veeam

Then, rigorous and ongoing testing for migration is crucial. This includes testing for both functionality and economics. Functionality refers to how well the system operates after migration, ensuring that it continues to meet expectations; economics refers to the cost-effectiveness of the system or application, which is particularly important with cloud-based migrations. Economics testing involves examining resource consumption, service costs and overall scalability to ascertain whether the solution is economically sustainable for the business.

Organizations must liken preparing for migration to how pilots train to resolve the unexpected. By planning for the potential problems businesses may encounter during the transfer of data across systems and platforms, the risk and impact of compromised data can be minimized.

Most importantly, companies should prepare for migrations even if they don’t anticipate immediate changes. Just as pilots do not wait for poor flying conditions to train for an emergency landing or response, businesses also should not wait to be notified of imminent change to initiate data checks and testing. The volatile and fast-paced technological environment means we need to always be prepared to avoid being caught off-guard.

Data integrity and secure backups enable data freedom

Finally, a robust data backup and recovery plan presents the last line of defense. Veeam’s 2024 Ransomware Trends Report found that 65 percent of organizations did not have a recovery plan for a site-level crisis, and only 50 percent had immutable backups. This significantly impairs a business’s ability to efficiently restore corrupted or lost data. Fortunately, there are simple steps that companies can take to safeguard their data and enhance data resilience. By following the 3-2-1-1-0 rule, which recommends storing three copies of data across at least two different media types, with one copy offsite, one copy air-gapped, and with zero errors, businesses can significantly boost their data resilience.

Data freedom is not just about having the ability to move data, it is about ensuring data remains accurate, secure, and usable during migrations or platform changes. Regular testing and data assessments help maintain both integrity and freedom, ensuring businesses can rely on their data when it matters most. Lastly, a solid backup and recovery plan provides companies with peace of mind and a safety net, ensuring they can bounce back and forward efficiently if anything does go wrong.

This Op-Ed was penned by: Rick Vanover, Vice President, Product Strategy, Veeam