Better than a Solution: Tightening Backup and Recovery Helps Financial Services Companies Innovate

We all know the risks that exist. Ransomware is a huge threat and critical transactional data is under constant attack. Meanwhile, financial services organizations are under pressure from all sides as regulators tighten legislation, from SOX to CCPA, GDPR and global data privacy laws like PIPL. In this firestorm, it has never been more important for financial services organizations to improve their data protection and risk mitigation strategies.

What makes financial services data so complex?

There are four main reasons:

  • Sophisticated data models
  • Strict security requirements
  • Extremely large data volumes
  • Test environments can introduce risks

Financial services under attack

Data threats come from many places, from human error to malicious activity, and one such threat is ransomware attacks. In Sophos’s The State of Ransomware in Financial Services 2022 report, findings showed that ransomware attacks against financial services are on an upward trajectory. The report found that 55% of organizations were affected in 2021, compared to 34% in 2020.

Still, according to the report, financial services reported the second lowest data encryption rate at 54%, compared to a global average of 65%.

Of the financial services organizations that were affected, 52% paid the ransom to restore the data, which is higher than the global average of 46%, and the survey found that the average cost of remediation in financial services was $1.59 million, which is above the global average. worldwide average of 1.4 million US dollars.

Response rates are too slow

It follows that securing this data is a huge challenge that requires constantly evolving innovations. Technological solutions exist, but their implementation is not always simple. This leads to the commendable statistic that 277 days is the average time to identify and contain a data breach, according to IBM Security’s Cost of a Data Breach Report 2022. That’s more than three-quarters of a year spent containing a single breach.

Complex data challenges

Financial services data is high frequency data and spans complex relationships. Restoring this type of data is difficult. It is necessary to use the most extensive security measures such as encryption key management, data residency, vendor data risk assessment and maintaining compliance with offshore development. Without these strict measures, the data is simply not secure enough.

Because the volumes of data are so massive, there are unique challenges. Query performance can be slow and difficult for agents to access, leading to customer service issues. With large objects and external data, it is easy for the data to be skewed. Perhaps more importantly, it can be difficult to remove large volumes of data from SaaS applications, giving this problem a snowball effect.

Another challenge: Creating and maintaining proper test environments is a hurdle to overcome. The data must be complete to run robust performance tests. There are data masking requirements to fulfill. And sandboxes can become congested when full copy data is used.

So what do financial services organizations need to do better?

Here is a four-step plan:

1. Develop a strategy to back up and restore your most critical data. Retrieving your data at any time is a necessity for financial organizations. Also, just because you can back up doesn’t mean you can restore it within your recovery time objectives (RTOs) or at all.
2. Have a clear archiving strategy, which enforces consistent rules that control what data stays on the platform, what leaves the platform, and what is deleted, all based on your business needs and the sector regulation. You also need to make sure that you can retrieve archived data in the future if needed (for example, in the event of an audit).
3. Choose third-party service providers that pose minimal risk, such as “no sight” providers, because if they have a breach or a malicious actor, your data is not affected since they do not have access to your data . Minimize the number of service and technology providers that have access to your data. Plus, ensure encryption of data at rest, in transit, and in use and own the encryption keys instead of depending on your backup restore provider.
4. Anonymize data in sandbox environments to limit data exposure during development and testing.

The good news? By following this four-step plan, organizations can innovate in several ways:

1. Protect business continuity. Create a backup and restore process for your data that will withstand even the most catastrophic data loss scenarios. In the case of Salesforce, it is essential to understand that data models can be very complex and to choose a tool that understands the extremes of possible customization. Selecting the most flexible toolkits to solve these problems will pay off, especially when it comes to recovering from data loss.

2. Operationalize regulatory compliance. In the case of Salesforce, companies use it for more than just customer relationship management. Salesforce data is more important than ever, as many financial services organizations use it for material use cases, managing the transactions and interactions that are at the very heart of the business. But archiving this data requires having the right tools. You can archive data, but the law requires you not to delete the data for a certain period. Part of the problem is where do you store this in your business? You need to reevaluate the security of where you store this data. What country it’s in, for example – so you make sure that the right data privacy laws are followed. It is also important to know which provider hosts the information. This adds an extra layer of complexity and many businesses are looking for the right archiving solution to alleviate these issues.

3. Improve performance. If data can be archived and deleted from the central SaaS environment, performance will automatically improve. When millions of records are there, it slows down the entire system and agents notice delays in accessing records, which leads to poor customer service. To provide excellent service to customers, teams must have fast and efficient systems. A continuous archiving strategy, based on the entire data lifecycle, is needed to avoid this scenario and avoid the snowball effect of having too much data and not being able to delete any of it.

4. Gain data agility. The key lies in turning data into fuel for innovation, moving it between production and non-production environments. One of the things finance departments want to do is safely test new configurations of their Salesforce or CRM data. One way to do this on a large scale is to test Salesforce, using real-world data. Businesses wonder how can we do this safely? Using techniques such as anonymizing the sandbox and seeding the sandbox allows them to innovate faster because they can test scenarios close to the real world, but without any of the risks associated with playing with the production data.

Add Comment