Data Governance and Security – How Multishoring’s Integration Governance plan stops data security vulnerabilities

Data security and governance is important. Being able to safeguard sensitive information is essential if you are to avoid fines and reputation-killing headlines in the press. Data security-related legislation and standards such as the General Data Protection Regulations (GDPR), the Health Insurance Portability and Accountability Act (HIPAA), the Sarbanes-Oxley Act (SOX) and, the Payment Card Industry Data Security Standard (PCI DSS) require organizations to take appropriate measures to protect data, prevent unauthorized access and misuse and, implement information governance and data security controls.

In addition to Federal and international legislation there are a myriad of state-level regulations such as the California Consumer Privacy Act (CCPA), the Gramm-Leach-Bliley Act (GLBA), the Children’s Online Privacy Protection Act (COPPA), the Electronic Communications Privacy Act (ECPA), the New York State Department of Financial Services (NYDFS) Cybersecurity Regulation and, the Illinois Biometric Information Privacy Act (BIPA). While the details contained within this legislation varies the fundamental principles are consistent – Organizations have a duty of care towards the owners of the data they handle and must ensure that sensitive information is protected as far as is reasonably practicable.

Most businesses are investing in data security as a priority. The data governance tools marketplace is growing rapidly as a consequence. However many are only focusing their efforts on the back-office systems of record where the data ultimately resides and often overlook the direct and indirect integration touchpoints that could potentially be used to access or manipulate the data.

But what is data governance and how does it relate to your integration project?

There is no universally accepted data governance definition. It is a term that is used to describe a wide variety of processes, policies and controls that aim to improve data security and assure privacy. While data governance is top of mind for many IT leaders, they often fail to consider the impact that systems integration has on their data governance compliance.

The most commonly observed integration issues that materially affect data governance and data security are:

  • Integrations often built on the unproven assumption that all transactions are authorized and fail to perform appropriate security checks or provide visibility into the triggering event that initiated the data update or transfer.
  • Integrations sometimes use proxy account credentials to perform actions in external systems and consequently the traceability of updates initiated by the integration may be diminished.
    Vulnerability assessments and penetration tests performed on integration solutions are often negligible or superficial. Integration solutions need to be tested and hardened to ensure they are fit for purpose.
  • Integrations are rarely designed with security and data governance in mind. Many of the controls needed to minimise risk must be designed into the integration from the start for them to be effective. Retrofitting security into an integration is often costly and ineffectual.
  • Integrations are not maintained or updated in light of the ever changing threat landscape. Just because an integration was secure when it was originally deployed, it does not mean that it will remain so as new attack vectors and vulnerabilities come to light.

Multishoring’s 12 step data governance plan to ensure system integration projects don’t introduce cyber security and data governance vulnerabilities

Data governance doesn’t happen on its own. It is the result of a coordinated set of actions and decisions taken by those designing, developing and operating an integration solution. Drawing on its experience across hundreds of integration projects, Multishoring developed the following 12 step approach that has been proven to mitigate and control data security risks.

Step 1 – Perform a comprehensive data governance audit

Undertake a data audit to identify what data resides where and its relative level of sensitivity and the potential attractiveness or value of the data to unauthorized third parties. Not all data is created equal. Not all data requires the same level of governance to be applied. Assuring data quality and security comes at a price. Businesses must evaluate the risks and take appropriate measures to mitigate them within the commercial and technical constraints they operate under.

Step 2 – Categorize data and define a control hierarchy

Classify data and define policies and procedures related to the controls, permitted use cases and, retention periods associated with different categories of data. Different classifications of information will need to be handled in different ways. It should be noted that there is sometimes a contextual element that should be considered when evaluating datasets. For example, “live” in process data may be incredibly sensitive as it will directly affect immediate actions and decisions whereas once the related transaction or activity has been completed the criticality of the historical information may be much lower.

Step 3 – Perform discovery to determine the data governance threat surface area

Understand the data flows throughout the end-to-end ecosystem. Look beyond the production systems and consider additional intermediate or transient data repositories where data, or subsets of it, are stored. Staging tables, transaction logs, reporting repositories and, backup systems all need to be considered if the overall integrity of data is to be assured. Attackers rarely attempt to force the front door or access the vault directly. Instead, they often prefer to target secondary or feeder systems where security controls and monitoring regimes may be less stringent.

Step 4 – Document end-to-end data flows and trickle-down data update timelines

Map out dependencies and data update propagation paths to get insight into sequencing and delays / lags in data updates trickling down through systems across the environment. Understand how changes propagate and how long they take to flow though the systems is invaluable when dealing with a data breach or data pollution event. Being able to halt the progress of contaminated data throughout the corporate back office quickly is essential for damage limitation and shorter service restoration times.

Step 5 – Profile datasets in terms of data quality and update characteristics

Identify duplicate data sources and analyse their respective levels of data quality in terms of completeness, validity, update frequency, provenance and, variance levels. Data duplication is a fact of corporate life. It creates, or contributes to, many issues and needs to be carefully considered in the design of any integration solution.

Step 6 – Analyse data granularly to determine the most trusted source or optimal mix of sources

Determine hierarchies of trust for duplicated datasets to feed into reconciliation processes and aggregated view definitions. Not every dataset is accurate. Data accuracy within a dataset is not always consistent. Sometimes a data source may have really good data in one particular element of the data record in terms of completeness and accuracy but it may be weaker in other areas. In such cases, it is useful to understand which elements of records within different, but duplicated, datasets should be used to create the best possible aggregated view or consolidated record.

Step 7 – Consolidate disparate data sources where practical

Evaluate data repository consolidation options to reduce the volume of duplicated data where possible. Sometimes data cannot be consolidated but sometimes it can. Reducing the volume of duplicated data in an environment reduces the potential for mistakes and misinformation.

Step 8 – Use data governance tools to implement robust access controls

Ensure that data access events are traceable and an audit trail of who saw what and when can be established for sensitive information.

Step 9 – Deploy decoy data to help identify and track unauthorised incursions

Consider deploying decoy data into production datasets to help with breach identification and to spot possible misuse scenarios. Decoy records, also known as “trapdoor data” or “honeypots,” are fake records that are deliberately included within a dataset to help detect data breaches. If an unauthorized individual accesses the dataset, they will also access the decoy records, allowing the organization to quickly identify the breach and take appropriate action.

Step 10 – Incorporate hashing to help safeguard against data tampering

Evaluate the potential to use techniques such as hashing to help identify instances where data has been tampered with. Algorithms such as SHA-256 can be used to calculate a hash of the data and store it alongside the data. Later, when the data is accessed, the hash is calculated again and compared with the stored hash. If the hashes match, it is likely that the data has not been tampered with.

Step 11 – Monitor network traffic for unexpected systems connectivity

Use network traffic analysis to identify anomalies in data access activity that may be indicative of a potential breach. Uncharacteristic peaks in network traffic or uncommon levels of resource contention related performance degradations may help identify elements of the infrastructure that are being accessed by external parties. Such analysis can also be useful to identify undocumented, and maybe unsanctioned, shadow IT integrations that are accessing corporate systems.

Step 12 – Plug potential data leaks from non-production systems

Review development, test and, pre-production environments to ensure that production data is not being used inadvertently. In cases where real-world data is being used for testing it is advisable to consider using obfuscation tools or, better yet, eliminating its use through the deployment of synthetic data modelling tools.

To conclude…

Data governance and security matters. Many business processes (and business’s) are dependent on the integrity of the data they consume. Data leakage prevention and being able to demonstrate good data stewardship is important to build customer confidence and avoid regulatory penalties. And yet many Systems Integrators fail to treat integration -related data governance with the seriousness it deserves or allocate the resources needed to deliver it. This is always a false economy. The quality and integrity of data flows between systems facilitate the process. If the data cannot be trusted, then the process outputs the rely upon the data cannot be trusted.

Being aware of data governance good practices helps you to ask the right questions and scope the project appropriately. When planning your integrations, we would strongly recommend that you use the above 12 step plan when evaluating proposals and project plans to ensure that the most significant data security risks are managed.

The Multishoring approach to data governance and security

Integration is what we do. Our team has seen the data governance issues outlined above many times and understands how to mitigate the risks, avoid common pitfalls and, ensure your integration project is successful. Whether we’re in a turn-around situation helping to get an off-the-rails project back on track, or working on a net new integration, we focus on the things that truly matter and help secure your data and implement demonstrable data governance controls within the constraints you define.

Depending on your appetite for risk, available budget, or timeline, we will work with you to define the optimal schedule of works so that you feel confident that your data is secure and that the delivered solution is able to protect your sensitive information issue free for as long as you need it. We don’t over engineer projects to artificially inflate the scope of works and we don’t skimp on activities that will materially impact the integrity of the delivered integration.

Integrations may be complex, but they do not need to be complicated. We use our experience and expertise to take challenging integration scenarios and make them happen. We do what is needed to make your integrations stable, secure and scalable so that you can focus on other things.

contact

Let's talk about your IT needs

Justyna PMO Manager

Let me be your single point of contact and lead you through the cooperation process.

Change your conversation starter

    * - fields are mandatory

    Signed, sealed, delivered!

    Await our messenger pigeon with possible dates for the meet-up.