In today’s complex IT environment, maintaining data consistency across distributed copies of data is challenging and the possibility of data discrepancy is an unfortunate reality. If not discovered and addressed, bad data can lead to incorrect decision-making; failed service-level agreements; and ultimately, operational, financial, and legal risk.
No matter how carefully you design your restrictions on changing the distributed data in your enterprise discrepancy will sneak in and cause potentially cause havoc on your reporting.
A key requirement in maintaining data consistency across your enterprise is to catch any discrepancy as soon as possible and fix it.
Oracle GoldenGate Veridata provides an easy-to-use yet powerful solution for identifying out-of-synch data before it negatively impacts the business.
I wrote a white paper which covers the challenges in maintaining data consistent in an enterprise and how Oracle GoldenGate Veridata can address this critical issue. Please find the link to the white paper below:
In an enterprise data sets that are being compared can be very large, the user will need to have a good grasp of the system requirements when configuring Veridata. I have produced a technical handbook which goes into the details of configuring Veridata and provides detail formulas and examples for evaluating system needs (threads, memory, disk) based on the size and growth of the data. This is available in Oracle Support Site.
Let me know what strategy and tools you use to maintain data consistency in your enterprise.