OSAdmin
Valued Contributor

Originally Prepared by Peter Fugere

The one part of any project that involved volumes of data that can never be underestimated is the data reconciliation. It will always take much longer than you plan it will. It will always be much more work than you think. The fact is until you go through a process that forces you to evaluate your data you don’t know how clean it is.

That’s one of the most understated benefits of doing an implementation with any reporting tool. You will find all of your data problems. You will see them because you be taking the parts of data you are seeing now and looking at them in new ways. If there is a flaw in the data, it comes out when you do this data reconciliation. So take some comfort in knowing that when you are done with the process, your data will be in pretty good shape.

Your project’s biggest risk.

So, why it so much work? The biggest reason is this is the first time as a user of the system you have to sit down and navigate through the database. Do not minimize the fact that you are going to validating the data in a new system, and that will present challenges.

 

The next reason this will take longer, is the truth is this is not glamorous work. In fact, it is tedious and time consuming. So it is easy to get distracted and pulled away to do something else.

The only way to sit down and work through the data reconciliation work, is to roll up your sleeves and get to work. But you need a plan. You need to first decide at what level you will be doing your data reconciliation. Is every account necessary when you know they sum to the parent? Or do you only need to do that once or twice then move on. I would recommend for each the account and custom dimensions to go to the parent entity after consolidation and ensure the children are rolling to the correct parents, and the data values are flowing correctly. Once this is done, there is no sense in continuing to do it. It is safe to assume this works now for all entities. You have tested the functionality. Now you can focus on the data.

 

But first you need to build some tools to help you in the process. The first is an Excel spreadsheet that should be created that includes all accounts for the OneStream application. These accounts will include all levels in the hierarchy and will be organized into groups based on what they are providing (P&L, B/S, Calculations, and Statistical). This spreadsheet will be the primary tool you will use to reconcile. Why use a spreadsheet and not a grid or report? You can build quick calculations right on the face of the spreadsheet. That helps you find if parents are not rolling correctly or if adjustments need to be made. Spreadsheets are also common tools for other products, which you might be able to pull data from the help move along reconciliation. This flexibility makes it much easier to build adjustments and make corrections. OneStream provides and very versatile and intuitive spreadsheet add in that makes this the preferred tool.

 

The second tool you will need is a set of audited reports, or information you have gotten form the source system to which you can reconcile. It absolutely needs to be verified that is the same as the data you are loading. It also need to be at the correct level of detail to which you are reconciling. Let me tell you, it is a nightmare to try and reconcile to something that you can’t drill into. If possible you can get something that helps you with the Spreadsheets you built above the better. You can use the simple functions in Excel to speed the tedious ticking and tying.

 

Now you are ready to reconcile the data. You want to stick to you plan. You can break this up across a team, but you should follow the following steps. The first step to reconciliation is validate all base accounts and base customs for a data load file until you are comfortable the mapping from you ETL is correct. This is usually no more than 2-3 periods of data. But once you are sure all the base accounts are correct, and mapped you are ready to see the data aggregate and consolidation in OneStream.

 

The second step is to validate parent accounts and customs for a set of entities. This is as discussed to ensure the structures you have built are in fact what you need. You may need to build subtotals using excel functionality where the consolidating points differ between the old and new hierarchies. The calculation accounts will also be aligned to ensure the old and new are correct and any changes to the calculation methodology can be identified and explained. Key accounts will be identified (with auditor assistance) that will be the same between the applications at the higher levels that will be used to validate the consolidation points.

 

The third step is to make any base level adjustments. You will likely find issues at this point. Hierarchies might have been reported incorrectly, or changed over time. Calculations may have changed. There are always one off adjustments. Do not make the mistake of building things in OneStream for the way things used to be. Those types of complex structures will make it harder to make changes in the future. It will make general maintenance and upgrades harder. It will create more work that really has no value going forward. You should not waste time build a structure for old reporting or issues. Post a journal and move on. And knowing the audit features and reporting of journals, you can see how this is a preferred way to get the data into OneStream. So, it is less work and yields a better result.

 

The fourth step is to validate the translation. Depending on how you were doing translation before, you may see differences. It is important to remember it is not just rate but methodology that can change. The methodology of translation only affects the flow accounts. All balance sheet accounts will use the rate is the end of the period that the balance is in. The flow accounts (revenue, expense and cash flow) should translate using an average rate. The average rate can be an average for the year, or for the month. Fortunately OneStream can translate using both a year to date average and periodic average. This periodic translation is done each month, so the sum of the months would yield the year to date translated amounts. You have to have all current months of a given year to see the correct translated amounts. For example, if you were translating using the periodic method you would need all twelve months.

 

Certain accounts add difficulty due to the fact that at the end of the previous year the data was translated at last year’s closing rate, so current period opening balances needs to show the same value as the prior year’s closing. Other accounts need to translate at historical rates. Dividends in the profit and loss account use the rate at the date they are declared.

 

It is possible you are taking advantage of an upgrade or rebuild project of OneStream to redefine your translation methodology. And if that is the case you would expect translation differences. You have a couple ways to handle the changes in translation for history. You can build something for the way the accounts used to be translated, but is not a best practice. Same as when you were reconciling base members, it will make general maintenance and upgrades harder. You could try to post adjustments, but depending on at what level you choose you will find there are many of entries that need to be made. You could just load translated values for historical periods. I would recommend that approach since it will be easier to reconcile. You can turn-on’ translation for periods when the translation is what you will use going forward. It is possible in OneStream to load parent currency. By simply using the Flow dimension, you can see how simple it would be to handle historical currency issues.

 

Reconciling translation is another place where reconciling more than a handful of periods will give you diminishing returns. The translation is just a set of rules. Once you have ensured the translation functionality is working, you should move on. Then you can be sure the only variables you have are the currency rates and how you decided to handle historical overrides for equity accounts. Doing any more than this is like making sure 1 plus 1 equals 2, over and over again.

 

The next step of the reconciliation is to reconcile parent entities. You should start at a group of lower entities. You can see a group of entities and the data you loaded consolidate here to the parent entity. Then you can move to other entities higher in structure. The historical reconciliations will take place at all top level names within the entity structure and at specific legal entities within the structures you defined. This is one place where I would reconcile every period of data.

 

A second area of testing will occur when the reports are converted to the new chart of accounts. Next you will take hard copies of the reports printed out, tied out with tick marks, and collected in a binder which will be reviewed and signed by either internal audit or the controller.

These steps will validate both the base level data mappings as well as consolidation points in the financial statements. It is a thorough approach that minimizes the working over the same data again and again. You should note that it is possible all aggregation or consolidation points cannot be validated due to the changes in the structures where there may be different accounts rolling together or some points may not exist anymore. So with those spots you will need to spend extra effort.

 

Also, calculations may have changed due to the new chart or a new standardization of the calculation criteria and these differences will be explained. Many of the old calculations are also invalid or not used anymore and have been deleted from the Functional chart of accounts. For these instances, you should have a reconciling report that shows and explains these differences as a new report. Fortunately the spreadsheet you use to reconcile can be used to document this.

Once you have your reports and the data is signed off, you should lock the historical periods. You do not want anyone making any changes even by accident to these periods. You should also make full data extracts of all the periods and save those with a set of the reconciliation reports as .pdf. This with the binder and sign offs should satisfy any audit or review of the project.

 

How much Historical data should I have in my application?

When considering the how much work is involved with building out the application you need to ask yourself, ‘How much historical data should I load and reconcile?’ Yes it is some work, but this reconciliation step serves two significant purposes. First it trains your core team to support the application. Just the process of reconciling historical data helps people understand the system and trouble shot for most of the common issues. Working through those common issues will make sure they are ready to support the application after the reconciliation is done.

The second benefit of reconciling more periods of data is that it functions as a test of the rules and data load process. The more periods you load and reconcile, the more confident you can be the application is ready for the close, and that you can handle data mapping issues. While OneStream has real error handling in the rules, and compiles the rules before they are run, Business Rules you wrote and developed are written based on assumptions and expected data sets. What happens when those expected data sets change? What happens when the assumptions change? More than likely the rules will fail. So the best thing you can do is run more periods of data through the system. This is also true for the data integration you built. The structure do change over time, by varying degrees depending on your business and process. So when new accounts are added, how does you system respond? Even if the data integration accounts for these changes with dynamic mapping, then it is still helpful for the team to see this work, and be ready to reconcile the data in these new members.

Still you may find for audit or tax reasons you want seven years of history in your

application. There is no reason why you cannot start with a number of periods and then backfill form there. With that said many people elect to load and consolidate 2 years of data for the application and build back from there. The benefit here is you would be reducing risk to the project. You have fewer data values to work through. Two years for the implementation seems to be the best option for most people. That meets the majority of required reporting for the go-live. It also is not so few there is an uncertainty the application is ready for production use.

For some historical changes people want to see the data as if there were no changes. Changes in the system like an acquisition, a merger, a restructure of entities or debt. One way to accomplish doing this is to build a Pro Forma Scenario Type. This is really just another scenario where you can load data with different attributes. For example you could load a full year profit and loss statement, instead of the months that were loaded. Now you can see what the company would look like it if you had that company the whole year.

 

So why is data reconciliation so difficult?

So if there is a proven and reliable approach to reconciliation why do people fail here? Why is it so hard for some people to reconcile their data? Often it is a combination of factors. When there are data issues, the consulting team can only go so far. The business users need to take control of this process. You cannot bring in a ton of external people to work on the data. They will leave with the knowledge, and take longer than you think to get up to speed. Hiring consultants is very expensive. You need to assign people who will supporting the application and put them on this phase full time. If dates start to slip, you need to make sure there are not bottle necks. For example, the users are finding issues and all taking them to one person to resolve. The group needs to work on a concentrated effort to complete this critical task.

 

It is true this is where most people struggle. But with a good plan and procedure any team can work through this issues. At OneStream we use the PRIME implementation methodology. A repeatable, defined methodology is critical to minimizing risk,

1 Comment