Forum Discussion
Hi,
Thanks for getting back to me with your use case!
Adding data directly to the data tables using SQL is strongly recommended against. This risks system integrity as you only update this table but not all of the related e.g. audit tables. You should always create your data in a business rule, also for instance with the help of SQL and a datatable or dataset, and then save the data in the database using native OneStream functions (Apis).
Using SQL also risks leaving the solution not supported by the OneStream support if something breaks.
Thank you Henning for these information.
I will check how to do it using a business rule as mentioned.
Do you have any idea if there is a possibility to add some consolidated data using a business rule without passing by the consolidation process of OneStream?
Thank you for your help.
Regards,
- FredLucas7 months agoContributor III
Hi aabirached,
Am I right in assuming that you are trying to migrate consolidated historical data from an existent source system into OS and that's why you need to be able to load parent level / consolidated data instead of consolidating it in OS?
If that is the case you could follow these steps:
1 - Load data into Stage
Can Load Unrelated Entities must be set to true. There is no need to assign any entity to this work flow.
The resulting load will look something like this:
Because it is stage only, loaded data is only validated against pure metadata (meaning that it will only check if a member with that name exists). This allows to load and validate parent entities too.
2 - Write a Finance rule that gets the data from Stage, converts it into databuffers and write it to the cube. See sample snippet below (in C#).
Do please read the comments as some are very important to take into account:
case FinanceFunctionType.Calculate: var entityDbw = new DbWhere("EtT", DbOperator.IsEqualTo, api.Pov.Entity.Name); var scenarioDbw = new DbWhere("SnT", DbOperator.IsEqualTo, api.Pov.Scenario.Name); var timeDbw = new DbWhere("TmT", DbOperator.IsEqualTo, api.Pov.Time.Name); using (var appdb = BRApi.Database.CreateApplicationDbConnInfo(si)) { // Here only select the columns for the dimensions you care about to avoid loading loading a bunch of empty strings. // Also keep in mind that the order of the columns will be respected in the data table, which indexes columns zero based. In fact later we simply use the columns ordinal, not the column name var df = BRApi.Database.GetDataTable(appdb, "SELECT Am,AcT,IcT,U1T,U2T FROM StageSummaryTargetData", new List<DbWhere> {entityDbw, scenarioDbw, timeDbw}, new List<DbOrderBy>(), false); // Create an empty buffer var db = new DataBuffer(); // Iterate on the rows and populate a data buffer foreach (DataRow row in df.Rows) { // Create a new cell (essentially a record) var dbCell = new DataBufferCell(); // Populate all the used dimensions and data (if you have a lot of data to populate this way it is better to use the column ordinal rather than the column name) dbCell.SetData((decimal)row[0], DataCellExistenceType.IsRealData, DataCellStorageType.DurableCalculation); dbCell.SetAccount(api, row[1].ToString()); dbCell.SetIC(api, row[2].ToString()); dbCell.SetOrigin(api, "AdjConsolidated"); // Here we write into AdjConsolidated but you can write on import too or forms dbCell.SetUD1(api, row[3].ToString()); dbCell.SetUD2(api, row[4].ToString()); // Add the cell to the data buffer, aggregate if the cell is duplicated (should not happen because data would have been already aggregated in stage) db.SetCell(si, dbCell, true); } // You can use this to debug the content of the data buffer in the error log //db.LogDataBuffer(api, "ParentLoadTest", 1000); // Here we set the data unit destination including the target view set to YTD // Bear in mind that force calculate on all the entities will un in parallel therefore if you force calculate many entities the database is going to be hit quite hard // Even if you use a separate application the database server is just one per environment. Keep this into consideration to avoid draining the database of resources during data import var povCell = new DataCellPk(api.Pov.GetDataUnitPk(), DimConstants.YTD, db.CommonDataBufferCellPk); var destInfo = new ExpressionDestinationInfo(povCell,povCell,false); api.Data.SetDataBuffer(db, destInfo); } break;
3 - Assign this business rule to the relevant cubes.
4 - Trigger a Force calculate, a simple way to this is via a DataManagement Step:
- aabirached6 months agoNew Contributor II
Hello Fred,
Thank you for your reply. Your assumption is right 😊
I am trying to import consolidated data from an external table and I would like to test adding these data to OneStream.
I didn't have time to test your solution yet. I will get back to you next week when I finish my tests.
Thank you again for your help.
Regards,
- Henning6 months agoValued Contributor II
Hi, am I right in assuming that this data is just a historical "data dump"? This is not part of the reconciled data for statutory consolidation I assume?
The code that has been shared here allows you to copy data to parent entities, but if data is loaded to all entities, the data from the base entities will still translate and consolidate to the parent entities. So at least, consolidation % should be set to 0 to prevent this.
If this is indeed for statutory consolidated historical data, that is also used to provide closing balances for the starting point in OneStream so that the correct opening balances are pulled from there, I recommend not following this approach. The typical approach is to load the data to base entities and then let the system translate, calculate share and eliminate the data. Additional adjustments should be posted via journals. I know this is taking more time, but it gives control to the accountants and it ensures data integrity in the statutory actual scenario as all data is based on the same consolidation logic. This also helps validate the solution in a sense that once historical data has been reconciled in OneStream, the system works as per your requirements.
Historical journals can be loaded directly into OneStream as well in case you have a lot of those and posting each one individually seems too time-consuming.
- Henning6 months agoValued Contributor II
Hi, I am with Fred on this one. Did Fred assume your requirement / use case correctly? If not, may I ask for some further details of what you would like to achieve? Maybe we might be able to point you into the right direction. At the very least we should be able to put some more options on the table.
- aabirached6 months agoNew Contributor II
Yes Henning it is the requirement that I am looking for.
I will test the solution and get back to you.
Regards,
Related Content
- 3 months ago
- 12 months ago
- 10 months ago
- 8 months ago
- 8 months ago