Forum Discussion
Hi aabirached,
Am I right in assuming that you are trying to migrate consolidated historical data from an existent source system into OS and that's why you need to be able to load parent level / consolidated data instead of consolidating it in OS?
If that is the case you could follow these steps:
1 - Load data into Stage
Can Load Unrelated Entities must be set to true. There is no need to assign any entity to this work flow.
The resulting load will look something like this:
Because it is stage only, loaded data is only validated against pure metadata (meaning that it will only check if a member with that name exists). This allows to load and validate parent entities too.
2 - Write a Finance rule that gets the data from Stage, converts it into databuffers and write it to the cube. See sample snippet below (in C#).
Do please read the comments as some are very important to take into account:
case FinanceFunctionType.Calculate:
var entityDbw = new DbWhere("EtT", DbOperator.IsEqualTo, api.Pov.Entity.Name);
var scenarioDbw = new DbWhere("SnT", DbOperator.IsEqualTo, api.Pov.Scenario.Name);
var timeDbw = new DbWhere("TmT", DbOperator.IsEqualTo, api.Pov.Time.Name);
using (var appdb = BRApi.Database.CreateApplicationDbConnInfo(si))
{
// Here only select the columns for the dimensions you care about to avoid loading loading a bunch of empty strings.
// Also keep in mind that the order of the columns will be respected in the data table, which indexes columns zero based. In fact later we simply use the columns ordinal, not the column name
var df = BRApi.Database.GetDataTable(appdb, "SELECT Am,AcT,IcT,U1T,U2T FROM StageSummaryTargetData", new List<DbWhere> {entityDbw, scenarioDbw, timeDbw}, new List<DbOrderBy>(), false);
// Create an empty buffer
var db = new DataBuffer();
// Iterate on the rows and populate a data buffer
foreach (DataRow row in df.Rows)
{
// Create a new cell (essentially a record)
var dbCell = new DataBufferCell();
// Populate all the used dimensions and data (if you have a lot of data to populate this way it is better to use the column ordinal rather than the column name)
dbCell.SetData((decimal)row[0], DataCellExistenceType.IsRealData, DataCellStorageType.DurableCalculation);
dbCell.SetAccount(api, row[1].ToString());
dbCell.SetIC(api, row[2].ToString());
dbCell.SetOrigin(api, "AdjConsolidated"); // Here we write into AdjConsolidated but you can write on import too or forms
dbCell.SetUD1(api, row[3].ToString());
dbCell.SetUD2(api, row[4].ToString());
// Add the cell to the data buffer, aggregate if the cell is duplicated (should not happen because data would have been already aggregated in stage)
db.SetCell(si, dbCell, true);
}
// You can use this to debug the content of the data buffer in the error log
//db.LogDataBuffer(api, "ParentLoadTest", 1000);
// Here we set the data unit destination including the target view set to YTD
// Bear in mind that force calculate on all the entities will un in parallel therefore if you force calculate many entities the database is going to be hit quite hard
// Even if you use a separate application the database server is just one per environment. Keep this into consideration to avoid draining the database of resources during data import
var povCell = new DataCellPk(api.Pov.GetDataUnitPk(), DimConstants.YTD, db.CommonDataBufferCellPk);
var destInfo = new ExpressionDestinationInfo(povCell,povCell,false);
api.Data.SetDataBuffer(db, destInfo);
}
break;
3 - Assign this business rule to the relevant cubes.
4 - Trigger a Force calculate, a simple way to this is via a DataManagement Step:
Hello Fred,
Thank you for your reply. Your assumption is right 😊
I am trying to import consolidated data from an external table and I would like to test adding these data to OneStream.
I didn't have time to test your solution yet. I will get back to you next week when I finish my tests.
Thank you again for your help.
Regards,
- Henning7 months agoValued Contributor II
Hi, am I right in assuming that this data is just a historical "data dump"? This is not part of the reconciled data for statutory consolidation I assume?
The code that has been shared here allows you to copy data to parent entities, but if data is loaded to all entities, the data from the base entities will still translate and consolidate to the parent entities. So at least, consolidation % should be set to 0 to prevent this.
If this is indeed for statutory consolidated historical data, that is also used to provide closing balances for the starting point in OneStream so that the correct opening balances are pulled from there, I recommend not following this approach. The typical approach is to load the data to base entities and then let the system translate, calculate share and eliminate the data. Additional adjustments should be posted via journals. I know this is taking more time, but it gives control to the accountants and it ensures data integrity in the statutory actual scenario as all data is based on the same consolidation logic. This also helps validate the solution in a sense that once historical data has been reconciled in OneStream, the system works as per your requirements.
Historical journals can be loaded directly into OneStream as well in case you have a lot of those and posting each one individually seems too time-consuming.
- aabirached7 months agoNew Contributor II
Hello Henning,
Thank you again for your help.
I confirm that this is just a historical static data dump. I am not aiming to use these data to provide closing balances for the starting point in Onestream.
I just need to add these consolidated amounts in OneStream for read only purposes.
Is it possible to load data directly to TOP consolidation member?
Thank you in advance.
Regards,
- Henning7 months agoValued Contributor II
No, C#Top does not store data. Data is stored (amongst others) on C#Local.
You could also consider using technical / historical entities for 'dumping' this data. That way you could just load all of your data to a technical base entity in OneStream which represents a parent in your legacy system. That way you could leverage built-in load functionality via the workflow. But that all of course depends on your solution and requirements.
Related Content
- 4 months ago
- 2 years ago
- 11 months ago
- 3 years ago
- 9 months ago