Forum Discussion
Hello Henning,
Thank you for your fast reply 😊
I am trying to add some data to the data record directly and I was studying the column values to prepare the insert statement.
Is there any SQL function that helps me add the data without having to calculate these columns (UD34, UD56 and UD78)?
The same question for the partition ID. I understood that this column is used by OneStream’s in-memory engines to split up processing by EntityId. Is there a way to know what value I should give to this column?
Thank you in advance for your help.
Regards,
- Henning7 months agoValued Contributor II
Hi,
Thanks for getting back to me with your use case!
Adding data directly to the data tables using SQL is strongly recommended against. This risks system integrity as you only update this table but not all of the related e.g. audit tables. You should always create your data in a business rule, also for instance with the help of SQL and a datatable or dataset, and then save the data in the database using native OneStream functions (Apis).
Using SQL also risks leaving the solution not supported by the OneStream support if something breaks.
- aabirached7 months agoNew Contributor II
Thank you Henning for these information.
I will check how to do it using a business rule as mentioned.
Do you have any idea if there is a possibility to add some consolidated data using a business rule without passing by the consolidation process of OneStream?
Thank you for your help.
Regards,
- FredLucas7 months agoContributor III
Hi aabirached,
Am I right in assuming that you are trying to migrate consolidated historical data from an existent source system into OS and that's why you need to be able to load parent level / consolidated data instead of consolidating it in OS?
If that is the case you could follow these steps:
1 - Load data into Stage
Can Load Unrelated Entities must be set to true. There is no need to assign any entity to this work flow.
The resulting load will look something like this:
Because it is stage only, loaded data is only validated against pure metadata (meaning that it will only check if a member with that name exists). This allows to load and validate parent entities too.
2 - Write a Finance rule that gets the data from Stage, converts it into databuffers and write it to the cube. See sample snippet below (in C#).
Do please read the comments as some are very important to take into account:
case FinanceFunctionType.Calculate: var entityDbw = new DbWhere("EtT", DbOperator.IsEqualTo, api.Pov.Entity.Name); var scenarioDbw = new DbWhere("SnT", DbOperator.IsEqualTo, api.Pov.Scenario.Name); var timeDbw = new DbWhere("TmT", DbOperator.IsEqualTo, api.Pov.Time.Name); using (var appdb = BRApi.Database.CreateApplicationDbConnInfo(si)) { // Here only select the columns for the dimensions you care about to avoid loading loading a bunch of empty strings. // Also keep in mind that the order of the columns will be respected in the data table, which indexes columns zero based. In fact later we simply use the columns ordinal, not the column name var df = BRApi.Database.GetDataTable(appdb, "SELECT Am,AcT,IcT,U1T,U2T FROM StageSummaryTargetData", new List<DbWhere> {entityDbw, scenarioDbw, timeDbw}, new List<DbOrderBy>(), false); // Create an empty buffer var db = new DataBuffer(); // Iterate on the rows and populate a data buffer foreach (DataRow row in df.Rows) { // Create a new cell (essentially a record) var dbCell = new DataBufferCell(); // Populate all the used dimensions and data (if you have a lot of data to populate this way it is better to use the column ordinal rather than the column name) dbCell.SetData((decimal)row[0], DataCellExistenceType.IsRealData, DataCellStorageType.DurableCalculation); dbCell.SetAccount(api, row[1].ToString()); dbCell.SetIC(api, row[2].ToString()); dbCell.SetOrigin(api, "AdjConsolidated"); // Here we write into AdjConsolidated but you can write on import too or forms dbCell.SetUD1(api, row[3].ToString()); dbCell.SetUD2(api, row[4].ToString()); // Add the cell to the data buffer, aggregate if the cell is duplicated (should not happen because data would have been already aggregated in stage) db.SetCell(si, dbCell, true); } // You can use this to debug the content of the data buffer in the error log //db.LogDataBuffer(api, "ParentLoadTest", 1000); // Here we set the data unit destination including the target view set to YTD // Bear in mind that force calculate on all the entities will un in parallel therefore if you force calculate many entities the database is going to be hit quite hard // Even if you use a separate application the database server is just one per environment. Keep this into consideration to avoid draining the database of resources during data import var povCell = new DataCellPk(api.Pov.GetDataUnitPk(), DimConstants.YTD, db.CommonDataBufferCellPk); var destInfo = new ExpressionDestinationInfo(povCell,povCell,false); api.Data.SetDataBuffer(db, destInfo); } break;
3 - Assign this business rule to the relevant cubes.
4 - Trigger a Force calculate, a simple way to this is via a DataManagement Step:
Related Content
- 3 months ago
- 12 months ago
- 10 months ago
- 8 months ago
- 10 months ago