Forum Discussion
Hi, as per default, each data unit gets cleared during a calculation operation (e.g. during a consolidation). This only does not apply if the corresponding scenario setting is set to False. However, in all my years in the OS world, I have never seen it being set to False.
Your calculation really just seems to pull data without a filter one from scenario to another. Just use a standard api.data.calculate("S#destinationScenario = removezeros(S#sourceScenario)"). On local currency and base entities and other things you might want.
Use a scenario member formula for that to make sure this gets executed first.
My high-level guess as to why yours does not work is that the member formula executes during a calculation of the source scenario. Keep in mind, calculations in OneStream are "pull", i.e. you cannot push data from your source scenario to your destination scenario. The calculation has to be executed from the destination scenario in order to pull the data from the source scenario.
- Marco11 days agoContributor II
So, are you recommending that I use a standard calculate function without all the additional logic? I had disabled the 'Clear Calculated Data during calc' option, but it didn’t help much, as the data was not being cleared when running the formula.
- rhankey8 days agoContributor III
You script is a little confusing to me.
What is the purpose of your first Calculate() statement that appears to be zeroing out the data in the target scenario? Those zeros, and even worse when they were written as durable data, are going to get in the way of writing any subsequent values to those cells. Other than in a few isolated instances where the odd zero is required to overcome derived data problem, writing zeros to the cube is bad.
Then why do you subsequently clear what appears to be the same zeroed out data?
As Henning alludes, the Scenario ClearCalculatedDataDuringCalc should normally be True (I similarly have never seen any builds where this has been set False).
Then I'd have a two line BR:
ClearCalculatedData() for all Calculated, Translated, Consolidate and Durable data in current Data Unit
Calculate() to copy desired source data to current Data Unit. I usually write the data as durable, but you need to make sure you're filtering out any data that may need to be recalculated in this Scenario. For filtering, you need to add an If test to check what Data Unit is being processed, and/or using Filters on the Calculate statement.
Run the above two lines for the destination Scenario/Time/Cons/Entities.
There is nothing I see in your current buffer handler that requires the buffer handler.
- Henning7 days agoValued Contributor II
Yes, from what I see, you do not need any additional logic such as data buffers. Maybe something to clear the durable data as rhankey suggested.
You can use the scenario member formulas if you want the seeding to run during every single consolidation of the forecast scenario, or use a custom calculate to trigger the seeding e.g. via a workflow process step if the seeding should only happen on demand. There are more options, but these two approaches are probably the most common ones.
Also, try to use Api instead of BRApi where possible for better performance. E.g. your TimeName string I assume can be set with Api.Pov.Time.Name.
For everything else, more detailed requirements are necessary.
Related Content
- 2 years ago
- 4 years ago
- 2 years ago