Change data buffer based on time
Hello,
The client has a calculation for the planning process where I used the prior period values in the calculation for current month values. Here is the data buffer I created:
Dim OpCostsDataBuffer As DataBuffer = api.Data.GetDataBufferUsingFormula("FilterMembers(RemoveZeros(F#Top:O#Top:T#POVPrior1:I#Top:U5#None:U6#None:U7#Top:U8#Top),A#Op_Costs.Base,U1#Top.Base, U2#Top.Base)
Then, the client had another idea to accommodate where if the calculation is for first month (M1) take a 3-month average and then for the calculation of the subsequent months (M2 – M12) revert to the original logic of trending on prior month – POVPrior1.
I added this logic, where I included an IF statement:
Dim OpCostsDataBuffer As DataBuffer = New DataBuffer ()
If POVPERIODNUM = 1 Then
OpCostsDataBuffer = api.Data.GetDataBufferUsingFormula("FilterMembers(RemoveZeros(F#Top:O#Top:T#" & Year1 & "Q4" &":I#Top:U5#None:U6#None:U7#Top:U8#Top),A#Op_Costs.Base,U1#Top.Base, U2#Top.Base)")/3
Else
OpCostsDataBuffer = api.Data.GetDataBufferUsingFormula("FilterMembers(RemoveZeros(F#Top:O#Top:T#POVPrior1:I#Top:U5#None:U6#None:U7#Top:U8#Top),A#Op_Costs.Base,U1#Top.Base, U2#Top.Base)")
End If
The formula works, but the consolidation time increased significantly. Is there another workaround for this?
Thanks!
- Don't create a new blank buffer, that's a waste of time. Just have "Dim OpCostsDataBuffer As DataBuffer = Nothing".
- Don't do constant-value calculations (" / 3 ") if you can avoid them - try leveraging the View dimension instead, which has built-in averages calculated as fast as they can be. If you add "V#MonthAvg" to your buffer definition, you should get the right numbers.
- If you really have to do calculations with constant numbers, do them inside the GetDataBufferUsingFormula clause. Yes, math with buffers also works, but there might be performance differences. GDBUF supports everything that you can have on the right side of "=" in Calculate, so you can do operations and so on.
- You don't show us how POVPERIODNUM is defined, there might be some slowdown due to a "bad" method used there (unlikely, but you never know).
- When benchmarking, always run the rule at least twice - discard times from the first run, since it's clearing up numbers from previous runs so it might not be reflective of the new rule.
Some of the slowdown is likely inevitable, for the simple reason that data is stored as YTD; so when you look at quarters, you're doing more work just to sum things up. Following the advice above should help you minimize the extra time, but it will likely never be as fast as it was before. You'll have to find gains elsewhere (e.g. executing only on specific Consolidation members, etc).