Forum Discussion

Arion's avatar
Arion
New Contributor II
2 days ago

Reduce data unit size

Good afternoon,

After one of our calculations started to behave slowly, we have been advised that we should reduce our data unit size.

Consultant explanation "Real Cell Count is relatively low, at 5 million cells. But to aggregate those 5 million cells, the system will have to request every monthly value for all of those 2.6 million rows from the database, hold in memory which ones have data and keep track of the ones that don't"

In this sense, reading the forum, seems the solutions are any (or all) of the following:

  • Leverage BI Blend to aggregate large data sets for reporting
  • Load into Cube at summarized level
  • Always use RemoveZeros() in api.Data.Calculate and api.Data.GetDataBuffer functions
  • Consider different Entity dimension
  • Vertical extensibility (not horizontal)
  • Consider storing some data outside Cube (stage)

Just wanted to check if you have applied any other solutions into your projects? as first time we need to reduce data unit size.

Thank you,

 

 

  • rhankey's avatar
    rhankey
    Contributor II

    All those points can improve Data Unit size and subsequently consolidation and reporting performance, and those points should be carefully considered and incorporated at design & initial build time.  Some of those items are rather hard to fix after the fact.

    Your list doesn't mention splitting data into additional cubes.  Things like allocation rates or factors as an example, do not need to be stored in the cubes that consolidate up the entity hierarchy.

    Another item not on your list is to ensure calculations do not retain unnecessary detail.  For example, when computing let's say YTD-NI in the BS, you can usually strip off all UD detail, and sometimes even IC detail too, as users can drill back to that detail if they need to see it.  This can strip out a shocking percentage of data.