Forum Discussion

FrankDK's avatar
FrankDK
Contributor
11 months ago

Extended Analytics export

hi folks! 

I'm seeking advice on overcoming the obstacle of exporting data from an Operating System (OS) to make it accessible for a Business Intelligence (BI) solution. Currently, my goal is to export data into a STAR schema within an external database. While I know this task is achievable, I'm curious to learn if anyone has leveraged the OneStream WebAPI or any other innovative methods to retrieve data instead. Would appreciate insights or experiences shared on this matter!

Cheers Frank

  • RobbSalzmann's avatar
    RobbSalzmann
    Valued Contributor II

    Hi FrankDK 

    Here are some ideas in the form of an outline of the steps needed to create an "extended analytics" star schema like HFM has 🙂  :

    1. Choose the dimensions you wish to use in the star schema
    2. Create a fact table with columns:
      1. (1) column for each exported dimension's member IDs
      2. (12) data columns, one for each period
    3. Create a dimension table for each exported dimension.  The ID column in each dimension table is PK, tied to the dimension column in the fact table. This is the "star", dimensions at the points and facts in the center
    4. Choose a default member from each dim not exported to use in the member script, I'm thinking export with FDX
      1. Create a buffer to hold the fact data as a MemberInfo from each dim associated with each data value.
    5. Export the Dimensions to the dimension tables
    6. Using the buffer, export the data by member id for each dimension into the fact table.

    Viola!  you have an extended analytics star schema for BI analysis.  I think everything here can be coded in an Extender Rule.

    • FrankDK's avatar
      FrankDK
      Contributor

      hi Robb, thanks for your input; which aligns pretty much with the way we have planned to build and run the process. But do you have any practical experience with how to manage/control/configure what data should be exported to the fact table? A defined Cubeview you extract from using FDX?

      Also, I was on the hunt for other solutions and/or experiences where a PULL mechanisme was used compared to this PUSH process, don't know if you can chip in here?

      Cheers Frank

  • JackLacava's avatar
    JackLacava
    Honored Contributor

    BiBlend has a "Create Star Schema" option that can be leveraged for this. The procedure would roughly be:

    1. Create a Data Source Connector that will retrieve the data you need. This will require a bit of coding, but if you know FDX already it should be pretty straightforward.
    2. Set up a "blank" set of Transformation rules that will just pass members as they are (see the note on parents further down).
    3. create a Workflow Profile of type Import, set the Workflow Name to something containing Blend, connect it to DataSource and Transformation rules defined earlier, then configure BiBlend properties setting the Create Star Schema option to True; and execute it.
    4. query the resulting database View (name starts with vBIB) in the external database, or do your custom SQL joins of the resulting tables.

    This could be then executed periodically with a DM job that runs a bit of very simple coding.

    The only drawback of this approach is that parent values might look different than in the cube (no business rules, no aggregation weight or %Cons, simpler translation -> different numbers), and that BiBlend can only accept base data. This can be solved by mapping parents to "dummy" base members, but then you will need to do some postprocessing of results - i.e. run some SQL that pulls parent values from the dummy records and saves them back where they belong. (mhh, this is actually shaping up to be an idea for a blog post...)

    As for the pull/push thing: the OneStream REST Api has methods to kickstart DM jobs and to execute Dashboard Data Adapters; what you choose depends on your needs. You will have to be familiar with coding outside of OneStream to invoke it. The main obstacle is having a proper SSO-enabled environment - not a problem if you're on the OneStream Cloud, but potentially challenging if you're on-prem. One thing to keep in mind is that HTTP calls can be flaky; if you plan to export hundreds of megabytes with each call, it would be probably better to do the export with a DM job, rather than forcing everything to go through a web call. You can trigger the job with a web call anyway. The REST guide has all the available endpoints, is there anything in particular that you're not clear about...?

    There is also the Smart Integration Connector (SIC), which could be an option if you're on a recent release; but I'm not entirely familiar with it yet, and I'm not 100% sure it applies here. SIC is mostly meant to pull data in OneStream rather than extracting it, as far as I know.