Forum Discussion

sahilp's avatar
sahilp
New Contributor II
3 years ago

Automating WF Execution

Hi all,

I have connector rule set up to an external source table that is working fine to pull data into stage, but I would like to automate the execution of the workflow as well so that the data in the cube is refreshed nightly.  I found the below api but it looks to me like it's meant to load a flat file into stage via an extensibility rule, not from an external table. I'm not sure how to connect it to the connector/import step I want to automate. Any thoughts?

 

Dim WFTime As String = TimeDimHelper.GetNameFromId(api.WorkflowUnitPk.TimeKey)
Dim results As WorkflowBatchFileCollection = BRApi.Utilities.ExecuteFileHarvestBatch(si, "Actual", WFTime, True, True, True, True, False, False, False)

  • It is certainly possible to automate the import/validate/load of data from an external source. Here's the high level steps for setup:

    1) Setup a Data Management job that will execute an Extender Business Rule. You will need to pass a few parameters to the business rule such as the Workflow you want to process and period.

    2) In the extender business rule, you will call the following functions:

    BRApi.Import.Process.ExecuteParseAndTransform <-- executes the workflow import. 

    BRApi.Import.Process.ValidateTransformation <-- executes the mapping validation step

    BRApi.Import.Process.ValidateIntersections <-- executes the validate intersection step

    BRApi.Import.Process.LoadCube <-- loads to cube

    BRApi.DataQuality.Process.ExecuteProcessCube <-- run workflow calculation if your workflow has this enabled.

     

    Here is how I've got my ExecuteParseAndTransform function setup on my end:

    Dim ImportInfo As LoadTransformProcessInfo = BRApi.Import.Process.ExecuteParseAndTransform(si, wFClusterPk, "", Nothing, TransformLoadMethodTypes.Replace, SourceDataOriginTypes.FromDirectConnection, False)

     

    You will need to determine the wfClusterPk based on the parameters you've provided from the data management step. In the Connector rule that is being used in the Data Source, it will also need to know what workflow profile, scenario and period you are trying to process. 

    The result of the import (successful or failed) is stored in the ImportInfo variable. We then check to see if the import was successful and without errors before proceeding to the validation steps:

        If ImportInfo.Status = WorkflowStatusTypes.Completed Then

     

    Hopefully this provides the gist on how to get started down the import automation! 

     

     

  • At first, we used the Global Time period to determine the period to process but then switched to using the System time. My scheduled tasks are setup with the following parameters:

     

    The daysBefore parameter indicates the # of days prior to a new month that we can start the automation while daysAfter indicates the # of days after start of month that we run the process. excludeWeekend true/false tells the business rule if we should run the process over weekends.

     

    The Extender rule then figures out what workflow period to run based on the current date and if the date falls between daysBefore >= current date <= daysAfter. The daysBefore/daysAfter is optional for determining the workflow period. if you're starting imports on the first day of every month, then the workflow period can readily be determined as:

    Dim curDate as Date = Date.Now

    Dim wfPeriod as string = curDate.Year & "M" & curDate.Month

     

    Note that I write the wfPeriod to a dashboard parameter which is the read from within the Connector rule so as to run the SQL on the current wfPeriod. 

     

    Hope this helps!

     

  • Cosimo's avatar
    Cosimo
    Contributor II

    It is certainly possible to automate the import/validate/load of data from an external source. Here's the high level steps for setup:

    1) Setup a Data Management job that will execute an Extender Business Rule. You will need to pass a few parameters to the business rule such as the Workflow you want to process and period.

    2) In the extender business rule, you will call the following functions:

    BRApi.Import.Process.ExecuteParseAndTransform <-- executes the workflow import. 

    BRApi.Import.Process.ValidateTransformation <-- executes the mapping validation step

    BRApi.Import.Process.ValidateIntersections <-- executes the validate intersection step

    BRApi.Import.Process.LoadCube <-- loads to cube

    BRApi.DataQuality.Process.ExecuteProcessCube <-- run workflow calculation if your workflow has this enabled.

     

    Here is how I've got my ExecuteParseAndTransform function setup on my end:

    Dim ImportInfo As LoadTransformProcessInfo = BRApi.Import.Process.ExecuteParseAndTransform(si, wFClusterPk, "", Nothing, TransformLoadMethodTypes.Replace, SourceDataOriginTypes.FromDirectConnection, False)

     

    You will need to determine the wfClusterPk based on the parameters you've provided from the data management step. In the Connector rule that is being used in the Data Source, it will also need to know what workflow profile, scenario and period you are trying to process. 

    The result of the import (successful or failed) is stored in the ImportInfo variable. We then check to see if the import was successful and without errors before proceeding to the validation steps:

        If ImportInfo.Status = WorkflowStatusTypes.Completed Then

     

    Hopefully this provides the gist on how to get started down the import automation! 

     

     

    • sahilp's avatar
      sahilp
      New Contributor II

      Cosimo, thanks that worked perfectly!

    • SaraA's avatar
      SaraA
      New Contributor II

      Update: I found the issue was unrelated to this function.

      ---------

      What parameters do you pass to BRApi.Import.Process.ExecuteParseAndTransform please? I'm using this function but it doesn't seem to bring in the data correctly from the source system (for example, the numbers entered in the source system in the last few hours are not being properly brought over to OneStream unless I run a manual import). It also only runs if I clear stage data first if the validation fails.

      Thank you.

  • Cosimo's avatar
    Cosimo
    Contributor II

    At first, we used the Global Time period to determine the period to process but then switched to using the System time. My scheduled tasks are setup with the following parameters:

     

    The daysBefore parameter indicates the # of days prior to a new month that we can start the automation while daysAfter indicates the # of days after start of month that we run the process. excludeWeekend true/false tells the business rule if we should run the process over weekends.

     

    The Extender rule then figures out what workflow period to run based on the current date and if the date falls between daysBefore >= current date <= daysAfter. The daysBefore/daysAfter is optional for determining the workflow period. if you're starting imports on the first day of every month, then the workflow period can readily be determined as:

    Dim curDate as Date = Date.Now

    Dim wfPeriod as string = curDate.Year & "M" & curDate.Month

     

    Note that I write the wfPeriod to a dashboard parameter which is the read from within the Connector rule so as to run the SQL on the current wfPeriod. 

     

    Hope this helps!

     

  • sahilp's avatar
    sahilp
    New Contributor II

    Quick follow up, is there a good way to automate the period selection? Or will someone have to update the time parameter each month?

    • BSK's avatar
      BSK
      New Contributor III

      Sahil, appreciate if you cab post the complete extended rule logic. i'm new to business rules. Finding difficulty writing the logic with the above inputs.

      we also have similar requirement to automate the data load process bu triggeting a WF instead of executing manually. 

       

      Thanks in advance.