Move data between scenarios - Slice data files on UD in export job

Jones
New Contributor III

Hey,

So I have built a flow to push data from one scenario to another by pressing a dashboard button which start a data management job that execute a extender business rule.

The extender business rule is executing import, validate and load with the file created through the referenced data management export data step/sequence. 

JonesHedlund_0-1693494472615.png

The export data step is using a literal parameter in UD1 that is being set before pushing the dashboard button.

JonesHedlund_1-1693494756568.png

Above UD1 selection makes it possible to only push/seed data to the other scenario for just a portion of the total data. Over time there will be up to 10 button executions to push all data in UD1 to target scenario. 

 

Everything works fine for me except replacing data in destination scenario/workflow. Currently I use .append in order to load all 10 UD1 areas to one import step in destination workflow/scenario. If the user in this case whould press the button for the same UD1 selection a second time, I would have the data twice. 

What I would love to archive is to replace data instead for append. But only for the given UD1 set. 

 

Have anyone manage to do something similar and could come with possible solutions?

 

Thanks in Advance!

 

 

1 ACCEPTED SOLUTION

Jones
New Contributor III

Hey and thanks for your reply. I did check that one out. But actually when looking at that in data source I managed to accomplish what I wanted by setting SourceID on DataSource to UD1 as below. 

JonesHedlund_0-1693909428717.png

 

View solution in original post

5 REPLIES 5

db_pdx
Contributor III

Hi Jones: are you using the Import, Validate, Load because you need to retransform the data?  If not, it will likely be significantly easier to manage the data copy with a FinanceBR-CustomCalculate that does an api.Data.Calculate to copy data between scenarios/cubes/members.  You can pass your U1 parameter directly into the api.Data.Calculate as a filter.

EricOsmanski
Valued Contributor

In this case, you could utilize the Source ID field on the Data Source and create a different Source ID for each UD1 you are sending. This will allow you to use Replace because OS will only replace the Source ID you are loading. Look at "XFR_GetFileNameforSourceID" Parser BR in the GolfStream application for a sample rule.

EricOsmanski_0-1693574160329.png

 

Jones
New Contributor III

Hey and thanks for your reply. I did check that one out. But actually when looking at that in data source I managed to accomplish what I wanted by setting SourceID on DataSource to UD1 as below. 

JonesHedlund_0-1693909428717.png

 

Why would you do this using an import step? Isn't this going to be a bit easier using a Finance Rule? Maybe I'm missing something

Jones
New Contributor III

Hey, valid question indeed. So we try to keep business rules as low as possible from a maintain perspective. Instead now all mapping happens in TransformationRules. The Extender business rule that is being used is less than 100 rows and works great to be honest.