Forum Discussion

kchampion's avatar
kchampion
New Contributor II
3 years ago

Transaction Matching - Load/Match Data Daily

Hello!

We're implementing Transaction Matching and we have a use case to load transactions on a daily basis and run matching each day.  Our TXM scenario has 12 periods, but we're finding that once the data in the Import step loads to the data set and the transactions match, additional files cannot be loaded using the same import step since there are now transactions from the first data set that have been matched.

 

Example: 

Import transactions on May 1 into 2022M5, run matching - transactions are matched.
Import transactions on May 2 into 2022M5 - error - all transactions in stage have to be unmatched first before reloading.

 

Wondering how others handle this - create a separate import step for each day of the month to load each individual daily file and combine them all to load into one data set?  Are there other options?

Thanks!

Kathryn

  • Hello,

    The source id is a key

    WFTime + WFScenario + WFProfile + SourceID must be unique.

    So you can import any files into the same period. Even without Splitting.

  • kchampion's avatar
    kchampion
    New Contributor II

    I figured this out - the Data Splitting WF Profile allows multiple files to be loaded to the same data set in the same period and does not issue errors if transactions have already been matched.  It appears that unlimited numbers of files can be loaded each period through the Data Splitting workflow profile.

    • EGalanzovsky's avatar
      EGalanzovsky
      New Contributor III

      Hello,

      The source id is a key

      WFTime + WFScenario + WFProfile + SourceID must be unique.

      So you can import any files into the same period. Even without Splitting.

      • kchampion's avatar
        kchampion
        New Contributor II

        Thank you, that helps greatly!  I parsed the file name with the datetime stamp as the source id and now the unique filenames are loading into the same period.  Thanks!!