Right now there is no way to have specific batch executions harvesting from different folders. It’d be great to have an additional parameter for harvest sub folder so this is a good one for ideastream.
in the meantime, the workaround I can think of is to have sempahore solution:
- have different batch subfolders, one for each system (you can download to other parent folder like contents)
- have 2 DM seqs, one per system
- have one ER which performs semaphore logic
1) get current DM sequence name
2) check if the other DM sequence task is running
3) if it is, skip execution (red light)
4) if it is not (green light), move files from subfolder of the current source system
5) execute harvest
in this way you make sure each sequence runs its file. If you want to have more parallelism, you could move the batch harvest execution to a ER in a separate DM and have the semaphore EX executing the DM with Queue api method.
You can also have a single DM job getting system as parameter and updating the description of the task to include the system. You can do this with API