Exporting Data automatically to outside of OneStream
HI I am trying to automate a data extract process to send data from OneStream to another system eg Anaplan. I have a data Mgmt job to Export the data to a CSV on the file share. I now want to move that CSV file from the file share to say my desktop for example (or any other file storage place). Has anyone done any routine like that? If so is there any BR in OneStream that can be used for that. I havent done this before but sure others may have come across this so just wondered what the best approach would be. Thanks! TahirSolved18KViews1like17CommentsTask Scheduler not executing tasks
I have scheduled a data management sequence to run every 120 minutes between a specified time range (7 am to 9 am) -- But the data management sequence is not being executed. There are no parameters needed in the data management sequence, and I have confirmed that manually running the sequence works without error. When I check the grid view in Task Scheduler, I can see the "Next Start Date/Time" column is correct. However, when the system finally reaches that time, the task is not executed, the "Count" column remains as zero, and the "Next Start Date/Time" column updates to 120 minutes later. I receive no system errors on why the task is not executing. I have confirmed the scheduling a one time task, or even a daily task works successfully. Any help on this would be great.10KViews3likes14CommentsAutomating WF Execution
Hi all, I have connector rule set up to an external source table that is working fine to pull data into stage, but I would like to automate the execution of the workflow as well so that the data in the cube is refreshed nightly. I found the below api but it looks to me like it's meant to load a flat file into stage via an extensibility rule, not from an external table. I'm not sure how to connect it to the connector/import step I want to automate. Any thoughts? Dim WFTime As String = TimeDimHelper.GetNameFromId(api.WorkflowUnitPk.TimeKey) Dim results As WorkflowBatchFileCollection = BRApi.Utilities.ExecuteFileHarvestBatch(si, "Actual", WFTime, True, True, True, True, False, False, False)Solved8.3KViews0likes7CommentsHow do I delete redundant files from system/file explorer/file share as DELETE Option is Grey ?
Hi, I have a number of files in my export folder that were created using a Data-Management step. However, even though I have system admin rights , the option to delete any or all files is 'greyed out'. So, how do I delete them when I have finished downloading them. Also, how do I then delete the 'date' folder as well ? Thanks Mark6.8KViews1like9CommentsExecute Import > Validate > Process using a button on a dashboard
Hi I have a dashboard that executes an Import > Validate > Process using a business rule. In my workflow I am currently using Workspace, Import, Validate, Process, Confirm: I would like to just use Workspace, Confirm, but it gives me the following error: "Cannot execute step because the specified step classification doesn't exist for the workflow profile." Is there anyway around this? Thanks, MarkSolved5.3KViews0likes14CommentsTask Scheduler
SOURCE: ONESTREAM CHAMPIONS Hi all! We just upgraded to 6.4 last week and have migrated our data management jobs running via power shell script onto Task Scheduler. A few questions: Do you set these jobs up with a certain person’s user name? Do you know what happens if that user is later disabled? (I’ve submitted a OS support ticket for that last question). If you don’t set them up under a specific person’s user name, how do you set them up? We have an “Administrator” user and “OSAutomation” user that I’m considering using to keep these auto jobs separate from my regular OS activity. Any insight would be appreciated as I’m trying to gauge how other users have set up Task Scheduler while also asking OS support for their best practices. Thanks! -Nicole5.3KViews0likes5CommentsSFTP by Workflow
Hello! I'm trying to set up a few workflows that individually kick off an SFTP load, but only for that given workflow. I created a dashboard button that the user clicks on to run the SFTP data management job, but it currently loads the fiels for all workflows set up for SFTP. I'm tyring to find a business rule line that doesn't sue the batch file load. Are there any examples of only loading a file based on the workflow selected? Thank you!5.2KViews0likes9CommentsClearing data for a very specific intersection / Data Unit
Hi all, Trying to delete an unnecessary member from the metadata but it won't delete due to 'existing' data. There isn't actually a value where it says a value exists but OS believes there is data there in currencies that are non-local. I've tried the DM Clear Data job but it only seems to see the Cube, Entity, Time periods and Scenario. It appears to ignore any other dimensions specifications I include in any of those filters (i.e. additional UD dimension specifics). Which makes it too broad for our requirements. Tried to test this and even though the DM job succeeded for that cube/entity/scenario/time period, it STILL thinks there's data there and won't allow the deletion of the member. Hoping someone out there has run into a similar situation and has some suggestions to offer on how to truly clear a member completely so it can be removed from the metadata. ThanksSolved5.1KViews0likes7Comments