Task Scheduler not executing tasks
I have scheduled a data management sequence to run every 120 minutes between a specified time range (7 am to 9 am) -- But the data management sequence is not being executed. There are no parameters needed in the data management sequence, and I have confirmed that manually running the sequence works without error. When I check the grid view in Task Scheduler, I can see the "Next Start Date/Time" column is correct. However, when the system finally reaches that time, the task is not executed, the "Count" column remains as zero, and the "Next Start Date/Time" column updates to 120 minutes later. I receive no system errors on why the task is not executing. I have confirmed the scheduling a one time task, or even a daily task works successfully. Any help on this would be great.10KViews3likes14CommentsClear data using level 1 data unit (rather than level 2) on data load
Hi all I was wondering if it is possible to override the default 'clear and load' settings on a data load step from 'Level 2: Workflow Data Unit' to 'Level 1: Cube Data Unit' Just to give a bit of context: We perform a data copy from a Budget scenario (using Data Management Sequence) to a Forecast scenario to provide a starting data position for divisions to use when they capture their forecast data. Once the data copy is complete, the process for divisions to update their forecasts is to extract an Excel Fixed File Template on which they will make their changes and additions over the copied Budget data, and then to import that file via their Workflow to the cube. However, when wanting to remove an account for instance, the only way to do so it to load zeros to that account. This is because the load clears by account and not entity. Is it possible to change the load so that it deletes all accounts? Thanks, Mark1.2KViews2likes3CommentsErrors creating New Task in Data Import Schedule Manager
I am getting several errors when setting up a New Task in the Data Import Schedule Manager. We are on OS version 9.01.17403, and I have installed the latest version of DSM, which is 8.4.0_SV100. I suspect this may be a version compatibility issue, so I am curious if anyone has been able to get this solution to work in a 9.01 application. I have already uninstalled and reinstalled the solution, which didn’t resolve the issues. Below are the two errors I am seeing: When choosing Global Scenario from the Scenario(s) drop down list, I get an immediate error “Error processing member. The item was not found. Member, 11111111.” The details state: Unable to execute Business Rule ‘DSM_Paramhelper’ where it appears to be trying to call the Global Scenario by OneStream.Client.Api.DashboardsAjaxServiceReference.DashboardsAjaxServiceClient.GetParameterDisplayInfosUsingDashboardNameCompressed(SessionInfo si, LoadDashboardInfo loadDashboardInfo, Boolean isForDashboardUIWithInteractiveComponents, Dictionary`2 custSubstVarsAlreadyResolved). If I pick a specific Scenario, I am getting a different error. It allows me to pick the Scenario and Time, but when save the Task, I get “The input string ‘’ was not in a correct format.”. The error details show it is an issue with the same Business Rule ‘DSM_SolutionHelper’ where Conversion from string “” to type ‘Double’ is not valid. The input string ‘’ was not in a correct format. OneStream.Client.Api.DashboardsAjaxServiceReference.DashboardsAjaxServiceClient.StartExecuteSelectionChangedServerTaskCompressed(SessionInfo si, Boolean isSystemLevel, Guid primaryDashboardID, Guid embeddedDashboardID, Guid componentID, PageInstanceInfo pageInstanceInfo, XFSelectionChangedServerTaskInfo serverTaskInfo) Any advice on how to correct these issues would be greatly appreciated.24Views1like0CommentsShrink & reindex, and elastic pool usage
Hi, I have recently reduced a PROD data table by several million rows but the Allocated table size has remained largely unchanged, as the reduction in the Used portion has just shifted to an increase in the Unused portion and the elastic pool usage has also remained unchanged. I'm assuming that if I run an app copy of PROD into DEV then OS will try to copy over the entire elastic pool size rather than differentiating between what is Used and Unused and only copying over the Used portion. A shrink and reindex has been suggested as a solution to reduce the Unused portions but I have been advised that running a shrink is a risky move as it can cause performance issues. I therefore wanted to check whether a shrink and reindex in the PROD environment, at least during monthly maintenance, is widely used and generally benefits the environment rather than being detrimental to it?80Views1like4CommentsExporting Data automatically to outside of OneStream
HI I am trying to automate a data extract process to send data from OneStream to another system eg Anaplan. I have a data Mgmt job to Export the data to a CSV on the file share. I now want to move that CSV file from the file share to say my desktop for example (or any other file storage place). Has anyone done any routine like that? If so is there any BR in OneStream that can be used for that. I havent done this before but sure others may have come across this so just wondered what the best approach would be. Thanks! TahirSolved18KViews1like17CommentsHow to back up OneStream Data
We have a requirement to have daily backups for our applications for a period of 15 days. OneStream cloud offers point in time backups up to 7 days and only weekly backups past that period. I know how to back up artifacts/metadata in OneStream but not sure how to back up data. Does anyone have any ideas on the best way to create a data back up?625Views1like2CommentsHow do I delete redundant files from system/file explorer/file share as DELETE Option is Grey ?
Hi, I have a number of files in my export folder that were created using a Data-Management step. However, even though I have system admin rights , the option to delete any or all files is 'greyed out'. So, how do I delete them when I have finished downloading them. Also, how do I then delete the 'date' folder as well ? Thanks Mark6.6KViews1like9CommentsData Management Global Time
Hi all, I thought this was easy but not sure what I am missing. I have a current data management job that is running off this parameter and works just fine. Scenario=[Actual], Time=[|GlobalTime|]. I have tried [|GlobalPrior1|] with no luck I am trying to get the prior month so I have tried Scenario=[Actual], Time=[|GlobalTimePrior1|] but that is an invalid sub variable, is there another syntax to use on this? I know you can with t# in cube views but this is the DM job. Thanks in advance!605Views1like5Comments