Recent Discussions
Can we parameterize or set a value for the additional "options" for the Data Export step in DM?
Hello OneStream Experts, I was wondering if it is possible to set a value or retrieve a value (using XFGetValue) from user input for the "options" in the Data Export of a Data Management job while developing an Extensibility Rule. The goal is to give end users control over Data Exports without granting them access to the Application tab. Below is the code I developed to execute the Data Management sequence. I noticed there are a list of options that can be leveraged; however, I have not been able to set any values for these options. Any help or suggestions would be greatly appreciated. Thanks!NitishKrishB19 hours agoNew Contributor8Views0likes0CommentsSIC Local Gateway Server Configuration
I have DEV and PROD OS environments. Is it possible to export the Gateway Configuration details from both OS environments and configure in one single Local Gateway Server? Or Do I need dedicated Local Gateway server for each OS environment?vmanojrc302 days agoContributor10Views0likes0CommentsAudit History for Forms - CubeView drill down (v8.4)
I have drilled down to the bottom of both Origin and UD6 dimension which shows the data was loaded by corp accounting team. However, i cannot view Audit history for Forms - it continues to show as grey'd out. Does anyone know how to enable the audit history? thank you.allanaklepko6 days agoNew Contributor III29Views0likes4CommentsRunning Validate and Load Steps for Multiple Years
Hi everyone, We are uploading portions of our budgets in Excel templates across multiple years. The data is imported across all the years in one shot, but the validate and load steps need to be run year by year if imported through the workflows we have set up. Does anyone know if there is a way to execute the validate and load steps across multiple years through a rule? I know we can use the batch harvest folders, but we're hoping to keep the workflow steps as close to those used in other processes as possible. Thanks, JamesSolvedJames3216 days agoNew Contributor49Views0likes3CommentsUnable to unlock the Work Flow channel
Hi Team, WhatIF_R&O_Adj is a Workflow Channel, and it is assigned under "Financial_WeeklyProcess" - "Base_Input Profile" (Refer the 2nd screen shot). when we are trying to unlock the channel. getting pop up like "'mnuWFChannelLockWhatif_R&O_Adj' is not a valid value for property 'Name'." and it is not listed under WF channel list(Please refer 3rd screen shot). Kindly help us to resolve the issue. Thank you, Shiva Prasad.ShivaPrasad6 days agoNew Contributor III17Views0likes1CommentImport data using multiple workflow profles
Hi everyone, I have an issue in understanding how to implement multiple import steps within a workflow. My client will load data through a classing import step throughout all the months of the year. In addition to that, they asked to set up 3 additional import steps (P13, P14, P15) to be used in December that will serve as "adjustments" to the previous data loaded on the cube. At each import step, however, the data will be loaded in full (with eventual adjustments) and should overwrite the data loaded through previous import steps (and this is the reason why we don't use forms but we use multiple import steps). For example, when P13 is loaded, it should overwrite the data loaded through the Import step, and when P14 is loaded it should overwrite the data loaded through P13, and so on. The problem now is that since we are using a different workflow profile at each step, the system does not automatically overwrite the data loaded through the previous workflow profile, and therefore the data is just added on top of the data that was loaded in previous steps. Does anyone knows whether there is a way to overwrite data from previous loads when a new import step is used? An idea I thought about was to delete the data on the cube through sql while executing the BR Connector rule but I don't know what tables in the db hold the data that I want to overwrite. Thanks in advance for any help, and if something is not clear please ask, I'll be happy to clarify!Solvedfc7 days agoContributor1.6KViews0likes7CommentsOrphan members - validation error required
Hello dear community members, Background: We have had situations with month end workflow load validation errors which were fixed with UD1 mass upload. Few UD1 members ended up in Orphans, as the Parent did not exist yet in OneStream at the members mass upload step. The validation error disappeared and the workflow was complete. However, this created an Out of Balance and it was difficult to retrieve the issue. Questions/potential solutions help required: Would it be possible for the month end Workflow load Validation error to ignore Orphan members and still get an error even when we create members and end up by mistake in Orphans. Is there a way to delete Orphan members with a BR? Is there a way to stop the Metadata builder xml file to create Orphans when a parent is missing from OneStream? Thank you in advance for any idea/suggestion/tip/hint.SolvedIrinaDragusanu7 days agoContributor60Views0likes11CommentsCube View Comments
Hello All, I have a cube view in which i am using on the Row Level a GetDataCell Function, example GetDataCell(Account1+ Account2): Name (Trade). on the column level i have a Comment using the V#Annotation:Name(Comments). however, on the GetDataCell the Comment is highlighted in Green. is there a way to be able to allow the user to write a comment on row that contains a GetDataCell function?ZAH9 days agoNew Contributor3.5KViews0likes6Commentscomposite mapping rule by period
Hello, I am new to Onestream and trying to solve something that seemed easy but it doesn't work:( We need to map an account by the entity and by the time. I used composite mapping. to be exact; A#[651200]:E#[8*]:T#2019M12} has to be mapped to account A A#[651200]:E#[8*]:T#[2019M3] has to be mapped to the account B - also all other instnaces of #651200 for all other entities are mapped to the acount B. I have only composite mapping for this account. It doesn't work. It totally ignores the time dimension. Any tips? thank you, OlaOlaWidera9 days agoNew Contributor1.7KViews0likes4CommentsIs there guidance on the setting for the number of parallel executions for Harvest Batch Loads?
When using the BRAPi.Utilities.ExecuteFileHarvestBatchParallel function in an Extender BR, does anyone have any guidance and/or opinions on the proper setting of the parallelBatchCount parameter? My understanding has always been to set it to a maximum of the number of CPUs - 1 , available for the server performing data management or batch load processing.114Views0likes2Comments