Best Practice to Clear Stage Data Globally and Cube Data for Selected Entities
Hi Team, I’m looking for guidance on the best approach to clear data in OneStream. My requirement is twofold: Clear all data from the Stage area For a selected set of entities, clear both Stage and Cube data I’m particularly interested in understanding the most efficient and recommended way to handle this, while ensuring data integrity and minimizing any unintended impact. Specifically, I would appreciate insights on: Recommended methods to clear Stage data globally (e.g., via Data Management steps, Business Rules, or utilities) The best approach to selectively clear Cube data for specific entities Any considerations around workflow locks, data unit locking, or audit/history implications Performance best practices and recommended sequence of operations If anyone has implemented a similar requirement, I’d love to hear your approach or any lessons learned. Thanks in advance for your help!54Views1like2CommentsBusiness Rule | One-to-One Transformation - E# Parent to its first E# Base
We need your help to Import flat files: 1. > 100k Rows by 1,000s of E# Parent Members 3. E# is extended We've been fortunate to be advised to build a BR to generate a one-to-one Transformation Rule file. - E# Parent to its first E# Base Is it possible for you to be generous enough to share with us/Community: - a sample BR to look up its first E# Base Member of an E# Parent Member - how to call that BR in an Import Workflow Thank you, OS SMEs.Solved209Views1like2CommentsExcel Template upload - Data Source setup
Hi All, I'm trying to set up some Excel templates for the uploading Budget and Forecast data. I'm pretty happy with the Excel Template side, with the named range (XFD), and specific header formats, etc. However, I'm a bit lost on the Data Source set up, I get that you have to Allow Dynamic Excel Loads set to True, but what about the rest of the set up? Do I choose Delimited or Fixed file? It feels like this Data Source section is really for flat files, as it always wants to know the column number. I've tried importing the Excel into the Data Source in the same way I would for a csv file, but it just shows up as xml gibberish in the top box. It definitely feels like I'm missing something.Solved5.2KViews2likes5CommentsHow to "Import" "in parallel" via a "OneStream Connector" to a "Data Warehouse"?
Please share your practical advice to help us meet the new Integration requirements. - I couldn't find a OneStream KB/post nor a good result from free Gemini. - We're using SAAS on v9.0 to move to v10 upon its upcoming release. Thank you folks for sharing your expertise.93Views1like2CommentsFeb Actuals divided in half to import to Jan
There was no Jan close, so we want to divide Feb actuals by 2 to get a Jan actuals load. The loads are SQL from HFM and Database Whse, so it isn't done with a flat file csv. When I looked at the Feb Import that are 1214 lines. How can we take that Feb import, divide by 2, and import in Jan Actuals? First thought was to create a QuickView then upload back into OneStream. Also looked at a copy Business Rule but calculations are needed for the data. Any ideas/thoughts?Solved86Views0likes4CommentsIncluding Journal Entries from Different Origin Member in Account Reconciliation Discovery
Hi Community, I’m working on an Account Reconciliation solution where I need to ensure posted journal entries balances are also included in the Account Reconciliation discovery process. Currently, by design the Discovery process only pulls trial balance information already imported and validated in the Stage. Therefore, I wanted to check if there’s a recommended, or best practice approach, to accomplish this requirement. Any guidance or examples would be greatly appreciated, thanks in advance for your help!117Views1like3CommentsImport with Vertical Extensibility or Extensible Dimensions
Hi, I can’t get my data import to work with vertical extensibility, and I’d like to understand how this situation is supposed to be handled. In short, the import validation fails on accounts that are extended for an entity. The entity belongs to a cube that is vertically extended from the primary workflow cube. Detailed situation I have extended dimensions and two cubes that use those dimensions. CubeParent contains a parent entity with dimensions applied. The parent entity resides in the parent entity dimension. CubeChild has dimensions that are extended from the parent’s dimensions. The child entity resides in the child entity dimension. As a result, each cube has its own unique entity dimension containing its respective entity. Additional configuration details: CubeChild has Is Top Level Cube For Workflow set to False. CubeParent is the top-level cube and has the Workflow Profile created from it. The child entity has been added as a child of the parent entity in the parent’s entity dimension (see screenshot). CubeParent has the Cube References tab configured correctly. Data import issue I have configured a data source that reads from a CSV file. The data source successfully loads the CSV into the staging area. However, the transformation step fails. The failure occurs only for the child entity’s extended account, not accounts in the ParentCube account dimension. The transformation rules when created are associated with a cube. If the cube is not that of the workflow profile the transformation rule does not appear in the import configuration step. Question How should data be imported for an entity that exists in a workflow that has different dimensional detail than the cube it belongs to because it is using vertical extensibility?Solved126Views0likes4CommentsConnect MS Fabric Lakehouse?
Hi all, I'm currently exploring ways to extract data from a Microsoft Fabric Lakehouse into OneStream. The obvious way would be to connect to the SQL endpoint just like any other DB. In this case, what would be the best authentication method? I've been testing it out with my own user on MS SQL Server Management Studio using MFA, and I have no idea what the best way would be to create credentials for the connection string in OneStream. In case connecting to the SQL endpoint with a connection string isn't a good idea, what would be a better practice? Thanks.SolvedErrors creating New Task in Data Import Schedule Manager
I am getting several errors when setting up a New Task in the Data Import Schedule Manager. We are on OS version 9.01.17403, and I have installed the latest version of DSM, which is 8.4.0_SV100. I suspect this may be a version compatibility issue, so I am curious if anyone has been able to get this solution to work in a 9.01 application. I have already uninstalled and reinstalled the solution, which didn’t resolve the issues. Below are the two errors I am seeing: When choosing Global Scenario from the Scenario(s) drop down list, I get an immediate error “Error processing member. The item was not found. Member, 11111111.” The details state: Unable to execute Business Rule ‘DSM_Paramhelper’ where it appears to be trying to call the Global Scenario by OneStream.Client.Api.DashboardsAjaxServiceReference.DashboardsAjaxServiceClient.GetParameterDisplayInfosUsingDashboardNameCompressed(SessionInfo si, LoadDashboardInfo loadDashboardInfo, Boolean isForDashboardUIWithInteractiveComponents, Dictionary`2 custSubstVarsAlreadyResolved). If I pick a specific Scenario, I am getting a different error. It allows me to pick the Scenario and Time, but when save the Task, I get “The input string ‘’ was not in a correct format.”. The error details show it is an issue with the same Business Rule ‘DSM_SolutionHelper’ where Conversion from string “” to type ‘Double’ is not valid. The input string ‘’ was not in a correct format. OneStream.Client.Api.DashboardsAjaxServiceReference.DashboardsAjaxServiceClient.StartExecuteSelectionChangedServerTaskCompressed(SessionInfo si, Boolean isSystemLevel, Guid primaryDashboardID, Guid embeddedDashboardID, Guid componentID, PageInstanceInfo pageInstanceInfo, XFSelectionChangedServerTaskInfo serverTaskInfo) Any advice on how to correct these issues would be greatly appreciated.Solved108Views1like1Comment