Zero cells count
Hi, I wanted to ask about the OneStream guidelines for an acceptable proportion of zero cells. I remember reading something a few years back about a suggested maximum proportion, but I can't find that info now. I know it's important to minimize zero cells and that there are ways to do this, like using the RemoveZero command in formulas. I also get that the proportion of zero cells is just one part of the overall system performance. Could you share the specific proportion of zero cells to total cells that would typically trigger further investigation? Also, what threshold would indicate that I'm in the clear? I have one client with an existing implementation where zero cells make up about 20% of the total data cell count at the total entity level for Actual. Is that a reasonable proportion? Thanks19Views0likes1CommentLock Out Sold Entities, Keep Their Impact!
Managing Sold Entities Without Losing Consolidation Results. Hi Community, We’re looking to hear how others handle this in OneStream: When an entity is sold in a specific period (e.g., 2025M5), the requirement is to: ✅ Retain its posted journal values in group consolidation ❌ Prevent any further use of the entity (journals, data loads, forms) The challenge we've encountered is that setting In Use = False — even after journals are posted and consolidation has been run — can cause OneStream to drop that entity’s data from the group during future reconsolidation. This results in the loss of valid P&L impact from the sold entity. Our workaround: Keep In Use = True for the sold period Use workflow locks and security restrictions to block future changes Only set In Use = False once everything is finalized and locked 💡 Feature Suggestion: It would be incredibly useful to have a system property or override flag that allows us to: Set In Use = False to fully disable the entity in the workflow Still retains its consolidated values during reconsolidation How are you managing this today? Your thoughts! Thanks,23Views0likes0CommentsLooking for a security report
Hi all, Does anyone know if the standard OS security audit or application dashboard reports present a way to determine all of the objects within a OneStream application where a particular Security group is used? For example - let's say I have a grp called "E_WF_specialUsers" and I want to better understand where this group is used within my application (i.e. is it attached to the display group of certain user dimensions, or account members, or workflows, cube views, etc.). I can't seem to find anything that does this within OS but want to make sure I'm not missing something here. Thanks in advance, K.24Views0likes2CommentsTracking User Details and Timestamps in OneStream Dashboard Export
Hi , Good Day! I'm currently building a dashboard in OneStream that allows users to select all possible points of intersection for dimensions and their members. Once selected, they can click a button to export the data to a file share using a Data Management export sequence. This part is working well, and the dashboard is complete. Now, I want to enhance it by capturing user details (who entered the data) and the timestamp of when the data was entered for each point of intersection. I'm wondering if there's an existing OneStream Marketplace solution or feature that supports this kind of tracking and logging. Has anyone implemented something similar or come across a solution that could help with this? Thanks in advance!12Views0likes0CommentsSpreadsheet tool in v8.5 - does it work?
Trying to confirm if the spreadsheet tool in v8.5 is working for anyone in the following manner... Created dashboard with the spreadsheet tool. The page renders fine... However, when doing something as simple as Save As, the following message is generated. Excel file is defined in the configuration. One client mentioned this, so I tested this on another client with the same results. Heard this is fixed in v9.0. Is this correct or am I missing something? Thanks for any input! Ed Puente39Views0likes1CommentView Member Issue on Cash Flow Report
Hi All, I am working on a Cash Flow Cube View to show the beginning cash, ending cash, and change in cash (activity) in three separate rows. All the base cash accounts are under the new parents (beginning cash, ending cash, and change in cash) in an alternate account hierarchy for the Cash Flow. We are using the BegBal_Dynamic flow member from the Blueprint app for beginning cash, EndBal for ending cash, and Activity_Calc for the change in cash. Figures are flowing in, but for these accounts the columns have a view member set to Periodic, QTD, and YTD. The problem is that figures are not representative of the View member, that is to say that they are all the same on the ending cash. Below is a screenshot and the Cell POV for each of the members. I am happy provide any other screenshots or additional information to get a resolution on this. Thanks in advance! MTD (Periodic): Cb#Batesville_Cons:E#LE_TOT:P#?:C#Local:S#Actual:T#2025M5:V#Periodic:A#End_Period:F#EndBal:O#Top:I#Top:U1#CC_TOT:U2#Product_Total_Core:U3#Reported_GAAP_Tot:U4#PayType_Total:U5#Initiative_Total:U6#None:U7#None:U8#None QTD: Cb#Batesville_Cons:E#LE_TOT:P#?:C#Local:S#Actual:T#2025M5:V#QTD:A#End_Period:F#EndBal:O#Top:I#Top:U1#CC_TOT:U2#Product_Total_Core:U3#Reported_GAAP_Tot:U4#PayType_Total:U5#Initiative_Total:U6#None:U7#None:U8#None YTD: Cb#Batesville_Cons:E#LE_TOT:P#?:C#Local:S#Actual:T#2025M5:V#YTD:A#End_Period:F#EndBal:O#Top:I#Top:U1#CC_TOT:U2#Product_Total_Core:U3#Reported_GAAP_Tot:U4#PayType_Total:U5#Initiative_Total:U6#None:U7#None:U8#None27Views0likes1CommentClear ALL Data related to a specific Member.
Hello everyone, I'm just wondering if there's a method or a way to delete ALL Data with a click of a button. I know about data management steps and the use of it along with the finance BR. So my concept and idea is that, I allow the user to select the member (step 1) or list of members (if step 1 is good). Then the user can click one button to delete all data. Why I want to do this is because let say the user want to delete a certain member, it will be easier for the user to do so. I tried to retrieve all the cubes that I have in the application and changing it via the scriptbuilder/databuffer method and setting the cell to 0 but the concept of DataUnit do not allow us to do so. Thus, I am here seeking for guidance and wisdom from the community. :) Thanks!Solved63Views0likes2CommentsData Cells become invalid when Flow Processing Type is enabled
When the Flow Processing Type "Is Alternate Input Currency for All Accounts" is enabled on a Flow member (as shown below Data Intersections become invalid for that Flow member and Foreign Currencies (USD, CAD).Entity BU_200 default currency is EUR. Note that when this property is not enabled the foreign currency cells are valid. My understanding is when this setting is enabled the foreign currency data cells should be enabled for data load or input. Appreciate any clarification on this.Solved45Views0likes1CommentSmarter Data Loading: How OneStream Transformation Rules Automatically Handle New Members
In the world of Enterprise Performance Management (EPM), clean and accurate dimensional mapping is essential for a reliable reporting process. Fortunately, OneStream makes this easier through its robust Transformation Rule framework, which intelligently handles new dimension members automatically. Whether it's a new GL account, cost center, or entity, OneStream’s transformation engine ensures that new data doesn’t derail your process, even before user intervention. 🔍 Key Benefits Dynamic Member Detection When a new source member enters the system—say, a newly created account in the ERP—OneStream flags it immediately as “Not Mapped” in the staging area. This provides the user or administrator with a clear indication that attention is required. Mapping can then be quickly added or adjusted to keep data flowing seamlessly. Centralized and Flexible Maintenance Transformation rules support Wildcard and Dynamic Mapping, making it easy to set default routes for unknown members (e.g., directing all new accounts to a placeholder for Unmapped Accounts). From there, consultants or admins can reassign the correct mappings when ready. This approach provides control without interrupting the load process. Built-In Mapping Validation There’s no need to build dashboards or manual reports to detect mapping issues. OneStream’s Data Source and Transformation Rule editors include built-in validation tools to identify unmapped or incorrectly mapped items before they affect results. This enables early detection, faster resolution, and reduces dependence on visual checks. ✅ Why This Matters By leveraging these out-of-the-box capabilities, organizations save time, reduce risk, and streamline their integration workflows. Consultants can focus on higher-value tasks instead of constantly updating mappings, and end users can trust that data loads will highlight issues automatically, before reporting is impacted. Please keep it simple. Do you agree that OneStream handles what it's built to do?49Views0likes0CommentsAccount Attribute: In Use versus Allow Input
Hi all, We have hundreds of historical (no longer valid) accounts sitting in a cube and we are looking at ways of de-cluttering resulting data retrievals to avoid seeing them. But additionally, we want to prevent any accidental loading to these accounts from the source system. For reporting, I know we can populate one of the text fields with some text and then combine that with a Where() clause to filter out these old accounts. But, from a data load perspective, just wondering what the risks / benefits are of utilizing either (or both) of the In Use or Allow Input attributes; to prevent accidental loading of data there. We want to be able to continue to see any historical data for these accounts and be able to drilldown from the parent account and see that data. There are no member formulas associated with these - they are all base accounts. We don't care about the Scenario/Time specificity capability If anyone is using these attributes would very much appreciate any recommendations or hearing what your experience has been (before we go through the effort of testing either of these options). Many thanks, K.40Views0likes2Comments