The OneStream Community is currently experiencing issues with the IdeaStream and User Groups. Learn more.
Announcement
ABCFeatured Content
Recent Activity
You’ve Built the Dashboards. Now Get Recognized.
2 MIN READ If you’ve completed the Building Dashboards course, you’ve already done the hard part. You’ve moved beyond standard reporting. You’ve learned how to design dashboards that tell a story. You’ve built views that help stakeholders truly understand what’s happening in OneStream. Now there’s a clear next step. Get Certified. Get Recognized. You’ve already built the skills—certification is how you make them visible. The OneStream Certified Specialist (OCS) - Reports and Dashboards Exam is designed for practitioners who are already doing the work. It validates your ability to translate business questions into dashboards that drive insight and action. If you completed the instructor-led course, you’ve already covered the core concepts the exam is built around: Designing dashboards with purpose—not just visuals Structuring reports to support decision-making Applying best practices that scale across an organization Certification is your opportunity to formalize those skills and showcase them to your team, your leadership, and your organization. Exclusive Offer for Course Graduates To help you take the next step, we’re offering an exclusive certification promotion for Building Dashboards course graduates. You’ve already invested in learning—this is your opportunity to complete the journey at a reduced cost and turn your experience into a recognized credential. Request Your Certification Voucher Because this offer is tied to course completion, the process is simple: Submit a ServiceNow ticket to the Credentialing team Your course completion will be verified You’ll receive a discounted exam voucher to use when you’re ready No promo codes. No guesswork. Just a clear path from learning to certification. This offer is available for a limited time, so if certification is on your roadmap, now is the time to act. Why Certification Matters—Especially Now Dashboards are often the most visible part of a OneStream solution. Executives rely on them. Teams make decisions from them. Expectations are high. Certification signals that you don’t just know how to build dashboards—you understand why they’re built the way they are. It shows that you: Follow proven design standards Build with consistency and intent Can be trusted with high-impact reporting and analytics In many organizations, certified professionals are the ones trusted with the most visible dashboards and the most critical business insights. Certification doesn’t just validate your skills—it elevates your credibility. Ready for What’s Next? If you’ve already completed the Building Dashboards course, you’re closer to certification than you might think. This is your opportunity to take what you’ve learned, formalize it, and put a credential behind your experience. Take the next step today. Request your discounted exam voucher and turn your dashboard expertise into a recognized credential. Submit your request and get certified.tnylin3 hours agoOneStream Employee12Views0likes0CommentsThe Dynamic Grid Dashboard Component
Are you frustrated by volume limitations and the inability to edit data in grid view? Do you find the SQL Table Editor restrictive because it only supports a single table? Are your items taking too long to load in these grids? Discover a smarter way to load data on dashboards and grids: the Dynamic Grid dashboard component. Traditionally, OneStream grids have been limited in customizations, flexibility, data retrieval and data manipulation. The Dynamic Grid changes that traditional pattern. It allows each of those items in a more robust and enhanced way. Also, the Dynamic Grid allows for multiple data sources from a single component (however out of scope for this article). The Dynamic Grid loads only the data you need—nothing more. Dynamic Grid delivers unmatched flexibility: you can customize your grid to fit your company’s exact requirements. Enjoy traditional column formatting from the SQL Table Editor, now enhanced with powerful row and specific cell formatting options. I could go on about how powerful this new dashboard component is—but you are here to learn how to set it up. Let us dive into an example of creating a Dynamic Grid. The Dynamic Grid relies on two out‑of‑the‑box functions: GetDynamicGridData and SaveDynamicGridData. GetDynamicGridData retrieves the data for display in the grid. SaveDynamicGridData persists changes made in the grid—whether you are adding, updating, or deleting rows. Together, these functions power efficient data retrieval and safe, transactional saves for the Dynamic Grid. GetDynamicGridData Example Code: The first line— If args.Component.Name.XFEqualsIgnoreCase("Grid7") Then —acts as a conditional check to segment multiple Dynamic Grids within the same assembly. In other words, it ensures that the logic applies only to the grid named Grid7. This name must exactly match the Dynamic Grid component name defined in your dashboard configuration. Immediately after the opening If condition, you will notice two key arguments: startRowIndex and pageSize. These variables define where the table should begin rendering and how many rows appear per page. Without them, the grid would attempt to render the entire data, which would create serious performance issues. Here is what you need to know about pageSize: pageSize is driven by the Rows Per Page property on the grid. By default, Rows Per Page is set to -1, which means the grid uses your security settings to determine row limits. If you specify any value other than -1, that value overrides the security setting. The maximum allowed page size is 3,000 rows per page. Setting these arguments correctly is critical for performance and user experience. The next few lines use standard OneStream and .NET objects to open a database connection and create a command—nothing unusual there. The real magic happens in the SQL query, which leverages the lesser known OFFSET and FETCH clauses for efficient paging. These functions allow you to return only the rows you need, rather than loading the entire dataset. Alternatively, you can implement paging by generating a RowNumber column and filtering with a WHERE clause, such as: “WHERE ((RowNumber >= {StartRowNumber}) AND (RowNumber <= {EndRowNumber}))” Both approaches achieve the same goal: controlled data retrieval for better performance. Note: When building a Dynamic Grid with SQL, always follow best practices for writing SQL queries. This ensures your solution is secure, efficient, and maintainable. Notice the StartRowNumber and EndRowNumber variables integers. These are driving the SQL parameters inside of the OffSet and Fetch section of the SQL query. Using parameters is essential for pagination because it ensures your query retrieves only the requested rows while preventing SQL injection. The next section of code handles data retrieval and paging metadata: First, a new DataTable is created and populated using the SQL query for the current page of records. Then, “Dim SQL As String” defines a COUNT query to determine the total number of rows in the underlying table. A second DataTable is used to execute this count query and retrieve the result. Finally, the line: “result.TotalNumRowsInOriginalDataTable = $"{dt(0)(0)}" assigns the total row count to the XFDataTable result object. This property is critical for paging—without it, the grid cannot calculate how many pages to display. In short: no total count, no paging. Once all the arguments are in place, the final step is to create the XFDynamicGridGetDataResult object. This constructor requires three key inputs: XFDataTable – The data table you populated earlier (e.g., the result variable). Column definitions – We will cover these in detail later in the article. DataAccessLevel – Determines how users interact with the grid data. You can choose from: .AllAccess – Full read/write access (most common for editable grids). .ReadOnly – Users can view but not modify data. .NoAccess – No data interaction allowed. For most scenarios involving data manipulation, .AllAccess is the recommended setting. Note: As of now, it’s on the roadmap for paging and the ability to ascend and descend columns. This functionality is not currently available as of the writing of this article. SaveDynamicGridData Example Code: In the previous section, we focused on rendering data. Now, let us shift to saving modified data, whether inserting, updating, or deleting rows. With the SQL Table Editor, save functionality is built in. However, the Dynamic Grid offers far greater flexibility, which means you will need to implement custom-saving logic. This approach gives you full control over: How data changes are processed Custom validation rules User messaging and error handling Custom saving functionality ensures your grid behaves exactly as your business requires. Let us break down the code for the SaveDynamicGridData Function step by step. In the save routine, you will see two argument objects sourced from args: Dim getDataArgs As DashboardDynamicGridGetDataArgs = args.GetDataArgs Retrieves context from the earlier GetDynamicGridData call (e.g., paging, filters, sort)—useful if your save logic needs to reference the current view. Dim saveDataArgs As DashboardDynamicGridGetDataArgs = args.SaveDataArgs Initializes the save payload: inserted, updated, and deleted rows, plus column metadata. Next, the code gathers what is editable: Dim editedDataRows As List(Of XFDataRow) = saveDataArgs.EditedDataRows Gets the list of rows that have been modified, inserted, or deleted. This is often the entire row set, but you can scope it to specific rows when only part of the data should be editable. Dim columns As List(Of XFDataColumn) = saveDataArgs.Columns Provides the columns list (names, types, formats) for the target table—critical for validation and parameter binding. You will also see a simple table name string (e.g., Dim tableName As String = "dbo.sample_data") to identify where the changes are saved, and a Boolean flag used later to control behavior (such as enabling/disabling validation or transactional commits). Finally, the code re-opens a database connection to perform the actual writes (insert, update, delete). This is done in the save path to keep read and write operations logically separated and to ensure that any transactional logic is scoped to the save operation. Now we are moving into the fun section of code. This opens with a Loop (For Each) of all our possible edited data. Then you see the Case Statement “editedDataRow.InsertUpdateOrDelete” This lets the application know that we want to apply the insert, update or delete functions to our available edited data rows variable. There is no easy part to the modified sections. Each of them takes their own separate lines of code that are functionally SQL Statements. As you can see from my previous screenshot, I have created three separate functions to dynamically run these queries. I will post these code snippets as an appendix to this blog. For the function of InsertDataRow, this is using the standard SL Insert statement. Now we get to the save loop. The routine starts with a For Each loop over the collection of edited rows. Inside the loop, a Select Case editedDataRow.InsertUpdateOrDelete directs the logic for each row, applying the appropriate insert, update, or delete operation. There is no shortcut here, each operation requires its own parameterized SQL command and validation. In the screenshot, you will see I’ve split the logic into three dedicated functions to keep the code clean and testable: InsertDataRow(...) UpdateDataRow(...) DeleteDataRow(...) I will include these helpers in the appendix for reference. For InsertDataRow, we use a standard SQL INSERT statement with parameters (not string concatenation), ensuring safety and better performance. INSERT INTO <tableName> (Column1, Column2, Column3) VALUES (Value1, Value2, Value3) The UpdateDataRow function uses a standard SQL UPDATE statement to modify existing records. Here’s the basic pattern: UPDATE <tableName> SET column1 = value1, column2 = value2 WHERE <condition> The DeleteDataRow function uses a standard SQL DELETE statement to remove records. Here’s the basic pattern: DELETE FROM <tableName> WHERE <condition> Now that we have done all this coding magic, what’s next? First, create an empty XFDynamicGridSaveDataResult: “Dim result as New XFDynamicGridSaveDataResult()” This step is non‑negotiable: you must send data back to the gird after the save. Without it, the grid will lose context and paging will break. That’s why this line is essential: “result.DataTable = GetDynamicGridData(si, brGlobals, workspace, args)?.DataTable” . By calling GetDynamicGridData again, you ensure the save result reflects the latest filters, sort order, and paging logic—keeping the user experience consistent after modifications. With the empty XFDynamicGridSaveDataResult created and the original data table rehydrated, the next step is to restore paging, so the user returns to the correct page after a save. Compute the current page index using VB.NET integer division: “result.PageIndex = (getDataArgs.StartRowIndex \ getDataArgs.PageSize)”. Finally, before returning the result, there’s one last piece of logic, a small but important snippet that ties everything together. This step ensures the save result object is fully populated with the data, paging info, and any additional metadata the grid needs to render correctly after modifications. result.SaveDataTaskResult = New XFDynamicGridSaveDataTaskResult() With { .IsOK = True, .ShowMessageBox = True, .Message = "Save Finished" } This allows us to assign the result to the XFDynamicGridSaveDataTaskResult. However, you will notice the “With” statement. This is a cool vb.net trick that allows you to assign a series of properties or arguments inside an enclosed block of code. Simply put, you wouldn’t have to keep assigning, result.isOk = True and so forth. This is a quick way to assign several properties that are related. This is where we assign values to the XFDynamicGridSaveDataTaskResult. You’ll notice a With block—an idiomatic VB.NET construct that lets you set multiple properties on the same object without repeating the variable name. In other words, instead of writing result.IsOk = True, result.PageIndex = …, and so forth line by line, you can group them neatly inside one block. Now that we have all this code in place, you can finally return the result. We’ve covered how to render data and how to save modified rows—but the Dynamic Grid’s most used reporting feature is its ability to apply conditional column formatting. This functionality mirrors the familiar options found in the SQL Table Editor, giving you full control over how columns look and behave based on dynamic conditions. Below is a screenshot of the SQL Table Editor column formatting properties. All these formatting properties are available in the Dynamic Grid—but only through code. You can apply them directly within the GetDynamicGridData function or encapsulate them in a dedicated formatting function for better organization and reuse. In my implementation, I chose the latter approach, creating a separate function to handle conditional column formatting. Inside the GetDynamicGridData function (see screenshot above), I added the following line: “Dim columnDefinitions As List(Of XFDynamicGridColumnDefinition) = Me.GetColumnDefinitions()”. This creates a list of XFDynamicGridColumnDefinition objects, which define the properties and formatting rules for each column in the grid. The call to Me.GetColumnDefinitions() retrieves these definitions from a dedicated function, keeping the logic clean and reusable. As you can see, this entire line of code is commented out. I wanted to highlight all the possible properties. If you look at the available properties, they match one-for-one with the properties that exist inside the SQL Table Editor component. You must initialize the variable named columnDefinition6, and as we previously discussed, the With statement is used here. Each of these properties should function exactly as they do within the SQL Table Editor. Once you have set the properties, you need to add them to the collection of column definitions. This is what you see with “columnDefinition.Add(columnDefinition#)”, which adds the specific column definition to the list of all column definitions. Finally, as with every function, the return clause simply returns the final output. In conclusion, we recognize that new code‑based components can feel intimidating at first. But with the insights from this blog, we hope we’ve lowered the barrier to entry and given you the confidence to start taking full advantage of this powerful new grid. InsertDataRow Sub Code Sample: UpdateDataRow Sub Code Sample: DeleteDataRow Sub Code Sample:Eric_Hanson5 hours agoOneStream Employee683Views4likes7CommentsGenesis - Multiple instances
Hi, I was trying to go back to one of the two Genesis pages (we have two instances in our app) using a button but seems like the following navigation action is not working. XFPage=Dashboard:[container_bag_(App)_(Main)_EU] I've tried using parameters in embedded dashboards to refresh the content and there is no way to go back to the home page. It appears that Genesis assigns the same names for all objects so it's a problem when trying to refer a home page dashboard only related to a specific instance. This is the reason why i renamed the home page Genesis dashboard with a suffix. For now i just opening the dashbaord i want in a new page using OpenInNewXFPage=True, but i prefer to make the back button working.SolvedFW16 hours agoNew Contributor II11Views0likes1CommentText Box Component - Input Mask
Does anyone know if we can assign an Input Mask to Text Box components? Using version 8.5.4, I have a few dashboards that make use of Text boxes for data collection and I don't see a way to assign an input mask to the text box. An input mask is a string expression that constrains input to support valid input values. If the text box is expecting a dollar amount, the input mask would format the entry as number format #,###.00. For capturing a telephone number (North America format), the mask would be (000)000-0000. If this doesn't exist, can the dev team add this to future releases? Thanks, CosimoCosimo23 hours agoContributor II12Views0likes1CommentHierarchy Validation tool Security
We have added a power user who has the ability to manage some dimensions, however they cannot access the hierarchy validation tool and receive the error Security Access Error. User XXXX is not authorized to access object 'OneStream Database Server'. I tried changing the access in the workspace but this did not resolve the error. What areas need to be changed to allow access?tdugas23 hours agoNew Contributor III9Views0likes2CommentsPOV Action on OS 9.0
Hi all, I'm experiencing a strange issue on OneStream (rel 9.2.0.18004) that was working as expected under previous release (8.*) In a dashboard I have combos that let me select Scenario and Time, with a "POV Action" that set them back setting - respectively - WFScenario and WFTime parameters. Now, starting from the above mentioned release, whenever I change a combo, the focus moves from the dashboard, opening the selected Workflow. So, the User is experiencing something like: Launch the Dashboard Select Scenario System opens current WorkFlow Open back my dashboard Select Time System opens current Workflow Open back my dashboard Run the action In previous release, the POV Action didn't took the User away from the current dashboard. Have you experienced the same issue ? Is there a workaround to let the user stay in the dashboard ? Thanks in Advance FabioGFabioG23 hours agoContributor II19Views0likes1CommentWorkforce Planning Express PV920-SV100 is Live on Solution Exchange!
1 MIN READ The Express Development team is excited to share that Workforce Planning Express (PV920-SV100) was released on March 17! What is Workforce Planning Express? It’s a packaged solution designed to help organizations plan and manage their workforce more easily. It includes leading-practice designs and ready-made configurations so teams can get started quickly. With this solution, you can: Plan position adjustments such as hiring, transfers, and terminations. Manage headcount across the organization. Monitor compensation, benefits, and other workforce-related costs over time. Want to learn more? Read: Review of the release notes and product documentation on the Solution Exchange. If you’re working with customers or prospects Here are a few ways you can guide them to get started: Access the Solution: Workforce Planning Express is available for download on the Solution Exchange. Training Opportunities: End-user training will be available on Navigator in mid-April 2026. Have feedback? If you have feedback or receive it from customers or prospects, please share it with the team via this link so we can continue improving the solution.agoralewski1 day agoCommunity Manager30Views0likes0CommentsCalculate HC in LIM for Mid-Month Start
Is there an ability to calculate HC expenses that account for when in the month an employee starts and prorate accordingly? As an example, if the employee starts on the 15th then their monthly salary would be Salary / 12 * (15/30). The syntax available in the blocks don't clearly identify if this can be done.tchev2 days agoNew Contributor III111Views0likes2CommentsONC - IntercompanyDetailAllPlugaccounts - Suppress matches
Hi, We have moved to v9, and the previously used IC matching methods are giving us issues. For this we have a ticket open with support, and they have pointed us towards ONC. We use something very similar to the intercompany all plugs, but then supressing any combinations that are matching. Is there a way to filter the IntercompanyDetailAllPlugAccountsGridView_ONC to suppress the matches? It seems to be working in our development environment, but would be great if we can supress the matches as this is something that will be required. We have tried this: {WS}{GetIntercompanyDetailAllPlugAccountsGridView}{View = |!Members_View_Numeric_ONC!|, SuppressMatches=True} But that does not seem to work. With kind regards, TimTimVierhout12 days agoNew Contributor II8Views0likes0CommentsThe OneStream Podcast: Author Series - OneStream Workspaces and Assemblies available now!
1 MIN READ On this episode of The OneStream Podcast, Peter Fugere is joined by Jessica Toner and Eric Telhiard to discuss their new book ‘OneStream Workspaces and Assemblies’ and how these evolutions in architecture are reshaping the development workflow and creating new opportunities to optimize user experiences and system efficiencies.jcooley2 days agoOneStream Employee10Views0likes0Comments
Getting Started
Learn more about the OneStream Community with the links below.