Shape the Future of OneStream Training with our Quick Survey (Gift card available!) Take Survey
Notification
Recent Activity
The Dynamic Grid Dashboard Component
Are you frustrated by volume limitations and the inability to edit data in grid view? Do you find the SQL Table Editor restrictive because it only supports a single table? Are your items taking too long to load in these grids? Discover a smarter way to load data on dashboards and grids: the Dynamic Grid dashboard component. Traditionally, OneStream grids have been limited in customizations, flexibility, data retrieval and data manipulation. The Dynamic Grid changes that traditional pattern. It allows each of those items in a more robust and enhanced way. Also, the Dynamic Grid allows for multiple data sources from a single component (however out of scope for this article). The Dynamic Grid loads only the data you need—nothing more. Dynamic Grid delivers unmatched flexibility: you can customize your grid to fit your company’s exact requirements. Enjoy traditional column formatting from the SQL Table Editor, now enhanced with powerful row and specific cell formatting options. I could go on about how powerful this new dashboard component is—but you are here to learn how to set it up. Let us dive into an example of creating a Dynamic Grid. The Dynamic Grid relies on two out‑of‑the‑box functions: GetDynamicGridData and SaveDynamicGridData. GetDynamicGridData retrieves the data for display in the grid. SaveDynamicGridData persists changes made in the grid—whether you are adding, updating, or deleting rows. Together, these functions power efficient data retrieval and safe, transactional saves for the Dynamic Grid. GetDynamicGridData Example Code: The first line— If args.Component.Name.XFEqualsIgnoreCase("Grid7") Then —acts as a conditional check to segment multiple Dynamic Grids within the same assembly. In other words, it ensures that the logic applies only to the grid named Grid7. This name must exactly match the Dynamic Grid component name defined in your dashboard configuration. Immediately after the opening If condition, you will notice two key arguments: startRowIndex and pageSize. These variables define where the table should begin rendering and how many rows appear per page. Without them, the grid would attempt to render the entire data, which would create serious performance issues. Here is what you need to know about pageSize: pageSize is driven by the Rows Per Page property on the grid. By default, Rows Per Page is set to -1, which means the grid uses your security settings to determine row limits. If you specify any value other than -1, that value overrides the security setting. The maximum allowed page size is 3,000 rows per page. Setting these arguments correctly is critical for performance and user experience. The next few lines use standard OneStream and .NET objects to open a database connection and create a command—nothing unusual there. The real magic happens in the SQL query, which leverages the lesser known OFFSET and FETCH clauses for efficient paging. These functions allow you to return only the rows you need, rather than loading the entire dataset. Alternatively, you can implement paging by generating a RowNumber column and filtering with a WHERE clause, such as: “WHERE ((RowNumber >= {StartRowNumber}) AND (RowNumber <= {EndRowNumber}))” Both approaches achieve the same goal: controlled data retrieval for better performance. Note: When building a Dynamic Grid with SQL, always follow best practices for writing SQL queries. This ensures your solution is secure, efficient, and maintainable. Notice the StartRowNumber and EndRowNumber variables integers. These are driving the SQL parameters inside of the OffSet and Fetch section of the SQL query. Using parameters is essential for pagination because it ensures your query retrieves only the requested rows while preventing SQL injection. The next section of code handles data retrieval and paging metadata: First, a new DataTable is created and populated using the SQL query for the current page of records. Then, “Dim SQL As String” defines a COUNT query to determine the total number of rows in the underlying table. A second DataTable is used to execute this count query and retrieve the result. Finally, the line: “result.TotalNumRowsInOriginalDataTable = $"{dt(0)(0)}" assigns the total row count to the XFDataTable result object. This property is critical for paging—without it, the grid cannot calculate how many pages to display. In short: no total count, no paging. Once all the arguments are in place, the final step is to create the XFDynamicGridGetDataResult object. This constructor requires three key inputs: XFDataTable – The data table you populated earlier (e.g., the result variable). Column definitions – We will cover these in detail later in the article. DataAccessLevel – Determines how users interact with the grid data. You can choose from: .AllAccess – Full read/write access (most common for editable grids). .ReadOnly – Users can view but not modify data. .NoAccess – No data interaction allowed. For most scenarios involving data manipulation, .AllAccess is the recommended setting. Note: As of now, it’s on the roadmap for paging and the ability to ascend and descend columns. This functionality is not currently available as of the writing of this article. SaveDynamicGridData Example Code: In the previous section, we focused on rendering data. Now, let us shift to saving modified data, whether inserting, updating, or deleting rows. With the SQL Table Editor, save functionality is built in. However, the Dynamic Grid offers far greater flexibility, which means you will need to implement custom-saving logic. This approach gives you full control over: How data changes are processed Custom validation rules User messaging and error handling Custom saving functionality ensures your grid behaves exactly as your business requires. Let us break down the code for the SaveDynamicGridData Function step by step. In the save routine, you will see two argument objects sourced from args: Dim getDataArgs As DashboardDynamicGridGetDataArgs = args.GetDataArgs Retrieves context from the earlier GetDynamicGridData call (e.g., paging, filters, sort)—useful if your save logic needs to reference the current view. Dim saveDataArgs As DashboardDynamicGridGetDataArgs = args.SaveDataArgs Initializes the save payload: inserted, updated, and deleted rows, plus column metadata. Next, the code gathers what is editable: Dim editedDataRows As List(Of XFDataRow) = saveDataArgs.EditedDataRows Gets the list of rows that have been modified, inserted, or deleted. This is often the entire row set, but you can scope it to specific rows when only part of the data should be editable. Dim columns As List(Of XFDataColumn) = saveDataArgs.Columns Provides the columns list (names, types, formats) for the target table—critical for validation and parameter binding. You will also see a simple table name string (e.g., Dim tableName As String = "dbo.sample_data") to identify where the changes are saved, and a Boolean flag used later to control behavior (such as enabling/disabling validation or transactional commits). Finally, the code re-opens a database connection to perform the actual writes (insert, update, delete). This is done in the save path to keep read and write operations logically separated and to ensure that any transactional logic is scoped to the save operation. Now we are moving into the fun section of code. This opens with a Loop (For Each) of all our possible edited data. Then you see the Case Statement “editedDataRow.InsertUpdateOrDelete” This lets the application know that we want to apply the insert, update or delete functions to our available edited data rows variable. There is no easy part to the modified sections. Each of them takes their own separate lines of code that are functionally SQL Statements. As you can see from my previous screenshot, I have created three separate functions to dynamically run these queries. I will post these code snippets as an appendix to this blog. For the function of InsertDataRow, this is using the standard SL Insert statement. Now we get to the save loop. The routine starts with a For Each loop over the collection of edited rows. Inside the loop, a Select Case editedDataRow.InsertUpdateOrDelete directs the logic for each row, applying the appropriate insert, update, or delete operation. There is no shortcut here, each operation requires its own parameterized SQL command and validation. In the screenshot, you will see I’ve split the logic into three dedicated functions to keep the code clean and testable: InsertDataRow(...) UpdateDataRow(...) DeleteDataRow(...) I will include these helpers in the appendix for reference. For InsertDataRow, we use a standard SQL INSERT statement with parameters (not string concatenation), ensuring safety and better performance. INSERT INTO <tableName> (Column1, Column2, Column3) VALUES (Value1, Value2, Value3) The UpdateDataRow function uses a standard SQL UPDATE statement to modify existing records. Here’s the basic pattern: UPDATE <tableName> SET column1 = value1, column2 = value2 WHERE <condition> The DeleteDataRow function uses a standard SQL DELETE statement to remove records. Here’s the basic pattern: DELETE FROM <tableName> WHERE <condition> Now that we have done all this coding magic, what’s next? First, create an empty XFDynamicGridSaveDataResult: “Dim result as New XFDynamicGridSaveDataResult()” This step is non‑negotiable: you must send data back to the gird after the save. Without it, the grid will lose context and paging will break. That’s why this line is essential: “result.DataTable = GetDynamicGridData(si, brGlobals, workspace, args)?.DataTable” . By calling GetDynamicGridData again, you ensure the save result reflects the latest filters, sort order, and paging logic—keeping the user experience consistent after modifications. With the empty XFDynamicGridSaveDataResult created and the original data table rehydrated, the next step is to restore paging, so the user returns to the correct page after a save. Compute the current page index using VB.NET integer division: “result.PageIndex = (getDataArgs.StartRowIndex \ getDataArgs.PageSize)”. Finally, before returning the result, there’s one last piece of logic, a small but important snippet that ties everything together. This step ensures the save result object is fully populated with the data, paging info, and any additional metadata the grid needs to render correctly after modifications. result.SaveDataTaskResult = New XFDynamicGridSaveDataTaskResult() With { .IsOK = True, .ShowMessageBox = True, .Message = "Save Finished" } This allows us to assign the result to the XFDynamicGridSaveDataTaskResult. However, you will notice the “With” statement. This is a cool vb.net trick that allows you to assign a series of properties or arguments inside an enclosed block of code. Simply put, you wouldn’t have to keep assigning, result.isOk = True and so forth. This is a quick way to assign several properties that are related. This is where we assign values to the XFDynamicGridSaveDataTaskResult. You’ll notice a With block—an idiomatic VB.NET construct that lets you set multiple properties on the same object without repeating the variable name. In other words, instead of writing result.IsOk = True, result.PageIndex = …, and so forth line by line, you can group them neatly inside one block. Now that we have all this code in place, you can finally return the result. We’ve covered how to render data and how to save modified rows—but the Dynamic Grid’s most used reporting feature is its ability to apply conditional column formatting. This functionality mirrors the familiar options found in the SQL Table Editor, giving you full control over how columns look and behave based on dynamic conditions. Below is a screenshot of the SQL Table Editor column formatting properties. All these formatting properties are available in the Dynamic Grid—but only through code. You can apply them directly within the GetDynamicGridData function or encapsulate them in a dedicated formatting function for better organization and reuse. In my implementation, I chose the latter approach, creating a separate function to handle conditional column formatting. Inside the GetDynamicGridData function (see screenshot above), I added the following line: “Dim columnDefinitions As List(Of XFDynamicGridColumnDefinition) = Me.GetColumnDefinitions()”. This creates a list of XFDynamicGridColumnDefinition objects, which define the properties and formatting rules for each column in the grid. The call to Me.GetColumnDefinitions() retrieves these definitions from a dedicated function, keeping the logic clean and reusable. As you can see, this entire line of code is commented out. I wanted to highlight all the possible properties. If you look at the available properties, they match one-for-one with the properties that exist inside the SQL Table Editor component. You must initialize the variable named columnDefinition6, and as we previously discussed, the With statement is used here. Each of these properties should function exactly as they do within the SQL Table Editor. Once you have set the properties, you need to add them to the collection of column definitions. This is what you see with “columnDefinition.Add(columnDefinition#)”, which adds the specific column definition to the list of all column definitions. Finally, as with every function, the return clause simply returns the final output. In conclusion, we recognize that new code‑based components can feel intimidating at first. But with the insights from this blog, we hope we’ve lowered the barrier to entry and given you the confidence to start taking full advantage of this powerful new grid. InsertDataRow Sub Code Sample: UpdateDataRow Sub Code Sample: DeleteDataRow Sub Code Sample:Eric_Hanson1 day agoOneStream Employee411Views2likes5CommentsAdd Reminder for Flip Sign Flag on Account Mapping During Load
Our income, liability, equity accounts need to have the flip sign flag checked but our users always forget. Is there a way to add a pop up when they first get to this screen (after they save it is too late).mgreenberg1 day agoContributor II6Views0likes0CommentsData Import Step - Load validated records
We are looking for a load into a financial cube. Mappings are already done in the source system. Is there a way to load records that pass the validation and write the other records to an error protocol. The idea would be that not the whole load is rejected but only records that contain errors, those being written to an error protocol. Many thanks for your supportjuergzbinden2 days agoNew Contributor II12Views0likes1CommentCannot reference member expansion 1 from member expansion 2
Hi, I have a cube view with a straightforward requirement. Below is an example of my requirements using the Golfstream app. My rows are set as follows: Member filter 1: U2#Clubs.ChildrenInclusive:name(UD2: |MFUD2Desc|) Member filter 2: T#2011Q1.ChildrenInclusive:name(Time: |MFTime|, UD2: |MFUD2Desc| The below screenshot shows what this returns. The issue is that I need to reference the member filter 1 member from member filter 2. When I use |MFUD2| from member filter 2, it reverts to my top UD2 member rather than reading my UD2 member from member filter 1. I need this to do a more complicated loop through members, and that loop must take the row's UD2 member into account. Any advice or workarounds? User-created parameters and |CVUD2| do not return what I need.DRider2 days agoOneStream Employee9Views0likes0CommentsTech Talks After Hours: Data Buffer Manipulation
In this After Hours edition of Tech Talks, join Jon Golembiewski and Tom Linton as they dive deep into advanced data buffer manipulation techniques in OneStream, revealing powerful strategies for optimizing performance and reducing scripting complexity. To receive OnePoints for watching this Tech Talk video, please visit the link below: Tech Talks After Hours: Data Buffer Manipulationagoralewski2 days agoCommunity Manager16Views0likes0CommentsGenesis - No moveable splits in Dashboard Layout
I cannot find any setting for moveable splits in dashboard layout or advanced dashboard layout. Is it not possible in Genesis? Does it means I should just create the dashboard in the client first and then link it into Genesis?SolvedJennyTan2 days agoNew Contributor II17Views0likes2CommentsButton Excel and Report in the Cube View Advanced
Good afternoon, I have a question and I can't understand the system's behavior! In Genesis, I created several Advanced Cube Views, but some show the option to open Excel and Report, while others don't. I want them to show up in all of them. How can I do that? This image has what I want, the Excel and Report buttons. This image hasn´t what I want, the Excel and Report buttons. Thanks :)josecarlos2 days agoNew Contributor II50Views1like5CommentsGetDataBufferUsingFormula
Can someone check if there's any mistake in the below 2 lines. I am unable to execute the BR Dim sourceBuff As DataBuffer Dim srcED As String = args.CustomCalculateArgs.NameValuePairs("sourceEffectiveDate") Dim selectedScenario = args.CustomCalculateArgs.NameValuePairs("selectedScenario") Dim dmTime = api.Pov.Time.Name Dim account = args.CustomCalculateArgs.NameValuePairs("account") sourceBuff = api.Data.GetDataBufferUsingFormula($"FilterMembers(T#{dmTime}:S#{selectedScenario}:U4#Approved_Status:U6#Total_Audit:U7#{srcED}, [A#{account}.Base])")jaideepk262 days agoNew Contributor13Views0likes0CommentsSplit hierarchy into 2 dimensions to ensure retroactivity
Hi everyone, We are currently developing a planning solution in OneStream for a client who has requested that the Cost Center dimension (and its related nodes) be updateable every year, while still maintaining full retroactivity when users need to view reports for prior years. We have considered two possible approaches: OPTION 1 Each year, load a new Cost Center hierarchy associated only with that specific year. When users want to analyze historical data, they simply select the hierarchy corresponding to the year they are reviewing. The main drawback of this approach is that, over time, multiple versions of the Cost Center hierarchy will accumulate, which could potentially impact maintainability or performance in the long term. OPTION 2 Separate Cost Centers and their nodes into two distinct dimensions. Introduce a technical member in the Account dimension and use a data entry form (maintained annually) where users enter a value of “1” at the intersection of each Cost Center and its corresponding Node. Source data is initially loaded only at the Cost Center level (using a dummy member for the Node dimension). At the end of each cycle, a Business Rule is executed to populate the Node dimension based on this mapping. This approach avoids unnecessary data duplication, but it introduces additional process complexity and reliance on manual maintenance. Does anyone have experience with similar requirements, alternative approaches or feedbacks on this topic? Thank you in advance!fc2 days agoContributor15Views0likes1CommentHow to read csv file from Documents/users folder?
Hi, I have been trying to read a csv file that is in Documents/Users/userxxxx folder. Not able to make it work. Getting an error. Below is the code Dim FileName As String = args.NameValuePairs.XFGetValue("Param1", String.Empty) Dim sourceFolder As String = $"Documents/Users/{StringHelper.RemoveSystemCharacters(si.UserName,False,False)}" Dim fullPath = sourceFolder & "/" & "abc.csv" Below is the error Could not find file 'P:\Program Files\OneStream Software\OneStreamAppRoot\OneStreamApp\Documents\Users\userxxxx\abc.csv'. Any idea why this is not working? Thank you, MikkiMikki3 days agoNew Contributor III16Views0likes2Comments
Getting Started
Learn more about the OneStream Community with the links below.