loading sub-ledger data into OneStream
Hi Community :) We are currently in development with regard to loading sub-ledger data into OneStream, but the balance specifications are so granular that we currently have them split into about 10 distinct loads. Because of this setup, our end users are being forced to manually click "Load and Transform" 10+ times to get a complete sub-ledger balance loaded for the period. Obviously, this is tedious, creates a poor user experience, and introduces a lot of room for human error. We want to streamline this. What is best practice to do this?48Views0likes2CommentsExcel Template upload - Data Source setup
Hi All, I'm trying to set up some Excel templates for the uploading Budget and Forecast data. I'm pretty happy with the Excel Template side, with the named range (XFD), and specific header formats, etc. However, I'm a bit lost on the Data Source set up, I get that you have to Allow Dynamic Excel Loads set to True, but what about the rest of the set up? Do I choose Delimited or Fixed file? It feels like this Data Source section is really for flat files, as it always wants to know the column number. I've tried importing the Excel into the Data Source in the same way I would for a csv file, but it just shows up as xml gibberish in the top box. It definitely feels like I'm missing something.Solved5.1KViews2likes5CommentsCAMT.053 Files (ISO20022)
Has anyone imported CAMT.053 files from their bank into OneStream? We are embarking on an Account Reconciliation project and trying to accommodate our international sites. We've been told this is tricky and to stick with the BAI format, but that will be problematic as we integrate more global entities.6Views0likes0CommentsSequence not performing appropriately in WF
Hello, I have Product and Customer allocations that I would like to be ran on the process step of the Workflow. The weird challenge that I am running into is that when I run each step of the sequence on its own, the allocations are executing as expected. When I run all of the allocations with one sequence/in the Workflow, my allocations are producing errant results. (E.g. my allocated-in amount in is only a small fraction of my allocated-out amount.) Is there a property I am missing on a sequence or workflow that may cause problems if not handled appropriately? I am calling the sequence with a No Calculate in the WF, which executes a handful of Finance BR Calcs.Solved41Views0likes2CommentsHow to "Import" "in parallel" via a "OneStream Connector" to a "Data Warehouse"?
Please share your practical advice to help us meet the new Integration requirements. - I couldn't find a OneStream KB/post nor a good result from free Gemini. - We're using SAAS on v9.0 to move to v10 upon its upcoming release. Thank you folks for sharing your expertise.77Views1like2CommentsFeb Actuals divided in half to import to Jan
There was no Jan close, so we want to divide Feb actuals by 2 to get a Jan actuals load. The loads are SQL from HFM and Database Whse, so it isn't done with a flat file csv. When I looked at the Feb Import that are 1214 lines. How can we take that Feb import, divide by 2, and import in Jan Actuals? First thought was to create a QuickView then upload back into OneStream. Also looked at a copy Business Rule but calculations are needed for the data. Any ideas/thoughts?Solved80Views0likes4CommentsImport with Vertical Extensibility or Extensible Dimensions
Hi, I can’t get my data import to work with vertical extensibility, and I’d like to understand how this situation is supposed to be handled. In short, the import validation fails on accounts that are extended for an entity. The entity belongs to a cube that is vertically extended from the primary workflow cube. Detailed situation I have extended dimensions and two cubes that use those dimensions. CubeParent contains a parent entity with dimensions applied. The parent entity resides in the parent entity dimension. CubeChild has dimensions that are extended from the parent’s dimensions. The child entity resides in the child entity dimension. As a result, each cube has its own unique entity dimension containing its respective entity. Additional configuration details: CubeChild has Is Top Level Cube For Workflow set to False. CubeParent is the top-level cube and has the Workflow Profile created from it. The child entity has been added as a child of the parent entity in the parent’s entity dimension (see screenshot). CubeParent has the Cube References tab configured correctly. Data import issue I have configured a data source that reads from a CSV file. The data source successfully loads the CSV into the staging area. However, the transformation step fails. The failure occurs only for the child entity’s extended account, not accounts in the ParentCube account dimension. The transformation rules when created are associated with a cube. If the cube is not that of the workflow profile the transformation rule does not appear in the import configuration step. Question How should data be imported for an entity that exists in a workflow that has different dimensional detail than the cube it belongs to because it is using vertical extensibility?Solved115Views0likes4CommentsData Import Step - Load validated records
We are looking for a load into a financial cube. Mappings are already done in the source system. Is there a way to load records that pass the validation and write the other records to an error protocol. The idea would be that not the whole load is rejected but only records that contain errors, those being written to an error protocol. Many thanks for your supportSolved66Views0likes2CommentsConnect MS Fabric Lakehouse?
Hi all, I'm currently exploring ways to extract data from a Microsoft Fabric Lakehouse into OneStream. The obvious way would be to connect to the SQL endpoint just like any other DB. In this case, what would be the best authentication method? I've been testing it out with my own user on MS SQL Server Management Studio using MFA, and I have no idea what the best way would be to create credentials for the connection string in OneStream. In case connecting to the SQL endpoint with a connection string isn't a good idea, what would be a better practice? Thanks.SolvedErrors creating New Task in Data Import Schedule Manager
I am getting several errors when setting up a New Task in the Data Import Schedule Manager. We are on OS version 9.01.17403, and I have installed the latest version of DSM, which is 8.4.0_SV100. I suspect this may be a version compatibility issue, so I am curious if anyone has been able to get this solution to work in a 9.01 application. I have already uninstalled and reinstalled the solution, which didn’t resolve the issues. Below are the two errors I am seeing: When choosing Global Scenario from the Scenario(s) drop down list, I get an immediate error “Error processing member. The item was not found. Member, 11111111.” The details state: Unable to execute Business Rule ‘DSM_Paramhelper’ where it appears to be trying to call the Global Scenario by OneStream.Client.Api.DashboardsAjaxServiceReference.DashboardsAjaxServiceClient.GetParameterDisplayInfosUsingDashboardNameCompressed(SessionInfo si, LoadDashboardInfo loadDashboardInfo, Boolean isForDashboardUIWithInteractiveComponents, Dictionary`2 custSubstVarsAlreadyResolved). If I pick a specific Scenario, I am getting a different error. It allows me to pick the Scenario and Time, but when save the Task, I get “The input string ‘’ was not in a correct format.”. The error details show it is an issue with the same Business Rule ‘DSM_SolutionHelper’ where Conversion from string “” to type ‘Double’ is not valid. The input string ‘’ was not in a correct format. OneStream.Client.Api.DashboardsAjaxServiceReference.DashboardsAjaxServiceClient.StartExecuteSelectionChangedServerTaskCompressed(SessionInfo si, Boolean isSystemLevel, Guid primaryDashboardID, Guid embeddedDashboardID, Guid componentID, PageInstanceInfo pageInstanceInfo, XFSelectionChangedServerTaskInfo serverTaskInfo) Any advice on how to correct these issues would be greatly appreciated.Solved101Views1like1Comment