File Explorer Incoming workflow folders missing in copied applications
This post is to provide information about File Explorer Incoming workflow folders. Based on what we've seen in our environments, when an application is copied, the new application does not contain the folder structure located at: \File Share\Applications\<Application Name>\Incoming\<Cube Root Name>_Cube\<Workflow Base Input Name>\<Workflow Import Name> We noticed these folders were missing in our last application copy when we were trying to copy files through SIC and drop them into the workflow import folders since our requirement was to drop them there for users to manually import when they choose (instead of automatically loading the files through batch import from batch harvest). To fix this issue we exported the Workflows and re-imported them back into the copied application. This triggered OneStream to create the cube root folders and workflow folders for the workflows that were imported. Hope this helps someone else who might be running into the same issue.1.5KViews3likes9CommentsNeed to send data from OneStream to Snowflake using an existing SIC ODBC connection
Hello, What is the best way to send data from OneStream to Snowflake using an existing SIC ODBC connection? Currently I am using SIC ODBC connection to query data from Snowflake and now I would like to send data back to Snowflake.CAMT.053 Files (ISO20022)
Has anyone imported CAMT.053 files from their bank into OneStream? We are embarking on an Account Reconciliation project and trying to accommodate our international sites. We've been told this is tricky and to stick with the BAI format, but that will be problematic as we integrate more global entities.Solved38Views0likes3Commentsloading sub-ledger data into OneStream
Hi Community :) We are currently in development with regard to loading sub-ledger data into OneStream, but the balance specifications are so granular that we currently have them split into about 10 distinct loads. Because of this setup, our end users are being forced to manually click "Load and Transform" 10+ times to get a complete sub-ledger balance loaded for the period. Obviously, this is tedious, creates a poor user experience, and introduces a lot of room for human error. We want to streamline this. What is best practice to do this?90Views0likes2CommentsExcel Template upload - Data Source setup
Hi All, I'm trying to set up some Excel templates for the uploading Budget and Forecast data. I'm pretty happy with the Excel Template side, with the named range (XFD), and specific header formats, etc. However, I'm a bit lost on the Data Source set up, I get that you have to Allow Dynamic Excel Loads set to True, but what about the rest of the set up? Do I choose Delimited or Fixed file? It feels like this Data Source section is really for flat files, as it always wants to know the column number. I've tried importing the Excel into the Data Source in the same way I would for a csv file, but it just shows up as xml gibberish in the top box. It definitely feels like I'm missing something.Solved5.2KViews2likes5CommentsSequence not performing appropriately in WF
Hello, I have Product and Customer allocations that I would like to be ran on the process step of the Workflow. The weird challenge that I am running into is that when I run each step of the sequence on its own, the allocations are executing as expected. When I run all of the allocations with one sequence/in the Workflow, my allocations are producing errant results. (E.g. my allocated-in amount in is only a small fraction of my allocated-out amount.) Is there a property I am missing on a sequence or workflow that may cause problems if not handled appropriately? I am calling the sequence with a No Calculate in the WF, which executes a handful of Finance BR Calcs.SolvedHow to "Import" "in parallel" via a "OneStream Connector" to a "Data Warehouse"?
Please share your practical advice to help us meet the new Integration requirements. - I couldn't find a OneStream KB/post nor a good result from free Gemini. - We're using SAAS on v9.0 to move to v10 upon its upcoming release. Thank you folks for sharing your expertise.96Views1like2CommentsFeb Actuals divided in half to import to Jan
There was no Jan close, so we want to divide Feb actuals by 2 to get a Jan actuals load. The loads are SQL from HFM and Database Whse, so it isn't done with a flat file csv. When I looked at the Feb Import that are 1214 lines. How can we take that Feb import, divide by 2, and import in Jan Actuals? First thought was to create a QuickView then upload back into OneStream. Also looked at a copy Business Rule but calculations are needed for the data. Any ideas/thoughts?Solved89Views0likes4CommentsImport with Vertical Extensibility or Extensible Dimensions
Hi, I can’t get my data import to work with vertical extensibility, and I’d like to understand how this situation is supposed to be handled. In short, the import validation fails on accounts that are extended for an entity. The entity belongs to a cube that is vertically extended from the primary workflow cube. Detailed situation I have extended dimensions and two cubes that use those dimensions. CubeParent contains a parent entity with dimensions applied. The parent entity resides in the parent entity dimension. CubeChild has dimensions that are extended from the parent’s dimensions. The child entity resides in the child entity dimension. As a result, each cube has its own unique entity dimension containing its respective entity. Additional configuration details: CubeChild has Is Top Level Cube For Workflow set to False. CubeParent is the top-level cube and has the Workflow Profile created from it. The child entity has been added as a child of the parent entity in the parent’s entity dimension (see screenshot). CubeParent has the Cube References tab configured correctly. Data import issue I have configured a data source that reads from a CSV file. The data source successfully loads the CSV into the staging area. However, the transformation step fails. The failure occurs only for the child entity’s extended account, not accounts in the ParentCube account dimension. The transformation rules when created are associated with a cube. If the cube is not that of the workflow profile the transformation rule does not appear in the import configuration step. Question How should data be imported for an entity that exists in a workflow that has different dimensional detail than the cube it belongs to because it is using vertical extensibility?Solved134Views0likes4CommentsData Import Step - Load validated records
We are looking for a load into a financial cube. Mappings are already done in the source system. Is there a way to load records that pass the validation and write the other records to an error protocol. The idea would be that not the whole load is rejected but only records that contain errors, those being written to an error protocol. Many thanks for your supportSolved80Views0likes2Comments