Best practice for filtering vStageSourceAndTargetDataWithAttributes by Workflow
Hi everyone, I am building a query using the vStageSourceAndTargetDataWithAttributes view and I want to ensure I am retrieving the latest figures for a specific Entity and Time period. I have a few questions regarding the best way to filter this: Filtering by Wfk: I know that hardcoding a GUID for the Wfk (Workflow Key) is risky because it can change between environments or if a profile is reset. What is the recommended way to dynamically filter for a specific Workflow Profile? Should I be joining on the WorkflowProfileHierarchy table/view to filter by ProfileName? Latest Figures: If multiple imports have occurred, does filtering by a specific Wfk automatically ensure I am seeing the "latest" iteration of the data, or is there a better field to use for partitioning/ranking the results? Any code snippets or best practices for making these stage queries robust would be highly appreciated.46Views0likes2CommentsCAMT.053 Files (ISO20022)
Has anyone imported CAMT.053 files from their bank into OneStream? We are embarking on an Account Reconciliation project and trying to accommodate our international sites. We've been told this is tricky and to stick with the BAI format, but that will be problematic as we integrate more global entities.Solved56Views0likes3CommentsEurope SWIFT File
Hello, I have a SWIFT Bank file from a client based in Europe that I need to import. I thought this would be similar to a BAI file and could use the OS BAI Parser solution to grab the necessary information, but the format isn't as similar as I thought. Is there any other tool for parsing out a SWIFT file or will this require some custom logic?Delimited file Import and normalization
Hi everyone, I need to set up a few transformations to normalize a delimited file that will be uploaded by the users, before feeding it to the datasource. I previously worked with FTP folders, and used a Connector BR to retrieve data and normalize it before the datasource step. I have never worked with delimited files loaded directly by clicking Import. It's not clear to me if I can still use a connector BR or if I need to use something else (a Parser BR?). Can anyone help me understand what is the best practice in this case? Thank you for the support!Solved87Views0likes3Commentsloading sub-ledger data into OneStream
Hi Community :) We are currently in development with regard to loading sub-ledger data into OneStream, but the balance specifications are so granular that we currently have them split into about 10 distinct loads. Because of this setup, our end users are being forced to manually click "Load and Transform" 10+ times to get a complete sub-ledger balance loaded for the period. Obviously, this is tedious, creates a poor user experience, and introduces a lot of room for human error. We want to streamline this. What is best practice to do this?91Views0likes2CommentsHow to "Import" "in parallel" via a "OneStream Connector" to a "Data Warehouse"?
Please share your practical advice to help us meet the new Integration requirements. - I couldn't find a OneStream KB/post nor a good result from free Gemini. - We're using SAAS on v9.0 to move to v10 upon its upcoming release. Thank you folks for sharing your expertise.98Views1like2CommentsFeb Actuals divided in half to import to Jan
There was no Jan close, so we want to divide Feb actuals by 2 to get a Jan actuals load. The loads are SQL from HFM and Database Whse, so it isn't done with a flat file csv. When I looked at the Feb Import that are 1214 lines. How can we take that Feb import, divide by 2, and import in Jan Actuals? First thought was to create a QuickView then upload back into OneStream. Also looked at a copy Business Rule but calculations are needed for the data. Any ideas/thoughts?Solved91Views0likes4CommentsIncluding Journal Entries from Different Origin Member in Account Reconciliation Discovery
Hi Community, I’m working on an Account Reconciliation solution where I need to ensure posted journal entries balances are also included in the Account Reconciliation discovery process. Currently, by design the Discovery process only pulls trial balance information already imported and validated in the Stage. Therefore, I wanted to check if there’s a recommended, or best practice approach, to accomplish this requirement. Any guidance or examples would be greatly appreciated, thanks in advance for your help!122Views1like3CommentsNo valid DataKeys (Scenario / Time) found in data source.
Hello everyone, I'm trying to create my first import from scratch. Unfortunately, it always gives me the same error : No valid DataKeys (Scenario / Time) found in data source. Review the source data load processing log and check one-to-one transformation rules to ensure that you have created proper Scenario and Time dimension rules for this data source. And no line is imported or displayed in the logs. However, I've gone down each parameter and everything seems correct and consistent with what I have on other import files. Could you help me investigate? My settings : All the UD1 to UD8 are on None. And the log :Solved463Views0likes9CommentsEncryptText And DecryptText Replacement
Does anyone know what replaces the 2 utility commands below? I get a warning message that those commands are obsolete but it does not give any indication or hints of what the new commands are or what's currently available? BrApi.Utilities.EncryptText andBrApi.Utilities.DecryptText Any help is highly appreciated. Thanks, JunSolved202Views0likes3Comments