Recent Discussions
Use 2 fields to determine dimension value in datasource
Hi all, I need to set up a complex expression to assign a specific value to the U6 dimension based on 2 source fields. Below part of the list of those source fields. The code I set up for the complex expression is the following: Dim fields As List(Of String) = api.Parser.DelimitedParsedValues() Brapi.ErrorLog.LogMessage(si, "Field count (List.Count): " & fields.Count) If fields.Count > 4 Then Dim loadedAccount As String = fields(4) Brapi.ErrorLog.LogMessage(si, "loadedAccount: '" & loadedAccount & "'") If Left(loadedAccount, 2) = "00" Then Return Right(loadedAccount, 8) End If Else Brapi.ErrorLog.LogMessage(si, "WARNING: only " & fields.Count & " fields; skipping Account logic") End If Every time I launch the loading process, however, it prints the WARNING message for all the rows, as if it didn't find any source field. Am I doing something wrong / is there an alternative way to retrieve the fields values that I am not considering? Thank you for the help!fc56 minutes agoContributor11Views0likes2CommentsAutomating Data Access from OneStream to Snowflake: Exporting Financial and Master Data
How to configure the OneStream system to push the transaction and master data (dimensions) to Snowflake without manual intervention? I need step-by-step Info how to pull data from Snowflake into OneStream.Sibi_S4 days agoNew Contributor361Views0likes4CommentsJournal creation through Excel Addin
Hi everyone, I need to create a few journal templates so that they can be filled up in Excel and subsequently loaded into Onestream. I found online screenshots of some journal templates in Excel like the following one: but it's not clear to me how can I create it. Is there a way to export in Excel the already precompiled structure of a journal from Onestream? or do I need to create it manually in Excel? Any suggestion would help!Solvedfc4 days agoContributor124Views0likes5CommentsOneStream and D365 Integration
Hi everyone, I'm currently working on integrating OneStream with Dynamics 365 (D365) and would love to hear about your experiences. Specifically, I'm interested in knowing which connector you used for the integration. Did you opt for OData, Synapse Link, or another method? Any insights, tips, or recommendations would be greatly appreciated! Thanks in advance!KristenB275016 days agoNew Contributor38Views0likes1CommentMatrix Data Load Basics
I have a matrix data load question. I get the concept of how the matrix data load works. I haven't done too many of them, so I want to make sure I'm not missing the basics. The file I have has the year on one of the first lines in column 9. The next row contains the headers with periods in 12 separate columns. Is there an easy way to get the year from the first line (without a lot of code)? Or should I get year based on the POV? Then I am guessing a complex expression in the time columns to build the OS time key? I know I can do all of this in code, but I don't want to skip over any out of the box functionality or best practices. Any good guides on some of the best approaches with the Matrix load? Thanks, ScottSolved995Views0likes4CommentsExternal Connection for OS App in Same Environment?
Hi there everyone, Anyone ever created a connection between apps in the same OS environment? Working on some data movement and I have the connector set up, but struggling a bit with the connection string for the external database connection I'm setting up to reference the app I'm pulling data from. Is it the same syntax as the BI Blend connection, or are there other keywords that need to be in there? Thanks!M_L_Spencer8 days agoNew Contributor II42Views0likes1CommentNo valid DataKeys (Scenario / Time) found in data source
Originally posted by Krishna Srinivasan Hi - I am loading a simple CSV file into my Cube. - Created Data Source - The Scenario & Time Dimension are Current Key. - Created Transformation rule. - Created Workflow profiles - Start the Import Porcess and got the below error Summary: No valid DataKeys (Scenario / Time) found in data source. Review the source data load processing log and check one-to-one transformation rules to ensure that you have created proper Scenario and Time dimension rules for this data source. As anyone encountered this issue ? It is so strange even if I am not passing any values in the Data SourceSolvedOSAdmin11 days agoValued Contributor II5.9KViews1like7CommentsScheduling an extract of metadata
I'm being asked to schedule an extract of the OneStream metadata. I'm under the impression that this cannot be done using the Task Scheduler since an extract of metadata is not an option in Data Management. Is this an accurate statement? If this can be schedule, how so? Thanks!SolvedChrisFeller11 days agoNew Contributor2KViews0likes7CommentsSIC w/ v9
Does anyone know if the Smart Integration Connector is FedRamp complaint with OneStream v9? Thanks.Dwight12 days agoNew Contributor17Views0likes1Comment