Form Update Via Excel is Appending, not Replacing
We have a form in our workflow for budgeting. Users have the option of interacting directly with the form or using a spreadsheet template we provided to update their values. There are pre-seeded values in the cube. Users can then make adjustments to the amounts. When users are updating via the form, we are getting values replaced like we would want. However, when users interact with the form and update via our provided excel file, the amounts are getting appended to the original value. We can't seem to locate where this behavior is triggered on upload to form via Excel. Is there something in the VB we need to adjust, or is this a 'standard' behavior. We'd like to let our users simply replace the values. Let me know if you need any add'l clarifications. Regards, DK5Views0likes0CommentsMatrix Data Load
We are looking to load data from a year-to-date (YTD) consolidated statement of subsidiaries. The statement is in matrix form, with accounts (both balance sheet and profit & loss) in rows and companies in columns. Each company has two sets of columns: one for financial results and one for eliminations. How can we set up the data source to read multiple companies' statements from one file? We would like to load eliminations to a separate origin (Elimination). How can we change the destination for the Elimination columns? Also, how do we set up bypass row so that the load ignores headers and total rows? I noticed those rows can be identified by the Account column, where if the account is blank, then bypass.14Views0likes1CommentData source Complex Expression in Logical Operator
I have 2 Datasource imports that I want to use the same mapping tables. The first workflow is an actual monthly GL load. The datasource is a comma delimited file and uses 2 different columns to determine the source value for the UD1 dimension. The second Datasource will use an Excel Matrix to Load Budget data for multiple periods. There is logic in the first Datasource to use the Account source value instead of the UD1 source column as defined by position. This is based on the value in the Account source field. See the code below: Dim Accountcol As String = args.Value 'Reminder - (2) column is 3th column in excel - need to count "0" as a digit Dim CostCentercol As String = api.Parser.DelimitedParsedValues(6) 'Identify PL Accounts (greater than 39999) If Accountcol > 39999 Then 'Need to bring in Cost Center Return CostCentercol Else 'Use Account as UD2 source Return Accountcol End If My question is How do I create a datasource using an excel Matrix load that refers to the Columns defined with A# and UD2# ? Is there a method similar to api.Parser.DelimitedParsedValues(6) that I can code into the Matrix based Datasource to test the source value coming from the Excel cilumn value ?1.6KViews0likes6Comments"Detail Cache summarization did not produce any Summary Data Rows" error
Hi, I am working on an Import/Validate/Load step and built a Connector which gets data from an Application table and a Data source which use the said connector. The connector/Data source seem to work in the sense that that gets pulled in during the import step. However, when the import step is run I receive the following error: "Summary: Detail Cache summarization did not produce any Summary Data Rows." Does any of you know what could cause the issue and what the error message means? Thank youSolved2.1KViews0likes2CommentsIncluding Journal Entries from Different Origin Member in Account Reconciliation Discovery
Hi Community, I’m working on an Account Reconciliation solution where I need to ensure posted journal entries balances are also included in the Account Reconciliation discovery process. Currently, by design the Discovery process only pulls trial balance information already imported and validated in the Stage. Therefore, I wanted to check if there’s a recommended, or best practice approach, to accomplish this requirement. Any guidance or examples would be greatly appreciated, thanks in advance for your help!18Views0likes0CommentsDelimited data source with skipped rows
Hi, we have delimited data source implemented in application. In data source we have two additional rows on top and one empty row- file works fine but now we want to retrieve information("Chicago") from skipped row(number 2). It looks that OneStream read data starting from row number 5. Could you please help me with understanding why these first rows were skipped? Thank you in advance.64Views1like2CommentsNetSuite to OneStream Integration
Hello all, I need to verify and confirm the method of choice to set up an NS to OS 8.5 integration WITHOUT the SIC in play. Here is what I have uncovered thus far.... A SuiteAnalytics Connect is being considered via ODBC so OS can pull data from NS. I read a few things about this. One is that a BR would be set up in OS to call on a SuiteScript in NS initiate the data pull. Now, the missing piece is how to set up the connection in OS if there is no ODBC installed. Another thing is that SIC would not be needed. I also perused a blog that stated that Support can set up s direct connection to NS from OS, provided they have the NS info. Can anyone please confirm this approach? Rest API could be leveraged but looking for an alternative. Any input would be greatly appreciated. Thanks, Ed P84Views0likes2CommentsNo valid DataKeys (Scenario / Time) found in data source.
Hello everyone, I'm trying to create my first import from scratch. Unfortunately, it always gives me the same error : No valid DataKeys (Scenario / Time) found in data source. Review the source data load processing log and check one-to-one transformation rules to ensure that you have created proper Scenario and Time dimension rules for this data source. And no line is imported or displayed in the logs. However, I've gone down each parameter and everything seems correct and consistent with what I have on other import files. Could you help me investigate? My settings : All the UD1 to UD8 are on None. And the log :Solved67Views0likes6Comments