importing a file with variable columns
We have been tasked with importing a file that has a large number of variable columns. For the sake of easy explanation, let's say the first five columns are standard (time, entity, ud1, ud2, ud3) but the file could have from 50 to 150 additional columns, one for each account. If there is no data for the account, there is no column for it. New accounts could appear in the future without warning. No, we don't have the ability to change the format of the report. (Oh, how I wish.) I have thought up several ways of making this work but each is fraught with its own type of peril. Create a new, custom table dynamically to stage the data. Parse the column names from the file. Use the column name list to run a new SQL query to unpivot. Parse the file in-memory to manually unpivot by parsing each data column and adding rows to a datatable, then returing the full data table. Maintain a list of the columns we care about the most, parse the file in advance and save the column name/position maps to parameters/a lookup table. Use up every possible attribute/value field in a data source to stage to BI Blend and try to unpivot from there. Hope they never need more "important" columns than OS can handle. (This is similar to option 1 but we're not stuck dropping/creating a custom table ourselves and we have more consistent column names.) Write a manual file-parser that creates a new, sane text file and then imports that instead. (Seems wasteful. If I can get it this far, I can probably just do it in-memory, ie, option 2.) Some other, better idea that I haven't thought of yet.Solved47Views0likes2CommentsHow to copy a scenario from one application to another?
Hello, I recently recovered a scenario from a restored application copy from a point in time. I now would like to extract that scenario from the application copy and put that in our working application. Does anyone have any idea on how this would be done? Thank you, Jeremy MorganSolved1.7KViews0likes7CommentsCan ERP transactional data be loaded into OS?
Hi, my company's OS data integration with the ERP system has been set-up to load monthly TB's into OS and for transactional tables to sit outside of OS but with the ability for OS to query and drill-back to view transactions. The transactional drill-back function is limited in that a drill-down first needs to be performed to get to the detailed dimensional intersections at the lowest UD level before a drill-back to view transactions (at that precise lowest level set of dimensional intersections) can be run. It is therefore easier to use a stand-alone BI viewer outside of OS to view transactions. Am wondering - is it possible to load ERP transactional tables into OS (this may contain millions of rows spanning multiple years) and for OS to then display transactional level detail across higher-level roll-ups (e.g. viewing transactions across a cost centre parent rather than only at an individual cost centre level), or even display transactional detail in quick views, essentially making OS have similar functionality to a BI viewer?112Views0likes8CommentsLoading Annotations Only via a Workflow
Is it possible to create a data source to load annotations only from a flat file. I am tagging the annotations in the flat file with the View dimension. It seems like the data source always wants an "Amount". Is this case I have no amount so when I Ioad the data file, I get an error that "No Valid Data Keys in File". It seems that this should be possible. Maybe I should do this via Excel loads.Solved3.7KViews1like10CommentsAudit History for Forms - CubeView drill down (v8.4)
I have drilled down to the bottom of both Origin and UD6 dimension which shows the data was loaded by corp accounting team. However, i cannot view Audit history for Forms - it continues to show as grey'd out. Does anyone know how to enable the audit history? thank you.40Views0likes4CommentsUse 2 fields to determine dimension value in datasource
Hi all, I need to set up a complex expression to assign a specific value to the U6 dimension based on 2 source fields. Below part of the list of those source fields. The code I set up for the complex expression is the following: Dim fields As List(Of String) = api.Parser.DelimitedParsedValues() Brapi.ErrorLog.LogMessage(si, "Field count (List.Count): " & fields.Count) If fields.Count > 4 Then Dim loadedAccount As String = fields(4) Brapi.ErrorLog.LogMessage(si, "loadedAccount: '" & loadedAccount & "'") If Left(loadedAccount, 2) = "00" Then Return Right(loadedAccount, 8) End If Else Brapi.ErrorLog.LogMessage(si, "WARNING: only " & fields.Count & " fields; skipping Account logic") End If Every time I launch the loading process, however, it prints the WARNING message for all the rows, as if it didn't find any source field. Am I doing something wrong / is there an alternative way to retrieve the fields values that I am not considering? Thank you for the help!37Views0likes3CommentsOneStream and D365 Integration
Hi everyone, I'm currently working on integrating OneStream with Dynamics 365 (D365) and would love to hear about your experiences. Specifically, I'm interested in knowing which connector you used for the integration. Did you opt for OData, Synapse Link, or another method? Any insights, tips, or recommendations would be greatly appreciated! Thanks in advance!Matrix Data Load Basics
I have a matrix data load question. I get the concept of how the matrix data load works. I haven't done too many of them, so I want to make sure I'm not missing the basics. The file I have has the year on one of the first lines in column 9. The next row contains the headers with periods in 12 separate columns. Is there an easy way to get the year from the first line (without a lot of code)? Or should I get year based on the POV? Then I am guessing a complex expression in the time columns to build the OS time key? I know I can do all of this in code, but I don't want to skip over any out of the box functionality or best practices. Any good guides on some of the best approaches with the Matrix load? Thanks, ScottSolved1.1KViews0likes4CommentsForm Update Via Excel is Appending, not Replacing
We have a form in our workflow for budgeting. Users have the option of interacting directly with the form or using a spreadsheet template we provided to update their values. There are pre-seeded values in the cube. Users can then make adjustments to the amounts. When users are updating via the form, we are getting values replaced like we would want. However, when users interact with the form and update via our provided excel file, the amounts are getting appended to the original value. We can't seem to locate where this behavior is triggered on upload to form via Excel. Is there something in the VB we need to adjust, or is this a 'standard' behavior. We'd like to let our users simply replace the values. Let me know if you need any add'l clarifications. Regards, DKSolved44Views0likes1Comment