D365 Fixed Assets Subledger Integration
Hi All, Just curious if anyone has had success pulling data (SQL) from D365's Asset Ledger to get the continuity / movements for Fixed Asset accounts? In other words, it would provide the Flow movements like Additions and Disposals for the Fixed Asset accounts to justify the change in the Balance Sheet account balance. Has anyone ever successfully done this, specifically using SQL from D365? Thx, Mike7Views0likes0CommentsFTP File Load Best Practice
When I pull a flat file from an SFTP server, I have been parsing that file in VB .net into a Data Table and loading it that way. Is this the best way to do this, or is there a way to make the Import read the file from the filesystem once it has been downloaded? I feel like parsing a csv file using Split() is risky. Thanks, ScottConnect different OneStream SaaS deployments
Hi Folks, We're just starting on a OneStream SaaS implementation to replace HFM. We're far from the stage of actively working on any data integrations yet but I'm already starting to think about how things will hang together and how data will potentially flow in the future state of things. Two of the downstream feeds that we currently receive in our HFM system are companies which are current users of OneStream SaaS. Just thinking at a very high level for now, does anyone know if there are any ways downstream systems could submit data to us without punching out to the internet e.g. using Azure Private Link service and staying within the Microsoft network (or some other networkey magic)? High level pretty picture :) Obviously we wouldn't be able to configure this ourselves as this is SaaS but I'm interested to hear if anyone else has any experience exploring this or anything similar with OneStream SaaS and if anything like this is even possible in the SaaS world. Regards Craig118Views0likes8CommentsIdentifying API Parameters for OneStream Audit Log Export via Postman
Hello all, I joined a company where they have implemented OneStream solution. I have access to the platform, and still exploring the way it was implemented. I am currently working on integrating OneStream with an external platform and need to export audit logs using the product's API. While setting up the request in Postman, i've encountered difficulty identifying the correct values for the following parameters: ApplicationName AdapterName Despite reviewing the available documentation, these tags remain unclear. Has anyone successfully queried audit logs via the OneStream API and can share insights on where these parameters are defined or how to retrieve them? Any guidance or examples would be greatly appreciated. Thank you in advance!27Views0likes1Commentimporting a file with variable columns
We have been tasked with importing a file that has a large number of variable columns. For the sake of easy explanation, let's say the first five columns are standard (time, entity, ud1, ud2, ud3) but the file could have from 50 to 150 additional columns, one for each account. If there is no data for the account, there is no column for it. New accounts could appear in the future without warning. No, we don't have the ability to change the format of the report. (Oh, how I wish.) I have thought up several ways of making this work but each is fraught with its own type of peril. Create a new, custom table dynamically to stage the data. Parse the column names from the file. Use the column name list to run a new SQL query to unpivot. Parse the file in-memory to manually unpivot by parsing each data column and adding rows to a datatable, then returing the full data table. Maintain a list of the columns we care about the most, parse the file in advance and save the column name/position maps to parameters/a lookup table. Use up every possible attribute/value field in a data source to stage to BI Blend and try to unpivot from there. Hope they never need more "important" columns than OS can handle. (This is similar to option 1 but we're not stuck dropping/creating a custom table ourselves and we have more consistent column names.) Write a manual file-parser that creates a new, sane text file and then imports that instead. (Seems wasteful. If I can get it this far, I can probably just do it in-memory, ie, option 2.) Some other, better idea that I haven't thought of yet.Solved53Views0likes2CommentsHow to copy a scenario from one application to another?
Hello, I recently recovered a scenario from a restored application copy from a point in time. I now would like to extract that scenario from the application copy and put that in our working application. Does anyone have any idea on how this would be done? Thank you, Jeremy MorganSolved1.7KViews0likes7CommentsReferencing business rule in data management step
I'm struggling with something elementary here. I can't seem to properly reference a business rule in my data managment step. I've tried a couple of options - explicit workspace name and 'Current' which should both work, but I consistently get the error: Error processing Data Management Step 'MyDMStep'. Business Rule 'Workspace.MyWorkspace.MyAssy.MyBR' is invalid. or Error processing Data Management Step 'MyDMStep'. Business Rule 'Workspace.Current.MyAssy.MyBR' is invalid. Using that name as a reference, this is how my App Workspace Heirarchy looks: Anyt input would be greatly appreciated!54Views0likes2CommentsClearing data for a very specific intersection / Data Unit
Hi all, Trying to delete an unnecessary member from the metadata but it won't delete due to 'existing' data. There isn't actually a value where it says a value exists but OS believes there is data there in currencies that are non-local. I've tried the DM Clear Data job but it only seems to see the Cube, Entity, Time periods and Scenario. It appears to ignore any other dimensions specifications I include in any of those filters (i.e. additional UD dimension specifics). Which makes it too broad for our requirements. Tried to test this and even though the DM job succeeded for that cube/entity/scenario/time period, it STILL thinks there's data there and won't allow the deletion of the member. Hoping someone out there has run into a similar situation and has some suggestions to offer on how to truly clear a member completely so it can be removed from the metadata. ThanksSolved4.7KViews0likes7Comments