Recent Discussions
Help with moving/copying multiple csv files from file share to external shared Azure storage.
Hello Community - Looking for extender BR to automate by coping bunch of csv files under file share folder on to external Azure storage, any help is appreciated. Thanks.fellow_Ones3 days agoNew Contributor III40Views0likes0CommentsIdentifying API Parameters for OneStream Audit Log Export via Postman
Hello all, I joined a company where they have implemented OneStream solution. I have access to the platform, and still exploring the way it was implemented. I am currently working on integrating OneStream with an external platform and need to export audit logs using the product's API. While setting up the request in Postman, i've encountered difficulty identifying the correct values for the following parameters: ApplicationName AdapterName Despite reviewing the available documentation, these tags remain unclear. Has anyone successfully queried audit logs via the OneStream API and can share insights on where these parameters are defined or how to retrieve them? Any guidance or examples would be greatly appreciated. Thank you in advance!12Views0likes0CommentsComposite rule for filled value only
Hi, is there a way how to apply composite rule on filled value only? If I use U8#* it takes every record filled or not. My intention is to use another rule behind this one to fill default value for blank source.Pawel7 days agoNew Contributor II15Views0likes1CommentBusiness Rule | One-to-One Transformation - E# Parent to its first E# Base
We need your help to Import flat files: 1. Number of Rows: > 100k 2. Organized under E# Parent Members: a few thousands 3. E# is extended We've been fortunate to be advised to build a BR to generate a one-to-one Transformation Rule file. - E# Parent to its first E# Base Is it possible for you SMEs to be generous enough to share with us/Community: - a sample BR to look up its first E# Base Member of an E# Parent Member - under extended E# - how to call that BR in an Import Workflow Thank you, OS SMEs.KH110 days agoContributor II75Views0likes2CommentsHow to convert a raw Excel file into multiple upload-ready OneStream journals. (3 quick steps)
Use case: You generated several journal lines in Excel. You saved it as a CSV file, and the upload to OneStream failed. Here’s how to clean it up and get it posted. 1) Paste your raw lines into Excel: Paste your raw journal lines into column A as above. Go to Data → Text to Columns → Delimited → Comma → Finish. You should now see the first column as H and the next two as D, D, with each field in its column. 2) Clean the values (common blockers) ConsName must be a valid Consolidation member in your app (e.g., Entity or Local). Leave ParentName blank unless you’re 100% sure of the direct parent. Don’t type literal None unless it’s a real member; leave IC/UD blank or use your member when required. Make sure Scenario → Cube mapping matches your CubeName. If JournalBalanceType=Balanced, totals of Debits = Credits per journal. 2) Save as CSV (UTF-8) Keep only the upload rows on the active sheet (no notes above/beside them). File → Save As → CSV UTF-8 (Comma delimited) (.csv). Accept “only the active sheet will be saved.” 3) Import & post in OneStream OnePlace → Journals → Import (or your Workflow step). Choose File Type = CSV, select your file, and Import. Open the journal, Validate/Balance, then Post. Run Consolidation and verify in a Cube View. Hope this helps some people within the OneStream Universe 😊.ChrisBriscoe16 days agoNew Contributor III17Views1like0CommentsConnect different OneStream SaaS deployments
Hi Folks, We're just starting on a OneStream SaaS implementation to replace HFM. We're far from the stage of actively working on any data integrations yet but I'm already starting to think about how things will hang together and how data will potentially flow in the future state of things. Two of the downstream feeds that we currently receive in our HFM system are companies which are current users of OneStream SaaS. Just thinking at a very high level for now, does anyone know if there are any ways downstream systems could submit data to us without punching out to the internet e.g. using Azure Private Link service and staying within the Microsoft network (or some other networkey magic)? High level pretty picture :) Obviously we wouldn't be able to configure this ourselves as this is SaaS but I'm interested to hear if anyone else has any experience exploring this or anything similar with OneStream SaaS and if anything like this is even possible in the SaaS world. Regards Craigchuggans16 days agoNew Contributor II71Views0likes6CommentsReferencing business rule in data management step
I'm struggling with something elementary here. I can't seem to properly reference a business rule in my data managment step. I've tried a couple of options - explicit workspace name and 'Current' which should both work, but I consistently get the error: Error processing Data Management Step 'MyDMStep'. Business Rule 'Workspace.MyWorkspace.MyAssy.MyBR' is invalid. or Error processing Data Management Step 'MyDMStep'. Business Rule 'Workspace.Current.MyAssy.MyBR' is invalid. Using that name as a reference, this is how my App Workspace Heirarchy looks: Anyt input would be greatly appreciated!eVoid22 days agoNew Contributor47Views0likes2CommentsShrink & reindex, and elastic pool usage
Hi, I have recently reduced a PROD data table by several million rows but the Allocated table size has remained largely unchanged, as the reduction in the Used portion has just shifted to an increase in the Unused portion and the elastic pool usage has also remained unchanged. I'm assuming that if I run an app copy of PROD into DEV then OS will try to copy over the entire elastic pool size rather than differentiating between what is Used and Unused and only copying over the Used portion. A shrink and reindex has been suggested as a solution to reduce the Unused portions but I have been advised that running a shrink is a risky move as it can cause performance issues. I therefore wanted to check whether a shrink and reindex in the PROD environment, at least during monthly maintenance, is widely used and generally benefits the environment rather than being detrimental to it?IL24 days agoNew Contributor II45Views1like4CommentsREST API Execute LogonAndReturnCookie
I am trying to determine if we are using any REST APIs. We were instructed to run a logonandreturncookie via Postman. https://documentation.onestream.com/1384528/Content/PDFs/REST_API_Implementation_Guide.pdf I followed the instructions in page 26 and ran into this error message: Error: Invalid character in header content ["Authorization"] I entered "{{webapi_access_token}}", similar to the screenshot in page 27. I hovered over the token and saw a message stating it's not defined. Where should I set up this variable or is there an issue with my params/body?bjornliow28 days agoNew Contributor III23Views0likes0CommentsCan ERP transactional data be loaded into OS?
Hi, my company's OS data integration with the ERP system has been set-up to load monthly TB's into OS and for transactional tables to sit outside of OS but with the ability for OS to query and drill-back to view transactions. The transactional drill-back function is limited in that a drill-down first needs to be performed to get to the detailed dimensional intersections at the lowest UD level before a drill-back to view transactions (at that precise lowest level set of dimensional intersections) can be run. It is therefore easier to use a stand-alone BI viewer outside of OS to view transactions. Am wondering - is it possible to load ERP transactional tables into OS (this may contain millions of rows spanning multiple years) and for OS to then display transactional level detail across higher-level roll-ups (e.g. viewing transactions across a cost centre parent rather than only at an individual cost centre level), or even display transactional detail in quick views, essentially making OS have similar functionality to a BI viewer?IL2 months agoNew Contributor II113Views0likes8Comments