Recent Discussions
Business Rule within the Import Step
Hello, We have a custom business rule that does some account reversing we run centrally at the end our month end process. using a dashboard We're putting in a new lease process within OneStream and was wondering, is there a way instead of giving the front end user more steps within their workflow, that I can attach the business rule to the import step and once they run the "Load and Process" portion of the import step, the business rule will run? Thanks, WillSolvedWillVitale16 hours agoContributor II23Views0likes3CommentsDefault load Method not sticking to replace - all time
Hi all We have come across an issue where the default load Method for workflow does not change from "replace" to "replace all time". Anyone has experienced the same issue? Thank you for your feedback NouriaNou18 hours agoContributor4.8KViews0likes9CommentsChange Load method through connector code
Hello everyone! Quick question. Does anybody know how to change the Load Method for a ProcessDataTable through the connector code? We have a dashboard where the user wants to choose the Load Method with a radio button. We are able to pass that parameter to the code and change the api.LoadMethod property to Append but the load still executes as Replace. The above works if we change the load method directly in the WF, but not from the codeNelly18 hours agoNew Contributor1.1KViews0likes2CommentsValidation on Data Source: Restrict Entities to _PX Suffix and Metadata Text Contains PSC^
Hi Community, I’m working on a data load in OneStream and need to enforce two validations on the Entity dimension during the upload process: Entity name must end with _PX. Entity’s metadata Text field (e.g., Text1/Text2) must contain PSC^. (Basically Prevent data from being loaded to anything but a PSC entity) Currently, I see the Logical Expression and Override Settings on the Data Source column, but it seems limited to simple operators like Ends With or Like. My question is: Can this be achieved using a complex expression directly in the Data Source column settings? Or do I need to implement this as a Transformation Validate rule or a Data Source Business Rule? If a Business Rule is the right approach, could someone share a best-practice snippet for checking both conditions (including metadata lookup)? Should I create a conditional rule and add to the logical expression above?sahil3 days agoNew Contributor25Views0likes2CommentsCubeview Drill Down issue: Load Results for Importe Cells is empty
Hello there, We are having an issue when using the Drill Down option from a Cubeview, and it's that after getting to a base data cell within the Drill Down window, and then trying to navigate to stage using the "Load Results for Imported Cell", the results window is displayed empty: The workflow that is being used it's a simple Import, Validate and Load, that retrieves the imported data from a Connector and there is no further data processing once it's in the Cube. From the Workflow Page we can use Drill Down and Drill Back just fine. Any ideas of why is this happening? Thank you. Oriologonzalez3 days agoNew Contributor II20Views0likes3CommentsCentral Import and Account Reconciliations
I’m starting an Account Reconciliations implementation and the current process loads Actuals for all entities through a central import. They do have individual workflows for each entity where security is assigned at that level and used for planning and reporting. For RCM they want to look at their recs at that level too (WF entity) but data is not flowing through. Did the discovery both at the Central and individual levels and only the Central inventory is being populated. I guess it’s because data is not being loaded to stage into those WFs. Is there a way to work around this or will data need to be loaded at the entity level for it to work?mlopez7 days agoNew Contributor II19Views0likes0CommentsExcel form template load error for some users but not all
The Excel form template has the range XFF defined. It worked for some users, but does not work for others. The error they got is: "File does not contain valid range tokens, token XFC" How do we make the range XFF work for everyone? Do they have a different setting on their application or laptop?SolvedHirono8 days agoNew Contributor23Views0likes2CommentsTime/data setting in data source/Transformation rule for Multiperiod upload (delimited files)
Hi Everyone, I'm trying to build Data source and transformation rule for multi period manual data load through single file. The import step didn't succussed. Can anyone suggest time settings in data source & transformation rule. I selected data type as "Stored Data key text" in Data source . My source file format is Col1 has GL accounts, Col2-13 Jan to Dec data.Solveduvrao339 days agoNew Contributor III54Views0likes5CommentsMatrix Data Load Basics
I have a matrix data load question. I get the concept of how the matrix data load works. I haven't done too many of them, so I want to make sure I'm not missing the basics. The file I have has the year on one of the first lines in column 9. The next row contains the headers with periods in 12 separate columns. Is there an easy way to get the year from the first line (without a lot of code)? Or should I get year based on the POV? Then I am guessing a complex expression in the time columns to build the OS time key? I know I can do all of this in code, but I don't want to skip over any out of the box functionality or best practices. Any good guides on some of the best approaches with the Matrix load? Thanks, ScottSolved1.2KViews0likes5CommentsExport to cube in specifics departments
Hi everyone, I want to run the export to the cube for a specific department. The reason is that my current process deletes all the information in other departments. Is there a way to apply a filter to this process?Marco17 days agoContributor II27Views0likes1Comment