Clear data using level 1 data unit (rather than level 2) on data load
Hi all I was wondering if it is possible to override the default 'clear and load' settings on a data load step from 'Level 2: Workflow Data Unit' to 'Level 1: Cube Data Unit' Just to give a bit of context: We perform a data copy from a Budget scenario (using Data Management Sequence) to a Forecast scenario to provide a starting data position for divisions to use when they capture their forecast data. Once the data copy is complete, the process for divisions to update their forecasts is to extract an Excel Fixed File Template on which they will make their changes and additions over the copied Budget data, and then to import that file via their Workflow to the cube. However, when wanting to remove an account for instance, the only way to do so it to load zeros to that account. This is because the load clears by account and not entity. Is it possible to change the load so that it deletes all accounts? Thanks, Mark1.2KViews2likes3CommentsExcel Template upload - Data Source setup
Hi All, I'm trying to set up some Excel templates for the uploading Budget and Forecast data. I'm pretty happy with the Excel Template side, with the named range (XFD), and specific header formats, etc. However, I'm a bit lost on the Data Source set up, I get that you have to Allow Dynamic Excel Loads set to True, but what about the rest of the set up? Do I choose Delimited or Fixed file? It feels like this Data Source section is really for flat files, as it always wants to know the column number. I've tried importing the Excel into the Data Source in the same way I would for a csv file, but it just shows up as xml gibberish in the top box. It definitely feels like I'm missing something.Solved5KViews2likes3CommentsSo you have upgraded to 7.1.1 version . Check out the known error-
You will get this error during the import step. Reason - The load file doesnt have any data after transformation.( ByPass). The underlying code looks to summarize the data when it finds nothing, it throws an error.1.3KViews1like1Comment"Cannot execute step because the prior workflow step is not completed"
We have Workflow A, and Workflow B. We get an error when the loadCube step in workflow B, when Workflow A is in Validation step. The error we get is "Cannot execute step because the prior workflow step is not completed. . (WP#WorkflowA.Import:S#Actual:T#2022M1)" The name the actual workflow has been changed to protect the innocent. Our question is: what is causing this issue and how do we fix it? It's not clear in the documentation what exactly triggers this error message. The parameters passed back imply that one cannot load to the same scenario and time if another workflow to that scenario and time is still open. WorkflowA and WorkflowB are loading to the same scenario and time, but to different accounts.4.9KViews1like5CommentsErrors creating New Task in Data Import Schedule Manager
I am getting several errors when setting up a New Task in the Data Import Schedule Manager. We are on OS version 9.01.17403, and I have installed the latest version of DSM, which is 8.4.0_SV100. I suspect this may be a version compatibility issue, so I am curious if anyone has been able to get this solution to work in a 9.01 application. I have already uninstalled and reinstalled the solution, which didn’t resolve the issues. Below are the two errors I am seeing: When choosing Global Scenario from the Scenario(s) drop down list, I get an immediate error “Error processing member. The item was not found. Member, 11111111.” The details state: Unable to execute Business Rule ‘DSM_Paramhelper’ where it appears to be trying to call the Global Scenario by OneStream.Client.Api.DashboardsAjaxServiceReference.DashboardsAjaxServiceClient.GetParameterDisplayInfosUsingDashboardNameCompressed(SessionInfo si, LoadDashboardInfo loadDashboardInfo, Boolean isForDashboardUIWithInteractiveComponents, Dictionary`2 custSubstVarsAlreadyResolved). If I pick a specific Scenario, I am getting a different error. It allows me to pick the Scenario and Time, but when save the Task, I get “The input string ‘’ was not in a correct format.”. The error details show it is an issue with the same Business Rule ‘DSM_SolutionHelper’ where Conversion from string “” to type ‘Double’ is not valid. The input string ‘’ was not in a correct format. OneStream.Client.Api.DashboardsAjaxServiceReference.DashboardsAjaxServiceClient.StartExecuteSelectionChangedServerTaskCompressed(SessionInfo si, Boolean isSystemLevel, Guid primaryDashboardID, Guid embeddedDashboardID, Guid componentID, PageInstanceInfo pageInstanceInfo, XFSelectionChangedServerTaskInfo serverTaskInfo) Any advice on how to correct these issues would be greatly appreciated.53Views1like0CommentsWorkflow Import - Clearing cube data in v9
Hello, we recently upgraded to v9 and noticed that we can no longer clear WF imported cube data using the "Clear" option through the workflow import (snip below). Prior to the upgrade, we could clear the stage data, then run import/validate/load to clear the data that was imported to the cube. Now, this option only clears the data from stage and in order to clear the data loaded to the cube, we have to use a data management step. In our case, this is now difficult because the data loaded may be specific to workflow channels and also use multiple dimension criteria and not just Entity (used by DM step). Is anyone else having this issue? Are there any other methods to removing the imported data via csv file from the cube other than a DM step? If we try to run the old process in v9. Clear, then Retransform, Load to clear cube data. This error appears.265Views1like7CommentsAbout Foundation Second Edition
Beyond offering a training guide, the focus of this book is on the ‘why’ of design and building an application. While the foundational principles of building a solid, scalable OneStream application have remained largely unchanged, updates contained in this second edition reflect implemented software enhancements, along with the ongoing development of the OneStream landscape. Manage your Implementation with the OneStream methodology Understand Design and Build concepts Build solutions for the Consolidation of financial data, and develop Planning models Create Data Integration solutions that will feed your models Develop Workflows to guide and manage your End-Users Advance your solutions with Rules and Security Take advantage of detailed Data Reporting using tools such as Analytic Blend and Advanced Excel functionality Tune Performance, and optimize your application New content on Workspaces, Smart Integration, Dashboard design, and more. Over 180 updated images. The information contained within this book is relevant to software version 8.4.0. To access the complete publication, you must purchase either the PDF or the physical copy of the book. Purchases can be made at onestreampress.com. Table of Contents Foreword by Tom Shea Introduction [Peter Fugere, updated by Chul Smith] Methodology and the project [Greg Bankston, updated by Greg Bankston] Design and Build [Peter Fugere, updated by Chul Smith] Consolidation [Eric Osmanski, updated by Nick Bolinger] Planning [Jonathan Golembiewski, updated by Jonathan Golembiewski] Data Integration [John Von Allmen, updated by Joakim Kulan] Workflow [Todd Allen, updated by Chul Smith] Rules and Calculations [Nick Kroppe and Chul Smith, updated by Nick Kroppe and Chul Smith] Security [Jody Di Giovanni, updated by Bobby Doyon] Reporting [Jacqui Slone and Chul Smith, updated by Chul Smith] Excel and Spreadsheet Reporting [Nick Blazosky, updated by Nick Blazosky] Analytic Blend [Andy Moore, Sam Richards, and Terry Shea, updated by Chul Smith] Introduction to the Solution Exchange [Shawn Stalker, updated by Shawn Stalker] Performance Tuning [Jeff Jones and Tony Dimitrie, updated by Jeff Jones]48Views1like0CommentsAbout Admin
Whether you are a novice or seasoned administrator, this book examines key concepts to help you understand and manage the financial and data processes of your OneStream application. Written for administrators, this book is filled with technical and functional contexts – whether syntax-related to business rules or general accounting concepts – and dives into practical examples and use cases that provide guidance and insights into commonly encountered themes. By the end of this book, you will have a deep understanding and appreciation of the capabilities that the OneStream platform offers, and have the tools needed to tackle the wide variety of administrative actions that may surface. In this book, we will cover: Components within OneStream, such as application properties, metadata, and workflow Data troubleshooting for missing or off data, whether that is related to integration setup, workflow setup, calculation adjustments in business rules, or more. Translations involving cube and metadata settings, plus the loading and viewing of FX rates. The security framework, and all the nooks and crannies that can be secured within OneStream. Constraining and locking data through systems-level and process-level controls. Considerations – as companies mature – for the updating of new or existing business processes. To access the complete publication, you must purchase either the PDF or the physical copy of the book. Purchases can be made at onestreampress.com. Table of Contents Chapter 1: Introduction Chapter 2: Testing Chapter 3: Application Properties Chapter 4: Metadata Management Chapter 5: Translation Chapter 6: Work the Workflow Chapter 7: Data Troubleshooting Chapter 8: Import and Validation Errors Chapter 9: Constraining and Locking Data Chapter 10: Business Rules Chapter 11: Cube Views Chapter 12: Securing the Pieces Chapter 13: Compliance and Audit Chapter 14: Business as “Usual” Index59Views1like0CommentsBusiness Rule | One-to-One Transformation - E# Parent to its first E# Base
We need your help to Import flat files: 1. Number of Rows: > 100k 2. Organized under E# Parent Members: a few thousands 3. E# is extended We've been fortunate to be advised to build a BR to generate a one-to-one Transformation Rule file. - E# Parent to its first E# Base Is it possible for you SMEs to be generous enough to share with us/Community: - a sample BR to look up its first E# Base Member of an E# Parent Member - under extended E# - how to call that BR in an Import Workflow Thank you, OS SMEs.167Views1like2CommentsHow to convert a raw Excel file into multiple upload-ready OneStream journals. (3 quick steps)
Use case: You generated several journal lines in Excel. You saved it as a CSV file, and the upload to OneStream failed. Here’s how to clean it up and get it posted. 1) Paste your raw lines into Excel: Paste your raw journal lines into column A as above. Go to Data → Text to Columns → Delimited → Comma → Finish. You should now see the first column as H and the next two as D, D, with each field in its column. 2) Clean the values (common blockers) ConsName must be a valid Consolidation member in your app (e.g., Entity or Local). Leave ParentName blank unless you’re 100% sure of the direct parent. Don’t type literal None unless it’s a real member; leave IC/UD blank or use your member when required. Make sure Scenario → Cube mapping matches your CubeName. If JournalBalanceType=Balanced, totals of Debits = Credits per journal. 2) Save as CSV (UTF-8) Keep only the upload rows on the active sheet (no notes above/beside them). File → Save As → CSV UTF-8 (Comma delimited) (.csv). Accept “only the active sheet will be saved.” 3) Import & post in OneStream OnePlace → Journals → Import (or your Workflow step). Choose File Type = CSV, select your file, and Import. Open the journal, Validate/Balance, then Post. Run Consolidation and verify in a Cube View. Hope this helps some people within the OneStream Universe 😊.