Recent Discussions
Workflow Profile Hierarchies: Security Model | Application Security Roles v Workflow Security
Please share your practical advice to secure Workflow Profile Hierarchies per the two requirements below. Thank you. A. Workflow Profile Hierarchies: Admin v Builder/Business 0. 'Cube Root Workflow Profiles' | Admin and Business | Top Level Cube_suffix 1. 'Default/Child Workflow Profile Types' | Admin/Business 'Cube Root Workflow Profiles' 2. Admin 'Parent/Child Workflow Profile Types' | Admin/Business 'Cube Root Workflow Profiles' 3. Business 'Parent/Child Workflow Profile Types' | Business 'Cube Root Workflow Profiles B. Two Workflow Requirements: Admin v Builder/Business 1. Builder not allowed to see/create/edit: - Admin 'Cube Root Workflow Profiles' - 'Default/Child' and Admin 'Parent/Child Workflow Profile Types' | Admin 'Cube Root Workflow Profiles' - 'Default/Child Workflow Profile Types' | Business 'Cube Root Workflow Profiles' - Admin 'Parent/Child Workflow Profile Types' | Business 'Cube Root Workflow Profiles' 2. Admin/Builder allowed to see/create/edit: - Business 'Cube Root Workflow Profiles' -- Business 'Parent Workflow Profile Types': Review, Base Input, Parent Input -- Business 'Child Workflow Profile Types': Import, Forms, Journals C. Security Model: Application Security Roles + Workflow Security | Admin v Builder/Business 0. Security Groups: 'Admin Roles and Builder Roles' and Child Groups = 'Admin Roles' + 'Builder Roles' 1. Application Security Roles | Manage Workflow Profiles = 'Admin Roles and Builder Roles' 2. Workflow Security = 'Admin Roles' = Admin/Business 'Default Workflow Profile Types' 3.1. Workflow Security = 'Admin Roles' = Admin 'Cube Root Workflow Profiles' 3.2. Workflow Security = 'Admin Roles' = Admin 'Parent Workflow Profile Types' 4.1. Workflow Security = 'Builder Roles' = Business 'Cube Root Workflow Profiles' 4.2 Workflow Security = 'Builder Roles' = Business 'Parent Workflow Profile Types' D. Security Model Results: Admin v Builder/Business - 1 + 2 + 3 + 4 = Workflow Requirement 1 = Fail - 1 + 2 + 3 + 4 = Workflow Requirement 2 = Succeed Thank you, SMEs.156Views0likes0CommentsErrors creating New Task in Data Import Schedule Manager
I am getting several errors when setting up a New Task in the Data Import Schedule Manager. We are on OS version 9.01.17403, and I have installed the latest version of DSM, which is 8.4.0_SV100. I suspect this may be a version compatibility issue, so I am curious if anyone has been able to get this solution to work in a 9.01 application. I have already uninstalled and reinstalled the solution, which didn’t resolve the issues. Below are the two errors I am seeing: When choosing Global Scenario from the Scenario(s) drop down list, I get an immediate error “Error processing member. The item was not found. Member, 11111111.” The details state: Unable to execute Business Rule ‘DSM_Paramhelper’ where it appears to be trying to call the Global Scenario by OneStream.Client.Api.DashboardsAjaxServiceReference.DashboardsAjaxServiceClient.GetParameterDisplayInfosUsingDashboardNameCompressed(SessionInfo si, LoadDashboardInfo loadDashboardInfo, Boolean isForDashboardUIWithInteractiveComponents, Dictionary`2 custSubstVarsAlreadyResolved). If I pick a specific Scenario, I am getting a different error. It allows me to pick the Scenario and Time, but when save the Task, I get “The input string ‘’ was not in a correct format.”. The error details show it is an issue with the same Business Rule ‘DSM_SolutionHelper’ where Conversion from string “” to type ‘Double’ is not valid. The input string ‘’ was not in a correct format. OneStream.Client.Api.DashboardsAjaxServiceReference.DashboardsAjaxServiceClient.StartExecuteSelectionChangedServerTaskCompressed(SessionInfo si, Boolean isSystemLevel, Guid primaryDashboardID, Guid embeddedDashboardID, Guid componentID, PageInstanceInfo pageInstanceInfo, XFSelectionChangedServerTaskInfo serverTaskInfo) Any advice on how to correct these issues would be greatly appreciated.17Views1like0CommentsWorkflow Import - Clearing cube data in v9
Hello, we recently upgraded to v9 and noticed that we can no longer clear WF imported cube data using the "Clear" option through the workflow import (snip below). Prior to the upgrade, we could clear the stage data, then run import/validate/load to clear the data that was imported to the cube. Now, this option only clears the data from stage and in order to clear the data loaded to the cube, we have to use a data management step. In our case, this is now difficult because the data loaded may be specific to workflow channels and also use multiple dimension criteria and not just Entity (used by DM step). Is anyone else having this issue? Are there any other methods to removing the imported data via csv file from the cube other than a DM step? If we try to run the old process in v9. Clear, then Retransform, Load to clear cube data. This error appears.TyeshaAdams8 days agoNew Contributor III78Views0likes6CommentsGet dependent entities on a review level workflow
Hi All, We have a parameter attached in the entity POV of all our cube views, and it works perfectly on base level. However, not in the review level workflow. Is there a way to include dependent entities in a review-level workflow via XFBR String? The entities in the review level are dependent, coming from the assigned entities in base level workflows underneath it. I have tried using objList = BRApi.Workflow.Metadata.GetDependentProfileEntities(si, profileKey) with no luck. Perhaps I am missing something. Bound list has been tested as well, and attaching the parameter in the POV of a cube view, only works in the base level workflows as well. When the workflow is a review, it will show a full list of entities that we do not want. Is there a way through the XFBR string and parameter in the member dialog? Appreciate all suggestions.cons111 days agoNew Contributor III20Views0likes2CommentsThe supplied Workflow Time is more than 3 periods after the last import period
We are currently in the process of upgrading to OFC PV840-SV112 and just applied the upgrade to our Customer Acceptance Testing Application where we have our imports and matching running daily. However, the amount of data imported is much less in this applicationas it is using a test database to import the data from and one of the matchsets has not imported any data for many months. The matching step for that matchset erred out a couple of days after the upgrade with the message "The supplied Workflow Time is more than 3 periods after the last import period". We discovered that if we loaded just one "dummy" row for the matchset that the error stopped happening. My questions are - has anyone else noticed this? And, if so, is there a setting somewhere to either turn this check on or off or increase the number of months that it looks back? Thanks for any information you can provide.Ruth_Sillman12 days agoNew Contributor15Views0likes0CommentsHow to validate metadata in Journal on submit / post
I would like to build a business rule that will execute when user attempts to submit or quick-post a journal. The rule will check that correct Flow dimension member had been used and display the message to the user if that is not the case. If anyone has something similar in use, can you please share. Any advise will be greatly appreciated. Thank you! YanYanSavinsky12 days agoNew Contributor III46Views0likes2CommentsTransformation Rules - Possible Bug
I have an issue, since we migrated to V9, OS is adding on its own a space after transforming a member and this prevent us from loading the data to the member. (This is in the validation step when it fails) Data source has been checked, adjusted, and tested in many ways. The extra space doesn't come from here. Transformation rules have also been tested in many ways, and the space is also not here, we event tried one-to-one, masks, all ,and this is not the issue The error is only happening with 2 UD3 members US4 & CA4 both ending in 4, I created a fake one for testing "US41" and this one works fine. I have been checking with OS support, but they just want to repeat over and over the same testing we already did with them on a call To me, after checking all that I could, seems like a bug in the new V9.1, but if anyone has any ideas please let me know.Luigi_Caroli14 days agoNew Contributor II35Views0likes1CommentMatrix data load with entities in columns
Is there a way to set up a single matrix data source for a file with accounts in rows and entities in columns where the entities could change, as well as the number of entities? Can you set up matrix for max number of entities and read entity from specific row?DEReam2315 days agoNew Contributor22Views0likes1CommentHelp Connecting DBeaver to OneStream Internal Database
Hi everyone, I’m trying to connect to the OneStream internal database so I can access the application data using DBeaver as my database manager. Has anyone successfully done this before? If so, could you share the steps or any tips to get it working? Thanks in advance for your help!carbon21 days agoNew Contributor29Views0likes1CommentAutomate execution of non-import workflow
I´ve setup a process, confirm and certify workflow steps on the Base input workflow profile and I would like to setup automated run of this and email notifications for the confirmation results. Does anyone have an example of business rule for this? I expect it is different then the general one used for automating imports.GettingThere21 days agoNew Contributor11Views0likes0Comments