Validation Error: No security access to data cell being cleared
Hi Team, We have a client that upload excel templates using xfdrange into OneStream. They have different files per Business Unit (UD2). Two of the files run into the following intersection error during the validation step: All the other files are correct, however, when they upload these two files, all data intersections (from all other files) are suddenly also invalid intersections. For examples, the line listed here is not from either of the two files but from a seperate file that has no errors when uploaded individually). When you would navigate to "Intersection" instead of "invalid Intersection (1)" all data intersections from all files are there. Only the local users get this validation intersection, when I (administrator) would run "Validate" from this screenshot, the error dissapears. Strange thing is: they have uploaded data on this intersection before in previous Forecast Scenarios without errors Security has not changed for either the application, user, workflow profile, entity etc. these two files both have an account that is only used there but is set up in the same way as all the other accounts (and previously could be loaded in other scenarios). Could just be a coincidence. Application Security: ModifyData is set to Everyone One thing to note is that this is a new forecast scenario we recently created for them, however, setup is the same way as the other forecast scenarios in terms of security. I am also having a hard time understanding the error and why it is refering to "clearing data" in the validate step. Anyone that can help me a little bit more in understanding the error or knows what could cause this? Thanks in advance!18Views0likes1CommentValidation Errors mapping to placeholder
Hey Folks, I am looking to map all validation errors that might occur during an automated load to a placeholder member in the UD1 dimension, but i am running into issues as the mask rule approach is not working. Does anyone else have any other way to make this work? Below is what i have tried but i still get kickouts in the validation step of the load process.89Views0likes9CommentsComposite mapping
I'm fairly new to OneStream. I've a requirement where if ICP is P213 and irrespective of any source entity then target entity will be "Chicago" and if Source entity is "E123" then target entity will be "New York". The problem which I'm facing is that I do have "E123" in source file where it only takes the mapping of "E123"->"New York". It doesn't even see if the ICP is P213.Solved23Views0likes3CommentsS4/HANA - URL drill-back gets truncated in OS
Has anyone come across a bug that causes S4 HANA - URL drill-back to get truncated on execution? We are calling a remote URL that opens the Fiori app. The actual URL gets cut off from the # sign from the OS Drill-back in the browser. Simple code in use - The drillback code is: drillBackInfo.DisplayType = ConnectorDrillBackDisplayTypes.WebUrlPopOutDefaultBrowser. OneStream Support reported this as a bug in the solution. Any thoughts?43Views0likes6CommentsAutomation of exchange rates calculations
Dear colleagues, Please advise do you have experience with automation of exchange rates calculation. We need to pick up daily rates for different currencies from different source websites, calculate monthly average rates and end of reporting period rates and upload them to the correct time period in OneStream. Where can I find guidance to solve this task?6Views0likes0CommentsNeed physical path to file
I am implementing a BR that pulls a file from FTP. The file is encrypted so I need to decrypt it using a third party library. In order to decrypt it, I need to pass in a physical path to the PGP key provided. I uploaded the key to the Public folder share for now but cannot seem to find a way to get the actual physical file. Does anyone know if uploading a file to the share folder results in an actual file being created on the server or is it just stored as a BLOB in the database?Solved4.8KViews0likes10CommentsProcessing Blank Columns | Direct Import/Load v Standard Import
Rolling Quarterly Forecasts - to Import 12-month flat files - Several times daily during a cycle - 12-month Matrix Data Source 1. Forecast Q1 '25 flat file - Columns Q1-Q4 '25 = amounts 2. Forecast Q2 '25 flat file a. Columns Q1 '25 = blanks b. Columns Q2-Q4 '25 = amounts c. Q1 '25 to be refilled with actuals - using a DM Job after the initial Import 3. Forecast Q2 '25 flat file - Columns Q1 '25 = blanks a. Direct Import/Load - Wiped out refilled actuals in Q1 '25 = blanks b. Standard Import (Import, Validate, Load) - Kept refilled actuals in Q1 '25 4. Blank Columns - Q1 '25, etc = Prior Periods a. Why do Direct and Standard process blank columns differently? b. How to make Direct process blank columns like Standard? - Keep actuals in Q1 '25/prior periods in OS Data Cubes TY.20Views0likes0Comments