How do you know if you are using the Roslyn compiler?
Hotfix 8.4.3 addresses, among other things, security vulnerabilities found in the third-party Roslyn Compiler DLL. How do you know if you are using the Roslyn compiler in your implementation of custom code and, therefore, at risk if you are? Is the Roslyn Compiler only used by the WinSCP libraries (and therefore its removal is WinSCP-related)? Or are there instances where Smart Integration Functions would be using the Roslyn compiler outside of the use of WinSCP?Solved16Views0likes1CommentWorkflow channels with scenario copies
Hi All, In our multi year planning process, most entities need to upload four sets of fixed costs (all loaded by the same team, but sent to that entity's finance team by different functional teams), in addition to all the driver-based costs we're calculating (mostly based on form inputs). For the four sets of fixed costs, we have four import profiles under the same parent workflow. The entities import their files and then run a DM sequence from a dashboard to run the validate/load for a range of years they can select. When they run this sequence, all import profiles (under the same parent) that have been imported are validated and loaded. We had not set up workflow channels on the different loads, and everything was working as expected. The different import profiles will use some of the same accounts, but different sets of cost centers. We did not see any collision of data when the same accounts were loaded, and if a balance that had previously been loaded in one import was removed from the import template and reimported, the data in that account would be cleared from the cube (even if that account was only used by one of the four imports) after the next validate/load. Now that we have started to copy cube data from one completed scenario (CY) to another blank scenario (CY_v2), so that some changes can be made to CY_v2, we were running into issues with our fixed cost loads in CY_v2 (balances not clearing, other values being overwritten). This makes sense, and we had not considered loading the fixed costs to copied scenarios. We would like to be able to only load one of the four imports (if say the G&A load is the only one with changes) in the CY_v2 scenario. We were hoping that workflow channels on the imports would solve the issue in the copied-to scenarios, but after completing that set up (changing accounts to nodatalock, setting up different data source members for each of the four imports, creating the channels, and applying the channels to the data source members), we are still seeing balances "stuck" in CY_v2 when previously loaded balances to a unique account are removed from the latest upload for one of the imports. We had also gone back and reloaded the "source" scenario with the channels applied, before copying the data and trying to load only a single template to CY_v2. Is there a way to use workflow channels so that all data loaded through that channel is cleared and replaced when a new import is loaded to a data source member where the channel is applied, or will accounts no longer in the latest import be left un-cleared? The documentation on the level 2 and level 3 data units seems to indicate that those accounts will not be cleared, but that does not match the behavior we see in the original loads of multiple sibling import profiles (described and bolded in my second paragraph). It also doesn't seem to match how a typical, iterative planning process would work. Any feedback would be appreciated. Thanks, James24Views0likes0CommentsTransformation Rules - Possible Bug
I have an issue, since we migrated to V9, OS is adding on its own a space after transforming a member and this prevent us from loading the data to the member. (This is in the validation step when it fails) Data source has been checked, adjusted, and tested in many ways. The extra space doesn't come from here. Transformation rules have also been tested in many ways, and the space is also not here, we event tried one-to-one, masks, all ,and this is not the issue The error is only happening with 2 UD3 members US4 & CA4 both ending in 4, I created a fake one for testing "US41" and this one works fine. I have been checking with OS support, but they just want to repeat over and over the same testing we already did with them on a call To me, after checking all that I could, seems like a bug in the new V9.1, but if anyone has any ideas please let me know.Matrix data load with entities in columns
Is there a way to set up a single matrix data source for a file with accounts in rows and entities in columns where the entities could change, as well as the number of entities? Can you set up matrix for max number of entities and read entity from specific row?32Views0likes1CommentIdentifying API Parameters for OneStream Audit Log Export via Postman
Hello all, I joined a company where they have implemented OneStream solution. I have access to the platform, and still exploring the way it was implemented. I am currently working on integrating OneStream with an external platform and need to export audit logs using the product's API. While setting up the request in Postman, i've encountered difficulty identifying the correct values for the following parameters: ApplicationName AdapterName Despite reviewing the available documentation, these tags remain unclear. Has anyone successfully queried audit logs via the OneStream API and can share insights on where these parameters are defined or how to retrieve them? Any guidance or examples would be greatly appreciated. Thank you in advance!62Views0likes3CommentsConnector rule - Drill back on dimension using a business rule logical operator
Hi, in a connector business rule, when a dimension has a business rule logical operator, is there a better way to build the SQL drill back query than reverse engineer what the business rule is doing? In the example below the business rule brings "Zero" to the stage when the source UD1 is null. The code in the drill back is reversing that to get the right source data: 'UD1 If sourceValues.Item(StageTableFields.StageSourceData.DimUD1).ToString.XFEqualsIgnoreCase("Zero") Then whereClause.Append("And (Department IS NULL Or Department = '') ") Else whereClause.Append("And (Department = '" & SqlStringHelper.EscapeSqlString(sourceValues.Item(StageTableFields.StageSourceData.DimUD1).ToString) & "') ") End If However, this is a simple business rule. I am wondering if there is a way of getting the source data before it is passed through the logical operator business rule, to reduce code complexity in the drill back. Thank youSolved1.6KViews0likes3CommentsFTP File Load Best Practice
When I pull a flat file from an SFTP server, I have been parsing that file in VB .net into a Data Table and loading it that way. Is this the best way to do this, or is there a way to make the Import read the file from the filesystem once it has been downloaded? I feel like parsing a csv file using Split() is risky. Thanks, ScottSolved44Views0likes2CommentsLoading Amount and Annotations together
Hi, not sure if anyone came across with this one but thought it was interesting to share If you have a file which has in the same line amount and annotation you can set View to YTD (or periodic), then in the text value complex expression use this api prop: api.Parser.TextValueRowViewMember = "Annotation" That will make the parser to automatically generate the line(s) with the annotation if text value have a value. HTHSolved1.7KViews8likes4Comments