Import TextValue from Matrix-Excel
Hi, I have an Excel template file to import via the stage. This Excel is using a matrix-style for the values (time in columns) and also one column with a text commentary. As I have to change the view to “Annotation” in order to import a text value, I was thinking about duplicating the rows by a derivative rule and change the view in that rule. The outcome shows the duplicated rows, but these are missing the text value. So this approch is not working. I also tried to duplicate one time column and use a parser rule to change the view. Doing so, I would need to identify, which time period I am in during pasing, as otherwise all rows are changed. Any ideas about that? Regards, Tobias7.9KViews0likes16CommentsData Imports
I have currently set up a new data source and associated transformation rules (and allocated to the appropriate workflow). Data importing is correctly working for 2022 time period, however I am getting the following error in any other time period...... I have got the Scenario as 'Current Datakey' and Time as matrix datakey in my columns Summary: No valid DataKeys (Scenario / Time) found in data source. Review the source data load processing log and check one-to-one transformation rules to ensure that you have created proper Scenario and Time dimension rules for this data source. Any ideas? I am assuming this might be one simple setting change somewhere5.9KViews0likes11CommentsImporting data for multiple time periods
How can I import data for multiple time period all in one go? I was told that its possible. But wondering what settings do I need to change in order to do that. I tried using "Replace All Time" but the Time is set to "Current". I'm uploading a excel base file. Any guidance on this will be appreciated.Solved5.8KViews0likes8CommentsImport files from FTP Connection
Hello, I'am newly at OneStream and now I have a requirement from a client: - They want to import, automatically, files that will be data source of a Workflow Profile = import; 1. Those files are on FTP (FTP connection), that means that I need to add on the current application server, a connector type = gateway, right? Also, what I need to define on the connection string? 2. Then I need to create a BR for the connector; 3. Since the files are "delimited files", so on the Data Sources menu, I need to create a new Data source - delimited file and map the columns to the correspondent dimensions that exists in OneStream right? Also, on the connector setting option, I need to assign the delimited file to the BR connector that I created previously, right? 4. On the Workflow Profile, I will assign it to the new delimited file. Are these the correct steps? Thank you all. RicardoSolved5.2KViews0likes5CommentsExecute Import > Validate > Process using a button on a dashboard
Hi I have a dashboard that executes an Import > Validate > Process using a business rule. In my workflow I am currently using Workspace, Import, Validate, Process, Confirm: I would like to just use Workspace, Confirm, but it gives me the following error: "Cannot execute step because the specified step classification doesn't exist for the workflow profile." Is there anyway around this? Thanks, MarkSolved4.9KViews0likes14CommentsClear data from Workflow import
Hey everyone - How do I clear data from these workflow manual imports automatically? Right now, I've been going through each month one by one. Wondering if there is a BR that can run them all together and clear up data from those specific imports and time dimension. Here are the steps I want to follow: Navigate to import that needs to be cleared > Clear > Re-Transform > Validate > Load Cube4.5KViews0likes10Comments"Cannot execute step because the prior workflow step is not completed"
We have Workflow A, and Workflow B. We get an error when the loadCube step in workflow B, when Workflow A is in Validation step. The error we get is "Cannot execute step because the prior workflow step is not completed. . (WP#WorkflowA.Import:S#Actual:T#2022M1)" The name the actual workflow has been changed to protect the innocent. Our question is: what is causing this issue and how do we fix it? It's not clear in the documentation what exactly triggers this error message. The parameters passed back imply that one cannot load to the same scenario and time if another workflow to that scenario and time is still open. WorkflowA and WorkflowB are loading to the same scenario and time, but to different accounts.4.5KViews1like4CommentsEntry with the same key already exists
Hello, I am receiving the error "Entry with the same key already exists" when trying to load a xml through a workflow. I have checked all of the dimensions, transformation rules, data sources, etc. and am not finding any duplication. Looking online I see recommendations on other platforms where the issue relates to a ListDictionary and Add within the code where it tries to add something that may already exist? I am curious if anyone has ran across this issue and how you resolved it. Summary: An entry with the same key already exists. ---------------------------------------- Description: An entry with the same key already exists. Error Time: 3/3/2023 10:03:03 PM Error Level: Error Tier: AppServer App Server XF Version: 6.8.1.13230 App Server OS Version: Microsoft Windows NT 10.0.14393.0 Total Memory: 68,719,005,696 (64.00 GB) Memory In Use: 1,979,629,568 (1.84 GB) Private Memory In Use: 2,175,569,920 (2.03 GB) Peak Memory In Use: 2,666,090,496 (2.48 GB) Maximum Data Records In RAM: 12,884,813 Maximum Data Units In RAM: 100,000 Number Of Threads: 81 ---------------------------------------- Exception Type: XFException Thread Id: 58 Source code: Transformer.vb, line 1588, method ParseAndTransform ---------------------------------------- Exception Type: XFException Thread Id: 58 Source code: Transformer.vb, line 852, method InitializeTransformer ---------------------------------------- Exception Type: XFException Thread Id: 58 Source code: Transformer.vb, line 1105, method InitializeDataCache ---------------------------------------- Exception Type: XFException Thread Id: 58 Source code: Transformer.vb, line 1031, method InitializeDimensionListCache Stack Trace: at OneStream.Stage.Engine.Transformer.InitializeDimensionListCache(SessionInfo si, String cubeName, Int32 scenarioTypeID, TransformDataCache dataCache) in C:\agent\_work\298\s\Source\Stage\StageEngine\Transformer\TransformerEngine\Transformer.vb:line 1031 at OneStream.Stage.Engine.Transformer.InitializeDataCache(SessionInfo si, WorkflowUnitClusterPk wfClusterPk, Int32 DataPageSize, Int32 PagesInMemoryLimit) in C:\agent\_work\298\s\Source\Stage\StageEngine\Transformer\TransformerEngine\Transformer.vb:line 1105 at OneStream.Stage.Engine.Transformer.InitializeTransformer(SessionInfo si, WorkflowUnitPk wfUnitPk, String sourceFilePath, Boolean retransformingOnly, TaskActivityStepWrapperItem parentTaskActivityStep) in C:\agent\_work\298\s\Source\Stage\StageEngine\Transformer\TransformerEngine\Transformer.vb:line 852 at OneStream.Stage.Engine.Transformer.ParseAndTransform(SessionInfo si, WorkflowUnitPk wfUnitPk, String sourceFilePath, TransformLoadMethodTypes loadMethod, Boolean deleteFilesFromOSAfterArchiving, Guid taskActivityID) in C:\agent\_work\298\s\Source\Stage\StageEngine\Transformer\TransformerEngine\Transformer.vb:line 1588 at OneStream.Stage.Engine.ParseAndTransformThread.WorkerThreadMethod() in C:\agent\_work\298\s\Source\Stage\StageEngine\Transformer\TransformerEngine\ParseAndTransformThread.vb:line 85 ---------------------------------------- Exception Type: Unknown Message: An entry with the same key already exists. Stack Trace: at System.ThrowHelper.ThrowArgumentException(ExceptionResource resource) at System.Collections.Generic.TreeSet`1.AddIfNotPresent(T item) at System.Collections.Generic.SortedDictionary`2.Add(TKey key, TValue value) at OneStream.Stage.Engine.Transformer.InitializeDimensionListCache(SessionInfo si, String cubeName, Int32 scenarioTypeID, TransformDataCache dataCache) in C:\agent\_work\298\s\Source\Stage\StageEngine\Transformer\TransformerEngine\Transformer.vb:line 1031Solved4.3KViews0likes3CommentsData load for specific time period pulling data from other time periods
Hi all, I'm having an issue whereby I'm importing profit and loss actual's data via a csv file into one of the data upload workflow steps we have set up. However some of the data is being loaded into months it shouldn't be, even though on the in the csv file the data is pointing to the correct time period. So for example in the upload file, data is put against time periods such as 2023M1, 2023M2 etc. Having imported the data, validated it and loaded it to the reporting cube we have, some of the data is being loaded into 2023M1 and also 2023M2, when on the upload file it's only on there once against 2023M1. I've checked the transformation rules and everything is set up correctly in terms of translating the import file, so I was wondering if anyone has come across this type of issue before or if anyone has any advice as to where I could look next? I'm pretty new to OneStream so any help would be much appreciated. Thanks, Ged4.3KViews0likes5CommentsActuals data load issue - YTD adjustment being loaded
Hi all, I have an issue whereby I'm loading our company actuals into OneStream, but if a General Ledger code has no data in it but had data in previous months, I think OneStream assumes the YTD value to be zero and so the opposite total of the previous months is being loaded. So for example the YTD total of repairs & maintenance may be £100 at Apr-22, and in May-22 there are no further transactions so the YTD position is still £100. However as no transactions occurred in May, when the data is loaded -£100 appears in May. I've checked the DateSource and TransformationRules we have set up, I cant see a reason why this would be happening as it's all set up as 'Periodic', and so only the data from the initial load should be include. Does anyone have any ideas as to where the problem my lie, or where I could look to try and see why this is happening? Many thanks, GedSolved4.2KViews0likes4Comments