Import TextValue from Matrix-Excel
Hi, I have an Excel template file to import via the stage. This Excel is using a matrix-style for the values (time in columns) and also one column with a text commentary. As I have to change the view to “Annotation” in order to import a text value, I was thinking about duplicating the rows by a derivative rule and change the view in that rule. The outcome shows the duplicated rows, but these are missing the text value. So this approch is not working. I also tried to duplicate one time column and use a parser rule to change the view. Doing so, I would need to identify, which time period I am in during pasing, as otherwise all rows are changed. Any ideas about that? Regards, Tobias8KViews0likes16CommentsImporting data for multiple time periods
How can I import data for multiple time period all in one go? I was told that its possible. But wondering what settings do I need to change in order to do that. I tried using "Replace All Time" but the Time is set to "Current". I'm uploading a excel base file. Any guidance on this will be appreciated.Solved6.1KViews0likes8CommentsData Imports
I have currently set up a new data source and associated transformation rules (and allocated to the appropriate workflow). Data importing is correctly working for 2022 time period, however I am getting the following error in any other time period...... I have got the Scenario as 'Current Datakey' and Time as matrix datakey in my columns Summary: No valid DataKeys (Scenario / Time) found in data source. Review the source data load processing log and check one-to-one transformation rules to ensure that you have created proper Scenario and Time dimension rules for this data source. Any ideas? I am assuming this might be one simple setting change somewhere6.1KViews0likes11CommentsImport files from FTP Connection
Hello, I'am newly at OneStream and now I have a requirement from a client: - They want to import, automatically, files that will be data source of a Workflow Profile = import; 1. Those files are on FTP (FTP connection), that means that I need to add on the current application server, a connector type = gateway, right? Also, what I need to define on the connection string? 2. Then I need to create a BR for the connector; 3. Since the files are "delimited files", so on the Data Sources menu, I need to create a new Data source - delimited file and map the columns to the correspondent dimensions that exists in OneStream right? Also, on the connector setting option, I need to assign the delimited file to the BR connector that I created previously, right? 4. On the Workflow Profile, I will assign it to the new delimited file. Are these the correct steps? Thank you all. RicardoSolved5.4KViews0likes5CommentsExecute Import > Validate > Process using a button on a dashboard
Hi I have a dashboard that executes an Import > Validate > Process using a business rule. In my workflow I am currently using Workspace, Import, Validate, Process, Confirm: I would like to just use Workspace, Confirm, but it gives me the following error: "Cannot execute step because the specified step classification doesn't exist for the workflow profile." Is there anyway around this? Thanks, MarkSolved5.2KViews0likes14CommentsExcel Template upload - Data Source setup
Hi All, I'm trying to set up some Excel templates for the uploading Budget and Forecast data. I'm pretty happy with the Excel Template side, with the named range (XFD), and specific header formats, etc. However, I'm a bit lost on the Data Source set up, I get that you have to Allow Dynamic Excel Loads set to True, but what about the rest of the set up? Do I choose Delimited or Fixed file? It feels like this Data Source section is really for flat files, as it always wants to know the column number. I've tried importing the Excel into the Data Source in the same way I would for a csv file, but it just shows up as xml gibberish in the top box. It definitely feels like I'm missing something.Solved4.9KViews2likes3Comments"Cannot execute step because the prior workflow step is not completed"
We have Workflow A, and Workflow B. We get an error when the loadCube step in workflow B, when Workflow A is in Validation step. The error we get is "Cannot execute step because the prior workflow step is not completed. . (WP#WorkflowA.Import:S#Actual:T#2022M1)" The name the actual workflow has been changed to protect the innocent. Our question is: what is causing this issue and how do we fix it? It's not clear in the documentation what exactly triggers this error message. The parameters passed back imply that one cannot load to the same scenario and time if another workflow to that scenario and time is still open. WorkflowA and WorkflowB are loading to the same scenario and time, but to different accounts.4.9KViews1like5CommentsClear data from Workflow import
Hey everyone - How do I clear data from these workflow manual imports automatically? Right now, I've been going through each month one by one. Wondering if there is a BR that can run them all together and clear up data from those specific imports and time dimension. Here are the steps I want to follow: Navigate to import that needs to be cleared > Clear > Re-Transform > Validate > Load Cube4.8KViews0likes10CommentsEntry with the same key already exists
Hello, I am receiving the error "Entry with the same key already exists" when trying to load a xml through a workflow. I have checked all of the dimensions, transformation rules, data sources, etc. and am not finding any duplication. Looking online I see recommendations on other platforms where the issue relates to a ListDictionary and Add within the code where it tries to add something that may already exist? I am curious if anyone has ran across this issue and how you resolved it. Summary: An entry with the same key already exists. ---------------------------------------- Description: An entry with the same key already exists. Error Time: 3/3/2023 10:03:03 PM Error Level: Error Tier: AppServer App Server XF Version: 6.8.1.13230 App Server OS Version: Microsoft Windows NT 10.0.14393.0 Total Memory: 68,719,005,696 (64.00 GB) Memory In Use: 1,979,629,568 (1.84 GB) Private Memory In Use: 2,175,569,920 (2.03 GB) Peak Memory In Use: 2,666,090,496 (2.48 GB) Maximum Data Records In RAM: 12,884,813 Maximum Data Units In RAM: 100,000 Number Of Threads: 81 ---------------------------------------- Exception Type: XFException Thread Id: 58 Source code: Transformer.vb, line 1588, method ParseAndTransform ---------------------------------------- Exception Type: XFException Thread Id: 58 Source code: Transformer.vb, line 852, method InitializeTransformer ---------------------------------------- Exception Type: XFException Thread Id: 58 Source code: Transformer.vb, line 1105, method InitializeDataCache ---------------------------------------- Exception Type: XFException Thread Id: 58 Source code: Transformer.vb, line 1031, method InitializeDimensionListCache Stack Trace: at OneStream.Stage.Engine.Transformer.InitializeDimensionListCache(SessionInfo si, String cubeName, Int32 scenarioTypeID, TransformDataCache dataCache) in C:\agent\_work\298\s\Source\Stage\StageEngine\Transformer\TransformerEngine\Transformer.vb:line 1031 at OneStream.Stage.Engine.Transformer.InitializeDataCache(SessionInfo si, WorkflowUnitClusterPk wfClusterPk, Int32 DataPageSize, Int32 PagesInMemoryLimit) in C:\agent\_work\298\s\Source\Stage\StageEngine\Transformer\TransformerEngine\Transformer.vb:line 1105 at OneStream.Stage.Engine.Transformer.InitializeTransformer(SessionInfo si, WorkflowUnitPk wfUnitPk, String sourceFilePath, Boolean retransformingOnly, TaskActivityStepWrapperItem parentTaskActivityStep) in C:\agent\_work\298\s\Source\Stage\StageEngine\Transformer\TransformerEngine\Transformer.vb:line 852 at OneStream.Stage.Engine.Transformer.ParseAndTransform(SessionInfo si, WorkflowUnitPk wfUnitPk, String sourceFilePath, TransformLoadMethodTypes loadMethod, Boolean deleteFilesFromOSAfterArchiving, Guid taskActivityID) in C:\agent\_work\298\s\Source\Stage\StageEngine\Transformer\TransformerEngine\Transformer.vb:line 1588 at OneStream.Stage.Engine.ParseAndTransformThread.WorkerThreadMethod() in C:\agent\_work\298\s\Source\Stage\StageEngine\Transformer\TransformerEngine\ParseAndTransformThread.vb:line 85 ---------------------------------------- Exception Type: Unknown Message: An entry with the same key already exists. Stack Trace: at System.ThrowHelper.ThrowArgumentException(ExceptionResource resource) at System.Collections.Generic.TreeSet`1.AddIfNotPresent(T item) at System.Collections.Generic.SortedDictionary`2.Add(TKey key, TValue value) at OneStream.Stage.Engine.Transformer.InitializeDimensionListCache(SessionInfo si, String cubeName, Int32 scenarioTypeID, TransformDataCache dataCache) in C:\agent\_work\298\s\Source\Stage\StageEngine\Transformer\TransformerEngine\Transformer.vb:line 1031Solved4.7KViews0likes3CommentsData Cell is read-only because Parent Workflow Profile has no active Input Profiles
Hi all, I have an Import, Validate, Load workflow that hasn't been used for a while. I'm trying to load in data and I'm getting invalid intersections at the validation stage with the following error message: "Data Cell is read-only because Parent Workflow Profile 'Workflow Name' has no active Input Profiles for the Workflow Channels that are assigned to the Data Cell's Account and/or UD member. Entity = Entity, Account=Volume, Origin=Import" Note I've use italics instead of using our entity names and workflow names. We didn't have this issue before so I'm not sure what has changed. The parent workflow profile stated in the validation message isn't actually the parent workflow profile so I'm totally confused. There is another post on this but I didn't understand the resolution. Could somebody please tell me what I would need to do to fix this? Thanks!Solved4.5KViews0likes5Comments