Dashboard Integration
Hi, I have designed a dashboard using SQL adapter and have stitched multiple OS tables.Current ask is to integrate this dashboard data and sent it out to an external db.I can create a table in the external db reflecting the same column structure. What is an efficient way to integrate this data from OS dashboard to external db-FYI- I have SIC in place. Thanks for your time73Views0likes1CommentWorkflow error
I have created a workflow profile, and the data source for the workflow is a CSV file with several columns. In this file, the second column contains account numbers. I specified the 'Text Fill Settings -> Lead Fill Value' as 'A' in the data source as I need to append with A Infront of the account. Additionally, I have defined a transformation rule. However, when I execute the workflow, I get the following error message: 'Validation Messages: Invalid member name. Account: 652901.'" Could you please suggest, is there anything that I am missing?129Views0likes2CommentsAutomating BAI uploads in Workflow
Hello OneStream Community, Currently working on setting up an automated process to handle BAI file uploads directly into each specific bank workflow in our OneStream application. 1. BAI Parser & Data Source Configuration - We've installed the BAI Parser dashboard to view and manage all imported BAI files. The BAI data source has been configured and initial tests with sample files have been successful. 2. File Transfer & Business Rules - Plan is to automate the transfer of finalized BAI files from our system via FTP to the OneStream file system. This requires a BR to connect with the FTP site and currently developing a BR to facilitate this connection. Once the files are in OS, they'll be imported into the appropriate bank workflow. 3. SFTP Wrapper and Workflow Integration- After the initial FTP transfer, a separate BR will be implemented to securely retrieve and upload the BAI file into the bank workflows. Question:For those who have set up similar BAI automation processes, are there specific best practices or additional BR configurations that you would recommend? Any insight would be greatly appreciated.158Views0likes4CommentsData sources zero suppression with complex expression
I have a csv file that has two columns one for Debits and the other for Credits columns E and F (5 and 6). I'm using the code to take the Debit column if there is a number in Col E use it, if zero, take number from Col F instead. In the Data Sources settings in order for the script to work properly I need zero suppression to be False or it won't pick up the credit column. Note in the script that integer = 5 is referring to column 6. Is there a way to suppress zero after the script runs or write zero suppress into the script? Thanks ScottSolved167Views0likes2CommentsNew connection requiring a date and scenario transformation rule
I cloned an existing connection that pulls the Scenario and Date from the workflow. The new connection is requiring me setup a New Scenario Transformation rule that is basically Budget --> Budget and time rule 2024M1 --> 2024M1. The connection I cloned doesn't require the additional rules and was pulling Actuals. All my Data Sources, Scenario, and Workflow settings seem the same. Any suggestions on what I am missing? I don't want to maintain the time rules, so that's why I'm trying to see what I'm doing wrong.Solved166Views0likes2CommentsExport Formatted Data Explorer to Excel via code
I have a dashboard that I would like to automate the export of the formatted data explorer to Excel and keep the formatting. The same functionality as is available in the export to Excel button on the data explorer component itself. Am I missing an obvious function or does anyone have a way to replicate that functionality. Thanks, Scott154Views0likes2CommentsExtract incremental data from cube
I am looking to implement an incremental data extraction feature for our cube data. Specifically, I need to extract all data from inception and continue to do so on a quarterly basis, including any updates to previous quarters. Any ideas or best practices on how to achieve this effectively would be greatly appreciated.457Views0likes2Comments