Forum Discussion

fc's avatar
fc
Contributor
20 days ago
Solved

Delimited file Import and normalization

Hi everyone, 

I need to set up a few transformations to normalize a delimited file that will be uploaded by the users, before feeding it to the datasource.
I previously worked with FTP folders, and used a Connector BR to retrieve data and normalize it before the datasource step.

I have never worked with delimited files loaded directly by clicking Import. It's not clear to me if I can still use a connector BR or if I need to use something else (a Parser BR?).

Can anyone help me understand what is the best practice in this case?

Thank you for the support!

  • You can use a matrix format data source and add multiple Account columns (search for Matrix Data Load for more info). The Golfstream app has a few examples (HoustonBudget). Alternatively you can use the TransformationEventHandler Business Rule to intercept the selected file and restructure it. I would go with the matrix format.

3 Replies

  • MarcusH's avatar
    MarcusH
    Valued Contributor

    You can use a matrix format data source and add multiple Account columns (search for Matrix Data Load for more info). The Golfstream app has a few examples (HoustonBudget). Alternatively you can use the TransformationEventHandler Business Rule to intercept the selected file and restructure it. I would go with the matrix format.

  • JJones's avatar
    JJones
    Icon for OneStream Employee rankOneStream Employee

    You would first need to create a Delimited Data Source that defines what column in your source CSV file maps to each dimension within your OneStream Cube.  If you need to parse the file, you will use Parser rules to parse the data in the file as necessary.

    There are some examples you can find in the Design and Reference Guide available on OneStream Documentation that may help you with this.

     

  • fc's avatar
    fc
    Contributor

    Hi JJones​ 

    I already read the OS documentation but couldn't find very clear information about it.

    My problem is that the file that the user will load and that will be fed to the datasource has each account in a separate header, and the values are stored below each account instead of being stored in a dedicated column.
    For example:

    UD1,UD2,UD3,Acc1,Acc2,Acc3,...    (headers)
    UD1,UD2,UD3,Amount1,Amount2,Amount3,...

    I believe I need to rearrange the content this way:

    UD1,UD2,UD3,Acc1,Amount1
    UD1,UD2,UD3,Acc2,Amount2
    ...

    but I cannot understand how. My understanding is that parser rules can only be used to modify data that already hit the datasource. Can I use a connector to do that even though the file is loaded manually by the user?