Culture Setting Impact on Dataload

OS_Love
New Contributor III

I have couple of Connector DS and multiple Flat file DS in OS.Currently, the client wants to include Eurpoean culture for some of the users. I noticed that the dataload is behaving weirdly. The same data load now multiplies the data into 100 for a Eurpoean user/ what are the possible places where this should be fixed? Please note that I have multiple Parser,Conditional BR running on DS and TR as well.

 

4 REPLIES 4

aformenti
Contributor II

Hi @OS_Love,

Look out for instances on those Business rules where you converting Decimal Values to Strings for use in formulas. 

A good way to handle it is by using Formula Variables:

api.Data.FormulaVariables.SetDecimalVariable(VariableName,DecimalAmount)

To be used in formulas Like:

api.Data.Calculate("A#Acc1= $VariableName * A#Acc2")

 

 

 

 

Henning
Valued Contributor

Hi, this should shed some light into the dark on this general topic:

Building International Applications - October 2022 - OneStream Community (onestreamsoftware.com)

kberry
New Contributor III

If the field type of the value column in the data table passed to the parser is numeric, the parser converts the column numbers to text using the user's culture. The simplest way to avoid this is to set the field type of the value column to String. They system will pass the value as-is, keeping the source formatting.

A more roundabout approach is to add an xfbr rule to the Thousand Indicator and Decimal Indicator fields of the Data Source so they are updated dynamically based in the user's culture. The document Henning references contains an example should you need one, but it should not be necessary.

JackLacava
Honored Contributor

If you can't fix it with the suggestions above, I would suggest to build a very very simple test case (ideally on Golfstream) to reproduce the issue, and file a case with Support.