Forum Discussion

Onestreamer92's avatar
Onestreamer92
New Contributor II
11 days ago

Parser rule to compare attribute data between current month load and last year Dec load

HI Experts, 

I have a parser rule that looks at current attribute 9 and compares it with attribute 9 of last year December load and returns it as attribute 10 which helps me map the differences in Tiering.

My issue is - the file is huge around 15k lines and its taking ages for this comparison. How can i make it run a bit faster ? Any help would be appreciated.

 

PARSER RULE BELOW FOR YOUR INFORMATION:

Public Class MainClass
Public Function Main(ByVal si As SessionInfo, ByVal globals As BRGlobals, ByVal api As ParserDimension, ByVal args As ParserArgs) As Object
Try
 
 
Dim Time As String = BRApi.Finance.Time.GetNameFromId(si,si.WorkflowClusterPk.TimeKey)
Dim WfYear As String = Time.Substring(0,4)
Dim WfYearPrior As Integer  = WfYear - 1
'
 
'Get the fields for comparison
Dim AttrColPosition As Integer = 97
Dim CurrString As String = api.Parser.DelimitedParsedValues(AttrColPosition)
'Brapi.ErrorLog.LogMessage(si,"My Currr A8  is " &CurrString)
 
Dim EntColPosition As Integer = 2
Dim EntString As String = api.Parser.DelimitedParsedValues(EntColPosition)
'Brapi.ErrorLog.LogMessage(si,"My Currr Entity  is " &EntString)
 
Dim ISINColPosition As Integer = 23
Dim ISINString As String = api.Parser.DelimitedParsedValues(ISINColPosition)
'Brapi.ErrorLog.LogMessage(si,"My Currr Entity  is " &EntString)
 
 
'SQL ----------------------
Dim sql As New Text.StringBuilder
Dim dt As DataTable
 
sql.AppendLine("Select TOP 1 A9 ")
sql.AppendLine("From vStageSourceAndTargetDataWithAttributes ")
sql.AppendLine("Where Si = 'AnalystPackLoad' and TmT = '"& WfYearPrior &"M12' and A18 = '"& ISINString &"' and Et = '"& EntString &"' ")
 
 
Using dbConnApp As DbConnInfo = BRApi.Database.CreateApplicationDbConnInfo(si)
  dt = BrApi.Database.ExecuteSql(dbConnApp,sql.ToString,True)
 
End Using
 
If dt IsNot Nothing AndAlso dt.Rows.Count > 0 Then
 
Dim Att10 As String =  dt.Rows(0)("A9").ToString()
Return Att10
 
End If 
Return Nothing
 
 
 
Catch ex As Exception
Throw ErrorHandler.LogWrite(si, New XFException(si, ex))
 
End Try
End Function
End Class
End Namespace

 

 

3 Replies

  • rhankey's avatar
    rhankey
    Contributor III

    I can bet performance is poor.  Options that come to mind:

    • If this is a direct connect data load, then join in the prior Dec attribute value from source so you no longer need to use a one row at a time parser rule
    • Use a Transformation Rule Event handler that performs this particular transformation for all the imported rows in a single SQL statement.
    • If you know the OS api well enough, you could cache a lookup table to OS Global cache.  The first time the Parser rule fires would see the lookup table does not exist, and thus create it for all passible values you could need.  For all subsequent rows, the parser rule would see the lookup table does exist in Global cache, and get the value from cache.
  • Onestreamer92's avatar
    Onestreamer92
    New Contributor II

    hi rhankey​

    thank you for your suggestions:

    • If this is a direct connect data load, then join in the prior Dec attribute value from source so you no longer need to use a one row at a time parser rule - not a direct connection  .Its loaded an an excel csv file. 
    • Use a Transformation Rule Event handler that performs this particular transformation for all the imported rows in a single SQL statement - does TR event handler have the ability to fetch attribute data for another month (PY dec in this case? )
    • If you know the OS api well enough, you could cache a lookup table to OS Global cache.  The first time the Parser rule fires would see the lookup table does not exist, and thus create it for all passible values you could need.  For all subsequent rows, the parser rule would see the lookup table does exist in Global cache, and get the value from cache - not great with Api's so will have to explore this option a bit more. Any reference will be useful . Thank you.
    • rhankey's avatar
      rhankey
      Contributor III

      Yes, you can build a Transformation Rule Event handler to read whatever prior Stage data you want, just as you are doing one row at a time in your Parser Rule.  The difference is that you want to issue a single SQL statement to update all the imported rows at one time.  Being a Business Rules file, you can do just about anything you want.

      I think there are articles in the community showing the basics in how to use Global cache. (Edit) Here is a link that provides a high-level view of Globals:

      What is the globals object good for? | OneStream Community