Import parser on a Batch File
Hello, I've successfully set up an automatic upload using the Harvest Folder. However, I want to use the Skipline functionality within the file to reduce the import size. This works perfectly when I manually import the file into the specific Import Step. But when using the automatic upload via the Harvest Folder, the Skipline functionality doesn't seem to be applied. Does anyone have an idea for a possible workaround?7Views0likes0CommentsBulk deletion of Workflow Channels
Dear Community, Is there any way to delete the workflow channels in bulk? Currently we are in the process of deleting the unused UD7 members in the application and the workflow channels are setup based on the UD7 members. Since the UD7 members and the channels are in huge volume. Just wanted to understand if there is any rule or a marketplace solution which can be used to delete the WF channels in bulk? Any solution would help. Thank you so much.25Views0likes1CommentSmart Integration Cloud (SIC) error
Hello, I am trying to query a local app which have 200k rows on SQL server using SIC (Smart Integration Cloud). However, I get the message : "Received an unexpected EOF or 0 bytes from the transport stream." There is an online troubleshooting which I added below. It seems this is a memory issue. However, there's no mention on how to add memory (on the server ? on my local machine ? how to do it ?). --> could you please help me sorting this out ? Regards, EDIT : I get the same error message when querying a table of only 1 row, 1 column. Memory Issues If you receive any of the following errors, increase the memory in your Smart Integration Connector Local Gateway Server. For queries returning over 1 million records, 32 GB or more RAM is recommended. "Error while copying content to a stream. Received an unexpected EOF or 0 bytes from the transport stream." "An error occurred while sending the request. The response ended prematurely."846Views0likes6CommentsBR works when run manually, but not through Task Schedule
I have the code below in an Extender BR. It runs fine if I run it manually in the BR or manually through a DM job. But when it runs in the Task Scheduler it fails, and I get the "Object reference not set to an instance of an object." error. Any thoughts? Dim wfUnitClusterPk As WorkflowUnitClusterPk = BRApi.Workflow.General.GetWorkflowUnitClusterPk(si, wfProfileName, wfScenarioName, wfTimeName) Dim ImportInfo As New LoadTransformProcessInfo Try ImportInfo = BRApi.Import.Process.ExecuteParseAndTransform(si, wfUnitClusterPk, "", Nothing, TransformLoadMethodTypes.Replace, SourceDataOriginTypes.FromDirectConnection, False) Catch ex As Exception log.AppendLine("Import Done - Status: " & ImportInfo.Status & " - RowCount: " & FormatNumber(ImportInfo.RowCount, 0,,,-1)) log.AppendLine(ImportInfo.ErrorMessage) log.AppendLine("------------------------------") log.AppendLine(ex.GetBaseException.Message.ToString.Trim) End TrySolved52Views0likes9CommentsComposite Mapping
Can I use Attribute dimension to write composite mapping with other dimension? Context- I am reading extra column from a TB load to drive the mapping.I want store this additional information in Attribute dimension (and not using any free UD) and use that to do composite mapping. Is it possible?18Views0likes1CommentGet Transformation Errors from a Batch Harvest job
I am running some data loads using Batch Harvest. I want to be able to capture the Validate Transformation and Validate Intersections errors to include in an email. I am able to get the Validate Intersections errors from the application table. Does anyone know how I would get the Validate Transformation errors? If I run the entire job within a BR then I can get the Transformation Errors, but I would rather run it through the Batch Harvest. I appreciate any input. Thank you. ScottAccessing multiple import file names via DA
HI Guys, Is there anyway via a method query or rule in a data adaptor that I can specify parameters to give me all file names related to a workflow. I preferably want all file names in a specific workflow for all years in a data table of columns: (Workflow,Year,Filename). Presumably doable with a dashboard data set rule but Ive never played with the staging engine before! Guessing also that its probably a case of looping years and appending tables. Thanks, Tom