Pushing data from SAP into OneStream

Michal_CPM
New Contributor II

Hello,

Does anyone maybe know if it is possible to create an endpoint in OneStream that will consume data pushed out to OneStream from SAP?

In this case, SAP or another external solution would be the initiator of the connection. OneStream can consume the data it receives.

Best regards,
Michał Wysocki-Zareba

Consulting Manager
Phone: +48 512 823 106

5 REPLIES 5

tschilling
New Contributor II

Hi Michel - If I understand correctly you want to "push" data into OS via a stream or post method of sorts from SAP or any external system? 

I am sure you have already explored, but it's far easier typically for OneStream to call out to SAP via direct connect, SOAP API (BAPI layer), via a Business rule to "pull" the data into OneStream.  There's probably a good reason/s reasons for not allowing this type of "push" method in the application just given the nature of the type of data that OneStream typically holds and the focus on data quality and workflow when loading the data.

That being said, there's probably a few options "workarounds" depending on what exactly you're trying to do.  One I can think of that might work would be to call the FileExplorer API and "push" a stream of data to a file within the OneStream folder structure.  Once this file is there, then the normal batch harvest process would fire off and load the data.  I suspect you would have to do some sort of conversion on the encoding of the file contents on streaming, but in theory this could work.

Similarly, if your calling application could push a file to an sftp folder, OneStream could pick up the file from there and load.  You then could have a BR/Data Management job that would fire off every so often to check for that file and load it.  

Alternatively, I suppose you could host your own API somewhere to broker the interaction between SAP and your OneStream instance; probably a bit more complex and would require custom dev work.

I'll post back here if I think of anything else.  Good luck!  Will be interesting to see what you come up with here.

Krishna
Valued Contributor

@Michal_CPM  - This can be done n multiple ways API/File or Direct Connect (S4). You can also use the Market Place solution for Direct Connect to SAP. If you are in Version 8. No Coding required.

 

Krishna_0-1715018778845.png

 

Thanks
Krishna

tschilling
New Contributor II

Sounds like he's trying to "push" not "pull" data into OS.  Something similar to like a Bulk Load API such as Salesforce has.  

Michal_CPM
New Contributor II

Dear @tschilling and @Krishna,

Thank you for your responses.

I will provide a more detailed description of my case.

We have our reasons for not being able to use data retrieval through OneStream. It is related to the opinion of the risk management team - a token available for a year is not acceptable. The maximum token activity should be 1 hour. We know there are options like Azure AD authorization, but unfortunately, we learned from support that this is only available for on-premises installations. Another reason is that SAP already has a solution in place, so we would only need to add OneStream as an endpoint on their side.

It is also noticeable that there is no direct out-of-the-box solution that could help us achieve this. The most feasible seems to be a solution involving an intermediary who would 'receive' data from SAP and then initiate data retrieval through OneStream.

Currently, we are communicating with SAP through files, but this method is inefficient and heavily dependent on manual input, which is why we want to automate the process and use APIs.

Therefore, I created a post on the forum because perhaps someone already has a solution in production that they could share or would be willing to share.

Best regards,
Michał Wysocki-Zareba

Consulting Manager
Phone: +48 512 823 106

kenostrovsky
New Contributor III

One possible approach worth exploring would be for SAP to push data to a cloud database like something offered by Amazon web services and then onestream can have  a Task Scheduler job that checks this cloud DB every 60 seconds for "fresh" data.