Recent Content
Data Processing and Performance - A comprehensive guide of tables, and design
Overview To maintain well performing application, one must understand how the underlying database works and more importantly its limitations. Understanding how a system works, allows designers and administrators to create reliable, stable, and optimal performing applications. This white paper is intended to guide the design of those optimal data processing strategies for the OneStream platform. First, this document will provide a detailed look at the data structures used by the stage engine as well as those used by the in-memory financial analytic engine, providing a deep understanding of how the OneStream stage engine functions in relation to the in-memory financial analytic engine. The relationship between stage engine data structures and finance engine data structures will be discussed in detail. Understanding how data is stored and manipulated by these engines will help consultants build OneStream applications that are optimized for high-volume data processing. Second, the workflow engine configuration will be examined in detail throughout the document since it acts as the controller / orchestrated of most tasks in the system. The workflow engine is the primary tool used to configure data processing sequences and performance characteristics in an OneStream application. The are many different workflow structures and settings that specifically relate to data processing and these settings will be discussed in relation to the processing engine that they impact. Finally, this document will define best practices and logical data processing limits. This will include suggestions on how to create workflow structures and settings for specific data processing workloads. With respect to data defining processing limits, this document will help define practical / logical data processing limits in relation to hard/physical data processing limits and will provide a detailed explanation of the suggested logical limits. This is an important topic because in many situations the physical data processing limit will accept/tolerate that amount of data that is being processed, but the same data may be able to be processed in a much more efficient manner by adhering to logical limits and building the appropriate workflow structures to partition data. These concepts are particularly important because they enable efficient storage, potential parallel processing and high-performance reporting/consumption when properly implemented. Conclusion Large Data Units can create problems for loading, calculating, consolidating, and reporting data. This really is a limitation of what the hardware and networks can support. Your design needs to consider this. This paper provides some options to relieve some of the pressure points that could appear. NOTE: some tables mentioned in the paper have changed in version 9+. See this note for further details.How to set up Parcel Services - Batch reporting
3 MIN READ Do you have many users that you want to send information? And is this information based on data that is available in OneStream, for instance a periodic internal financial statement? Use Parcel Services to automatically distribute books and documents from OneStream. Parcel Services enables you to create packages (groups) that can be shipped to email lists or a file share. This can be done via a dashboard button, or fully automated using PowerShell scripting.13KViews1like29CommentsHow to make coffee with OneStream?
3 MIN READ Hi there! If you are reading this post it probably means that you are a coffee addict. Or a OneStream addict. Or both! Before starting the explanation on how to make coffee with OneStream, I would like to say I had the idea of making this while reading the OneStream Foundation Handbook – I recommend this book, it is fun to read and they are lots of good stuff in it! For example, page 21 where Greg Bankston (GregB) said “I have joked on numerous occasions that OneStream can probably even automate a customer’s coffee maker for them if they can find one that accepts the right commands. While it is indeed a joke, it is not too far from the truth either.” To make coffee with OneStream you will need to Buy a Philips hue system and connect your coffee machine to it. Connect the api of the Philips Hue to OS Create a powershell script Open some firewall ports Create a DM job that links the dashboard with the BR Create a BR that launches a script Create a dashboard – just because life is nicer with a dashboard! For the 2 first steps, after buying your Philips Hue System you should read this blog : https://developers.meethue.com/develop/get-started-2/#turning-a-light-on-and-off The idea here is to generate a remote username. Now you should test the api connection using a powershell script like the one below. When it is working then you should save this script on your OneStream server. You can notice that the power plug connected to the Philips Hue is seen as a light as it has only an On/Off state. # Hue Bridge $hueBridge = "http://192.168.109.10:80/api" # Username $username = '4HNMsqH9n5NwMH9n5NFVLY9n5NzZrml-45e' # Command to Turn on $apicontent= '{"on":true}' # Command to Turn off – activate it on another script # $apicontent= '{"on":false}' # Invoke commands Invoke-WebRequest -Method put -Uri "$($hueBridge)/$($username)/lights/21/state" -Body $apicontent As the script is sitting on your server, I would recommend that you run it with powershell directly on the server. It is a good way to test and check that all firewall ports are open. Once you have the powershell script running it should already turn on your coffee from the server. Do not forget to add the script to turn it off too. Now you need to go to your OneStream application to create a Extender business rules. It should call for the powershell script on your server. It will look like this : Namespace OneStream.BusinessRule.Extender.PlugOn Public Class MainClass Public Function Main(ByVal si As SessionInfo, ByVal globals As BRGlobals, ByVal api As Object, ByVal args As ExtenderArgs) As Object Try ' Process.Start("powershell", "-File C:\TurnOnHue") Shell("powershell -ExecutionPolicy Bypass ""C:\TurnOnPlug"" ") Return Nothing Catch ex As Exception Throw ErrorHandler.LogWrite(si, New XFException(si, ex)) End Try End Function End Class End Namespace Now you need to create a Data Management job that will kick your Extender BR. And last but not least you end up with your Dashboard!8.5KViews45likes11CommentsWhat is a Data Unit?
4 MIN READ As you start to build and design an application, you may keep hearing the concept of a data unit. It is a critical concept and fundamental to how OneStream works. The following is an excerpt of the book OneStream Foundation Handbook by The Architect Factory. Not only do we cover data unit, but many design aspects and fundamental concepts to OneStream.4.7KViews2likes0Comments