Recent Content
How to make coffee with OneStream?
3 MIN READ Hi there! If you are reading this postit probably means that you are a coffee addict. Or a OneStream addict. Or both! Before starting the explanation on how to make coffee with OneStream, I would like to say I had the idea of making this while reading the OneStream Foundation Handbook – I recommend this book, it is fun to read and they are lots of good stuff in it! For example, page 21 where Greg Bankston (GregB) said “I have joked on numerous occasions that OneStream can probably even automate a customer’s coffee maker for them if they can find one that accepts the right commands. While it is indeed a joke, it is not too far from the truth either.” To make coffee with OneStream you will need to Buy a Philips hue system and connect your coffee machine to it. Connect the api of the Philips Hue to OS Create a powershell script Open some firewall ports Create a DM job that links the dashboard with the BR Create a BR that launches a script Create a dashboard – just because life is nicer with a dashboard! For the 2 first steps, after buying your Philips Hue System you should read this blog : https://developers.meethue.com/develop/get-started-2/#turning-a-light-on-and-off The idea here is to generate a remote username. Now you should test the api connection using a powershell script like the one below. When it is working then you should save this script on your OneStream server. You can notice that the power plug connected to the Philips Hue is seen as a light as it has only an On/Off state. # Hue Bridge $hueBridge = "http://192.168.109.10:80/api" # Username $username = '4HNMsqH9n5NwMH9n5NFVLY9n5NzZrml-45e' # Command to Turn on $apicontent= '{"on":true}' # Command to Turn off – activate it on another script # $apicontent= '{"on":false}' # Invoke commands Invoke-WebRequest -Method put -Uri "$($hueBridge)/$($username)/lights/21/state" -Body $apicontent As the script is sitting on your server, I would recommend that you run it with powershell directly on the server. It is a good way to test and check that all firewall ports are open. Once you have the powershell script running it should already turn on your coffee from the server. Do not forget to add the script to turn it off too. Now you need to go to your OneStream application to create a Extender business rules. It should call for the powershell script on your server. It will look like this : Namespace OneStream.BusinessRule.Extender.PlugOn Public Class MainClass Public Function Main(ByVal si As SessionInfo, ByVal globals As BRGlobals, ByVal api As Object, ByVal args As ExtenderArgs) As Object Try ' Process.Start("powershell", "-File C:\TurnOnHue") Shell("powershell -ExecutionPolicy Bypass ""C:\TurnOnPlug"" ") Return Nothing Catch ex As Exception Throw ErrorHandler.LogWrite(si, New XFException(si, ex)) End Try End Function End Class End Namespace Now you need to create a Data Management job that will kick your Extender BR. And last but not least you end up with your Dashboard!8.4KViews45likes11CommentsUnleashing the Power of OneStream Event Handlers: Enhancing User Experience and Efficiency
5 MIN READ Unlocking the Potential of OneStream Event Handlers Hey there! Thanks for dropping by to read about OneStream Event Handlers. If you're looking to supercharge your application's capabilities, you've come to the right place. Let's dive into the world of Event Handlers and discover how they can take your OneStream experience to the next level.2.8KViews25likes0CommentsGoodbye, and thanks for all the fish 🙂
From May 1st, my role in OneStream will change significantly (for the better!); unfortunately, this means I will step down as Community Moderator. Over the last 20 months or so it's been a lot of fun, interacting with all of you on forums while trying to move OneCommunity forward in a few different ways. So fun, in fact, that I don't know if I will be able to restrain myself from posting anyway 😅 (as an ordinary onestreamer from now on, of course). The good news is that the community is large enough, nowadays, that on most days I already have nothing to do. There are plans in place to guarantee continuity for the behind-the-scenes tasks (handling abuse alerts, ensuring posts are filed on the right boards, etc), while a new moderator (or team of moderators) is appointed.akloepfer will always be around anyway, making things work quietly in the shadows. I will be at Splash next month, happy to talk about OneCommunity as well as anything else - except hockey, I really don't know anything about that. All the best, Jack1.4KViews21likes2CommentsData Processing and Performance - A comprehensive guide of tables, and design
Overview To maintain well performing application, one must understand how the underlying database works and more importantly its limitations. Understanding how a system works, allows designers and administrators to create reliable, stable, and optimal performing applications. This white paper is intended to guide the design of those optimal data processing strategies for the OneStream platform. First, this document will provide a detailed look at the data structures used by the stage engine as well as those used by the in-memory financial analytic engine, providing a deep understanding of how the OneStream stage engine functions in relation to the in-memory financial analytic engine. The relationship between stage engine data structures and finance engine data structures will be discussed in detail. Understanding how data is stored and manipulated by these engines will help consultants build OneStream applications that are optimized for high-volume data processing. Second, the workflow engine configuration will be examined in detail throughout the document since it acts as the controller / orchestrated of most tasks in the system. The workflow engine is the primary tool used to configure data processing sequences and performance characteristics in an OneStream application. The are many different workflow structures and settings that specifically relate to data processing and these settings will be discussed in relation to the processing engine that they impact. Finally, this document will define best practices and logical data processing limits. This will include suggestions on how to create workflow structures and settings for specific data processing workloads. With respect to data defining processing limits, this document will help define practical / logical data processing limits in relation to hard/physical data processing limits and will provide a detailed explanation of the suggested logical limits. This is an important topic because in many situations the physical data processing limit will accept/tolerate that amount of data that is being processed, but the same data may be able to be processed in a much more efficient manner by adhering to logical limits and building the appropriate workflow structures to partition data. These concepts are particularly important because they enable efficient storage, potential parallel processing and high-performance reporting/consumption when properly implemented. Conclusion Large Data Units can create problems for loading, calculating, consolidating, and reporting data. This really is a limitation of what the hardware and networks can support. Your design needs to consider this. But from this paper, I hope you can take away some options to relieve some of the pressure points that could appear.What is the difference between ONECommunity and OneStream Champions?
As many of our members are members of OneStream Champions we thought it would be good to explain the difference between the two areas. OneStream Champions is a place for the community to engage in promoting OneStream, while creating stronger relationships with our global network of customers and partners. Earn rewards along the way as you complete tasks such as sharing a LinkedIn post or answering a survey. ONECommunity is a place for the community to go to engage in asking and answering and searching for technical questions and answers. The ONECommunity is monitored to answer those questions while the Champions area is more social in nature. We want to have the proper place for content and questions about the product be contained in ONECommunity so there is one place to search for those answers. OneStream Champions mission isto elevate the voice of the OneStream community, and promote advocacy of OneStream in the market, through fun and rewarding engagement activities. OneStream Champions is a place for customers and partners to connect, network, and share the benefits of the OneStream platform. ONECommunity mission is to provide one central source for all customers, employees and partners to communicate and build relationships and find answers to their questions and documentation on all product features and training materials.4.3KViews18likes0CommentsHow to train end users in OneStream using Train Me?
4 MIN READ OneStream has many great functionalities for your end users. And as a OneStream customer or partner, there is much information available to teach you how the system operates. But how do you train your end users? Preferably in a way that is user friendly and not too time consuming, both for you and the end user? The MarketPlace has two solutions, which combined will provide your end users with a range of training videos. How can you use these apps to train your end users?2.2KViews15likes0CommentsCube Dimension Assignment
Summary The following details offer a quick snapshot of this article’s core content and primary focus to ensure that it is most relevant to your needs. What: Cube dimension assignment When: Early build Why: Enable future flexibility How: Assign dimensions to specific Scenario Types and change “(Use Default)” to the Root dimension Overview To enable future flexibility, it is foundationally critical to properly configure the cube dimension assignments prior to loading data. Once data has been loaded to a cube, the assignments for the (Default) Scenario Type are locked in. The Root dimension assignments in the above image can be updated on the (Default) Scenario Type in the future, but any Scenario Types that have data and are set to “(Use Default)” like the image below, cannot be changed. This means that if not configured properly, the entire cube must abide by the updates to the (Default) Scenario Type. If the cube dimensions are configured properly, additional dimensions can be added to specific Scenario Types in the future. The example use case illustrated in this guide is adding a customer dimension in Budget to expand the annual planning capabilities. This guide provides example configurations to illustrate the recommended approach and common misconfigurations. Recommendation When a cube is created, dimension assignments on specific Scenario Types are set to “(Use Default)” on the Cube Dimensions tab. To properly configure an application for extensibility and enable data model flexibility/expansion in the future, these settings should be updated for the active Scenario Types within each cube. (Default) Scenario Type: Assign the Data Unit dimensions of entity and scenario. Leave all non-Data Unit dimensions as Root. For all active Scenario Types: Entity and scenario will remain as “(Use Default).” All non-Data Unit dimensions should be assigned a specific dimension. Select Root for all unused dimensions. “(Use Default)” should not remain for any dimension. Leave inactive Scenario Types as-is until ready to be activated. Use Case & Examples Use Case: A client with a live OneStream application wants to enable a customer dimension in Budget to expand their annual revenue planning capabilities and include their top customers. Data has already been loaded to Actual and a prior year Budget. Configuration #1: Recommended Configuration The recommended configuration of cube dimension assignments will enable the application to take full advantage of extensibility. This configuration will allow the addition of new dimensions to specific Scenario Types in the future and eliminate the need to “stub out” unused dimensions for future use. To configure properly, the Data Unit dimensions (entity and scenario) will be assigned to the (Default) Scenario Type, and all remaining active dimensions will be assigned to their respective active Scenario Types. Any inactive dimensions should be set to "Root” instead of “(Use Default)”. Recommended initial assignment is as follows: (Default) Actual Budget After the Actual and Budget Scenario Types both have data in them, we are still able to change our UD4 Dimension assignment in the Budget Scenario Type to include our new summary customer dimension. **Be aware that once you hit save, the new UD4 assignment will be locked in, and you will be unable to change it if there is data in that cube and Scenario Type combination. Changing from a Root dimension is a one-time change that cannot be reverted if there is data in this cube and Scenario Type combination. ** After adding the new dimension to the Budget Scenario Type, one will see the history in UD4#None, and the new dimension members active for input in subsequent budget cycles. Since it was assigned to the specific Scenario Type and not (Default), you will notice that this new UD4 dimension is invalid for the Actual Scenario Type. This configuration will also allow the future addition of UD5 and UD6 dimensions by following these same steps. Configuration #2: Improper Assignment to the (Default) Scenario Type A common error is to assign all dimensions to the (Default) Scenario Type and only use the Scenario Type-specific tabs for those that differ. This configuration will work and will also allow you to add additional dimensions in the future but is much less flexible. Additional dimensions in this setup must be assigned to the (Default) Scenario Type and will therefore apply to all active scenarios. In the example below, all active dimensions have been assigned to the (Default) Scenario Type and a different Account dimension has been assigned to the Budget Scenario Type to enable the use of extensibility. The remaining non-Data Unit dimensions have been left as (Use Default) for both the Actual and Budget Scenario Types: (Default) Actual Budget In this setup, attempting to assign our new customer dimension to UD4 on the Budget Scenario Type: Will display this error: Due to the use of (Use Default) on the active Scenario Types, they are now locked into whatever the (Default) Scenario Type has set for these dimensions and they cannot be updated. To add our new customer dimension, one is forced to assign it to the (Default) Scenario Type: When assigning to the (Default) Scenario Type, you will notice that it works for Budget as required (same as the recommended configuration), but it is now active for the Actual Scenario Type as well which was not the desired result. With this configuration, existing business rules and member formulas will need to be validated throughout the application to ensure the right intersections are specified. This additional dimension contains valid intersections in all Scenario Types; therefore, rules need to be more explicit in their filtering and writing of data. If written improperly or too open, this new dimension may cause a performance impact or zeros and other bad data being calculated in these new intersections. Configuration #3: Improper Assignment of (Use Default) Another erroneous configuration is to assign all dimensions to their respective Scenario Type but leave unused dimensions as (Use Default). This configuration will also work and will allow you to add additional dimensions in the future, but also is not as flexible as the recommended setup. Additional dimensions in this setup must be assigned to the (Default) Scenario Type and will therefore apply to all active scenarios. In the example below, all active dimensions have been assigned to their respective Scenario Types. The remaining inactive dimensions have been left as (Use Default) for both the Actual and Budget Scenario Types: (Default) Actual Budget In this setup, attempting to assign our new customer dimension to UD4 on the Budget Scenario Type will result in the same error as configuration #2 above: Again, forcing the assignment to the (Default) Scenario Type which will apply to all active Scenario Types with the setting of (Use Default) for UD4. This assignment will also work for Budget, but as with configuration #2 above, you will notice that it is now active for the Actual Scenario Type as well which was not the desired result. As with configuration #2 above, existing business rules and member formulas should be validated throughout the application to ensure the right intersections are specified. Additional no input rules may be necessary to limit input to these intersections in Scenario Types that do not apply. Considerations The recommended configuration for cube dimension assignment eliminates the need to “stub out” unused dimensions for future use. If unused dimensions are assigned to “Root” on their respective Scenario Types, they can be changed in the future. One should not create a placeholder dimension for those that are unused (UD4, UD5, and UD6 in our example above) as this will only limit future flexibility. If additional dimensions are not configured properly, you can only update from Root to a specific dimension once. If you accidentally save an incorrect dimension update, you’re locked into that change. Plan and make sure these settings are properly updated before saving the changes. Despite adding flexibility for the future, configuring the cube dimensions in this way still does not allow you to change active dimensions with data. Conclusion As you can see from the examples above, non-Data Unit dimensions should be assigned to the cube by Scenario Type with (Use Default) changed to the Root dimension for those that are inactive at setup. Assigning non-Data Unit dimensions to the (Default) Scenario Type and leaving (Use Default) on the specific Scenario Types will limit the benefits and flexibility provided by extensibility. Improper setup will force the entire cube to conform to future updates to the (Default) Scenario Type. Conversely, assigning non-Data Unit cube dimensions to specific Scenario Types and utilizing the Root dimensions instead of (Use Default) will open additional growth opportunities for the application. This recommended configuration is also more flexible than “stubbing out” dimensions for future use as you do not have to consider the potential pitfalls related to that design.2KViews13likes0Comments