Figure 4 depicts the rest of the RTBI architecture as unveiled in figure 2 above. After data is loaded into the warehouse in formats that are preconceived to be useful for business analysis, then it would be time to devise means that enable these organizational data to be availed to business users from these various formats. This process is embodied by the next layer of RTBI, which is known as the analytical layer. Ness Global Industries (2010) identifies two types of analysis in this layer as predictive analysis and performance monitoring. These types can be performed from all the three analytical engines depicted in the analytics phase (please see figure 4 above). Predictive analysis is defined as analysis conducive for determining the likely future status quo of business performance, while performance monitoring is the intention to determine all that matters to business at present (Ness Global Industries, 2010).
In IDC Vendor Spotlight (2009)’s terms, analysis is the instrument through which data can be converted into information. That is, what is sitting in the data warehouse has unknown value until is passed through the analysis phase, in a similar way gold is processed and tagged with specific value after it has passed through a furnace. Now, there are three identified analytical engines in this layer, namely: data mining, rule based and custom engines. Gravic, Inc., (2010) defines data mining as random data processing which hopes to extract useful data relationships for decision support. Gravic, Inc. (2010) further identifies rule based analysis as processing that uses current existing rules to determine what traits can be associated with new transactional data. Looking at these proffered definitions, it becomes evident that data mining can be a source of new useful rules that could be implemented for future data analysis, while the rule based engine depends on these rules to determine how to handle newly received data. Lastly, the custom engine allows users to define their own custom rules to extract data that they deem fit to meet specific business information needs. It is more like a self service support. Moreover, this engine could also feed new rules to the rule based engine for future data processing, in a similar manner data mining could result in future useful rules.
Azvine, Cui, Nauck and Majeed (n.d.) stresses the need for the analytical layer to be endowed with a learning capability. This could be learning that is due to repetition of incidents. If for instance a way of drawing useful data is becoming dominant, the analytical layer should learn to carve means to simplify further this emerging way of analyzing data. For example, if executives in an organization repeatedly draw sales analysis data for specific regions, products or services, the analytical layer should pick as a trait the analytical formulae applied such that similar data processing is accelerated in the future.
Some authors such as Hanrahan, Stolte and Mackinlay (2009) go on to recommend visual analytics as a recommended method for accelerated data interpretation. However, no matter the method of preference used to present the findings of analysis, the basic design rules given above should hold irrespectively. Precis: the rule based engine harbours the learning that could see the analytical layer develop into being an anchor for organizational defined business rules. If it (the rule based engine) is configured to learn from all the day to day happenings, it could translate into BI terms all the rules governing business operation and optimum performance. Meaning the organization could be at a vantage point to realize maximum benefit from all its generated data.
The BI layer provides a platform where event analysis are combined to derive business impactful facts (Lehman, Watson, Wixom, Hoffer, n.d). The BI layer is also a place to plan information dissemination and security to the various business areas. For instance, RTBI information can be pushed to only relevant users for decision taking after event detection, and not users seeing no value on the deliered information. Extending the learning capability of the analytical layer, users can be allowed the flexibility to create custom dashboards, which are driven by their specialized business needs. Finally, the BPM layer benchmark business performance against the set strategy via measures that inheres in the organizational strategy (Vesset, 2009; Vesset and McDonough, 2009). It (the BPM layer) also enables decision makers to only embark on data analysis exercises that are meaningful to business. At this very layer, new choices can be developed to fix all the issues that seem to be flawed in business processing.
In sum, mini batches govern swift data movement between the operational store and the data warehouse. However, the real-time nature of RTBI cannot solely rely on the intermittent mini-batch processing, and still be able to supply live events to business users. This is the point where the RTRR becomes indispensable, because it enables new data to go straight to the analytical phase for real time analysis to be performed towards tactical business actions. The federation process, presents to the analytical layer mini data structures from the operational store, which are replicas of data structures residing in the data warehouse. Thus the analytical layer do not incur an additional burden of having to distinguish between data coming from the RTRR and the data warehouse, as data come as if they are coming from a single source. Lastly, rules are applied to these live processed data in order for meaningful business traits to be drawn and studied for tactical decision making.
References:
1. Azvine, B., Cui, Z., Nauck, D.D. and Majeed, B. (n.d.). Real time business intelligence for the adaptive enterprise. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.101.194&rep=rep1&type=pdf (Accessed 14 June 2010).
2. Gravic, Inc. (2010). The evolution of real-time business intelligence. Available from: http://www.gravic.com/shadowbase/whitepapers.html (Accessed 24 May 2010)
3. Hanrahan, P., Stolte, C. and Mackinlay, J. (2009). Selecting a visual analytics application. http://www.tableausoftware.com/whitepapers/selecting-visual-analytics-application (Accessed 15 July 2010).
4. IDC Vendor Spotlight (2009). Ensuring long term success of BI/Analytics projects: Train and train again. http://www.information-management.com/white_papers/SAP2.php (Accessed 15 July 2010).
5. Lehman, A. R., Watson, H.J. Wixom, B.H. Hoffer, J.A. (n.d). Continental Air-lines fly high with real-time business intelligence. Available from: http://www.fuqua-europe.duke.edu/centers/ccrm/datasets/continental/Continental_Airlines_Case_Study.pdf (Accessed 22 May 2010).
6. Ness Global Industries (2010). From expected to achieved: four steps to making business intelligence work. http://www.cio.com/white-paper/595797/From_Expected_to_Achieved_Four_Steps_to_Making_Business_Intelligence_Work (Accessed 10 July 2010).
7. Vesset, D. (2009). Gaining competitive differentiation from business intelligence and analytics solution. http://www.information-management.com/white_papers/-10017911-1.html (Accessed 16 JUL 2010).
8. Vesset, D. and McDonough, B. (2009). Improving organizational performance management through pervasive business intelligence. http://www.information-management.com/white_papers/-10017909-1.html (Accessed 16 JUL 2010).
No comments:
Post a Comment