Friday, July 30, 2010

Hints

RTBI usually fails because of the common approach of wanting to supply business users with pre-configured reports (Karapala, 2010a). The danger lies in the likely quick-fading need of these reports, and consequently of the entire BI service to the organization. This is not immediately evident during need analysis, and organizations most likely go on to invest inordinate amounts only to derive no returns on RTBI. However, Vesset and McDonough (2009) and Pells (2009) recommend a different approach: they recommend that RTBI implementers should take time to understand the data needs of users, more importantly, the data that could help them analyze business issues bound by their different areas of work. The main objective of RTBI should be to provide these users with data upon which flexible analytical and reporting tools could be applied for analysis. And not rigid reports, which are only good at prescribing the nature of supported decisions. With this approach, only the processing power of the hardware on which BI is based gets outdated and not necessarily the RTBI application.

Karapala (2010b) and Ness Global Industries (2010) identify as core, the need to ensure that the corporate strategy is holistically used as a guide towards defining the requirements on which RTBI should deliver. In line with what is given (in one of the previous sections) above, RTBI should serve to indicate how the currently adopted strategy is performing for business, given the available resources. Thus the ultimate measure to determine the robustness of RTBI should be determining what percentage of corporate strategy is translated into RTBI KPIs.

Langenkamp (2010) postulates that current BI initiatives are not exploited in full, but a subset of capabilities are. This is backed by Peters (2010)’s views towards cloud computing on BI, and Pennock (2010)’s discussion on how social media could be integrated into BI for more informed decisions to be taken. Cloud computing could resolve the problem of expensive vendor licenses that could be needed for pervasive BI (Peters, 2010). Thus organizations that are spanning zones and have the capacity to exploit cloud computing, should consider doing so as this could keep RTBI investments at acceptable levels. Pennock (2010) in addition prod firms to carve means to integrate social media into their RTBI solutions. In his explanation of the impact of social media on BI, he briefs that a firm stands a better future if it is enabled to identify the likely future needs of customers/clients, based on their general perceptions about issues that are relevant to business.

References:

1. Karapala, K. (2010a). The road to business intelligence success: The information focus.
http://www.information-management.com/infodirect/2009_165/business_intelligence_bi-10017923-1.html?ET=informationmgmt:e1533:2230379a:&st=email&utm_source=editorial&utm_medium=email&utm_campaign=IM_IMD_052010 (Accessed 19-July-2010).

2. Karapala, K. (2010b). Business intelligence strategies: Keys to success.
http://www.information-management.com/infodirect/2009_164/BI_strategy-10017847-1.html?ET=informationmgmt:e1524:2230379a:&st=email&utm_source=editorial&utm_medium=email&utm_campaign=IM_IMD_051310 (Accessed 19-July-2010).

3. Ness Global Industries (2010). From expected to achieved: four steps to making business intelligence work.
http://www.cio.com/white-paper/595797/From_Expected_to_Achieved_Four_Steps_to_Making_Business_Intelligence_Work (Accessed 10 July 2010).

4. Pells, D.L. (2009). Global business intelligence for managers of programs, projects and project oriented organizations.
http://www.pmforum.org/library/editorials/2009/PDFs/june/Editorial-Pells-BI-for-PM.pdf (Accessed 09 July 2010).

5. Vesset, D. and McDonough, B. (2009). Improving organizational performance management through pervasive business intelligence. http://www.information-management.com/white_papers/-10017909-1.html (Accessed 16 JUL 2010).

Friday, July 16, 2010

RTBI Implementation strategy (Continued…)

Environmental setup for RTBI – a technical perspective (Continued ...)

Figure 4. RTBI – Analytical and Strategic Layers

Figure 4 depicts the rest of the RTBI architecture as unveiled in figure 2 above. After data is loaded into the warehouse in formats that are preconceived to be useful for business analysis, then it would be time to devise means that enable these organizational data to be availed to business users from these various formats. This process is embodied by the next layer of RTBI, which is known as the analytical layer. Ness Global Industries (2010) identifies two types of analysis in this layer as predictive analysis and performance monitoring. These types can be performed from all the three analytical engines depicted in the analytics phase (please see figure 4 above). Predictive analysis is defined as analysis conducive for determining the likely future status quo of business performance, while performance monitoring is the intention to determine all that matters to business at present (Ness Global Industries, 2010).

In IDC Vendor Spotlight (2009)’s terms, analysis is the instrument through which data can be converted into information. That is, what is sitting in the data warehouse has unknown value until is passed through the analysis phase, in a similar way gold is processed and tagged with specific value after it has passed through a furnace. Now, there are three identified analytical engines in this layer, namely: data mining, rule based and custom engines. Gravic, Inc., (2010) defines data mining as random data processing which hopes to extract useful data relationships for decision support. Gravic, Inc. (2010) further identifies rule based analysis as processing that uses current existing rules to determine what traits can be associated with new transactional data. Looking at these proffered definitions, it becomes evident that data mining can be a source of new useful rules that could be implemented for future data analysis, while the rule based engine depends on these rules to determine how to handle newly received data. Lastly, the custom engine allows users to define their own custom rules to extract data that they deem fit to meet specific business information needs. It is more like a self service support. Moreover, this engine could also feed new rules to the rule based engine for future data processing, in a similar manner data mining could result in future useful rules.

Azvine, Cui, Nauck and Majeed (n.d.) stresses the need for the analytical layer to be endowed with a learning capability. This could be learning that is due to repetition of incidents. If for instance a way of drawing useful data is becoming dominant, the analytical layer should learn to carve means to simplify further this emerging way of analyzing data. For example, if executives in an organization repeatedly draw sales analysis data for specific regions, products or services, the analytical layer should pick as a trait the analytical formulae applied such that similar data processing is accelerated in the future.

Some authors such as Hanrahan, Stolte and Mackinlay (2009) go on to recommend visual analytics as a recommended method for accelerated data interpretation. However, no matter the method of preference used to present the findings of analysis, the basic design rules given above should hold irrespectively. Precis: the rule based engine harbours the learning that could see the analytical layer develop into being an anchor for organizational defined business rules. If it (the rule based engine) is configured to learn from all the day to day happenings, it could translate into BI terms all the rules governing business operation and optimum performance. Meaning the organization could be at a vantage point to realize maximum benefit from all its generated data.

The BI layer provides a platform where event analysis are combined to derive business impactful facts (Lehman, Watson, Wixom, Hoffer, n.d). The BI layer is also a place to plan information dissemination and security to the various business areas. For instance, RTBI information can be pushed to only relevant users for decision taking after event detection, and not users seeing no value on the deliered information. Extending the learning capability of the analytical layer, users can be allowed the flexibility to create custom dashboards, which are driven by their specialized business needs. Finally, the BPM layer benchmark business performance against the set strategy via measures that inheres in the organizational strategy (Vesset, 2009; Vesset and McDonough, 2009). It (the BPM layer) also enables decision makers to only embark on data analysis exercises that are meaningful to business. At this very layer, new choices can be developed to fix all the issues that seem to be flawed in business processing.

In sum, mini batches govern swift data movement between the operational store and the data warehouse. However, the real-time nature of RTBI cannot solely rely on the intermittent mini-batch processing, and still be able to supply live events to business users. This is the point where the RTRR becomes indispensable, because it enables new data to go straight to the analytical phase for real time analysis to be performed towards tactical business actions. The federation process, presents to the analytical layer mini data structures from the operational store, which are replicas of data structures residing in the data warehouse. Thus the analytical layer do not incur an additional burden of having to distinguish between data coming from the RTRR and the data warehouse, as data come as if they are coming from a single source. Lastly, rules are applied to these live processed data in order for meaningful business traits to be drawn and studied for tactical decision making.

References:
1. Azvine, B., Cui, Z., Nauck, D.D. and Majeed, B. (n.d.). Real time business intelligence for the adaptive enterprise.
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.101.194&rep=rep1&type=pdf (Accessed 14 June 2010).
2. Gravic, Inc. (2010). The evolution of real-time business intelligence. Available from:
http://www.gravic.com/shadowbase/whitepapers.html (Accessed 24 May 2010)
3. Hanrahan, P., Stolte, C. and Mackinlay, J. (2009). Selecting a visual analytics application.
http://www.tableausoftware.com/whitepapers/selecting-visual-analytics-application (Accessed 15 July 2010).
4. IDC Vendor Spotlight (2009). Ensuring long term success of BI/Analytics projects: Train and train again.
http://www.information-management.com/white_papers/SAP2.php (Accessed 15 July 2010).
5. Lehman, A. R., Watson, H.J. Wixom, B.H. Hoffer, J.A. (n.d). Continental Air-lines fly high with real-time business intelligence. Available from:
http://www.fuqua-europe.duke.edu/centers/ccrm/datasets/continental/Continental_Airlines_Case_Study.pdf (Accessed 22 May 2010).
6. Ness Global Industries (2010). From expected to achieved: four steps to making business intelligence work.
http://www.cio.com/white-paper/595797/From_Expected_to_Achieved_Four_Steps_to_Making_Business_Intelligence_Work (Accessed 10 July 2010).
7. Vesset, D. (2009). Gaining competitive differentiation from business intelligence and analytics solution.
http://www.information-management.com/white_papers/-10017911-1.html (Accessed 16 JUL 2010).
8. Vesset, D. and McDonough, B. (2009). Improving organizational performance management through pervasive business intelligence.
http://www.information-management.com/white_papers/-10017909-1.html (Accessed 16 JUL 2010).

Tuesday, July 13, 2010

RTBI Implementation strategy (Continued…)

Environmental setup for RTBI – a technical perspective

Integration has become more like corporate encroachment and demands due-diligence to be the main ingredient in planning for IT systems (Ness Global Industries, 2010). Gathibandhe, Deogirikar and Gupta (2010) describe two integration types in business IT as: application integration and data integration. The former being the establishment of communication channels between applications for smoother business process flow, and the latter dealing specifically with combining enterprise data at the back-end level (Gathibandhe et al. 2010;Gravic, Inc., 2010). In enterprise application integration (EAI), applications should establish a common language to communicate with one another (Gravic, Inc., 2010). Meaning, all applications should have adaptable interfaces to communicate intelligibly in fulfilling tactical contracts with the entire communication system.


Integration is essential for RTBI since tactical resolutions detected in the BPM layer may instigate inquiry into more than one application, even worse if an urgent-response is ineluctable towards resolving a calling business event. However, Gravic, Inc. (2010) discourages reliance on EAI for RTBI. Stated as reasons are the inefficacy, invasiveness and unreliability that might result from linking up an EAI infrastructure with the RTRR layer of RTBI. That is, inter-application communication might be a source of response delay – due to inter-application data routing that needs to happen during transaction processing, therefore, non-conducive for RTBI which should boasts real-time organizational snapshots at any given moment. Again, intrusion done to live applications that are already engaged in other processing activities might weigh on database performance. Lastly, inter-application connectivity might mean that the collapse of one application in the network, breaks the entire communication system. However, it is all not lost as there is still enterprise data integration (EDI).

EDI is based on triggers that constantly load data into auditing structures (Gravic, Inc., 2010). Applications carry on as normal, but the generated data is also loaded (in addition to the normal application data structures) into data structures that serve as integration havens for organizational information systems. These structures must be designed such that essential data is constantly loaded as it is generated from all the online applications within an organization (Gathibandhe et al. 2010). There is no need for applications to talk to each other during transaction processing as the database enables data sharing from all enterprise applications via back-end channels. What EDI means is basically online ETL, that is, push all data changes into auditing structures, which might be direct replicas of fed data structures residing in a data warehouse. From these audit-like structures, mini batches can be performed to load the data into the data warehouse (Gathibandhe et al., 2010). Also, the RTRR layer can use these online-resident structures for urgent interrogation because of their miniscule nature (also implying increased response speed) – a process called federation. Figure 3 depicts this discussion in a diagram for clarity. This is also how an RTBI solution gains both its tactical and strategic capabilities. The gradual feeding of data into a data warehouse enables long-term strategic planning, while the agility towards new data incorporation into the data analysis process, enables tactical resolutions. Again, although this approach looks enticing, an elixir to corporate decision making, it has its own drawbacks. Olofson (2009) mentions real time data synchronization as one of the issues impacting on efficient decision making from decision systems. Data synchronization is relevant with EDI, because it is at this point that cross-application record similarity matters. An enterprise might identify the same customer, service or product differently in its disparate systems. Thus putting the burden to match overlapping information to run-time processing can be an unwieldy and unwise a exercise, even futile for that matter.


However, Olofson (2009) proposes the need for master data management (MDM) to be a priority in planning for enterprise information architecture. He further describes it as the creation of a single view of the lowest granule of enterprise records – see relevant material for MDM discussions as it is not the intention of this study to elaborate on it. It is thus proposed in this discussion that MDM has to precede EDI for an organization to see good returns on its investment on the RTBI solution.

Again from figure 3, a change log engine is also depicted, its purpose is to load into data structures fresh changes from on-line systems in a format that is similar to the one adopted by supersets residing in the data warehouse. Therefore, the only function of the ETL process is pushing new and latest changes into the data warehouse, and not heavy loads of transactional records like is the case with traditional BI soulutions.


Figure 3. RTBI - Operational Layer

References:

1. Ness Global Industries (2010). From expected to achieved: four steps to making business intelligence work. http://www.cio.com/white-paper/595797/From_Expected_to_Achieved_Four_Steps_to_Making_Business_Intelligence_Work (Accessed 10 July 2010).
2. Olofson, C.W. (2009). Maximizing opportunity and minimizing risk through enterprise data integration: strategies for success in uncertain times.
http://vip.informatica.com/?elqPURLPage=5453 (Accessed 10 July 2010).
3. Gathibandhe, H., Deogirikar, S. and Gupta, A.K. (2010). How smart is Real-Time BI? Available from:
http://www.information-management.com/infodirect/2009_152/real_time_business_intelligence-10017057-1.html?pg=1 (Accessed 14 June 2010).
4. Gravic, Inc. (2010). The evolution of real-time business intelligence. Available from:
http://www.gravic.com/shadowbase/whitepapers.html (Accessed 24 May 2010)

Friday, July 9, 2010

RTBI Implementation strategy

Martin (2010) reveals that being successful has to do with understanding all the elements of success embodied in your organization. That is, a firm should understand first what will determine its success, before embarking on means to make itself successful. In this endeavour, Martin (2010) alluded to the fact that a business has to come up with metrics for performance measurement which would in turn be used to determine its success levels. Vesset and McDonough (2009) give credence to Martin (2010)’s claims by giving out that a strategy should allow a business enterprise to come up with key performance indicators (KPI), which subsequently guide the information needs towards business success. Fast response to business events is achieved by firms that have managed to perfectly knit their corporate strategies to KPIs (Pang, 2009).

Business intelligence (BI) is the very instrument with which an enterprise is enabled to juxtapose corporate strategy and actual performance of business processes (Quinn, n.d.). Adding agility to the latter, BI then morphs into real-time business intelligence (RTBI). Thus RTBI would then be that instrument which indicates the difference in strategy and actual performance as soon as this discrepancy is detected (Gravic, Inc., 2010) – assuming that the firm in question has managed to entirely convert its strategy into KPI. If an event can be found to be non-conforming to predefined business norm, then one of two things happens. Either, the RTBI automatically engages in mending the gone amiss transaction or, an instantaneous incident is logged to a relevant person for appropriate measures to be taken.

Drawing from conventional BI designs, which treat operational and analytical environments as purely separate and independent from each other, instantaneity in decision making cannot be achieved at all times (Gravic, Inc., 2010) – solely because event occurrence and event detection are separated by a significant measure of time. This is the one thing that marries RTBI to success according to Gathibandhe, Deogirikar and Gupta (2010). Gathibandhe et al. (2010) state that the value of data is directly proportional to how fast a business acts, upon receiving it. Meaning, instant action on any supplied data might spawn high business value while late action, might engender reduced or entirely lost value.

In this section the architecture of RTBI is developed, followed by the description of an information architecture that could be suitable for a robust RTBI solution. Since RTBI is viewed as an improvement from traditional BI (Azvine et al. (n.d.) & Lehman, Watson, Wixom and Hoffer (n.d)), it makes sense for the beginning point to be the architecture of the latter (see figure 1 below).


Figure 1. Conventional BI architecture (adapted from Gravic, Inc., (2010) & Azvine, Cui, Nauck and Majeed (n.d.)).

Figure 1 above depicts a common architecture or model that drives implementation of most BI systems, according to Gravic, Inc., (2010). Companies usually define several applications which are meant to support the various adopted business functions (such as human resource management, customer relation management etc.) (Watson, Wixom, Hoffer, Lehman and Reynolds, 2000). According to Gathibandhe et al. (2010), a BI would be partly, a system defined for collating data from the myriad data sources backing these applications into a common data store, called a data warehouse, via the extraction, transform and load processes (as depicted in figure 1). From this common data store, various techniques can be applied to manipulate the data into forms that are deemed intelligible to business users. Business intelligence is thus based on how the presented data is converted into business value. This value relies mainly on how the users interpret and combine these data into intelligence, moreover, on how these interpretations are discretionally applied to business activities.

Pang (2009) advices against a BI that presents users with data & information reports that are hardly linked to business goals, desired performance or governance. In such cases, only discretion determines whether business is performing towards a favourable direction or not. One might ask if the BI configured in such a way (that is, to process data for actually no clearly defined goal) really adds value to business performance? This would truly remain a mystery for businesses that view BI as a tool only endowed with the capability to refine, process and report on operational data, without the concern about the deeper revelations corollary to such data presentation. Data presentation (the analytical layer shown in figure 1) should be done on purpose, that is, reporting should be directly linked to what a business is trying to achieve, which is surely inscribed in the corporate strategy (Quinn, n.d.), and not on what decision makers guess based on intuition or ill-informed structures.

All in all, the model in figure 1 depicts a BI solution that locks out of the domain of its existence, business activity monitoring, the very essence which galvanizes the need for a BI solution within an organization (Azvine, et al. n.d.). In an attempt to further one of the previously covered topics (namely, The nexus between RTBI and business objectives), the following section explicates a could-be desirable RTBI model. This is performed by expanding figure 1 and discussing the added layers in a piecemeal fashion.

The RTBI architecture

In line with Raden (2008) who says decisions are made to solve problems, BI provides the platform to make and substantiate taken decisions so that problems can be better solved. But still, a business needs to know the priority and value of solving these identified problems. The added business performance management (BPM) layer (in figure 2 below) serves exactly that purpose. For real time intelligence generation to occur, prioritizing according to business needs becomes a necessity. That is, a business enterprise should know the cost and benefits proportional to the time taken to react to a business event. For example, if costs of business operation are escalating within a specific business unit, this event could be unveiled by the analytical layer, and confirmed in the BI layer as a business threatening observation. The BPM layer should expose the cost escalation before the waste runs for too long. This could prompt visitation to business activities in time. In addition, if costs are escalating because of known and expected causes, which for instance might be courtesy calls by call centre agents to promote new products or services to clients. The BPM could enable a decision maker to align this detected event with a predefined performance measure, and curb unnecessary panic to try and fix unbroken business operations.

Following the same way of processing, detected anomalies are sent to the real time response router (RTRR), which serves to decide the destination that is intended by the message originator (that is, is it a query for more information? A business process fix etc). Note should be taken that for traditional BI, reaction to events is purely manual (Azvine et al. n.d.). Analysts detect problems and manually engage in attempts to carve possible resolutions. RTBI is not meant to replace manual processing, as human action is immutable towards effective decision taking, but to supplement manual decision taking. Only in cases where remedy to given events is known that automatic processing can occur. What RTBI should aim for then is to streamline and refine business rules such that emerging events are spontaneously and automatically connected to business value. Therefore automation of correctional actions should be an adjunct task, and not the main driver of an RTBI solution.


Figure 2. RTBI architecture.

The RTRR deserves more attention since it is the heart of an RTBI-artifact. Pells (2009) gives out that current BI lacks the much needed interactive response between what is output from BI and what is antecedent to the output. This suggests that there has to be communication between both ends of a BI architecture. Murray (n.d.) ascertains the need for this communication layer by saying that BI solutions should embrace flexibility. The flexibility referred to here is three fold: one, the BI solution has to flexibly supply the information needs of all information consumers according their varying levels; two, the BI has to be endowed with a learning capability such that it builds up functionality as it is used to solve problems (Murray, n.d. & Azvine et al. n.d.), lastly, it has to be able to send tactical corrective measures back to the operational IT applications for speedy resolution to occur.

From figure 2, the BPM layer gauges overall business performance by contrasting actual business performance as drawn from BI, and the expected business performance as stipulated by trends that are embedded in the BPM layer. Also, from figure 2 above the interconnectedness of an enterprise’s information systems influences the pace at with instructions flow between the BPM and the preceding layers (Gravic, Inc., 2010; Gathibandhe et al. 2010). Thus for a firm operating on independent systems, the beginning point might be to establish robust and reliable communication among the disparate systems, followed by procuring and configuring capable hardware to support efficient live data movement between systems.

References:

1. Azvine, B., Cui, Z., Nauck, D.D. and Majeed, B. (n.d.). Real time business intelligence for the adaptive enterprise. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.101.194&rep=rep1&type=pdf (Accessed 14 June 2010).
2. Gathibandhe, H., Deogirikar, S. and Gupta, A.K. (2010). How smart is Real-Time BI? Available from:
http://www.information-management.com/infodirect/2009_152/real_time_business_intelligence-10017057-1.html?pg=1 (Accessed 14 June 2010).
3. Gravic, Inc. (2010). The evolution of real-time business intelligence. Available from:
http://www.gravic.com/shadowbase/whitepapers.html (Accessed 24 May 2010)
4. Langkamp, J. (2010). Business intelligence and performance management not very mature.
http://www.information-management.com/news/BI_performance_management_maturity-10017885-1.html?ET=informationmgmt:e1526:2230379a:&st=email&utm_source=editorial&utm_medium=email&utm_campaign=IM_Daily_051710 (Accessed 06 Jul 2010).
5. Martin, W. (2009). Agile corporate management.
http://www.wolfgang-martin-team.net/research-notes.php (Accessed 15 June 2010).
6. Murray, D. (n.d.). 7 principles for implementing high value business intelligence on a budget.
http://www.tableausoftware.com/files/TS_Budget_BI.pdf (Accessed 15 June 2010).
7. Pells, D.L. (2009). Global business intelligence for managers of programs, projects and project oriented organizations.
http://www.pmforum.org/library/editorials/2009/PDFs/june/Editorial-Pells-BI-for-PM.pdf (Accessed 09 July 2010).
8. Watson, J.H., Wixom, B.H., Hoffer, J.A., Lehman, R.A. and Reynolds, A. (2000). Real-Time business intelligence: Best practices at Continental Airlines. Information Systems Management 23(1):7:18.