Tuesday, August 10, 2010

Is RTBI worth it?

After all the praises that are showered upon RTBI, it is still essential to assess whether this strategic solution is worth the challenges that come with its implementation. What are the main issues that lie en-route a successful RTBI implementation? Trawling in just a few RTBI publications, an avalanche of challenges are revealed by different authors. Although some are revealed with workaround solutions, most are just declared as solid impediments that would continuously resist RTBI services within an organization. The following are a few of the identified burning issues.

Gravic, Inc. (2010) mentions one major problem as inherent in the structural configuration of most business enterprises. A business enterprise normally adopts one or several applications for its business operations. These applications are developed, honed and modified over years. However, the adoption of RTBI might impose a reconfiguration need on all or some of these tried and tested applications within a firm (Karapala, 2010c; 2010b). Gravic, Inc. (2010) associates this need to high costs and possible immeasurable risks. That is, since RTBI is an endeavour to integrate data generated from all organizational business activities, information technology (IT) applications might have to be overhauled to conform to integration rules. Thus depending on the uniformity, and application development standards adhered to when these applications were developed, the firm might incur innumerable losses from breaking core business processes (posing as a lifeblood to business,) when it undertakes an initiative to redefine these IT applications.

Raden (2008) points out that RTBI introduces a new form of competition among firms. He mentions analytics as the emerging form of competition in the market place, due to increasing RTBI adoption. Therefore, a company adopting RTBI is indirectly committing into continuous enhancement of the RTBI analytical capability such that the RTBI function resist obsolescence. As the organization evolves, RTBI should subsequently do the same to meet the new needs posed by business evolution. Raden (2008) refers to the RTBI advancement in tandem to new business needs as instrument flexibility. This simply means that RTBI as a technology should be elastic enough to vary with business needs. In addition it could mean regular procurement of hardware to avoid RTBI processing speed from deteriorating to unaccepted levels, or intense human resource management to prevent possible losses of sought after RTBI skills.

Most prominent and purportedly robust vendor databases are incapable to multitask and still be able to provide efficiency (Gathibandhe, Deogirikar and Gupta, 2010). RTBI relies on live data updates, meaning there should be constant interrogation to the same database capturing new, and modifying existing transactions. Gathibandhe et al. (2010) and Gravic, Inc. (2010) postulate that there are very few databases that are capable of doing this, and those that do, charge exorbitant amounts in license fees. The onus therefore rests upon the organization intending adoption to either carve extraordinary means to bypass this hindrance, or invest in expensive database solutions.

Most of pre-packaged BI solutions are only good to offer delusional claims of excellence (Syncsort, 2010), without any physical delivery of value. Turn-key BI solutions are mainly defined to answer the designer envisaged problem (Gravic, Inc., 2010), and this might not be the same as the problems of individual business firms posing as consumers of such solutions. Thus there is no short cut, the RTBI implementation means custom development, and one which might only prove invaluable long after delivery.

Vesset (2009) and Martin (2009) view RTBI (or BI to be precise) as the most accurate measuring scale any organization could use to measure the actual perfromance, and prognostication of its future complexion. This translates to the fact that any organization devoid of it, is simply attempting to run and manage resources which it cannot measure. However, this golden scale comes at a cost, but at the same time a question can be posed: is the cost worth the benefits to be gained? Again, if indeed the statement laid by The economist (2009) is true that information is the currency and knowledge is the coin of this digital age, then where is the mint? The decision lies upon all people who have find this topic interesting to follow from start to this end.


References:

1. Gathibandhe, H., Deogirikar, S. and Gupta, A.K (2010). How smart is real-time BI? http://www.information-management.com/infodirect/2009_152/real_time_business_intelligence-10017057-1.html?pg=1 (Accessed 17-July-2010).
2. Gravic, Inc. (2010). The evolution of real-time business intelligence. Available from: http://www.gravic.com/shadowbase/whitepapers.html (Accessed 24 May 2010).
3. Karapala, K. (2010b). Business intelligence strategies: Keys to success. http://www.information-management.com/infodirect/2009_164/BI_strategy-10017847-1.html?ET=informationmgmt:e1524:2230379a:&st=email&utm_source=editorial&utm_medium=email&utm_campaign=IM_IMD_051310 (Accessed 19-July-2010).
4. Karapala, K. (2010c). The role of strategy in BI. http://www.information-management.com/infodirect/2009_167/business_intelligence_bi_strategy-10018001-1.html?ET=informationmgmt:e1561:2230379a:&st=email&utm_source=editorial&utm_medium=email&utm_campaign=IM_IMD_060310 (Accessed 19-July-2010).

5. Raden, N. (2008). Fast Analytics and business intelligence for everyone: best practices for deploying collaborative bi. http://www.information-management.com/web_seminars/10000819-1.html (Accessed 14 June 2010).
6. Martin, W. (2009). Agile corporate management. http://www.wolfgang-martin-team.net/research-notes.php (Accessed 15 June 2010).
7. Syncsort (2010). Business drivers and enabling technologies for clickstream data warehouse initiatives. http://www.syncsort.com/6de3ffc5-5f7b-435f-880d-f3c02167cf1f/clk2axd-Business-Technologies-Clickstream.htm (Accessed 10 July 2010).
8. The economist (2009). Organisational agility: How business can survive and thrive in turbulent times. http://www.emc.com/collateral/leadership/organisational-agility-230309.pdf (Accessed 10 August 2010).
9. Vesset, D. (2009). Gaining competitive differentiation from business intelligence and analytics solution. http://www.information-management.com/white_papers/-10017911-1.html (Accessed 16 JUL 2010).

Friday, July 30, 2010

Hints

RTBI usually fails because of the common approach of wanting to supply business users with pre-configured reports (Karapala, 2010a). The danger lies in the likely quick-fading need of these reports, and consequently of the entire BI service to the organization. This is not immediately evident during need analysis, and organizations most likely go on to invest inordinate amounts only to derive no returns on RTBI. However, Vesset and McDonough (2009) and Pells (2009) recommend a different approach: they recommend that RTBI implementers should take time to understand the data needs of users, more importantly, the data that could help them analyze business issues bound by their different areas of work. The main objective of RTBI should be to provide these users with data upon which flexible analytical and reporting tools could be applied for analysis. And not rigid reports, which are only good at prescribing the nature of supported decisions. With this approach, only the processing power of the hardware on which BI is based gets outdated and not necessarily the RTBI application.

Karapala (2010b) and Ness Global Industries (2010) identify as core, the need to ensure that the corporate strategy is holistically used as a guide towards defining the requirements on which RTBI should deliver. In line with what is given (in one of the previous sections) above, RTBI should serve to indicate how the currently adopted strategy is performing for business, given the available resources. Thus the ultimate measure to determine the robustness of RTBI should be determining what percentage of corporate strategy is translated into RTBI KPIs.

Langenkamp (2010) postulates that current BI initiatives are not exploited in full, but a subset of capabilities are. This is backed by Peters (2010)’s views towards cloud computing on BI, and Pennock (2010)’s discussion on how social media could be integrated into BI for more informed decisions to be taken. Cloud computing could resolve the problem of expensive vendor licenses that could be needed for pervasive BI (Peters, 2010). Thus organizations that are spanning zones and have the capacity to exploit cloud computing, should consider doing so as this could keep RTBI investments at acceptable levels. Pennock (2010) in addition prod firms to carve means to integrate social media into their RTBI solutions. In his explanation of the impact of social media on BI, he briefs that a firm stands a better future if it is enabled to identify the likely future needs of customers/clients, based on their general perceptions about issues that are relevant to business.

References:

1. Karapala, K. (2010a). The road to business intelligence success: The information focus.
http://www.information-management.com/infodirect/2009_165/business_intelligence_bi-10017923-1.html?ET=informationmgmt:e1533:2230379a:&st=email&utm_source=editorial&utm_medium=email&utm_campaign=IM_IMD_052010 (Accessed 19-July-2010).

2. Karapala, K. (2010b). Business intelligence strategies: Keys to success.
http://www.information-management.com/infodirect/2009_164/BI_strategy-10017847-1.html?ET=informationmgmt:e1524:2230379a:&st=email&utm_source=editorial&utm_medium=email&utm_campaign=IM_IMD_051310 (Accessed 19-July-2010).

3. Ness Global Industries (2010). From expected to achieved: four steps to making business intelligence work.
http://www.cio.com/white-paper/595797/From_Expected_to_Achieved_Four_Steps_to_Making_Business_Intelligence_Work (Accessed 10 July 2010).

4. Pells, D.L. (2009). Global business intelligence for managers of programs, projects and project oriented organizations.
http://www.pmforum.org/library/editorials/2009/PDFs/june/Editorial-Pells-BI-for-PM.pdf (Accessed 09 July 2010).

5. Vesset, D. and McDonough, B. (2009). Improving organizational performance management through pervasive business intelligence. http://www.information-management.com/white_papers/-10017909-1.html (Accessed 16 JUL 2010).

Friday, July 16, 2010

RTBI Implementation strategy (Continued…)

Environmental setup for RTBI – a technical perspective (Continued ...)

Figure 4. RTBI – Analytical and Strategic Layers

Figure 4 depicts the rest of the RTBI architecture as unveiled in figure 2 above. After data is loaded into the warehouse in formats that are preconceived to be useful for business analysis, then it would be time to devise means that enable these organizational data to be availed to business users from these various formats. This process is embodied by the next layer of RTBI, which is known as the analytical layer. Ness Global Industries (2010) identifies two types of analysis in this layer as predictive analysis and performance monitoring. These types can be performed from all the three analytical engines depicted in the analytics phase (please see figure 4 above). Predictive analysis is defined as analysis conducive for determining the likely future status quo of business performance, while performance monitoring is the intention to determine all that matters to business at present (Ness Global Industries, 2010).

In IDC Vendor Spotlight (2009)’s terms, analysis is the instrument through which data can be converted into information. That is, what is sitting in the data warehouse has unknown value until is passed through the analysis phase, in a similar way gold is processed and tagged with specific value after it has passed through a furnace. Now, there are three identified analytical engines in this layer, namely: data mining, rule based and custom engines. Gravic, Inc., (2010) defines data mining as random data processing which hopes to extract useful data relationships for decision support. Gravic, Inc. (2010) further identifies rule based analysis as processing that uses current existing rules to determine what traits can be associated with new transactional data. Looking at these proffered definitions, it becomes evident that data mining can be a source of new useful rules that could be implemented for future data analysis, while the rule based engine depends on these rules to determine how to handle newly received data. Lastly, the custom engine allows users to define their own custom rules to extract data that they deem fit to meet specific business information needs. It is more like a self service support. Moreover, this engine could also feed new rules to the rule based engine for future data processing, in a similar manner data mining could result in future useful rules.

Azvine, Cui, Nauck and Majeed (n.d.) stresses the need for the analytical layer to be endowed with a learning capability. This could be learning that is due to repetition of incidents. If for instance a way of drawing useful data is becoming dominant, the analytical layer should learn to carve means to simplify further this emerging way of analyzing data. For example, if executives in an organization repeatedly draw sales analysis data for specific regions, products or services, the analytical layer should pick as a trait the analytical formulae applied such that similar data processing is accelerated in the future.

Some authors such as Hanrahan, Stolte and Mackinlay (2009) go on to recommend visual analytics as a recommended method for accelerated data interpretation. However, no matter the method of preference used to present the findings of analysis, the basic design rules given above should hold irrespectively. Precis: the rule based engine harbours the learning that could see the analytical layer develop into being an anchor for organizational defined business rules. If it (the rule based engine) is configured to learn from all the day to day happenings, it could translate into BI terms all the rules governing business operation and optimum performance. Meaning the organization could be at a vantage point to realize maximum benefit from all its generated data.

The BI layer provides a platform where event analysis are combined to derive business impactful facts (Lehman, Watson, Wixom, Hoffer, n.d). The BI layer is also a place to plan information dissemination and security to the various business areas. For instance, RTBI information can be pushed to only relevant users for decision taking after event detection, and not users seeing no value on the deliered information. Extending the learning capability of the analytical layer, users can be allowed the flexibility to create custom dashboards, which are driven by their specialized business needs. Finally, the BPM layer benchmark business performance against the set strategy via measures that inheres in the organizational strategy (Vesset, 2009; Vesset and McDonough, 2009). It (the BPM layer) also enables decision makers to only embark on data analysis exercises that are meaningful to business. At this very layer, new choices can be developed to fix all the issues that seem to be flawed in business processing.

In sum, mini batches govern swift data movement between the operational store and the data warehouse. However, the real-time nature of RTBI cannot solely rely on the intermittent mini-batch processing, and still be able to supply live events to business users. This is the point where the RTRR becomes indispensable, because it enables new data to go straight to the analytical phase for real time analysis to be performed towards tactical business actions. The federation process, presents to the analytical layer mini data structures from the operational store, which are replicas of data structures residing in the data warehouse. Thus the analytical layer do not incur an additional burden of having to distinguish between data coming from the RTRR and the data warehouse, as data come as if they are coming from a single source. Lastly, rules are applied to these live processed data in order for meaningful business traits to be drawn and studied for tactical decision making.

References:
1. Azvine, B., Cui, Z., Nauck, D.D. and Majeed, B. (n.d.). Real time business intelligence for the adaptive enterprise.
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.101.194&rep=rep1&type=pdf (Accessed 14 June 2010).
2. Gravic, Inc. (2010). The evolution of real-time business intelligence. Available from:
http://www.gravic.com/shadowbase/whitepapers.html (Accessed 24 May 2010)
3. Hanrahan, P., Stolte, C. and Mackinlay, J. (2009). Selecting a visual analytics application.
http://www.tableausoftware.com/whitepapers/selecting-visual-analytics-application (Accessed 15 July 2010).
4. IDC Vendor Spotlight (2009). Ensuring long term success of BI/Analytics projects: Train and train again.
http://www.information-management.com/white_papers/SAP2.php (Accessed 15 July 2010).
5. Lehman, A. R., Watson, H.J. Wixom, B.H. Hoffer, J.A. (n.d). Continental Air-lines fly high with real-time business intelligence. Available from:
http://www.fuqua-europe.duke.edu/centers/ccrm/datasets/continental/Continental_Airlines_Case_Study.pdf (Accessed 22 May 2010).
6. Ness Global Industries (2010). From expected to achieved: four steps to making business intelligence work.
http://www.cio.com/white-paper/595797/From_Expected_to_Achieved_Four_Steps_to_Making_Business_Intelligence_Work (Accessed 10 July 2010).
7. Vesset, D. (2009). Gaining competitive differentiation from business intelligence and analytics solution.
http://www.information-management.com/white_papers/-10017911-1.html (Accessed 16 JUL 2010).
8. Vesset, D. and McDonough, B. (2009). Improving organizational performance management through pervasive business intelligence.
http://www.information-management.com/white_papers/-10017909-1.html (Accessed 16 JUL 2010).

Tuesday, July 13, 2010

RTBI Implementation strategy (Continued…)

Environmental setup for RTBI – a technical perspective

Integration has become more like corporate encroachment and demands due-diligence to be the main ingredient in planning for IT systems (Ness Global Industries, 2010). Gathibandhe, Deogirikar and Gupta (2010) describe two integration types in business IT as: application integration and data integration. The former being the establishment of communication channels between applications for smoother business process flow, and the latter dealing specifically with combining enterprise data at the back-end level (Gathibandhe et al. 2010;Gravic, Inc., 2010). In enterprise application integration (EAI), applications should establish a common language to communicate with one another (Gravic, Inc., 2010). Meaning, all applications should have adaptable interfaces to communicate intelligibly in fulfilling tactical contracts with the entire communication system.


Integration is essential for RTBI since tactical resolutions detected in the BPM layer may instigate inquiry into more than one application, even worse if an urgent-response is ineluctable towards resolving a calling business event. However, Gravic, Inc. (2010) discourages reliance on EAI for RTBI. Stated as reasons are the inefficacy, invasiveness and unreliability that might result from linking up an EAI infrastructure with the RTRR layer of RTBI. That is, inter-application communication might be a source of response delay – due to inter-application data routing that needs to happen during transaction processing, therefore, non-conducive for RTBI which should boasts real-time organizational snapshots at any given moment. Again, intrusion done to live applications that are already engaged in other processing activities might weigh on database performance. Lastly, inter-application connectivity might mean that the collapse of one application in the network, breaks the entire communication system. However, it is all not lost as there is still enterprise data integration (EDI).

EDI is based on triggers that constantly load data into auditing structures (Gravic, Inc., 2010). Applications carry on as normal, but the generated data is also loaded (in addition to the normal application data structures) into data structures that serve as integration havens for organizational information systems. These structures must be designed such that essential data is constantly loaded as it is generated from all the online applications within an organization (Gathibandhe et al. 2010). There is no need for applications to talk to each other during transaction processing as the database enables data sharing from all enterprise applications via back-end channels. What EDI means is basically online ETL, that is, push all data changes into auditing structures, which might be direct replicas of fed data structures residing in a data warehouse. From these audit-like structures, mini batches can be performed to load the data into the data warehouse (Gathibandhe et al., 2010). Also, the RTRR layer can use these online-resident structures for urgent interrogation because of their miniscule nature (also implying increased response speed) – a process called federation. Figure 3 depicts this discussion in a diagram for clarity. This is also how an RTBI solution gains both its tactical and strategic capabilities. The gradual feeding of data into a data warehouse enables long-term strategic planning, while the agility towards new data incorporation into the data analysis process, enables tactical resolutions. Again, although this approach looks enticing, an elixir to corporate decision making, it has its own drawbacks. Olofson (2009) mentions real time data synchronization as one of the issues impacting on efficient decision making from decision systems. Data synchronization is relevant with EDI, because it is at this point that cross-application record similarity matters. An enterprise might identify the same customer, service or product differently in its disparate systems. Thus putting the burden to match overlapping information to run-time processing can be an unwieldy and unwise a exercise, even futile for that matter.


However, Olofson (2009) proposes the need for master data management (MDM) to be a priority in planning for enterprise information architecture. He further describes it as the creation of a single view of the lowest granule of enterprise records – see relevant material for MDM discussions as it is not the intention of this study to elaborate on it. It is thus proposed in this discussion that MDM has to precede EDI for an organization to see good returns on its investment on the RTBI solution.

Again from figure 3, a change log engine is also depicted, its purpose is to load into data structures fresh changes from on-line systems in a format that is similar to the one adopted by supersets residing in the data warehouse. Therefore, the only function of the ETL process is pushing new and latest changes into the data warehouse, and not heavy loads of transactional records like is the case with traditional BI soulutions.


Figure 3. RTBI - Operational Layer

References:

1. Ness Global Industries (2010). From expected to achieved: four steps to making business intelligence work. http://www.cio.com/white-paper/595797/From_Expected_to_Achieved_Four_Steps_to_Making_Business_Intelligence_Work (Accessed 10 July 2010).
2. Olofson, C.W. (2009). Maximizing opportunity and minimizing risk through enterprise data integration: strategies for success in uncertain times.
http://vip.informatica.com/?elqPURLPage=5453 (Accessed 10 July 2010).
3. Gathibandhe, H., Deogirikar, S. and Gupta, A.K. (2010). How smart is Real-Time BI? Available from:
http://www.information-management.com/infodirect/2009_152/real_time_business_intelligence-10017057-1.html?pg=1 (Accessed 14 June 2010).
4. Gravic, Inc. (2010). The evolution of real-time business intelligence. Available from:
http://www.gravic.com/shadowbase/whitepapers.html (Accessed 24 May 2010)

Friday, July 9, 2010

RTBI Implementation strategy

Martin (2010) reveals that being successful has to do with understanding all the elements of success embodied in your organization. That is, a firm should understand first what will determine its success, before embarking on means to make itself successful. In this endeavour, Martin (2010) alluded to the fact that a business has to come up with metrics for performance measurement which would in turn be used to determine its success levels. Vesset and McDonough (2009) give credence to Martin (2010)’s claims by giving out that a strategy should allow a business enterprise to come up with key performance indicators (KPI), which subsequently guide the information needs towards business success. Fast response to business events is achieved by firms that have managed to perfectly knit their corporate strategies to KPIs (Pang, 2009).

Business intelligence (BI) is the very instrument with which an enterprise is enabled to juxtapose corporate strategy and actual performance of business processes (Quinn, n.d.). Adding agility to the latter, BI then morphs into real-time business intelligence (RTBI). Thus RTBI would then be that instrument which indicates the difference in strategy and actual performance as soon as this discrepancy is detected (Gravic, Inc., 2010) – assuming that the firm in question has managed to entirely convert its strategy into KPI. If an event can be found to be non-conforming to predefined business norm, then one of two things happens. Either, the RTBI automatically engages in mending the gone amiss transaction or, an instantaneous incident is logged to a relevant person for appropriate measures to be taken.

Drawing from conventional BI designs, which treat operational and analytical environments as purely separate and independent from each other, instantaneity in decision making cannot be achieved at all times (Gravic, Inc., 2010) – solely because event occurrence and event detection are separated by a significant measure of time. This is the one thing that marries RTBI to success according to Gathibandhe, Deogirikar and Gupta (2010). Gathibandhe et al. (2010) state that the value of data is directly proportional to how fast a business acts, upon receiving it. Meaning, instant action on any supplied data might spawn high business value while late action, might engender reduced or entirely lost value.

In this section the architecture of RTBI is developed, followed by the description of an information architecture that could be suitable for a robust RTBI solution. Since RTBI is viewed as an improvement from traditional BI (Azvine et al. (n.d.) & Lehman, Watson, Wixom and Hoffer (n.d)), it makes sense for the beginning point to be the architecture of the latter (see figure 1 below).


Figure 1. Conventional BI architecture (adapted from Gravic, Inc., (2010) & Azvine, Cui, Nauck and Majeed (n.d.)).

Figure 1 above depicts a common architecture or model that drives implementation of most BI systems, according to Gravic, Inc., (2010). Companies usually define several applications which are meant to support the various adopted business functions (such as human resource management, customer relation management etc.) (Watson, Wixom, Hoffer, Lehman and Reynolds, 2000). According to Gathibandhe et al. (2010), a BI would be partly, a system defined for collating data from the myriad data sources backing these applications into a common data store, called a data warehouse, via the extraction, transform and load processes (as depicted in figure 1). From this common data store, various techniques can be applied to manipulate the data into forms that are deemed intelligible to business users. Business intelligence is thus based on how the presented data is converted into business value. This value relies mainly on how the users interpret and combine these data into intelligence, moreover, on how these interpretations are discretionally applied to business activities.

Pang (2009) advices against a BI that presents users with data & information reports that are hardly linked to business goals, desired performance or governance. In such cases, only discretion determines whether business is performing towards a favourable direction or not. One might ask if the BI configured in such a way (that is, to process data for actually no clearly defined goal) really adds value to business performance? This would truly remain a mystery for businesses that view BI as a tool only endowed with the capability to refine, process and report on operational data, without the concern about the deeper revelations corollary to such data presentation. Data presentation (the analytical layer shown in figure 1) should be done on purpose, that is, reporting should be directly linked to what a business is trying to achieve, which is surely inscribed in the corporate strategy (Quinn, n.d.), and not on what decision makers guess based on intuition or ill-informed structures.

All in all, the model in figure 1 depicts a BI solution that locks out of the domain of its existence, business activity monitoring, the very essence which galvanizes the need for a BI solution within an organization (Azvine, et al. n.d.). In an attempt to further one of the previously covered topics (namely, The nexus between RTBI and business objectives), the following section explicates a could-be desirable RTBI model. This is performed by expanding figure 1 and discussing the added layers in a piecemeal fashion.

The RTBI architecture

In line with Raden (2008) who says decisions are made to solve problems, BI provides the platform to make and substantiate taken decisions so that problems can be better solved. But still, a business needs to know the priority and value of solving these identified problems. The added business performance management (BPM) layer (in figure 2 below) serves exactly that purpose. For real time intelligence generation to occur, prioritizing according to business needs becomes a necessity. That is, a business enterprise should know the cost and benefits proportional to the time taken to react to a business event. For example, if costs of business operation are escalating within a specific business unit, this event could be unveiled by the analytical layer, and confirmed in the BI layer as a business threatening observation. The BPM layer should expose the cost escalation before the waste runs for too long. This could prompt visitation to business activities in time. In addition, if costs are escalating because of known and expected causes, which for instance might be courtesy calls by call centre agents to promote new products or services to clients. The BPM could enable a decision maker to align this detected event with a predefined performance measure, and curb unnecessary panic to try and fix unbroken business operations.

Following the same way of processing, detected anomalies are sent to the real time response router (RTRR), which serves to decide the destination that is intended by the message originator (that is, is it a query for more information? A business process fix etc). Note should be taken that for traditional BI, reaction to events is purely manual (Azvine et al. n.d.). Analysts detect problems and manually engage in attempts to carve possible resolutions. RTBI is not meant to replace manual processing, as human action is immutable towards effective decision taking, but to supplement manual decision taking. Only in cases where remedy to given events is known that automatic processing can occur. What RTBI should aim for then is to streamline and refine business rules such that emerging events are spontaneously and automatically connected to business value. Therefore automation of correctional actions should be an adjunct task, and not the main driver of an RTBI solution.


Figure 2. RTBI architecture.

The RTRR deserves more attention since it is the heart of an RTBI-artifact. Pells (2009) gives out that current BI lacks the much needed interactive response between what is output from BI and what is antecedent to the output. This suggests that there has to be communication between both ends of a BI architecture. Murray (n.d.) ascertains the need for this communication layer by saying that BI solutions should embrace flexibility. The flexibility referred to here is three fold: one, the BI solution has to flexibly supply the information needs of all information consumers according their varying levels; two, the BI has to be endowed with a learning capability such that it builds up functionality as it is used to solve problems (Murray, n.d. & Azvine et al. n.d.), lastly, it has to be able to send tactical corrective measures back to the operational IT applications for speedy resolution to occur.

From figure 2, the BPM layer gauges overall business performance by contrasting actual business performance as drawn from BI, and the expected business performance as stipulated by trends that are embedded in the BPM layer. Also, from figure 2 above the interconnectedness of an enterprise’s information systems influences the pace at with instructions flow between the BPM and the preceding layers (Gravic, Inc., 2010; Gathibandhe et al. 2010). Thus for a firm operating on independent systems, the beginning point might be to establish robust and reliable communication among the disparate systems, followed by procuring and configuring capable hardware to support efficient live data movement between systems.

References:

1. Azvine, B., Cui, Z., Nauck, D.D. and Majeed, B. (n.d.). Real time business intelligence for the adaptive enterprise. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.101.194&rep=rep1&type=pdf (Accessed 14 June 2010).
2. Gathibandhe, H., Deogirikar, S. and Gupta, A.K. (2010). How smart is Real-Time BI? Available from:
http://www.information-management.com/infodirect/2009_152/real_time_business_intelligence-10017057-1.html?pg=1 (Accessed 14 June 2010).
3. Gravic, Inc. (2010). The evolution of real-time business intelligence. Available from:
http://www.gravic.com/shadowbase/whitepapers.html (Accessed 24 May 2010)
4. Langkamp, J. (2010). Business intelligence and performance management not very mature.
http://www.information-management.com/news/BI_performance_management_maturity-10017885-1.html?ET=informationmgmt:e1526:2230379a:&st=email&utm_source=editorial&utm_medium=email&utm_campaign=IM_Daily_051710 (Accessed 06 Jul 2010).
5. Martin, W. (2009). Agile corporate management.
http://www.wolfgang-martin-team.net/research-notes.php (Accessed 15 June 2010).
6. Murray, D. (n.d.). 7 principles for implementing high value business intelligence on a budget.
http://www.tableausoftware.com/files/TS_Budget_BI.pdf (Accessed 15 June 2010).
7. Pells, D.L. (2009). Global business intelligence for managers of programs, projects and project oriented organizations.
http://www.pmforum.org/library/editorials/2009/PDFs/june/Editorial-Pells-BI-for-PM.pdf (Accessed 09 July 2010).
8. Watson, J.H., Wixom, B.H., Hoffer, J.A., Lehman, R.A. and Reynolds, A. (2000). Real-Time business intelligence: Best practices at Continental Airlines. Information Systems Management 23(1):7:18.

Tuesday, June 29, 2010

The nexus between run-time business intelligence (RTBI) and business objectives

Quinn (n.d.) accentuates that RTBI is not only the software that a firm procures and deploys within its structures hoping for better service, but combines a whole lot of organizational actors both human and non-human. The indivisible nexus between humans and non-humans is explained in details by actor-network theorists (see Johnson, 1988; Callon, 1986; Law, 1992 for more details). It is the working together of all these actors, driving towards a negotiated direction that yields the desired results within an organization. Quinn (n.d.) remits that organizations exist and existed without the hyped and modern business intelligence solutions. However, the enrollment of an RTBI-artifact solution into the already existing network of actors involved with decision making, could help a firm bridge the gap between the running of its operational activities and its strategic decision making activities (Quinn, n.d.). Business intelligence helps an enterprise to constantly align its choice selection with the desired business value as engraved in business goals. Arnot (2004) & Ghemawat and Levinthal (2008) recount that a firm either chooses to define all the choices it whishes to follow, or partially defines these choices but keeps an open eye on what additional choices should be further selected or removed from the initial set. Fixing choices carries with it the aftermath of seeing past ineffective methods of business processing carried forward, leading to more business losses, while flexible choice selection means learning and adapting from emergent situations. In agreement with the latter, choice selection has more to do with an RTBI-artifact and the people around it working together towards decision making, while adaptation is informed enhancement of business activity. All in all, RTBI is not only the analytical software that produces statistical reports and aesthetic graphs, but also the individuals that play the role to making sense of those and using them for advantageous business processing. This presentation looks at an RTBI model as presented by Quinn (n.d.) (see figure 1 below). The model depicts three mutually-inclusive levels of an RTBI system as strategic, analytic and operational. He further asserts that a robust RTBI would incorporate all these three layers because they support one another.
Figure 1. RTBI layers – adapted from Quinn (n.d.)

According to Quinn (n.d.) the strategic layer of an RTBI helps a business enterprise defines metrics to empirically measure business performance. These measures are visualized in graphical computer applications, which only serve to graphically depict for managers (both senior and line) what the metric scales are reading from databases as guided by predefined business rules. In business intelligence (BI) parlance, these applications are termed dashboards (Gravic, Inc., 2010). Azvine, Cui, Nauck and Majeed, (n.d.) & Quinn (n.d.) perceive the strategic layer as an enabler for RTBI to permeate the entire business domain. To illustrate with an example, say one of the strategic goals is to cut costs within a specified period in an organization. However, cutting costs may depend on a lot of antecedents that are residing in the various departments within a firm. It can for instance mean that redundancy in call-centers should be minimized, or print costs should be cut, or business processes should be reengineered etc. Thus indicators can be set on the dashboard to signal to managers whether the organizational set targets dependant on these antecedents are being realized or not. Over and above serving an indicative role, the strategic layer also enables a business to devolve strategic responsibilities to the various operational areas. Individual departments are able to see the role they are playing within the entire business enterprise by understanding their strategic responsibilities (Quinn, n.d.). In an endeavour to clarify the mutual inclusiveness of the three levels as given by the Quinn model, we are going to run in parallel a patient analogy. In this analogy the strategic layer could be likened to the senses that report to a human-being when something is not well in his/her body.

Like in human-beings, abnormality still has to be diagnosed by doctors, the strategic layer could be able to sound a signal to the responsible person at the right time, but this problem still has to be diagnosed for causality. That is, we know what went wrong, but what caused it, when, and how? According Quinn (n.d.), after a problem is spotted the analytical level should be able to tell more about the problem. This means looking at all the possible causes to an identified problem and coming with the precise cause of it. Back to the analogy, the person goes to hospital, and a doctor runs all tests and checks which have the potential to reveal the cause of the suffered ailment. Only tests and checks pertinent to the clarification of the cause of the problem are performed. This is also true for RTBI according Azvine (n.d.), which streamlines diagnosis to only the area embodying an anomaly.

The last layer is the operational layer, Quinn (n.d.) says at this level the problem is known and only the correct fix is pending. If it is a business process error, a method to fix it is worked out and applied at this level. In our analogy, that would be a point when a patient walks out of the doctor’s room with a prescription note on their hand. However, this does not guarantee that the indentified problem or ailment is going to be remedied as envisaged. The operational layer also means the dashboard lights or alarming signals, are put to normality. Meaning, all the business activities running are in agreement with the set rules, which are derived from the current corporate strategy. Going back to Ghemawat and Levinthal (2008)’s choices, if the defined rules are still not satisfying the envisaged goals, then it is time to re-look into the choices that were chosen. In simple terms it means change what is not working for business.

In sum, an RTBI connects to the various business enterprise units via the operational layer, but reveals through the strategic layer. Operational activities are forced to adhere to certain processing standards, which are simple guided by metrics derived from the corporate strategy. For instance, call-centre management in firms adopting RTBI would have a live measure of calls-per-hour or average-calls per call-centre-agent against set threshholds. Secondly, the analytical layer service both the operational layer and the strategic layer. In servicing the operational layer, analytical processing enables constant business process validation against defined rules. Everything performed during a business transaction is analyzed for correctness, authenticity and value-add. On the other hand, analytical processing helps a business projects into the future by supporting the strategic layer. That is, based on what is happening currently, new choices can be incorporated in the business strategy which might be of greater value in the future. Lastly, the strategic layer allows management to steer overall business processing with a information based instrument.

References:

1. Quinn, k. (n.d.). How business intelligence should work.
http://www.informationbuilders.com/products/whitepapers/pdf/How_BI_Should_Work_WP.pdf (Accessed 17 June 2010).
2. Johnson, J. (1988). Mixing humans and non-humans together – The sociology of a door-closer. Social Problems 35(3):298:310
3. Callon, M. (1986). Some elements of a sociology of translation – domestication of the scallops and the fisherman of st-Brieuc Bay. Sociological review monograph 196:233
4. Law, J. (1992). Notes on the theory of the actor network – ordering strategy and heterogeneity. Systems practice 5(4):379:393
5. Arnot, D. (2004). Decision support systems evolution: framework, case study and research agenda. European journal of information systems 13:247:259
6. Ghemawat, P. and Levinthal, D. (2008). Choice interaction and business strategy. Management Science 54(9):1638:1651
7. Gravic, Inc. (2010). The evolution of real-time business intelligence. Available from:
http://www.gravic.com/shadowbase/whitepapers.html (Accessed 24 May 2010)
8. Azvine, B., Cui, Z., Nauck, D.D. and Majeed, B. (n.d.). Real time business intelligence for the adaptive enterprise.
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.101.194&rep=rep1&type=pdf (Accessed 14 June 2010).

Wednesday, June 16, 2010

Possible uses of RTBI

Ghemawat and Levinthal (2008) reveal how organizations struggle to select the most appropriate set of choices towards implementing actions. They base their reasoning on the fact that there is an infinite bundling of the set of choices available to any organization. This coupled with the changing demands of a modern knowledgeable customer (or client), impose to a business enterprise a need for agility towards decision making in order for it to remain competitive in the market place (Martin, 2009). The environmental demands for increased speed towards quality product/service delivery are not calling only for appropriate decisions to be made for better performance, but also calling for them to be made at an accelerated pace (Schonberg, Cofino, Hoch, Podlaseck and Spraragen, 2000). It is explained in the past section (Defining RTBI) what RTBI is capable of, and how better it is purported to fare in comparison to the long-existing and widely tried operational business intelligence. RTBI is hailed by Gathibandhe (2010) and Azvine, Cui, Nauck and Majeed (n.d.) as an organizational implement that encapsulates the capability of both best choice selection and agile decision making. In this section the possible uses of RTBI are visited by giving examples of implementations that were successfully done by a few institutions.

Revealed Current Applications of RTBI

Watson et al. (2006) give an account of how RTBI was successfully deployed at Continental Airlines, which resulted in the upsurge of its performance and increased client base. In this account, the RTBI was centered on the needs of the client. That is, the RTBI was designed and implemented such that Continental Airlines served customer needs as and when they arose. Most valued customers were recognized through the system during transaction processing and timely incentivised for their loyal support. Other events that could deter enrolment of new customers into the Continental Airlines family were unveiled at opportune times and mitigated with agile responses. Watson et al. (2006) asserts that profit margins grew and myriad other benefits were realized by both the clients and business.

Gravic, Inc. (2010) explains several implementations of RTBI as follows. RTBI is being used to detect fraudulent events by some of the major banks. A credit card transaction normally culminates in no more than a mere recording of an entry into a database. What Gravic, Inc. (2010) explicates is: even before an entry could be entered into a database, that very same record is validated for authenticity against well known and emerging fraudulent trends. All these happen in real time, and also enable timely actions to be taken by either supporting decisions that require human intervention, or automated rule-based processes. Another account which Gravic, Inc (2010) recognizes as of growing interest in RTBI, is stock trading. Various analyses are performed by RTBI on behalf of trading customers. Real time trading data and product recommendation reports are supplied live for informed investment decisions to be taken, thus possibly enabling risk (or loss) elimination. Finally, Gravic, Inc. (2010) recommends RTBI for inventory control and strategic marketing. In inventory control, a business is supplied with timely information to determine whether there is a need to restock certain products, given the rate at which stock is diminishing and the trends of history purchases. In this case the data kept as historic data in the data warehouse inform strategies going forward, while the inflow of real time information tactically briefs the business about anomalies and surfacing opportunities.

One last example is recounted by Schonberg et al. (2000) who say that e-commerce is another area where the use of RTBI is proving indispensable. They term such RTBI e-business intelligence. In their discussion of measuring success, these authors highlight the informative nature of click-stream data, which is data gathered during website navigation by users. In this case, RTBI can be used to establish a live communication between a business enterprise and a customer accessing its web site. That is, an enterprise can read the precise streamlined requirements of customers as they emerge through navigational choice trends, and respond to such in time. This can be achieved by analyzing both click-stream data and data extracted from internet user profiles. Customer preferences by region, age, race etc. and various other discoveries that might help a business enterprise maximize profit can be captured. Thus in this case RTBI responds to e-commerce events by enabling a business to derive useful patterns from the behaviour of the global web user. This could help a firm aligns its interests with those of users as demarcated by boundaries, race, age and so forth. Moreover, alignment happens swiftly and without any delays.

Who needs RTBI?

From the given examples it can be inferred that RTBI is a tool calibrated for businesses that experience huge inflow of data, and at the same time need the generated data in a processed form for agile decision making. These are businesses that handle volumes of transactions and deal with volumes of clients and customers across space and time. But any enterprise operating in the modern era is inundated with data of varying degrees of usefulness. These data lie within and around its domain of operation. Martin (2009) asserts that success is inherent in sifting through these data to get facts that can help substantiate decision making, and timely so. Therefore, the sub-title (or question) posed in this sub-section can be answered by a counter question:
who does not need indefinite success?

References:

1. Gravic, Inc. (2010). The evolution of real-time business intelligence. Available from: http://www.gravic.com/shadowbase/whitepapers.html (Accessed 24 May 2010).

2. Watson, J.H., Wixom, B.H., Hoffer, J.A., Lehman, R.A. and Reynolds, A. (2000). Real-Time business intelligence: Best practices at Continental Airlines. Information Systems Management 23(1):7:18.
3. Schonberg, E., Cofino, T., Hoch, R., Podlaseck, M., and Spraragen, S.L. (2000). Measuring success. Communication of the ACM 43(8):53:57.
4. Martin, W. (2009). Agile corporate management. http://www.wolfgang-martin-team.net/research-notes.php (Accessed 15 June 2010).
5. Gathibandhe, H. (2010). How smart is Real-Time BI? Available from: http://www.information-management.com/infodirect/2009_152/real_time_business_intelligence-10017057-1.html?pg=1 (Accessed 14 June 2010).
6. Azvine, B., Cui, Z., Nauck, D.D. and Majeed, B. (n.d.). Real time business intelligence for the adaptive enterprise. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.101.194&rep=rep1&type=pdf (Accessed 14 June 2010).
7. Ghemawat, P. and Levinthal, D. (2008). Choice interaction and business strategy. Management Science 54(9):1638:1651

Monday, May 24, 2010

RTBI Defined

Business intelligence (BI) is viewed as a single view of operational events as captured through historical organizational data, which have been collated from the firm’s disparate data sources, transformed and availed to intended recipients (Gravic, Inc., 2010), through analytical and reporting tools. Such data mainly arm an organization with only a strategic tool for planning (Lehman, Watson, Wixom and Hoffer, n.d and Gravic, Inc., 2010). Organizations are enabled to determine issues pertinent to IT alignment to strategy and future planning, which is inevitable towards enhancing performance and maximizing profits (Oh and Pinsonneault, 2007). This has been the rationale behind the adoption of BI applications by modern organizations. BI is promising the much needed analysis of data harboured by the organization, and sometimes sitting in silo-like systems that hardly communicate to one another eventhough they serve the same master - the organization. The product of this data analysis is facts which reflect the status quo of business operations. Business leaders thus base their decision making on such facts, which pose as a ‘single version of operational truth’ (Gravic, incl., 2010). Trends that are delineated by historical data help these leaders check the likely future business complexion, and possibly leverage on this information to gear the business better handle what’s coming ahead.

However, due to the constant flux of doing business in the contemporary world, coupled with unfettered boundaries made possible by globalization, there seems to be a strong need for tactical decision making by businesses (Lehman et al. 2010 and Gravic, Inc., 2010). Although these authors are not writing-off the benefits that are inherent within traditional BI, but emphasize that the solution offered by these applications are not sufficient to cater for all the instant decision making needed for optimal performance. Gravic incl. (2010) further gives an example of a fraudulent-appearing ATM transaction, which instigates a faster action to be taken by a banking institution. Since our traditional BI processes stale data, that is, data dating back many hours, weeks, even months. Such data no matter how accurately processed may not give a timely alert to the bank handling a fraudulent transaction to timely stem this (and similar unruly behaviours) just when the need arises. Instead of relying on historical (or stale) data, RTBI offers an organization the capability to act as and when threatening or favourable events emerge. It thus shortens the latency that dominates extant BI applications. The consequence is reduced reaction times to the various and capricious business events. Again, since this capability is only essential to correct situations or seize emerging opportunities, it is referred to as tactical support. However, RTBI is also a strategic tool as much as it is a tactical tool in that historical data still form a basis for analysis. It depends on the user role of the decision maker which data (in terms of time) is much relevant for their decision support needs (Lehman et al.2010).

In addition, an RTBI should not be seen as a substitute for management information systems (MIS), which processes predefined reports and presents them for analysis to responsible individuals. Like RTBI, such reporting enables business managers to take tactical action as prompted by the business environment. However, this enables only selected sections of business to do so, that is, departmental (or divisional) managers are enabled to make tactical decisions for the individual units they are in charge of. An RTBI could indicate a problem associated with a specific business process as MIS would, but also offer the benefit of contrasting business process execution spanning time and space. For example, an MIS might provide a report that indicates success in business execution for one among a litany of business processes (belonging to a firm), meanwhile an RTBI over and above the latter could aggregate performance of the entire business processes.


In sum, what the various MIS adopted by a business could report on today about business execution, is reported tomorrow (or after some measured latency) by traditional BI, and as happening by RTBI. Note that ‘could report’ is italicized meaning: it is very unlikely that the different MIS of a business enterprise report on an event by themselves. It will take human intervention to collate information from these systems and report something meaningful to business. RTBI is however endowed with the collate, analyze and report capabilities which supplement human intervention for speedy response to be achieved (Watson, Wixom, Hoffer, Lehman and Reynolds, 2000). It unlocks an instant event indication feature, allows MIS capability and provides a strategic platform for future planning (Lehman et al. 2010; Gravic incl., 2010; Watson et al. 2010).


References:

1. Lehman, A. R., Watson, H.J. Wixom, B.H. Hoffer, J.A. (n.d). Continental Air-lines fly high with real-time business intelligence. Available from: http://www.fuqua-europe.duke.edu/centers/ccrm/datasets/continental/Continental_Airlines_Case_Study.pdf (Accessed 22 May 2010).
2. Oh, P. and Pinsonneault, A. (2007). On the assessment of strategic value of information technologies: conceptual and analytical approaches. MIS Quarterly 31(2):239:265.
3. Gravic, Inc. (24 May 2010). The evolution of real-time business intelligence. Available from: http://www.gravic.com/shadowbase/whitepapers.html (Accessed 20 May 2010).
4. Watson, J.H., Wixom, B.H., Hoffer, J.A., Lehman, R.A. and Reynolds, A. (2000). Real-Time business intelligence: Best practices at Continental Airlines. Information Systems Management 23(1):7:18.