Should guilty seek asylum here,If you read the last line, it stresses the need to display the creator's glory. An architect should leave behind his glory after he is long gone from the project. That should be the true job description. It should be the one thing that the people would still talk about after he has resigned. The rest of the responsibilities are just enablers for the ultimate glory.
Like one pardoned, he becomes free from sin.
Should a sinner make his way to this mansion,
All his past sins are to be washed away.
The sight of this mansion creates sorrowing sighs;
And the sun and the moon shed tears from their eyes.
In this world this edifice has been made;
To display thereby the creator's glory.
Senthil On Data
Lessons learnt working with data models, datawarehousing, Business Intelligence and MDM
Tuesday, June 21, 2011
As an architect...
Personal Analytics - Excel & Qlikview
- I used Excel for feeding & maintaining data. I profiled attributes of all kinds for this.
- I used Qlikview for BI analytics.
Tuesday, June 14, 2011
Metadata Management - Scratching your own itch
Tuesday, June 07, 2011
Design Documentation - Batons in a relay race
I had always stayed away from "Documentation", because I never found a usage for it except during audit or knowledge transition. And even in audit, nobody cares for the quality of the document; the auditors just check for the existence of the document.
1. Light - so that its light and easy to carry.
2. Interesting - so that the consumer opens it and uses it frequently (and)
3. Do its job - so that the transition/passover is easy
Light
Saturday, December 19, 2009
Can BI Strategy ever become real?
The problem is not with the BI consultant; the problem is with prediction. BI is such a game where the variables are too many - money, business benefit, pain problems, information maturity, tool consolidation, vendor proliferation, data volumes, system integrators, application support, advanced visualization, data conformance, data quality, stewardship, etc....The list just goes on and on. BI Strategy almost turns into a weather forecasting system. So is there no answer to being real? There is. Answer is "Stop doing it. Get Real."
BI comes with a cost. Its not something that you can purchase it during a sale. BI is something that every organization needs. It has become ubiquitous. A strategy is just a sales tool to your governance board for approval. Do you need one? Why do you want to spend on a sales tool to prove that it is required for your organization? Would you construct a business case for seeking an admission for your son or daughter into the IIMs or the MITs of the world. Instead spend it on building a 60-day data mart. Make the users use it for a month. After a month, pull the plug off. The # of calls you recieve to get the system back would talk about the ROI of BI.
Let me know what you think.
Sunday, July 05, 2009
Operational BI - Part 2
The four important blocks to be considered while designing an O-BI system are
- Sourcing/Extraction Module
- Transformation & Load Module
- Data Retention Module
- Reporting Module
Transformation & Load Module discusses the kind of loading tool-set that would suit an O-BI system. Details about the expected load volumes, the loading patterns and the hand-shaking mechanisms with the source will be discussed
Data Retention Module discusses about the parameters required for estimating the size of the sliding data storage windows.
And finally the Reporting Module discusses the kind of reports that an operational executive would need for taking his tactical decision on a hour-hour basis.
These sections would be discussed in detailed in my further posts.
Saturday, April 18, 2009
Operational BI - Part 1
Let us drive the need for implementation of an operational BI solution with an example.
A store manager at a retail outlet manages various aspects of retailing - visual merchandising, customer experience, resource scheduling, loss prevention, product management (ordering, receiving, pricing, inventory). Let me explain each one of these facets of the retailing business briefly.
- Visual Merchandising: Promotion of the sale of good through visual appeal in the stores (source: Wikipedia).
- Customer Experience: Reduced customer wait-time in the check-out counters.
- Resource Scheduling: Monitoring the efficiency of the employee schedule for improved load balance of employee work-hours.
- Loss prevention: Real-time monitoring of 'shrinkage' because of shoplifting, employee embezzlement, credit card fraud, system errors and many more.
- Product Management: Real-time monitoring of product inventory.
Let us assume that the Store Manager has access to a reporting solution which refreshes once in a day. He notices that the daily sales has dropped as compared to the previous day. He drills further down to investigate the cause of the decline. He finds out that the drop can be traced to one particular hour in the day. A deeper look into the problem highlighted the issue of an increased average customer wait-time per hour causing a poor conversion rate. The wait time finally was attributed to reduced work-force in that hour because of an increased lunch break taken by the employees (since they turned up very early to work).
This problem could have been easily rectified if the store manager had access to data earlier than what he had. Had he had real-time access, he would have noticed the dip in sales for that hour immediately and would have taken corrective action, thereby not affecting the sales during that hour. With a decent business case established for a real-time BI system, let's analyse what an operational BI is and how does it facilitate to solve the problem.
The architecture of Operational BI and the challenges associated with it will be posted in the next article.
Tuesday, October 21, 2008
Infobright's column based datawarehousing
Some of the key customers of Infobright are RBC Royal Bank and Xerox. They claim their product would be ideal for data warehouses ranging from 500GB to 30TB. Their compression ratios are close to 40:1 according to their community blogs. The most attractive feature about them was the compatibility with the existing Business Intelligence tools like Business Objects and Pentaho.
I wasn't very convinced with the concurrency offered by them. It supports 50-100 users with 5-10 concurrent queries. I will watch for the progress of this new exciting player in the already crowded BI market.
Thursday, July 10, 2008
Release Early Release Often
User Thrill
Important features of the application were phased out for various distinct releases. Some of them were Hierarchy & Workflow management, Security and Exception reporting. And the duration between releases were as close as 2 weeks. That meant, the user saw features getting added once in 2 weeks. We captured the user feedback about the releases and made sure we corrected it in the immediate ones. This approach had a two prong benefit. User experienced the application very, very early and we experienced the bugs. By the time, the UAT phase reached us, the application had reached a near-to-zero defect zone. We were a bit skeptical whether the user participation would be high, but since the product was there to be played with, it naturally attracted them.
Incremental Application testing
The application was getting tested from the day the first beta was released; rather from the “Go Live” day. Although this created few negative impressions on the user experience due to few unpleasant bugs; they knew that it was in its beta stages and the next release would have the patched version. In fact, our testing team grew from a 3 member team to a 6 member virtual team (There were 3 business users).
Support framework
To enable such a dynamic release process, the revision control and the code review/release systems should be efficient; there would be multiple releases instead of one. The integration testing should be solid. And the unit testing before the releases should be good enough not to distract your users completely; dissolving the purpose. Meticulous planning of the releases will also form a key to the success. The development tools that you use should be agile and adaptable enough to accept and implement the user’s feedback for the next release.
Conclusion
The experiment turned out to be a success. This strategy would work for most of your implementations, unless it’s a maintenance project with less than a week’s duration of deliverable.
Wednesday, June 25, 2008
Which MDM approach is right for you?
- Operational MDM (the tougher among the two)
- Analytical MDM
Operational MDM enables synchronization of master entities and their attributes between the transaction processing systems. Why does one need such an MDM? Let's take an example. ABC Corporation is a manufacturing firm. It conducts roadshows and marketing campaigns to advertise its products. The salesperson collect customer information during those roadshows and feed it into their IT systems for further followup. There are a different set of sales representatives who conduct feedback on their products sold, with their customers. They too enter the customer feedback into their IT systems. These are 2 different sets of CRM processes.
Typically what happens in a mature company is, there are a set of batch processes which pick up the master data from one system and transfer it to the other. Now this introduces delay, inconsistency, inaccuracy of data and lot of manual reconciliation (same customer name can be entered by 2 different salesperson or the latest survey from a salesperson can erase previously collected information about the customer). So the IT develops custom programs to clean up the data, write reconciliation programs but still cannot manage to do all this in real time.
This mess can be reduced or eliminated by deploying an operational MDM. Operational MDM tools solve the synchronization problem using complex match-merge algorthims. Some of the tools currently in the market are Siperian, IBM, Purisma, Oracle and SAP.
Analytical MDM is an architectural approach if the problem revolves around inconsistent reporting for business performance management. In simple terms, inconsistent hierarchies are getting reported out. This needs for a unified reporting view of the master data. The audience for this system would be the downstream data warehousing and business intelligence applications. Some of the MDM vendors selling their expertise in this area are Kalido, Oracle, IBM.
It is essential that an organization has to build both these models to address their MDM needs. But which one to chose first depends on which problem is in their high priority list.