Category Archives: management

Enterprise Performance Clearly Explained With a Collaborative, Intuitive, Reporting Solution

In today’s Digital Age, the ability for management to clearly explain the quality and sustainability of corporate performance has become more important than ever.  Increasingly, the ability to value and explain the value of intangible assets is becoming a competitive differentiator.  Global and regulatory mandates around narrative reporting are also emerging, with the EU Directive on Non-Financial Reporting and SEC interest in making financial disclosure more effective.  

While there is a clear need for increased commentary and narrative in reporting, most performance reporting processes remain manual and ad-hoc.  The effort is time consuming, lacking process rigor and collaboration.  Errors are made in combining ‘data’ (what) with ‘narrative’ (who, when, why), especially with re-keying data.  In addition, organizations lack the ability to analyze the data to validate the narrative.  The disconnected nature of the process means it is difficult to bring subject matter experts into the process for centralized commentary.  Finally, there are auditability concerns and weak security around supporting “need to know” access to content.

In fact, in a recent survey, 90% of respondents agreed that expanding qualitative commentary in management reporting processes was critical to their organization. Yet, more than half of respondents were not confident in their tools to provide sufficient collaboration to produce that qualitative commentary.   

Oracle Enterprise Performance Reporting Cloud,  the newest offering in Oracle Enterprise Performance Management (EPM) Cloud, helps address these challenges.  It uniquely combines management, narrative and statutory reporting needs in a single, secure, and collaborative solution.  Complete authoring, collaboration, commentary, and report delivery capabilities streamline the process.  You can easily combine system of record data for more accurate reporting.  Secure, role-based auditable access on desktop and mobile devices enables the delivery of faster, meaningful insights to all stakeholders, anytime, anywhere.

Oracle Enterprise Performance Reporting Cloud combines data and narrative, providing a single web interface for report package contributors.  Report package owners define, manage, monitor and interact with content through this interface, while assigned users see only the content applicable to the role they have been assigned. In addition, users can easily take a deeper dive into the data without leaving the application.  Oracle Enterprise Performance Reporting Cloud includes the ability to perform multi-dimensional and other analysis on financial data.

The solution enables business users to participate in the narrative reporting process through the web interface on a variety of devices, including desktops and tablets.  Collaboration throughout the process is key to getting the most accurate picture possible, and helps shrink the time it takes to define, produce and deliver reports.

Increased demand, both internally and externally, for information, plus many data sources can make it challenging to have confidence in the results reported.  Oracle Enterprise Performance Reporting Cloud enables you to easily combine system of record data into your narrative reporting.  Authors can integrate both on-premises and cloud-based EPM and BI data sources directly, as well as integrate data from Oracle and other ERP systems, thereby leveraging existing IT investments.  This helps provide trust and reliability that the numbers and information are accurate.

We have seen tremendous interest from customers looking to “standardize” on a platform for narrative-based performance reporting.  Reporting needs range from quarterly or annual reports for external stakeholders, to internal management and business performance reviews, as well as periodic reports submitted to industry agencies, sustainability reporting, and more.

“We find Oracle Enterprise Performance Reporting Cloud extremely intuitive and easy to use.  The cloud-based nature of this solution, along with strong collaborative and security features, will help streamline the time it takes our clients to produce and deliver reports.”  Neil Sellers, Director Qubix

Stay tuned for more exciting news around customer adoption in the coming months!

To learn more about Oracle Enterprise Performance Reporting Cloud, click here.

Disclaimer:
1)Oracle, Oracle Hyperion, Hyperion and Java are registered trademarks of Oracle and / or its Affiliates
2)Microsoft is a registered trademark of Microsoft and / or its affiliates
3)Any other trademark, name, logo, images, etc. are copyright and trademark of its respective owner which also includes Innov8 Infinite Technology Pvt.Ltd.

How EPM and Six Sigma Intersect

There are so many wonderful business tools and methodologies out there that can help us monitor, analyze, set strategy and improve efficiency, etc., but can they all work together? Where do they connect? In this post I will focus on how EPM and Six Sigma intersect.


Six Sigma is a disciplined, data-driven approach and methodology for eliminating defects (driving toward six standard deviations between the mean and the nearest specification limit) in any process – from manufacturing to transactional and from product to service.  The principals of Six Sigma were originally were created by William Deming in his rebuilding of Japanese manufacturing industry post-WWII by applying statistical methods to measure, test, and improve design, quality and service.  By the 1980s, Six Sigma management techniques had been adopted more broadly for business process improvement and U.S. manufacturers such as Motorola, GE, Honeywell, and Dow competing in the global market.  By the 1990s, Six Sigma transcended manufacturing as Ritz Carlton Hotels applied total quality management and process improvement techniques to delivering five-star luxury service for their guests and were recognized twice with the Malcolm Baldrige National Quality Award by the U.S. Department of Commerce.

The Six Sigma method, when employed properly, aligns your organization and processes to achieve efficiency and a standard quality (whatever the standard should be).

Enterprise Performance Management is focused on

* Setting strategy for the company, including
        – Which products/services should be the focus in order to be competitive
        – Who are the desirable customers
        – Which markets to play in
        – What are the short and longer term goals
* Setting budgets, simulating forecasts
* Monitoring strategy execution
* Adjusting the strategy based on outcomes
* Reporting on the financial outcomes
* Repeat

To be very successful, the two methods should be employed together – EPM setting the desired strategy, Six Sigma providing the optimal processes and products/services to achieve the strategy; Six Sigma reporting on the outputs of the company and EPM reporting on strategic and financial outcomes.

Six Sigma’s job is primarily focused on lean operations, eliminating waste and inefficiencies  from monitoring feedback to knowing what’s working and what’s not, and when to ask what-if, making adjustments  based on that feedback for continuous improvement, etc. – where Enterprise Performance Management has both an internal and external view. It is simply not possible to set your near or long term strategy successfully without having an understanding of the external markets, external  customer sentiment, competitors’ movements and of course R&D on new products and services.

Without getting too philosophical, EPM typically functions assuming products and services are being made well and focuses on setting strategy and executing the strategy. Six Sigma focuses on making the products and services well and assumes that they are the right products and services to be made and delivered. In my opinion, they need to work hand-in-hand to successfully achieve your strategy.

For more information about Oracle Enterprise Performance Management (EPM), click here






Disclaimer:
1)Oracle, Oracle Hyperion, Hyperion and Java are registered trademarks of Oracle and / or its Affiliates
2)Microsoft is a registered trademark of Microsoft and / or its affiliates
3)Any other trademark, name, logo, images, etc. are copyright and trademark of its respective owner which also includes Innov8 Infinite Technology Pvt.Ltd.

B/E Aerospace Wins Business Analytics Innovation Award!

Todd Renard, Senior Manager – Financial Planning & Analysis for B/E Aerospace was very excited to receive the Oracle Business Analytics Innovation Award at Oracle OpenWorld 2014 for the company’s impressive results achieved with Oracle Enterprise Performance Management solutions.

B/E Aerospace is the worldwide leading manufacturer of aircraft passenger cabin interior products for commercial and business jet aircraft. The company, which was growing rapidly through a series of acquisitions, decided to adopt Oracle Enterprise Performance Management solutions to drive innovation and organizational change.

They took a three phased approach:


*PHASE I – Prove the value of the Hyperion solutions to senior management by leveraging the applications to meet company goals
*PHASE II – Build a superior financial end-to-end solution for monthly, quarterly, and annual reporting
*PHASE III – Build scalable daily financial reporting & analysis applications in order to make better decisions faster

In just nine months, the company completed a full-scale implementation that was delivered on time and under budget. As a result, B/E Aerospace has reduced by 80 percent the amount of time it takes to mine data from more than 30 sources. And the business can also acquire new companies and integrate their financials in three to four weeks instead of six months—dramatically speeding assimilation and supporting their acquisition strategy. 

Click here to watch the short video.

Disclaimer:
1)Oracle, Oracle Hyperion, Hyperion and Java are registered trademarks of Oracle and / or its Affiliates
2)Microsoft is a registered trademark of Microsoft and / or its affiliates
3)Any other trademark, name, logo, images, etc. are copyright and trademark of its respective owner which also includes Innov8 Infinite Technology Pvt.Ltd.

Why Not Data Quality?

By: John Siegman – Applications Sales Manager for Master Data Management and Data Quality, & Murad Fatehali – Senior Director with Oracle’s Insight team leading the Integration practice in North America.

Big data, business intelligence, analytics, data governance, data relationship management, the list of data oriented topics goes on. Lots of people are talking about all of these and yet very few people talk about poor data quality, let alone do anything about it. Why is that?

We think it’s because Data Quality suffers from the “Do I have to?” (DIHT) syndrome. Anyone with kids or anyone who was a kid will recognize this syndrome as in, “Clean up your room.” “Do I have to?” Dealing with poor quality data is not glamorous and it doesn’t get the headlines. Installing business intelligence systems or setting up data governance, gets the headlines and makes careers. But what good is better reporting and structure if the underlying data is junk?

Recently we were in a half day planning session for an existing customer. The customer wanted to know what they could do better using the existing software they had already purchased as Phase 1 of the project and what they would need to acquire to do things better for Phase 2. Reviews like this are critically important, as people change on both sides of the customer/vendor relationship, to ensure knowledge transfer and reaffirmation of goals. The customer provided access to numerous departments across their company for interviews and focus groups. All of this information was gathered, reviewed, summarized and suggestions were made. Excel spreadsheets and PowerPoints ensued. Even though the Aberdeen Group and others have shown significant performance increases in established ERP and other business systems through the use of Data Quality and Master Data Management, because the customer did not directly say they had a data issue (and very few customers ever admit this because poor data is just standard operating procedure), no emphasis was put on data quality as a way of improving the customer’s processes and results with their existing software packages.
What is it about data quality that makes it the option of last resort? The go to, when all else fails. It’s got to be the belief that the data in underlying systems and sources is good by default. I mean, really, who would keep bad data around? Well, pretty much everyone. Because if you don’t know that it’s bad, you end up keeping it around.

Let’s admit it, DQ is not glamorous. There are no DQ-er of theyear awards. People in DQ don’t typically have their names on a parking spot right up front in the corporate lot. And besides not being glamorous, it’s hard. Very rarely do we see someone ‘own’ data quality – after all since bad data affects multiple people, across multiple functions, no one really has the right incentives to drive data quality improvements where the resulting benefits accrue to multiple constituencies. Nobody really wants to spend their functional budgets fixing enterprise-wide data problems. Some of the very early DQ adopting companies have teams of people, representing a cross-section of processes and functions, who spend their days manually inspecting data and creating internal systems to meet their specific data quality needs. They are very effective at what they do, but not as efficient as they could be because the whole is greater than the sum of its parts. Also, most of the data knowledge is in their heads and that’s really hard to replicate and subject to loss due to job switching or retirement or the possible run in with the proverbial bus.

 So, given the underwhelming desire to fix poor data, how do you get the powers that be in your company to see the light? In our last article Data Quality, Is it worth it? How do you know? we examined the value of data quality based on units of measure that were meaningful to the given organization. To paraphrase Field of Dreams, if you build the ROI, they will come. The first step to building the ROI, is understanding how poor your data is and what impact that has on your organization. Typically that starts with a Data Quality Health Check.

 A DQ Heath Check takes a sample of your data and looks at varying aspects to determine the quality level of your data. The aspects examined include: Consistency, Completeness, Accuracy, and Validity. These measures attempt to answer the question, Is your data fit for purpose? Consistency looks at the validation of the data within a variable. For example if the variable in question only allows for Ys and Ns, Ms and Ts will lower the consistency rating. Completeness is just that, how complete is the data in your database? Using our previous example if Ys and Ns are only present 20% of the time, your data for that variable is fairly incomplete. Accuracy looks at a number of things but mostly represents data within the bounds of expectations. And Validity looks at usefulness. For example, telephone numbers are typically 10 digits. Phone numbers without area codes or with letters while complete and possibly consistent are not valid.

In another recent customer engagement, we looked at customer records for data anomalies specifically for consistency, completeness, accuracy, and validity. We found that fixing these records resulted in improvements not only in marketing (campaign effectiveness), but also improved service (customer experience), higher collections in finance (lower receivables), and improved reporting. In today’s data rich, integrated, system-driven processes, improving data quality in one part of the organization (whether it be customer data, supplier data, financial data) benefits multiple organizational functions and processes.

So while data quality will never be glamorous for individuals, with a little insight providing a strong ROI for DQ we can move this from Do I Have To? to Let’s Do This.

Disclaimer:
1)Oracle, Oracle Hyperion, Hyperion and Java are registered trademarks of Oracle and / or its Affiliates
2)Microsoft is a registered trademark of Microsoft and / or its affiliates
3)Any other trademark, name, logo, images, etc. are copyright and trademark of its respective owner which also includes Innov8 Infinite Technology Pvt.Ltd.

Data Quality, Is it worth it? How do you know?

By: John Siegman – Applications Sales Manager for Master Data Management and Data Quality, & 

Murad Fatehali – Senior Director with Oracle’s Insight team leading the Integration practice in North America.

You might think that the obvious answer to the title question would be to fix them, but not so fast. As heretical as this might be to write, not all data quality problems are worth fixing. While the data purists will tell you that every data point is worth making sure it is a correct data point, we believe you should only spend money fixing data that has a direct value impact on your business. In other words, what’s the cost of bad data?

What’s the cost of bad data? That’s a question that is not asked often enough. When you don’t understand the value of your data and the costs associated with poor data quality, you tend to ignore the problem which tends to make matters worse, specifically for initiatives like data consolidation, big data, customer experience, and data mastering. The ensuing negative impact has wider ramifications across the organization – primarily for the processes that rely on good quality data. All business operations systems like, ERP, CRM, HCM, SCM, EPM, that businesses run on assume that the underlying data is good.

Then what’s the best approach for data quality success? Paraphrasing Orwell’s Animal Farm, “all data is equal, some is just more equal than others”. What data is important and what data is not so important is a critical input to data quality project success. Using the Pareto rule, 20% of your data is most likely worth 80% of your effort. For example, it can be easily argued that financial data have a greater value as they are the numbers that run your business, get reported to investors and government agencies, and can send people to jail if they’re wrong. The CFO, who doesn’t like jail, probably considers this valuable data. Likewise, a CMO understands the importance of capturing and complying with customer contact and information sharing preferences. Negligent marketing practices, due to poor customer data, can result in non-trivial fines and penalties, not to mention bad publicity. Similarly, a COO may deem up-to-date knowledge of expensive assets as invaluable information, along with description, location, and maintenance schedule details. Any lapses here could mean significant revenue loss due to unplanned downtime. Clearly, data value is in the eye of the beholder. But prioritizing which data challenges should be tackled first needs to be a ‘value-based’ discussion.

How do you decide what to focus on? We suggest you focus on understanding the costs of poor data quality and management and then establishing a metric that is meaningful to your business. For example, colleges might look at the cost of poor data per student, utilities the cost of poor data per meter, manufacturers the cost of poor data per product, retailers the cost of poor data per customer, or oil producers the cost of poor data per well. Doing so makes it easy to communicate the value throughout your organization and allows anyone who understands the business to size the cost of bad data. For example, our studies show that on campus data quality problems can cost anywhere from $70 to $480 per student per year. Let’s say your school has 7,500 students and we take the low end of the range at $100 per student. That’s a $750,000 per year data quality problem. As another example, our engagement with a utility customer estimated that data quality problems can cost between $5 to $10 per meter. Taking the low value of $5 against 400,000 meters quantifies the data quality problem at $2,000,000 annually. Sizing the problem lets you know just how much attention you should be paying to it. But this is the end result of your cost of poor data quality analysis. Now that we know the destination, how do we get there?

To achieve these types of metrics you have to assess the impact of bad data on your enterprise by engaging all of the parties that are involved in attempting to get the data right, and all of the parties that are negatively affected when it is wrong. You will need to go beyond the creators, curators and users of the data and also involve IT stakeholders and business owners to estimate: impact on revenues; cost of redundant efforts in either getting the data or cleaning it up; the number of systems that will be impacted by high quality data; cost of non-compliance; and cost of rework. Only through this type of analysis can you gain the insight necessary to cost-justify a data quality and master data management effort.

The scope of this analysis is determined by the focus of your data quality efforts. If you are taking an enterprise-wide approach then you will need to deal with many departments and constituencies. If you are taking a Business Unit, functional or project focus for your data quality efforts, your examination will only need to be done on a departmental basis. For example, if customer data is the domain of analysis, you will need to involve subject matter experts across marketing, sales, and service. Alternatively, if supplier data is your focus, you will need to involve experts from procurement, supply-chain, and reporting functions.

Regardless of data domain, your overall approach may look something like this:

  1. Understanding business goals and priorities
  2. Documenting key data issues and challenges
  3. Assessing current capabilities and identifying gaps in your data
  4. Determining data capabilities and identifying needs
  5. Estimating and applying benefit improvement ranges
  6. Quantifying potential benefits and establishing your “cost per” metric
  7. Developing your data strategy and roadmap
  8. Developing your deployment timeline and recommendations

Going through this process ensures executive buy-in for your data quality efforts, gets the right people participating in the decisions that will need to be made, and provides a plan with a ROI which will be necessary to gain the necessary approvals to go ahead with the project.

Be sure to focus on: Master Data Management @ OpenWorld

Disclaimer:
1)Oracle, Oracle Hyperion, Hyperion and Java are registered trademarks of Oracle and / or its Affiliates
2)Microsoft is a registered trademark of Microsoft and / or its affiliates
3)Any other trademark, name, logo, images, etc. are copyright and trademark of its respective owner which also includes Innov8 Infinite Technology Pvt.Ltd.

How Do You Know if You Have a Data Quality Issue?

By: John Siegman –
Applications Sales Manager for Master Data Management and Data Quality & Murad Fatehali – Senior Director with
Oracle’s Insight team and leads the Integration practice in North America.

Big Data, Master Data Management, Analytics are all topics
and buzz words getting big play in the press.  And they’re all important as today’s storage and computing
capabilities allow for automated decision making that provides customers with
experiences more tailored to them as well as provides better information upon
which business decisions can be made. 
The whole idea being, the more you know, the more you know.

Lots of companies think they know that they should be doing
Big Data, Master Data Management and Analytics, but don’t really know where to
start or what to start with.  My
two favorite questions to ask any prospective customer while discussing these
topics are: 1) Do you have data you care about? And, 2) Does it have
issues?  If the answers come back
“Yes” and “Yes” then you can have the discussion on what it takes to get the
data ready for Big Data, Master Data Management and Analytics.  If you try any of these with lousy
data, you’re simply going to get lousy results.

But, how do I know if I’ve got less than stellar data?  All you have to do is listen to the
different departments in your company and they will tell you.  Here is a guide to the types of things
you might hear.

You know you have poor data quality if MARKETING says:

1. We have issues with privacy management and
customer preferences

2. We can’t do the types of data matching and
enhancement we want to do

3. There’s no way to do data matching with internal
or external files

4. We have missing data but we don’t know how much
or which variables

5. There’s no standardization or data governance
6. We don’t know who our customer is
7. We’ve got compliance issues

You know you have poor data quality if SALES says:

1. The data in the CRM is wrong and needs to be
re-entered and is outdated

2. I have to go through too many applications to
find the right customer answers

3. Average call times are too long due to poor data
and manual data entry

4. I’m spending too much time fixing data instead
of selling

You know you have poor data quality if BUSINESS INTELLIGENCE
says:

1. No one trusts the data so we have lots of Excel
spreadsheets and none of the numbers match

2. It’s difficult to find data and there are too
many sources

3. We have no data variables with consistent
definitions

4. There’s nothing to clean the data with
5. Even data we can agree on, like telephone
number, has multiple formats

You know you have poor data quality if OPERATIONS or FINANCE
says:

1. The Billing report does not match the BI report

2.   1. Payment information and address information does
not match the information in the Account Profile

3. Accounts closed in Financial Systems show up as
still open in CRM system or vice versa where customers get billed for services
terminated

4. Billing inaccuracies are caught during checks
because there are no up-front governance rules

5. Agents enter multiple orders for the same
service or product on an account

6. Service technicians show up on site with wrong
parts and equipment which then requires costly repeat visits and negatively
impacts customer satisfaction

7. Inventory systems show items sales deemed OK to
sell while suppliers may have marked obsolete or recalled

8. We have multiple GLs and not one single version
of financial truth

You know you have poor data quality if IT says:

1. It’s difficult to keep data in synch across many
sources and systems

2. Data survivorship rules don’t exist
3. Customer Data types (B2B, end user in B2B,
customer in B2C, account owner B2C) and status (active, trial, cancelled, etc.)
changes for the same customer over time and it’s difficult to keep track
without exerting herculean manual effort

You know you have poor data quality if HUMAN RESOURCES says:
1. First have to wait for data, then when it is
gathered and delivered we need to work to fix it

2. Ten-percent of our time is wasted due to waiting
on things or re-work cycles

3. Employee frustration with searching, finding,
and validating data results in churn, and will definitely delay re-hire of
employees

4. Incorrect competency data results in: a)
productivity loss in terms of looking at the wrong skilled person; b) possible
revenue loss due to lack of skills needed; and c) additional hires when none
are needed

You know you have poor data quality if PROCUREMENT says:

1. Not knowing our suppliers impacts efficiencies
and costs

2. FTEs in centralized sourcing spend up to 20% of
their time fixing bad data and related process issues

3. Currently data in our vendor master, material
master and pricing information records is manually synched since the data is
not accurate across systems.  We
end up sending the orders to the wrong suppliers

4. Supplier management takes too much time
5. New product creation form contains wrong inputs
rendering many fields unusable

6. Multiple entities: 1) Logistics, 2) Plants, 3)
Engineering, 4) Product Management, enter or create Material Master information
.  We cannot get
spend analytics

7. We have no good way of managing all of the
products we buy and use

You know you have poor data quality if PRODUCT MANAGEMENT
says:

1. Product development and life-cycle management
efforts take longer and cost more

2. We have limited standards and rules for product
dimensions.  We need to manually
search missing information available elsewhere

3. Our product data clean-up occurs in pockets
across different groups, the end result of these redundant efforts is
duplication of standards

4. We make status changes to the product lifecycle
that don’t get communicated to Marketing and Engineering in a timely manner.  Our customers don’t know what the
product now does

All of these areas suffer either individually or together
due to poor data quality.  All of
these issues impact corporate performance which impacts stakeholders which
impacts corporate management.  If
you’re hearing any of these statements from any of these departments you have a
data quality issue that needs to be addressed.  And that is especially true if you’re considering any type
of Big Data, Master Data Management or Analytics initiative.

Disclaimer:
1)Oracle, Oracle Hyperion, Hyperion and Java are registered trademarks of Oracle and / or its Affiliates
2)Microsoft is a registered trademark of Microsoft and / or its affiliates
3)Any other trademark, name, logo, images, etc. are copyright and trademark of its respective owner which also includes Innov8 Infinite Technology Pvt.Ltd.

How Do You Know if You Have a Data Quality Issue?

By: John Siegman –
Applications Sales Manager for Master Data Management and Data Quality & Murad Fatehali – Senior Director with
Oracle’s Insight team and leads the Integration practice in North America.

Big Data, Master Data Management, Analytics are all topics
and buzz words getting big play in the press.  And they’re all important as today’s storage and computing
capabilities allow for automated decision making that provides customers with
experiences more tailored to them as well as provides better information upon
which business decisions can be made. 
The whole idea being, the more you know, the more you know.

Lots of companies think they know that they should be doing
Big Data, Master Data Management and Analytics, but don’t really know where to
start or what to start with.  My
two favorite questions to ask any prospective customer while discussing these
topics are: 1) Do you have data you care about? And, 2) Does it have
issues?  If the answers come back
“Yes” and “Yes” then you can have the discussion on what it takes to get the
data ready for Big Data, Master Data Management and Analytics.  If you try any of these with lousy
data, you’re simply going to get lousy results.

But, how do I know if I’ve got less than stellar data?  All you have to do is listen to the
different departments in your company and they will tell you.  Here is a guide to the types of things
you might hear.

You know you have poor data quality if MARKETING says:

1. We have issues with privacy management and
customer preferences

2. We can’t do the types of data matching and
enhancement we want to do

3. There’s no way to do data matching with internal
or external files

4. We have missing data but we don’t know how much
or which variables

5. There’s no standardization or data governance
6. We don’t know who our customer is
7. We’ve got compliance issues

You know you have poor data quality if SALES says:

1. The data in the CRM is wrong and needs to be
re-entered and is outdated

2. I have to go through too many applications to
find the right customer answers

3. Average call times are too long due to poor data
and manual data entry

4. I’m spending too much time fixing data instead
of selling

You know you have poor data quality if BUSINESS INTELLIGENCE
says:

1. No one trusts the data so we have lots of Excel
spreadsheets and none of the numbers match

2. It’s difficult to find data and there are too
many sources

3. We have no data variables with consistent
definitions

4. There’s nothing to clean the data with
5. Even data we can agree on, like telephone
number, has multiple formats

You know you have poor data quality if OPERATIONS or FINANCE
says:

1. The Billing report does not match the BI report

2.   1. Payment information and address information does
not match the information in the Account Profile

3. Accounts closed in Financial Systems show up as
still open in CRM system or vice versa where customers get billed for services
terminated

4. Billing inaccuracies are caught during checks
because there are no up-front governance rules

5. Agents enter multiple orders for the same
service or product on an account

6. Service technicians show up on site with wrong
parts and equipment which then requires costly repeat visits and negatively
impacts customer satisfaction

7. Inventory systems show items sales deemed OK to
sell while suppliers may have marked obsolete or recalled

8. We have multiple GLs and not one single version
of financial truth

You know you have poor data quality if IT says:

1. It’s difficult to keep data in synch across many
sources and systems

2. Data survivorship rules don’t exist
3. Customer Data types (B2B, end user in B2B,
customer in B2C, account owner B2C) and status (active, trial, cancelled, etc.)
changes for the same customer over time and it’s difficult to keep track
without exerting herculean manual effort

You know you have poor data quality if HUMAN RESOURCES says:
1. First have to wait for data, then when it is
gathered and delivered we need to work to fix it

2. Ten-percent of our time is wasted due to waiting
on things or re-work cycles

3. Employee frustration with searching, finding,
and validating data results in churn, and will definitely delay re-hire of
employees

4. Incorrect competency data results in: a)
productivity loss in terms of looking at the wrong skilled person; b) possible
revenue loss due to lack of skills needed; and c) additional hires when none
are needed

You know you have poor data quality if PROCUREMENT says:

1. Not knowing our suppliers impacts efficiencies
and costs

2. FTEs in centralized sourcing spend up to 20% of
their time fixing bad data and related process issues

3. Currently data in our vendor master, material
master and pricing information records is manually synched since the data is
not accurate across systems.  We
end up sending the orders to the wrong suppliers

4. Supplier management takes too much time
5. New product creation form contains wrong inputs
rendering many fields unusable

6. Multiple entities: 1) Logistics, 2) Plants, 3)
Engineering, 4) Product Management, enter or create Material Master information
.  We cannot get
spend analytics

7. We have no good way of managing all of the
products we buy and use

You know you have poor data quality if PRODUCT MANAGEMENT
says:

1. Product development and life-cycle management
efforts take longer and cost more

2. We have limited standards and rules for product
dimensions.  We need to manually
search missing information available elsewhere

3. Our product data clean-up occurs in pockets
across different groups, the end result of these redundant efforts is
duplication of standards

4. We make status changes to the product lifecycle
that don’t get communicated to Marketing and Engineering in a timely manner.  Our customers don’t know what the
product now does

All of these areas suffer either individually or together
due to poor data quality.  All of
these issues impact corporate performance which impacts stakeholders which
impacts corporate management.  If
you’re hearing any of these statements from any of these departments you have a
data quality issue that needs to be addressed.  And that is especially true if you’re considering any type
of Big Data, Master Data Management or Analytics initiative.

Disclaimer:
1)Oracle, Oracle Hyperion, Hyperion and Java are registered trademarks of Oracle and / or its Affiliates
2)Microsoft is a registered trademark of Microsoft and / or its affiliates
3)Any other trademark, name, logo, images, etc. are copyright and trademark of its respective owner which also includes Innov8 Infinite Technology Pvt.Ltd.

MDM CHALLENGES BY HIGHER EDUCATION DOMAIN

Author: John Siegman 

How do you know if you have a Master Data
Management (MDM) or Data Quality (DQ) issue on your campus? One of the ways is to listen to the
concerns of your campus constituents. While none of them are going to come out and tell you that they have a
master data issue directly, by knowing what to listen for you can determine
where the issues are and the best way to address them.

What follows are some of
the key on-campus domains and what to listen for to determine if there is a MDM
or DQ issue that needs to be resolved.

Student: Disconnected
processes lacking coordination

· Fragmented
data across disparate systems, disconnected across groups for:

– data collection efforts
(duplicate/inconsistent student/faculty surveys)

– data definitions, rules, governance

– data access, security, and analysis

· Lack
of training around security/access further complicated due to number of sources

· No
information owner/no information strategy

· Student
attributes maintained across many systems

Learning: Does
not capture interactions

· Cannot
identify students at risk. Do not
capture interactions with students and faculty, and faculty interactions for
research support, etc.

· No
way to track how many undergraduates are interested in research

· Don’t
do any consistent analytics for course evaluations

· Difficult
and time consuming to gather information because of the federated nature of the
data – for example, job descriptions in HR are different than what is really
being used

· There
is no view of Student experience

HR:
Process inconsistencies, lack of data standards complicates execution

· Faculty
not paid by the university are not in the HCM system, while students receiving
payments from the university are in the HCM system

· Disconnected
process to issue IDs, keys, duplicate issues

· Given
multiplicity of data sources, accessing the data is a challenge

· Data
analytics capabilities and available reports are not properly advertised, so
people do not know what is available. As a consequence an inordinate amount of time is spent generating
reports

· Faculty/Staff
information collection is inconsistent, sometimes paper-based. Implication: lose applicants because it
is too difficult to complete the application process

Research: Getting
from data to insight is a challenge

· Very
time consuming to determine: Which proposals were successful? What type of awards are we best at
winning?

· Difficult
to understand: number of proposals, dollar value, by school, by department, by
agency, by time period

· Data
challenges in extracting data out of the system for grants, faculty, and making
it centrally available

Deans & Officers: Reporting is a challenge

· Significant
use of Excel, reporting is becoming unstable because of the amount of data in
the files

· Information
charter, a common retention policy does not exist

· A lot
of paper is generated for the domains we are covering. Converting paper to digital is a
challenge

· Collecting
information on faculty activity (publications) is a challenge. Data in documents requires validation

· Data
requests result in garbage. Donors
receiving the wrong information.

Finance: Has
little trust in data

· Do
not have workflow governance processes. Implication, information goes into the system without being reviewed, therefore errors
can make it into the records

· Systems
connected to ERP systems do not always give relevant or requested info

· Closing
the month or quarter takes too long as each school and each department has its
own set of GLs.

Facilities:
Efficiencies are hampered due to data disconnects

· Do
not have accurate space metrics due to outdated system, schools not willing to
share their info with Research Administrators and Proposal Investigators

· Do
not have utility consumption, building by building

· No clear
classroom assignment policy (a large room may be assigned to a small number of
students)

· Not
all classes are under the registrar’s control

· No
tool showing actual space for planning purposes

· Difficult
to determine research costs, without accurate access to floor plans and
utilization

· Cannot
effectively schedule and monitor classrooms

If your campus has data, you have data issues. As the push for students becomes more
competitive, being able to understand your current data, mine your social data,
target your alumni, make better use of your facilities, improve your supplier
relationships, and increase your student success will be dependent on better
data. The tools exist to take data
from a problem filled issue to a distinct competitive advantage. The sooner campuses adopt these tools,
the sooner they will receive the benefits of doing so.

Disclaimer:
1)Oracle, Oracle Hyperion, Hyperion and Java are registered trademarks of Oracle and / or its Affiliates
2)Microsoft is a registered trademark of Microsoft and / or its affiliates
3)Any other trademark, name, logo, images, etc. are copyright and trademark of its respective owner which also includes Innov8 Infinite Technology Pvt.Ltd.

MDM CHALLENGES BY HIGHER EDUCATION DOMAIN

Author: John Siegman 

How do you know if you have a Master Data
Management (MDM) or Data Quality (DQ) issue on your campus? One of the ways is to listen to the
concerns of your campus constituents. While none of them are going to come out and tell you that they have a
master data issue directly, by knowing what to listen for you can determine
where the issues are and the best way to address them.

What follows are some of
the key on-campus domains and what to listen for to determine if there is a MDM
or DQ issue that needs to be resolved.

Student: Disconnected
processes lacking coordination

· Fragmented
data across disparate systems, disconnected across groups for:

– data collection efforts
(duplicate/inconsistent student/faculty surveys)

– data definitions, rules, governance

– data access, security, and analysis

· Lack
of training around security/access further complicated due to number of sources

· No
information owner/no information strategy

· Student
attributes maintained across many systems

Learning: Does
not capture interactions

· Cannot
identify students at risk. Do not
capture interactions with students and faculty, and faculty interactions for
research support, etc.

· No
way to track how many undergraduates are interested in research

· Don’t
do any consistent analytics for course evaluations

· Difficult
and time consuming to gather information because of the federated nature of the
data – for example, job descriptions in HR are different than what is really
being used

· There
is no view of Student experience

HR:
Process inconsistencies, lack of data standards complicates execution

· Faculty
not paid by the university are not in the HCM system, while students receiving
payments from the university are in the HCM system

· Disconnected
process to issue IDs, keys, duplicate issues

· Given
multiplicity of data sources, accessing the data is a challenge

· Data
analytics capabilities and available reports are not properly advertised, so
people do not know what is available. As a consequence an inordinate amount of time is spent generating
reports

· Faculty/Staff
information collection is inconsistent, sometimes paper-based. Implication: lose applicants because it
is too difficult to complete the application process

Research: Getting
from data to insight is a challenge

· Very
time consuming to determine: Which proposals were successful? What type of awards are we best at
winning?

· Difficult
to understand: number of proposals, dollar value, by school, by department, by
agency, by time period

· Data
challenges in extracting data out of the system for grants, faculty, and making
it centrally available

Deans & Officers: Reporting is a challenge

· Significant
use of Excel, reporting is becoming unstable because of the amount of data in
the files

· Information
charter, a common retention policy does not exist

· A lot
of paper is generated for the domains we are covering. Converting paper to digital is a
challenge

· Collecting
information on faculty activity (publications) is a challenge. Data in documents requires validation

· Data
requests result in garbage. Donors
receiving the wrong information.

Finance: Has
little trust in data

· Do
not have workflow governance processes. Implication, information goes into the system without being reviewed, therefore errors
can make it into the records

· Systems
connected to ERP systems do not always give relevant or requested info

· Closing
the month or quarter takes too long as each school and each department has its
own set of GLs.

Facilities:
Efficiencies are hampered due to data disconnects

· Do
not have accurate space metrics due to outdated system, schools not willing to
share their info with Research Administrators and Proposal Investigators

· Do
not have utility consumption, building by building

· No clear
classroom assignment policy (a large room may be assigned to a small number of
students)

· Not
all classes are under the registrar’s control

· No
tool showing actual space for planning purposes

· Difficult
to determine research costs, without accurate access to floor plans and
utilization

· Cannot
effectively schedule and monitor classrooms

If your campus has data, you have data issues. As the push for students becomes more
competitive, being able to understand your current data, mine your social data,
target your alumni, make better use of your facilities, improve your supplier
relationships, and increase your student success will be dependent on better
data. The tools exist to take data
from a problem filled issue to a distinct competitive advantage. The sooner campuses adopt these tools,
the sooner they will receive the benefits of doing so.

Disclaimer:
1)Oracle, Oracle Hyperion, Hyperion and Java are registered trademarks of Oracle and / or its Affiliates
2)Microsoft is a registered trademark of Microsoft and / or its affiliates
3)Any other trademark, name, logo, images, etc. are copyright and trademark of its respective owner which also includes Innov8 Infinite Technology Pvt.Ltd.

Master Data Management and Service-Oriented Architecture: Better Together

Master Data Management
and Service-Oriented Architecture: Better Together

By Neela Chaudhari

Many companies are struggling to keep up with constant
shifts in technology and at the same time address rapid changes in the business.
As organizations strive to create greater efficiency and agility with the aid
of new technologies, each new business-led project may further fragment IT
systems and result in information inconsistencies across the organization. Because
data is an essential input for all processes and business objects, these irregularities
can undermine the original business objectives of the technology initiatives.

Combining the use of master data management (MDM) on the
business side and service-oriented architecture (SOA) on the IT side can
counteract the problem of information inconsistency. SOA is a practice that
uses technology to decouple services, transactions, events, and processes to enhance
data availability for business applications across a range of use cases. But
the underlying data is often overlooked or treated as an afterthought when it
comes to business processes, leading to poor data quality characteristics for
your business applications. Without MDM,
the data made available to business applications by an SOA approach might be
less than accurate and more widespread throughout an organization. That can
lead to a situation where lower quality data is consumed by more business
users—ultimately thwarting the objectives of efficiency and agility.

MDM can add value to SOA efforts because it improves the
quality and trustworthiness of the data that is being integrated and consumed.
MDM aids the tricky issue of upstream and downstream systems integration by
ensuring the systems access a data hub containing accurate, consistent master
data. It also assists SOA by providing consistent visibility and a technical
foundation for master data use. MDM delivers the necessary data services to
ensure the quality and timeliness of the enterprise objects the SOA will
consume.

To learn more about the importance of MDM to SOA
investments, read an in-depth technical article, MDM
and SOA Be Warned!
(http://www.oracle.com/technetwork/articles/soa/ind-soa-mdm-2090170.html)

And don’t miss the new Oracle MDM resource center (http://www.oracle.com/webapps/dialogue/ns/dlgwelcome.jsp?p_ext=Y&p_dlg_id=11125359&src=7319909&Act=42).
Visit today to download white papers, read customer stories, view videos, and
learn more about the full range of features for ensuring data quality and
mastering data in the key domains of customer, product, supplier, site, and
financial data.

Disclaimer:
1)Oracle, Oracle Hyperion, Hyperion and Java are registered trademarks of Oracle and / or its Affiliates
2)Microsoft is a registered trademark of Microsoft and / or its affiliates
3)Any other trademark, name, logo, images, etc. are copyright and trademark of its respective owner which also includes Innov8 Infinite Technology Pvt.Ltd.