Mark Madsen (2013) composes, “Big data isn’t hype, but it is being hyped”. With this one announcement, Madsen advises us that simply on the grounds that big data is close to the “Top of Inflated Expectations” in Gartner’s buildup cycle, organizations ought to be mindful so as not to dismiss the promises of big data. Big data and big analytics will drive astonishing societal changes – both commercial and non-commercial. However, to be fruitful with big data, organizations have a few difficulties to succeed. These difficulties range from critical investments in empowering innovations to overcoming interior political fights – from locate the right talent to enhancing information quality. This critical analysis paper refers to references from numerous articles; clarifies the significance of Big Data and how to succeed deliberately by picking the right programming, tools and team. 

Business Leaders and managers today, realize more clearly than ever that information leads to insight and insight leads to business success. These drives of information and desire for business breakthroughs have shifted the technology of what we know as Big Data. In the past, Big Data has been associated with an issue related to technology. But organizations have reframed the issue to be one of business value creation. The convergence of big data, low-cost hardware and information management means that the organization has capabilities required to analyze astonishing data sets quickly and cost effectively. It represents a clear opportunity to enormous gains in terms of efficiency, productivity, revenue and profitability. The burst of big data started in the first decade of 21st century with some online firms and startup companies. Firms like Google, eBay, Amazon, Facebook started around big data and they did not have much to do to integrate their data with the traditional systems. For big companies, they had to integrate everything that is going on in the company with big data. They have an evolving definition that Big Data is centered on the three V’s of data: Volume, Velocity and Variety. According to the book “Big Data Big Analytics”, Analytics on Big data have to coexist with analytics on other types of data. Big Data Analytics plays an important role in sorting out the winners in this competitive global economy by using a wide variety of Advanced Analytics methods like Data Mining, Predictive Analytics, Simulation and Optimization. It also uses Prescriptive analytics, which takes past information and uses it to direct future activities to achieve optimal or near optimal results. Even small benefits provide a large payoff when adopted on a large scale. The book also says that Big Data is not a replacement for data warehousing nor is it something to be treated separately. The reality is that it is all about new models for data processing and applying new technologies to meet unfulfilled needs. Hadoop is one such technology that has been mostly in demand when it comes to handling Big Data. It is an open-source platform which stores and processes divergent data types. It empowers an organization to work on the most modern architectures using a design data framework. Big Data foundation is composed of two major systems. Firstly data has to be stored on an infinitely scalable hardware. And secondly, it has to be processed and converted into usable business intelligence. Out of the many computing techniques available today, parallel computing is the one, which can handle the speed and volume of data being produced. MapReduce is one of the available parallel computing techniques.

Visualization is another technique, which facilitates the identification of patterns in data and makes it more presentable. By mapping data attributes to visual properties such as position, size, shape and color designers leverage perceptual skills to help users discern and interpret patterns within data. Tableau is one such software growing at unprecedented rate that helps in data discovery by enabling users to explore data, make discoveries and uncover insights in a dynamic and intuitive way. It provides a powerful means of making sense of data. Apart from technology, it is the “People” who lead the company’s success. The “Big Data Big Analytics” says that the trend of Big Data Analytics has given rise to a Data Scientist who uses Business, Math and Technology along with behavioral sciences to create the appropriate incentives to drive behaviors that align to the business goals. Optimization techniques are very mathematically intensive and with big data, the computation becomes massive. The experienced data scientist is knowledgeable in a variety of scientific approaches and takes into account uncertainty, complex interrelationships and conflicting objectives. They create models that are representations of the data and patterns that are used to predict future outcomes based on patterns from past data. Marketing evangelist Avinash Kaushik believes that according to the 90/10 rules, investing 10 in systems and 90 in the people is the key to success. This is because we need smarter people to handle the smart systems in this digital era. In order to generate more of such data scientists, Mu Sigma, the world’s largest decision sciences company took a new approach of “Creating” talent at their University. It develops people with traits like Agility, Scale and convergence, multidisciplinary talent, innovation and cost effectiveness.  Other than technology and people, Big Data has high focus on Data Privacy and Ethics. According to the “Big Data Big Analytics”, Data Privacy is the thing done to keep from getting sued whereas Data ethics is something that makes the relationship with a customer positive. Several companies have engrained into self-regulation for data privacy, which has seven global privacy principles. They are: Notice: Informing individuals about the purpose of data collection. Choice: Allowing individuals to choose how their personal data can be used. Consent: Disclosing personal to a third party only with the users consent. Security: Protection of personal data. Integrity: Assuring the reliability of personal information. Access: Individuals having access to their own personal data. Accountability: Including mechanisms for assuring compliance. In fact data Privacy is neither a compliance problem nor a cost avoidance problem, it is rather an opportunity to do the right thing and build a strong relationship with the customer. 

Executives and shareholders have to realize that data is a strategic asset and hence the need for proper data management rises. The Top Management has to make risk-taking decisions and need to ask: “What information concepts do we need for our tasks?” According to “The Data Asset”, the organization has to emphasize on risk mitigation, revenue optimization and cost control. In order to be successful with Big Data, an organization has to maintain transparency with strong data governance. They have to proactively avoid risks, fight fraud data and achieve data quality as an ongoing process. Discovering the Bad data is more than just recognizing that there is a problem or recognizing the particular data that is at fault. As companies grow and acquire new businesses the potential for risk multiplies. True risk mitigation is tied up to the quality of data. Risks can be avoided by understanding that it is not just about keeping away from the law. It is way beyond the government compliance to proactively avoid risk. The combination of data quality with robust team management reduces the risk in an organization. Another critical aspect to any company’s success is cost control. The solutions to this are supply chain optimization, automated business process and spend analysis. Having an accurate and trusted data can enhance the cost control efforts and an incremental improvement in data management adds to the big savings without big expenses. In order to increase revenue and profits, companies have to study their relationship with customers. CRM (customer relationship management) helps in using data to recreate one-on-one relationship with customers. It allows us to know who buys what products and services as well as when they buy them and ultimately why they buy them. With the increase in solutions, the potential to failure also increases. Bad data can let the company assume they have more customers than they actually do. The impact of poor quality data was first felt in customer information applications. Analyzing the data, improving the data and setting data controls are some of the steps to address product data quality. “The Data Asset” recommends having a 360-degree view of the customer with whom the organization interacts. Data quality along with data integrity helps companies aggregate the information to build a master record of the customer. This kind of measures- maintaining accurate data by building 360-degree view of the customer, bringing outside data to optimize revenue, help in achieving incremental ROI. Apart from cost control and revenue optimization, the other common objective of Big Data technologies and solution is time reduction. The organizations need to interact with customers in real time, using analytics and data derived from the customer experience.

It may seem like “governance” shouldn’t be concerned with recruiting and talent development, when we revisit the definitions that Eckerson and Fisher propose, it’s easy to see how this would be a governance issue. Effective data governance demands that companies develop comprehensive recruiting and development strategies. For example, “Mu sigma” one of the world largest decision science company, has the different approach for creating the talent. They have the front-end team who studied the applied math and business majors for many years of experience to find and create the analytical talent. They started a program in the university and preparing the people according to their needs of the company’s successful path. In the article “Academic Analytics Business Intelligence for Higher education”, creating plans for the business environment is key attributes to success. This is what exactly explained in the book about the talent of decision science needed in all the organizations to set up the team with data scientists having professional traits like “learning over knowing”, “Agility” (able to cope with continuous transformation), “Scale and Convergence”, Multidisciplinary talent”, ‘Innovation” and “Cost effectiveness”. It shows the importance the components of the data are and they must useful in order for the industry to be productive. Companies might consider sponsoring promising college students who agree to higher-level training in analytics. Alternatively (or additionally) companies could create comprehensive training programs, such as the Mu Sigma University at the analytics company Mu Sigma. This internal “university” has helped Mu Sigma train the talent they need instead of relying on the competitive market for them. (Minelli, Chambers, & Dhiraj, 2013, pp. 139-140). Implementing a fruitful MDM technique includes distinguishing the sacred data, characterizing the work processes that will be utilizing the unstructured data and utilizing data to reclassify business methods taking after a very iterative methodology.

The Top management has to identify the data governance stage into which their organization falls and determine how and when to move to the next stage. The four distinct stages of a data governance maturity model according to “The Data Asset” are Undisciplined, Reactive, Proactive and Governed. Determining the actions required to move from one stage to the next one is a critical consideration for companies moving from Reactive stage to the Proactive stage because it involves a culture change and the need to identify both business and IT experience people. In the proactive stage, data is treated as a corporate asset. Data Steward who is responsible for building consensus across business units is a new concept for many organizations. They help the company to plan actions required to move from one stage to another. The people selected as Data Stewards should be able to work with other parts of the organization to ensure that the full scope of data provides needs of entire organization. Data Stewardship takes a commitment from many people across the organization. The Governed organizations have Unified data Governance strategy with comfortable incorporation of external data and executive sponsorship. But not many organizations have yet reached the Governed stage. The ability to become a governed organization is built on the success of the proactive organization. Governed organizations can make difficult decisions like layoffs, closing down a unit or discontinuing a product line much more effectively. It is all about moving beyond data to business process automation. According to “The Data Asset”, Harnessing Big Data’s Potential Through Data Governance is very important. Huge data, when utilized admirably, can convey gigantic worth to associations. The significance of data governance in this comparison is picking up perceivability. A late report from the Institute for Health Technology Transformation, for instance, demonstrated that an institutionalized organization for data governance is vital for health awareness associations to influence the force of huge data. The first and most basic need is to add to a deliberately organized system for big business data governance. Whether we are adding to an approach sans preparation or improving a current one, here are four approaches to fortify our data governance model: 

1. Add to an data governance technique. This ought to be steady with the overall business procedure and ought to incorporate managing standards for how huge data will be administered. This implies choosing who claims distinctive sorts of data, who can get to it, and how data is utilized. Key issues to consider incorporate data quality, administrative prerequisites, security and protection, and data lifecycle governance. 

2. Utilize a cross-useful methodology. This is especially imperative for consistence purposes. Data and data frameworks regularly touch various offices and nobody individual has a complete perspective. A cross-practical group is best situated to add to a comprehensive perspective of the association’s huge data, including controls, documentation and auditable confirmation of agreeability. 

3. Settle on choices about data related end-of-life issues. All parts of the data lifecycle are applicable when tending to data governance, yet end-of-life issues shouldn’t be disregarded. One standard maintenance plan won’t fit all needs. Distinctive sorts of data will have diverse necessities for maintenance periods. Associations may choose to document data so as to upgrade application execution. 

4. Consider how innovation can bolster data governance endeavors. With huge data, associations must gauge how rapidly data volumes will develop, and additionally how unreasonable it will be to store data. Data governance approaches ought to characterize when data is moved to filing frameworks which offer less costly types of capacity, while keeping up simple access for end clients and taking execution loads off of different applications. 

Enormous data has extraordinary potential for helping associations improve; however essentially having the data is insufficient. To infer the best esteem and minimize dangers, an data governance framework is key. It is every executive’s responsibility to constructively and economically addresses data issues for the success of their organization. Finally and more importantly Data Quality and Governance are critical to make right decisions and they should not be considered a one-time project and no organization can tackle these two at once. All that they need to do is to take small measurable steps along the way to ensure success of their business!


The astronomical amounts of Data and expansion of Data is as broad and complex as the applications for it. It was originally viewed as a technical challenge to mitigate high data volumes. It defies processing using traditional data warehousing and information management systems. But today organizations are asking much weightier questions like how to put Big Data to work to create new business value. Hence it is a challenge, but critically important for any organization to prioritize the types of analytics they want to do while simultaneously integrate data, technologies and other resources. To beat these difficulties, organizations ought to put resources into “strategic data management” practices – additionally alluded to as “data governance”. By embracing the data governance mindset and viewing data as a corporate resource, associations will have the devices important to conquer these difficulties. Where Larry English inquires as to whether they would stay in business with item quality that matches their information quality, Mehmet Orun has an additionally rousing culmination: “If the state of your company’s data was the same level of quality as your company’s products and services, how much more profitable would your company be? (as cited in McGilvray, 2008, p. 2)” Similarly we ought to ask organizations “In the event that you were as deliberate and key with enormous information and huge investigation as you are with assembling your items or conveying your administrations, what sorts of returns could these activities convey?” 


§   “Making sense of your organizations Big Data” – Interesting Articles

§  “Linking MDM to Big Data” – Interesting Articles

§  “7 tips to succeed with Big Data” – Interesting Articles

§  “Succeeding in Enterprise Data Governance” – Interesting Articles

§  “When Big data goes bad” – Interesting Articles

§  “Academic Analytics Business Intelligence for Higher Education” – Interesting Articles

§  “Big Data Big Analytics” – Michael Minelli, Michele Chambers, Ambiga Dhiraj

§  “The Data Asset” – Tony Fisher