Characteristics of Big Data Analytics

Big Data Analytics are the solutions, processes, and procedures that allow an organization to create, manipulate, store, and manage a relatively large amount of data to get information. This book uses the term Big Data Analytics since a large amount of data (Big Data) in itself cannot really be useful. It is the combination of a large amount of data (Big Data) and the capability to analyze them (analytics) that can bring large benefits. Big Data Analytics means:

  • • storing a large amount of data;
  • • examining (or mining) them;
  • • getting appropriate information; and
  • • identifying hidden patterns, unknown correlations, and similar things in support of decision-making.

Around 15-20% of available data is in structured form, while the remaining information is available in an unstructured format (Feldman and Sanger 2007). While managing the overwhelming data flow can be challenging, financial institutions that can capture, store, search, aggregate, and possibly analyze the data can obtain real benefits such as increased productivity, improved competitive advantage, and enhanced customer experience. This value, however, does not necessarily come from simply managing Big Data Analytics. It comes from harnessing the actionable insights from them. Financial institutions that can obtain objective-driven business value by applying science to effectively mine data for customer insights, support, and offer new products/services will have clear competitive advantages and stay ahead of the curve in this information age.

Big Data Analytics develops from analytical technologies that have existed for years. Now organizations can use them faster, on a greater scale, and they are more accessible. Analytics is the discovery and communication of meaningful patterns in data. It is especially valuable in areas rich in recorded information. Analytics relies on the simultaneous application of statistics, computer programming, and operations research to quantify performance. Data visualization is particularly important in getting value from harvesting the data.

These challenges are the current inspiration for much of the innovation in modern analytics information systems. They give birth to relatively new automatic analysis concepts such as complex event processing, full-text search and analysis, and even new ideas in presenting the information to support successful decisions.

Big Data Analytics operations can be processed on site. As organizations migrate to the cloud, so will their corporate data. Cloud-based architectures will become more important as individual entities (both devices and resources) generate continuous data streams. With cloud computing, organizations can collect, store, process, analyze, use, and report them.

The volume, speed, and power of technology have transformed the economic environment into a sophisticated data economy. It allows for the execution of complex global transactions at the push of a button. From high-frequency trading to e-commerce to mobile telephony, computers all over the world are generating huge amounts of data. Like individuals, institutions might be facing an information overload that is limiting the promise and opportunity of technology. All of these data provide a large amount of information from more sources than ever before—from social media to e-commerce transaction records to cell phone and global positioning system (GPS) signals to an increasing number of sensors.

Because the majority of data is unstructured and requires unique expertise to understand, organize, and analyze, most of the information sits idly. The good news is that there is a growing set of Big Data Analytics solutions. They can help organizations use and monetize this valuable commodity by finding important insights into their activity. They can help in analyzing their customers’ transaction flows. In this way, organizations can support their customer in a more effective, efficient, economical, and ethical way with their offerings.

Research has found that Big Data Analytics holds the capability to generate profits by improving the margins around transaction flows.[1] The organization and analysis of the data can highlight flows and offer unique insights into trends, destinations, values, volumes, and fees, which can ultimately drive opportunities for organizations (Albright et al. 2010).

In today’s ever-changing economic environment, all sectors need to rethink traditional value propositions. Big Data Analytics is emerging as a cutting-edge option. It is an innovative way to access and visualize key information to be more effective, efficient, economical, and indeed ethical. By unlocking the data available in the organization, persons are able to better understand opportunities for growth and cost savings and, therefore, to be better prepared for success on all fronts.

Big Data Analytics’ superior value is twofold in that it not only provides key information on the business and the market. It also offers a look at the internal processes. This can support their improvement, taking into account the changing economic landscape. This visibility will give organizations the option to fill gaps, improve efficiencies, and ultimately make better decisions. It will also help to create customer-centric strategies and improve the overall customer experience.

As technology continues to push for faster, more interconnected organizations, Big Data Analytics will become an increasingly valuable tool. Through this untapped information, organizations will be able to understand their businesses and customers in new and insightful ways. For many organizations, using Big Data Analytics to identify trends is a very new approach. Only now, some financial institutions are beginning to understand the importance of information as an asset and what this information can offer, and are continuing to gain new insights. Organizations are talking more and more about “data monetization” (Woerner and Wixom 2015). Diving into Big Data Analytics may seem like diving into unchartered waters for some. Big Data Analytics is the future also for financial institutions. There is only the need to take advantage of it in order to remain relevant and use the increasing amount of data available.

It is important to follow a correct process in storing Big Data Analytics:

  • • Selecting data sources for analysis
  • • Defining data models: key value, graphics, document
  • • Analyzing the characteristics of the data
  • • Improving the data quality, for instance, eliminating redundant or

duplicated data

• Overviewing data storing, storage, and retrieval

There are several actions important in storing large sets of data:

  • • Choosing the correct data stores based on the characteristics of the data
  • • Moving code to data
  • • Implementing polyglot data store solutions
  • • Aligning business goals to the appropriate data store
  • • Integrating disparate data stores
  • • Mapping data to the programming framework
  • • Connecting and extracting data from storage
  • • Transforming data for processing
  • • Monitoring the progress of job flows
  • • Using advanced tools, such as D3.js (data-driven documents) (Zhu 2013)

There are a certain number of questions that need to be answered when dealing with Big Data Analytics:

  • • Which types of solutions to be used in Big Data Analytics?
  • • Where data are stored: centralized or distributed or cloud storage?
  • • Where processing is done: mainframe, distributed servers/cloud?
  • • How data are stored and indexed: high-performance schema-free databases?
  • • What operations are performed on data: sequential, analytic, or semantic processing?
  • • What are the risks?
  • • Are the right talents available capable of choosing the right data to solve the right problem?

Analytics 3.0

In an article in the Harvard Business Review, Tom Davenport presented a model of the development of analytics (2013):

  • • Analytics 1.0 is the business intelligence before the Big Data Analytics. It was mainly devoted to analyzing small internal problems since the amount of data available was small.
  • • Analytics 2.0 was a step forward thanks to the rise of Big Data Analytics. It can be used also for predictive analytics besides historical analysis.
  • • A new wave is Analytics 3.0. It is a new resolve to apply powerful data- gathering and analysis methods to a company’s operations and to its offerings—to embed data smartness into the products and services customers buy.

A quotation from Tom Davenport is interesting (2013):

“The most important trait of the Analytics 3.0 is that not only online companies but virtually any type of companies in any industry, can participate on in the given economy.”

Table 4.4 reports a synthesis according to Davenport of the characteristics of each generation of analytics.

With Analytics 3.0, a new architecture was born. The use of technologies existing in many large organizations is not abandoned. It is possible the use of solutions of analysis of Big Data Analytics (such as Hadoop) in the cloud and open source.

An example of the use of Analytics 3.0 in financial services is mass private financial services: a low-cost, customer-centric version of financial services:

  • • low costs since it can use the lower costs of processing a large amount of data made possible with the Big Data Analytics solutions;
  • • personalized to each customer, thanks to a powerful Big Data Analytics.

This would require

  • • recording the behavior of the customers: through his/her accesses, transactions, and, if available, social networks, with their consent;
  • • processing of all these data versus a “model” which might provide useful information for marketing, investment, or risk-averse actions;

Table 4.4 Characteristics of the three generations of analytics (adapted from T. Davenport 2013)

Era

1.0 Traditional analytics

2.0 Big data

3. 0 Data economy

Timeframe

Mid-1970s to 2000

Early 2001 to 2020

2021 and in the future

Culture

Competition not on analytics

New focus on data-based products and services

Agile method where all decisions are driven (or at least influenced) by data

Type of analytics

  • 95% reporting, descriptive
  • 5% predictive, prescriptive
  • 85% reporting, descriptive
  • 15% predictive, prescriptive (visual)

90%+ predictive, prescriptive, automated reporting

Cycle time

Months

An insight a week

Millions of insights per second

Data

Internal,

structured

Very large, unstructured, multisource

Explosion of sensor data

Seamless combination of internal and external data; analytics embedded in operational and decision processes Tools available at the point of decision

Technology

Rudimentary business intelligence (BI), reporting tools; dashboards; data stored in enterprise data warehouses or marts

New

technologies: Hadoop, commodity servers, in-memory, open source. Master data management Standards appear for data quality

New data architectures, beyond the data warehouse; new application architectures

Specific apps, mobile; data dictionaries; full data governance

(continued)

Table 4.4 (continued)

Era

1.0 Traditional analytics

2.0 Big data

3. 0 Data economy

Organization

Analytical people segregated from business and ICT

Some chief data officers appear in some advanced companies; data scientists are on the rise

Centralized teams, specialized functions among team members; dedicated funding

Back-room

statisticians

Talent shortage; educational programs starting

Chief analytics officers; training and educational programs

  • • suggesting or taking actions with the customers, which would add value to him/her; and
  • • reporting and getting the feedback of the customer to improve the services.

Such a potential sequence suits especially the case of services based on financial technologies. Mobility would add also information on the location of the customer. If the customer is in a mall, it would be possible to provide her/him with some proximity information of the nearest agency.

From a privacy point of view, it would be necessary to have the acceptance of the customer of such tracking of his/her activity. In some cases, for instance, if the customer has some funds available, he/she would really appreciate suggestions on how to invest the funds available through the financial services. The acceptance of the suggestion sent by the financial services company to the mobile might even not require pushing a key on the smartphone, but simply a “double shake” of the mobile.

A similar type of functions would be particularly useful in the case of mobile corporate/institutional financial services. The financial services company should send (on request) such messages to the corporate treasury to alert about the need to take an action, such as renew a policy or assign to a different value of the line of credit. That would help the treasury in properly covering corporate risks. Small- and medium-sized companies would appreciate such services. In such companies, often, the managers do not have the time or the skills to follow liquidity or do not have the necessary skills to optimize the financial services management.

  • [1] Banking on Big Data, Banking Technology. 3 December (2014).
 
Source
< Prev   CONTENTS   Source   Next >