Analytics MAY JUNE 2014

Published on February 2017 | Categories: Documents | Downloads: 79 | Comments: 0 | Views: 594
of 89
Download PDF   Embed   Report

Comments

Content

H T T P : / / W W W. A N A LY T I C S - M A G A Z I N E . O R G

DRIVING BETTER BUSINESS DECISIONS

M AY / J UNE 2014
BROUGHT TO YOU BY:

SPORTS
ANALYTICS
WHAT’S A NICE DEFENSE
CONSULTANCY DOING IN THE
SPORTS SPACE?

ALSO INSIDE:
• How to measure anything
• The big ‘V’ of big data
• Powerful decision-making
Executive Edge
Verisk Digital Services
President Henna Karna
on ‘going digital’

INS IDE STO RY

Quantified warriors
What’s a nice defense consultancy
company such as the Perduco Group doing in the sports analytics space?
That’s the question I asked Stephen
Chambal, co-founder of Perduco, after
attending his session on “opportunities,
barriers and lessons learned” in sports
analytics at the recent INFORMS Conference on Business Analytics & Operations
Research in Boston. As Chambal notes
in his article on the same topic in this issue of Analytics, there’s a “simple” answer (sports are fun) and a “real” answer
(his company’s core capabilities and the
business opportunities the sports industry presents, combined with a couple of
chance encounters, triggered Perduco’s
strategic push into the sports arena).
As it turns out, the defense community and the sports community are not that
far apart in terms of their ultimate goals.
They’re both interested in prevailing on
the “battlefield,” whether it’s a desert in
the Middle East or a basketball court in
Madison Square Garden, and they’re
both interested in the so-called “quantified warrior” – the ability to monitor and
assess a soldier’s/professional athlete’s
condition and to understand how to
optimize their performance on their
2

|

A N A LY T I C S - M A G A Z I N E . O R G

respective battlefields. It all makes for a
fascinating story, but before I give away
too much of it, click here.
Chambal’s session was just one of
many sessions I attended at the Boston conference. From Tom Davenport’s
opening keynote talk on “Analytics 3.0”
(look for an article on that topic in a future
issue of Analytics magazine, but in the
meantime, click here to see his thoughts
on predictive analytics in this issue) to
the Oscar-esque Edelman Gala, from the
four dozen poster presentations to the
non-stop series of networking events, the
conference was first rate.
From my perspective as the editor of
Analytics as well as OR/MS Today (the
membership magazine of INFORMS),
there’s nothing more energizing than attending a conference such as the Boston
event, and I suspect that holds true for
anyone involved in the analytics community. If you couldn’t make it to Boston or
even if you did and are craving another
analytics fix, fear not. INFORMS will present its inaugural Big Data Conference on
June 22-24 in San Jose, Calif.

– PETER HORNER, EDITOR
peter.horner@ mail.informs.org
W W W. I N F O R M S . O R G

OPTIMIZE YOUR BUSINESS
WITH UNPRECEDENTED SPEED
IDEA

IN A FEW
HOURS

MISSION CRITICAL
ENTERPRISE APP

IN A FEW
MONTHS

PUBLISHED
INSTANTLY
TO YOUR ENTERPRISE
OPTIMIZATION
APP STORE

PROOF OF
CONCEPT

IN A FEW
DAYS

OPTIMIZATION APP

IN A FEW
WEEKS

To learn more about AIMMS Optimization Apps, visit aimms.com.
[email protected] | +1 425 458 4024

C O N T E N T S

DRIVING BETTER BUSINESS DECISIONS

MAY/JUNE 2014
Brought to you by

FEATURES
28

‘HOW TO MEASURE ANYTHING’
By Douglas W. Hubbard and Douglas A. Samuelson
Latest edition of book takes another look at seven arguments,
new and old, in search of the value of business ‘intangibles.’

34

POWERFUL BUSINESS DECISION-MAKING
By Alex Romanenko and Alex Artamonov
Big data is a hot topic, but harnessing its full potential can be
elusive. The case for an analytics-driven business transformation.

46

THE BIG ‘V’ OF BIG DATA
By Pramod Singh, Ritin Mathur, Arindam Mondal and
Shinjini Bhattacharya
The three keys – information infrastructure, information management
and insights – to unlocking the hidden “value” of big data.

54

ADVENTURES IN CONSULTING
By Stephen Chambal
What’s a defense consulting company doing in sports?
Core capabilities, chance encounter score a business opportunity.

28

34

54

44
4

|

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

N
Di ew
m AS
en P
sio V2
na 01
lM 4w
in excel ode ith
lin
g

AnAlytic Solver PlAtform
easy to Use Predictive and Prescriptive Analytics

How can you get results quickly for business decisions,
without a huge budget for “enterprise analytics”
software, and months of learning time? Here’s how:
Analytic Solver Platform does it all in Microsoft Excel,
accessing data from PowerPivot and SQL databases.
Sophisticated Data Mining and Predictive Analytics
Go far beyond other statistics and forecasting add-ins
for Excel. Use classical multiple regression, exponential
smoothing, and ARIMA models, then go further with
regression trees, k-nearest neighbors, and neural
networks for prediction, discriminant analysis, logistic
regression, k-nearest neighbors, classification trees,
naïve Bayes and neural nets for classification, and
association rules for affinity (“market basket”) analysis.
Use principal components, k-means clustering, and
hierarchical clustering to simplify and cluster your data.

Help and Support to Get You Started
Analytic Solver Platform can help you learn while
getting results in business analytics, with its Guided
Mode and Constraint Wizard for optimization, and
Distribution Wizard for simulation. You’ll benefit from
User Guides, Help, 30 datasets, 90 sample models, and
new textbooks supporting Analytic Solver Platform.
Analytic Solver Platform goes further than any other
software with Active Support that alerts us when you’re
having a problem, and brings live assistance to you right
where you need it – inside Microsoft Excel.
Find Out More, Download Your Free Trial Now
Visit www.solver.com to learn more, register and
download a free trial – or email or call us today.

Simulation, Optimization and Prescriptive Analytics
Analytic Solver Platform also includes decision trees,
Monte Carlo simulation, and powerful conventional and
stochastic optimization for prescriptive analytics.

Tel 775 831 0300 • Fax 775 831 0314 • [email protected]

DRIVING BETTER BUSINESS DECISIONS

REGISTER FOR A FREE SUBSCRIPTION:
http://analytics.informs.org

24

72

DEPARTMENTS
2 Inside Story
8 Executive Edge
12 Analyze This!
16 Healthcare Analytics
20 Forum
24 INFORMS Honors
62 Predictive Analytics
66 Conference Preview
72 Five-Minute Analyst
78 Thinking Analytically
Analytics (ISSN 1938-1697) is published six times a year by the
Institute for Operations Research and the Management Sciences
(INFORMS), the largest membership society in the word dedicated
to the analytics profession. For a free subscription, register at
http://analytics.informs.org. Address other correspondence to
the editor, Peter Horner, [email protected]. The
opinions expressed in Analytics are those of the authors, and
do not necessarily reflect the opinions of INFORMS, its officers,
Lionheart Publishing Inc. or the editorial staff of Analytics.
Analytics copyright ©2014 by the Institute for Operations
Research and the Management Sciences. All rights reserved.

6

|

INFORMS BOARD OF DIRECTORS
President Stephen M. Robinson, University of
Wisconsin-Madison

President-Elect L. Robin Keller,
University of
California, Irvine

Past President Anne G. Robinson, Verizon Wireless

Secretary Brian Denton,
University of Michigan

Treasurer Nicholas G. Hall, Ohio State University

Vice President-Meetings William “Bill” Klimack, Chevron

Vice President-Publications Eric Johnson, Dartmouth College

Vice President
Sections and Societies Paul Messinger, CAP, University of Alberta

Vice President
Information Technology Bjarni Kristjansson, Maximal Software
Vice President-Practice Activities Jonathan Owen, General Motors
Vice President-International Activities Grace Lin, Institute for Information Industry

Vice President-Membership

and Professional Recognition Ozlem Ergun, Georgia Tech

Vice President-Education Joel Sokol, Georgia Tech

Vice President-Marketing,

Communications and Outreach E. Andrew “Andy” Boyd,
University of Houston

Vice President-Chapters/Fora David Hunt, Oliver Wyman


INFORMS OFFICES
www.informs.org • Tel: 1-800-4INFORMS


Executive Director Melissa Moore

Meetings Director Laura Payne

Marketing Director Gary Bennett

Communications Director Barry List


Headquarters INFORMS (Maryland)

5521 Research Park Drive, Suite 200
Catonsville, MD 21228
Tel.: 443.757.3500
E-mail: [email protected]

ANALYTICS EDITORIAL AND ADVERTISING
Lionheart Publishing Inc., 506 Roswell Street, Suite 220, Marietta, GA 30060 USA
Tel.: 770.431.0867 • Fax: 770.432.6969

President & Advertising Sales John Llewellyn

[email protected]
Tel.: 770.431.0867, ext. 209

Editor Peter R. Horner

[email protected]
Tel.: 770.587.3172

Assistant Editor Donna Brooks
[email protected]

Art Director Jim McDonald

[email protected]
Tel.: 770.431.0867, ext. 223

Advertising Sales Sharon Baker

[email protected]
Tel.: 813.852.9942

N
Di ew
m AS
en P
sio V2
na 01
lM 4w
od ith
excel
eli
ng

AnAlytic Solver PlAtform
from Solver to full-Power Business Analytics in

The Excel Solver’s Big Brother Has Everything You
Need for Predictive and Prescriptive Analytics
From the developers of the Excel Solver, Analytic Solver
Platform makes the world’s best optimization software
accessible in Excel. Solve your existing models faster,
scale up to large size, and solve new kinds of problems.
From Linear Programming to Stochastic Optimization
Fast linear, quadratic and mixed-integer programming is
just the starting point in Analytic Solver Platform. Conic,
nonlinear, non-smooth and global optimization are just
the next step. Easily incorporate uncertainty and solve
with simulation optimization, stochastic programming,
and robust optimization – all at your fingertips.

Comprehensive Forecasting and Data Mining
Analytic Solver Platform samples data from Excel,
PowerPivot, and SQL databases for forecasting and data
mining, from time series methods to classification and
regression trees, neural networks and association rules.
And you can use visual data exploration, cluster analysis
and mining on your Monte Carlo simulation results.
Find Out More, Download Your Free Trial Now
Analytic Solver Platform comes with Wizards, Help, User
Guides, 90 examples, and unique Active Support that
brings live assistance to you right inside Microsoft Excel.
Visit www.solver.com to learn more, register and
download a free trial – or email or call us today.

Ultra-Fast Monte Carlo Simulation and Decision Trees
Analytic Solver Platform is also a full-power tool for
Monte Carlo simulation and decision analysis, with a
Distribution Wizard, 50 distributions, 30 statistics and
risk measures, and a wide array of charts and graphs.

Tel 775 831 0300 • Fax 775 831 0314 • [email protected]

EXE CU TIVE E D G E

Digital value: Greater than
the sum of its parts
Going digital, in
simple terms, means
transitioning from
an “inside-out” to an
“outside-in” approach
to business.

BY HENNA A. KARNA

8

|

Over the past decade, corporations across industry
verticals have invested significant capital to upgrade
their infrastructure to analyze customer data collected
within a multichannel environment. And so the obvious
next question is – what now?
The answer can be derived from the combination of
capabilities, created by the investment in these newer
technologies – specifically, the digital engagement of
the customer, where customer represents both external
consumers and internal stakeholders. Whether to create
a network of external customers to drive the creation of
a dynamically optimized yet evolving set of products or
to drive efficiency in internal operations, digital presents
corporations with both challenges and opportunities.
Going digital, in simple terms, means transitioning
from an “inside-out” to an “outside-in” approach to business. With the exception of a handful of super-enterprises, most corporations are focused on “What can I offer
to customers?” and “What can I deliver to customers?”
That focus is important but traditional “inside-out” thinking. On the other hand, corporations with an “outside-in”
orientation ask questions such as: “What do customers
consciously – and subconsciously – want?” “How can I
affirm, if not expand, my control over the value chain with
new ideas and products?” “How do I redefine business
processes with an operating model that informs key answers to the aforementioned questions?”

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

New Solver for Office 365 Excel: Free for Your Tablet or Phone
New

With our Solver App for Office 365,
SharePoint 2013, and Excel 2013,
you have all the capabilities of the
Excel Solver on your desktop, laptop,
tablet or phone.
It works in the Excel Web App with
all popular Web browsers, and solves
your model “in the cloud” on
Windows Azure.
And it’s free and available now! Just
visit solver.com/app or the Office
Store online, or use Insert Apps for
Office in Excel 2013.

Analytic Solver Platform: Multidimensional Models with PivotTables
New

Analytic Solver Platform 2014 brings
multi-dimensional optimization
modeling to Excel. Easily create
dimensions or index sets and cubes
of computed values, using regular
Excel formulas – extended to operate
over multiple dimensions.
Use PivotTables, created in Excel or
from databases with PowerPivot, to
populate your model with data.
Easily create new PivotTables of
optimization results.

Find Out More, Download Your Free Trial Now.
Visit www.solver.com to learn more, register and
download a free trial – or email or call us today.
Frontline Solvers – The Leader in Spreadsheet Analytics

Tel 775 831 0300 • Fax 775 831 0314 • [email protected]

EXE CU TIVE E D G E

digital encompasses both external expanTHE DIGITAL CUSTOMER
Technological advances have created sion that is customer-centric and internal
the digital customer – one who is empow- optimization that improves efficiency withered by technology, distracted with technol- in the organization. Linking the two is imogy and social through technology. As we portant for a seamless, effective operating
move forward, communication with custom- model.
ers will be continual and bidirectional, capturing customers’ sentiments and built upon THE FOUR QUADRANTS OF DIGITAL
self-service as well as co-creation of prodAt the highest level, the impact of digital
ucts and services. Supported by an efficient can be segregated into four quadrants, deoperating model, digitally enhanced tech- fined by interaction vectors (unidirectional
nology effectively enables value creation for or bidirectional) and the strategic input vecthe business.
tors (outside-in or inside-out). Bidirectional
At the highest level, a company would interaction refers to a duplex interaction
have to engage its audiences more frequently, more personally and
across multiple channels. Digital engagement should mimic the
personalization of faceto-face interactions, offering a collaborative
environment in which
customers can participate in idea generation
and product development, and the organization is better able to
understand and predict
customers’ wants and
needs.
But keep in mind,
Figure 1: The four quadrants of digital.
a more holistic view of
10

|

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

model between the customer and the corporation, effectively providing a means by
which true information exchange, not just
information transfer, can occur. Outsidein refers to the ability to extract input from
outside the corporation to drive decisionmaking and broader strategic direction.
Inside-out, on the other hand, refers to strategic decision-making based on a model
driven by perspectives within the corporation (primarily a supply-constrained view).
As described in the four-quadrant
view shown in Figure 1, both outside-in
and inside-out input models are relevant
consideration factors depending on the
objectives (external expansion or internal
optimization) of the corporation pursuing
a digital transformation. The highlighted
examples (customer touch points, captive
conversion, internal collaboration, performance management) provide high-level
examples of key opportunities for corporations looking to capitalize on digital.
WHERE TO BEGIN
On a practical level, executing a
digital engagement strategy should be
gradual but still involve all key facets of
the company’s infrastructure. Although
a digital platform is heavily supported
by technology and is rooted in IT, it
should be a priority for the entire business. Paramount for digital success is
a broad top-down mandate that spans
A NA L Y T I C S

marketing, sales, service, operations,
finance and IT.
In terms of implementation, inertial
elements can slow the internal adoption of digital with business-as-usual
perspectives within the company, and
externally through an untrained customer base. To manage those constraints,
companies should adopt and institute
a gradual but focused program, concentrating initially on the interactions
most intuitively handled through a digital interface. Such early interactions
are best exploited in a service and support (e-service) model that provides a
natural problem-resolution incentive for
customers to engage. As a result of a
digital-service-first model, corporations
can obtain insightful data from a large
set of interaction types to help identify
those easiest to migrate to digital. Payment of bills is one the most common
interaction types for this purpose.
The challenges of the present, however, don’t change the fact that digital
engagement is the blueprint for future
corporate operating models. To that end,
organizations that embrace, implement,
and refine their thinking around this imminent paradigm will be the likely standouts
in the capital markets in the years ahead.
Dr. Henna A. Karna is president of Verisk Digital
Services, the digital business unit of Verisk Analytics.

M A Y / J U N E 2 014

|

11

ANALY ZE T H I S !

Grad school desires vs.
real-world demands
The business world is
increasingly data-rich, but
one must have the ability
to sort that data out
before any of this analysis
can take place. This means
getting comfortable with
programming and data
preparation.

BY VIJAY MEHROTRA

12

|

I was recently promoted from “associate professor” to “professor.” There were two notable things
about this promotion:
• This is the first time in my life that I had ever
received a formal promotion from an existing
employer (all other job title changes have been
the result of changing employers).
• I am now officially no longer a “junior” faculty
member, a fact confirmed by the impending
arrival of my 50th birthday later this year.”
Anyway, a few weeks after being notified of my promotion, I flew off to Minnesota to visit St. Olaf College,
my undergraduate alma mater. While my primary purpose in going was to attend a meeting of the school’s
Alumni Board, the real highlight was the chance to
meet with current St. Olaf students, to have a chance
to see the world through their eyes, and to offer up any
relevant wisdom gleaned from my own experience.
The first St. Olaf student I met with was a senior
math major, a bright young man who had until recently
been planning to pursue a Ph.D. in economics. He was
now set on further studies in operations research, and
he was trying to decide where to go to grad school.
I winced.
The second student was a young woman in her
junior year who was majoring in mathematics and biology with a concentration in statistics. She had seized
the opportunity to visit with me, in large part because

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

she had just begun to explore graduate
programs in O.R.
I tried to talk her out of it.
Bear in mind that some years ago, I
had been in the same place that these current St. Olaf students now were. Blessed
with a lot of good choices, I had chosen
to go forth to study operations research.
In fact, so too did my college classmates
Hai Chu and Karen Donohue, and I am
grateful to have had the chance to bask
in their (reflected) professional success
in operations research.
So why would I advise these youngsters not to follow in our glorious footsteps? Let’s start with some important
specifics. First of all, for both of these
students, the decision to go to graduate
school seemed to serve many purposes:
an opportunity to challenge themselves;
a chance to improve prospects for both
financially and intellectually rewarding
careers; and a socially acceptable path
with parents, peers and professors.
Also, from our conversations, it appeared that both had been initially attracted to operations research by my friend
Steve McKelvey, a professor who has
been inspiring Olaf math majors since my
own student days. Finally, it was quickly
apparent to me that the primary motivation was the chance to meaningfully apply their (current and future) skills, rather
than any particular passion for O.R. itself.
A NA L Y T I C S

Graduate programs in operations research certainly have many virtues, and I
will always be deeply indebted to the one
that took me in. There will always be some
for whom this is a clear and obvious right
next step, students who are passionate
about the methods and hungry to learn
more about them. Yet for the generally
quantitatively strong undergraduate who
is interested in applying her technical
skills within the business world, my postcollegiate recommendations are based
on a few simple premises:
1. The business world is increasingly
data-rich, but one must have the ability
to sort that data out before any of this
analysis can take place. This means getting comfortable with programming and
data preparation, which we know typically takes up more than 50 percent of the
time on most “real-world” projects.
2. Optimization is great, but really
good answers quickly are actually better,
especially if the environment is rapidly
changing or the objective itself lacks a
well-defined functional form.
3. You often can’t optimize a system
without first predicting future demand,
and that forecasting is itself a significant
challenge.
4. You are unlikely to do any of this great
work totally on your own, so developing the
skills needed to work effectively with others
is too important to be left to chance.
M A Y / J U N E 2 014

|

13

ANALY ZE T H I S !

You are unlikely to do
any of this great work
totally on your own, so
developing the skills
needed to work
effectively with others
is too important to be
left to chance.

With all this as background, I suggested
that the students pursue one of the following:
• An M.S. degree in analytics that focuses on
preparing students for effectively working
across the analytics project cycle, which
in addition to content found in traditional
O.R. programs also includes training in
problem discovery and framing, data capture,
preparation and analysis, predictive modeling,
business communication, teamwork and
project skills.
• A Ph.D. program in an academic area of interest
(economics, biology, physics and psychology
were all discussed), which would include
additional rigorous technical training, require
them to combine the experience of acquiring
deep domain knowledge with challenges in data
acquisition and analysis, and help them to more
deeply develop their ability to learn independently
while working in collaboration with an advisor
(and ideally as part of a research group).
The second St. Olaf student I met, who is still
more than a year from graduating, agreed to give
our conversation some serious thought. I was
pleased that she had at least listened to what I had
to say. However, for the first student, just weeks
from commencement, the die is cast. He’s going to
Cornell to study O.R.
Kids today! What can you do?
Vijay Mehrotra ([email protected]) is a professor in the
Department of Business Analytics and Information Systems at
the University of San Francisco’s School of Management. He is
also a long-time member of INFORMS.

14

|

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

Make optimized decisions.
Even if your data is incomplete.
Robust optimization at both the solver and modeling level.
Now part of FICO® Xpress Optimization Suite.
Missing data and the challenges of harnessing Big Data have introduced a lot of uncertainty into the process
of solving complex optimization problems.
FICO has solved that problem.
We’ve enhanced FICO® Xpress Optimization Suite with features to handle the difficulty of uncertainty introduced
by predictive analytics data. This robust optimization guarantees feasible solutions in the face of unknowns.
Still not feeling certain? Visit FICO at the 2014 INFORMS Conference: The Business of Big Data, June 22-24 in
San Jose, CA and learn how robust optimization handles your missing data challenges.

Learn more about the Xpress Optimization Suite: fico.com/xpress
© 2014 Fair Isaac Corporation. All rights reserved.

HEALT H CARE A NA LY T I C S

Rise of the empowered
patient consumer –
courtesy of analytics
About 44 million people in
the United States have no
health insurance and 38
million have inadequate
insurance. For a lot of
people there is no “shared
responsibility” for health
per se. Most people don’t
“own” the care of their
health; it was provided
and mostly paid for by
someone else.

BY RAJIB GHOSH

16

|

As of this writing, 7.5 million people have signed
up for their own health insurance policies via Healthcare.gov or 14 state-run health insurance exchanges.
In addition about three million people have enrolled
in state Medicaid programs. Private enrollment outside of those insurance marketplaces is also growing
and could be substantial. In other words – all signs
indicate that more and more people are apparently
taking control of their own health – the holy grail of
consumer-driven healthcare. Clearly, if we seek control over our cost of insurance, we have to be careful
about our personal health choices. Shouldn’t that be
the case anyway?
Well, that’s not how we do things in the United
States. Based on a report by the Congressional Budget Office, more than 50 percent of Americans or 156
million people were covered by employer-sponsored
insurance plans in 2013 [1]. Government-run programs such as Medicare and Medicaid cover 31 percent of the population who are poor, disabled or over
65 years old [2]. The government safety net is one of
the most prized albeit expensive possessions of the
American public.

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

Despite the fact that more people are taking
money out of that safety net than are putting in
and a threat that the Medicare fund will be depleted by 2037 – the American public is unwilling to do anything drastic about the safety net.
Still, that leaves quite a large number of people
who either have to buy insurance on their own
or remain uninsured. One estimate shows that
about 44 million people in the United States
have no health insurance and 38 million have
inadequate insurance. While those numbers are
huge for a developed nation, for a lot of people
there is no “shared responsibility” for health per
se. Most people don’t “own” the care of their
health; it was provided and mostly paid for by
someone else.

Despite the fact
that more people are
taking money out of
that safety net than
are putting in and
a threat that the
Medicare fund will
be depleted by 2037,
the American public
is unwilling to do
anything drastic
about the safety net.

CHANGING WORKFORCE, CHANGING
INSURANCE COVERAGE
All of that is changing. Some of it started to
change when the availability of employer-sponsored healthcare coverage started to decline
a decade ago. According to a 2010 report, the
number of people with employer-sponsored
health insurance was down 10.6 percent from
what it was in 2000. By 2013, the decline was
even greater as the recession, job losses and
rising costs that forced some small employers
to ditch employee group insurance altogether
were all contributing factors.
Meanwhile, the American workforce has
been changing, too. For example, 20 percent
to 30 percent of workers in Fortune 100 organizations today are freelancers or “contingent
A NA L Y T I C S

M A Y / J U N E 2 014

|

17

HEALT H CARE A NA LY T I C S

Providers are adopting
information digitization
and healthcare analytics,
mostly in the form of
descriptive business
intelligence tools that
make fancy post-mortem
charts. Predictive analytics
is still far-fetched.

workers.” By 2020, the number is expected to
rise to 50 percent [3], and the number of people
covered under employer-sponsored health insurance will become a smaller percentage of the
overall population. More people will have to pay
for insurance on their own – from the exchange
marketplaces or otherwise. For many who will
not have to pay their total insurance bill, the cost
sharing will be higher or the coverage will be less
or even inadequate. Those that can afford it might
have to supplement their insurance with personal
policies.
WHERE IS THE ANALYTICS?
As part of the Affordable Care Act widely
known as Obamacare, the U.S. government is
trying to drive performance efficiencies and improved quality of care through providers and
in the delivery system using programs such as
value-based purchasing, readmission penalties
and meaningful use of electronic health record
systems. In response to these initiatives, providers are adopting information digitization and
healthcare analytics, mostly in the form of descriptive business intelligence tools that make
fancy post-mortem charts. Predictive analytics is
still far-fetched.
Health Leaders Media recently identified the
top three strategic drivers for providers in 2014,
which includes clinical decision support and clinical performance tracking. Both require heavy use
of analytics. Payers are taking on more risks to
increase their medical-loss-ratio using analytics
that can identify patient cohorts with higher risk

18

|

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

exposures. But where is the consumer
in this change? Are we not supposed to
“drive” better, more efficient healthcare
for us?
Today, that drive is limited to asking
for provider cost transparencies or insurance plan shopping. Apart from the
quantified selfers, most of us are happy
with our annual physical check ups that
cost our health system $8 billion a year
according to a 2012 study analysis [4],
but that does nothing to address serious and expensive illnesses or premature mortality. Few, if any, individuals
use analytic insights to proactively know
what our current or future risk exposure
is or what behavior we should abandon
to prevent higher downstream medical
costs. Needless to say, we are not sophisticated enough to analyze our now
forbidden (by FDA) 23andMe genetic
test report to know our genetic predisposition toward future medical expenses.
A FUTURE OF EMPOWERED
CONSUMER PATIENTS
For the latter, we are yet to have
predictive analytics, which can take our
individual physiological measures and
a myriad of other factors and inform
us what we need to do to avoid out-ofpocket medical costs three to five years
downstream that won’t be covered by
our insurance plan. This brings up an
A NA L Y T I C S

interesting idea of fusing our health insurance information with our physiological data and genetics – not under the
watchful eyes of insurance providers but
for us and only for us. Think of it as our
personal financial risk dashboard using
the powers of predictive analytics! It will
be even better if we are able to tweak
our behavior data and then see the impact in our risk dashboard. That will be
real empowerment for us as consumers.
Rajib Ghosh ([email protected]) is an
independent consultant and business advisor
with 20 years of technology experience in various
industry verticals where he had senior level
management roles in software engineering,
program management, product management
and business and strategy development. Ghosh
spent a decade in the U.S. healthcare industry
as part of a global ecosystem of medical device
manufacturers, medical software companies and
telehealth and telemedicine solution providers.
He’s held senior positions at Hill-Rom, Solta
Medical and Bosch Healthcare. His recent work
interest includes public health and the field of
IT-enabled sustainable healthcare delivery in the
United States as well as emerging nations. Follow
Ghosh on twitter @ghosh_r.
NOTES & REFERENCES
1. Avik Roy, 2010, “Obama Officials In 2010: 93
Million Americans Will Be Unable To Keep Their
Health Plans Under Obamacare,” Forbes.
2. “Income, Poverty and Health Insurance
Coverage in the United States,” 2011,
government publication.
3. Thomas Fisher, 2012, “The Contingent
Workforce and Public Decision Making.”
4. Sharon Begley, 2013, “Think preventive
medicine will save money? Think again,”
Reuters.

M A Y / J U N E 2 014

|

19

FO RUM

Analyzing analysts:
dreamer vs. pragmatist
Analysts and EPM project
managers in all industries
face a common struggle:
acceptance of their ideas,
methods and findings
by often suspicious
work colleagues and
managers, some of
whom exhibit substantial
resistance to change.

BY GARY COKINS

20

|

Do you have two imaginary voices that are on
each of your shoulders telling you opposite messages? I do. And the voices are conflicting. One is a message of positive hope and possibilities, and the other
one is of negative discouragement.
The topic and context for each message involves
the frustratingly slow adoption rate for applying analytics and progressive enterprise performance management (EPM) methods. Examples of EPM methods
are the balanced scorecard with key performance
indicators (KPIs), channel and customer profitability
analysis, driver-based rolling financial forecasts and
lean management techniques.
I much better enjoy the inspiring messenger compared to the naysayer one. Who wouldn’t? But I have
two ears, so I must listen to both voices.
The negative voice is the clear-eyed pragmatist.
The positive voice is the creative wild-eyed dreamer. Using a jail prisoner analogy, the pragmatist sees
the prison window bars as barriers while the dreamer
sees the stars in the night sky.
What I am writing about is the struggle that analysts and EPM project managers in all industries
have. It is with the acceptance of their ideas, methods
and findings by often suspicious work colleagues and
managers, some of whom exhibit substantial resistance to change. (You know the type. Their motto is,
“We don’t do it that way here.”)

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

BEHAVIORAL CHANGE
MANAGEMENT
Organizations seem hesitant to
adopt analytics and EPM methods. Is
it evaluation paralysis or brain freeze?
Most organizations make the mistake
believing that applying analytics and
EPM methods are 90 percent math and
10 percent organizational change management with employee behavior alteration. In reality it is the other way around
– it is more likely 5 percent math and 95
percent about people.

A problem with removing behavioral
barriers to deploy analytics and EPM methods is that almost none of us have training
or experience as organizational change
management specialists. We are not sociologists or psychologists. However, we
are learning to become like them. Our focus should be on the “why to implement”
and its motivating effects on organizations
rather than the “how to.” The challenge is
how to alter people’s attitudes.
One way to remove cultural barriers is to acknowledge a problem that all

JUNE 16-19, 2014

CHICAGO
www.pawcon.com
Delivering on the promise of

big data
PRODUCED BY

A NA L Y T I C S

M A Y / J U N E 2 014

|

21

FO RUM

organizations suffer from an imbalance
with how much emphasis they should
place on being smart rather than being healthy. Most organizations overemphasize trying to be smart by hiring
MBAs and management consultants
with a quest to achieve a run-it-by-thenumbers management style. These
types of organizations miss the relevance of how important is to also be
healthy – assuring that employee morale is high and employee turnover is
low. To be healthy they also need to assure that managers and employees are
deeply involved in understanding the
leadership team’s strategic intent and
direction setting. Healthy behavior improves the likelihood of employee buyin and commitment. Analytics and EPM
methods are much more than numbers,
dials, pulleys and levers. People matter
. . . a lot.
When organizations embark upon
applying or expanding its use of analytics and EPM methods, I believe they
need two plans: (1) an implementation
plan, and (2) a communication plan. The
second plan is arguably much more important than the first.
WHAT DEFINES SUCCESS?
Overcoming barriers is no small
task. Cultural and behavioral obstacles not only include that resistance to
22

|

A N A LY T I C S - M A G A Z I N E . O R G

change but also include some co-workers who fear the consequences of knowing the truth. But driving social change in
others is achievable. It requires motivation to want to make a difference in an
organization’s performance and to provide others with insights and foresight for
them to make better decisions.
Overcoming barriers also requires influencing others to be more open-minded.
After all, one can never choose an alternative that has not even been considered.
Although I am an optimist by nature,
I am also a perpetual worrier. I cannot
shut out the pessimistic voice on my one
shoulder constantly warning me about
what can go wrong. Maybe that is OK because that voice forces me to think about
contingency plans to cope with unplanned
and unexpected events and outcomes.
I believe we all need both voices. Just
do not shut down the positive voice. It is a
corny line, but where there is a will there
is a way. Try not to only see the prison
window bars but also the stars.
Gary Cokins ([email protected]) is
the founder of Analytics-Based Performance
Management LLC, an advisory firm. A member
of INFORMS, he is an internationally recognized
expert, speaker and author in advanced cost
management and performance improvement
systems. He was previously a principal consultant
with SAS. For more of Cokins’ unique look at the
world, visit his website at www.garycokins.com.
A version of this article appeared in Information
Management.

W W W. I N F O R M S . O R G

Register Now &
Save!
Early registration
deadline is
May 23rd.

INFORMSCONFERENCE

BIG
DATA

THE

BUSINESS

OF

Best Practices from Experienced Companies
Analytics Media Group | Aster Data | Bell Labs | Booz Allen Hamilton |
Chevron | Dell | IBM | Intel | JP Morgan Chase | Kaiser Permanente |

June 22-24

2014

San Jose, California

Mayo Clinic | Merck | Opower | SAIC | SAS | UPS |

Tutorials, Case Studies Spanning These Topics
• Big data 101: how to navigate the big data ecosystem
• Lessons-learned on real-world implementations
• Building and managing data science teams
• From scoping the problem to advanced analytics, visualization
and supporting the decision process
• Identifying, storing, searching, cleaning the data you have

Keynote Speaker
Bill Franks
Chief Analytics Officer
Teradata Corporation
Putting Big Data to Work

• Gaining insight from new data sources
• Selecting the right big data technology
• Ethics and privacy requirements
• Emerging technologies and trends

meetings. informs.org/bigdata2014
Conference Co-Chairs:

Margery H. Connor
Diego Klabjan
Chevron Corporation Northwestern University

INFO RM S H O N O R S

CDC wins
INFORMS
Edelman Award
T

he U.S. Centers for Disease
Control and Prevention (CDC),
which collaborated with Kid
Risk, Inc. to use analytics
and operations research to combat the remaining pockets of polio around the world,
won the 2014 Franz Edelman Award for
Achievement in Operations Research and
the Management Sciences.
Following a series of judged presentations, the award was presented at the
Edelman Awards Gala held in conjunction with the INFORMS Conference on
Analytics & O.R. in Boston. INFORMS is
the premier organization for advanced analytics professionals.
Dr. Bruce Aylward, World Health Organization, assistant director-general of Polio,
Emergencies and Country Collaboration,
said, “This work has been fundamental to
so much of what’s happened in the polio
eradication program over the last few years,
24

|

A N A LY T I C S - M A G A Z I N E . O R G

and it has helped to support many of our
decisions over the last decade and to bring
the world much, much closer to one where
future generations will never know the terror of this disease.”
“Through collaborations with Kid Risk,
Inc. and other partners, CDC is helping to
identify the best strategies to further polio
eradication and achieve the endgame,”
added Dr. Mark Pallansch, director of the
Division of Viral Diseases in the National
Center for Immunization and Respiratory
Diseases at CDC.
The Franz Edelman competition attests to the contributions of analytics and
operations research in the profit and nonprofit sectors. Since its inception in 1972,
cumulative dollar benefits from Edelman
finalist projects have reached over
$213 billion.
As a spearheading partner of the
Global Polio Eradication Initiative (GPEI),
W W W. I N F O R M S . O R G

the CDC annually contributes over $100
million of its budget and significant human resources to polio eradication activities for which it maintains high standards
for developing evidence-based policies
and expectations of cost-effective use of
its resources. In 2001, the CDC launched
a collaboration with Kid Risk, Inc. to use
a range of operations research and management science tools combined with
the best available scientific evidence and
field knowledge to develop integrated analytical models for the evaluation of the
global risks, benefits, and costs of polio
eradication policy choices.
The analytical results from the collaboration significantly furthered polio eradication in many ways, including more rapid
response to outbreaks and reaffirmation
that pursuing eradication instead of control
is the “best buy” to prevent cases of paralysis and to save lives and money.
Recognition of polio eradication as a
major program in need of stable financing helped support a fundraising effort
in 2013 that raised over $4 billion from
donors to finish the job. The team foresees increased integration of operations
research and management science tools
to perform simultaneous probabilistic
and dynamic modeling for other complex
global health challenges, including other
vaccine-preventable diseases like measles and rubella.
A NA L Y T I C S

Members of the Edelman Award-winning team from the
CDC and Kid Risk, Inc.

Along with the CDC, the other finalists competing in the 2014 Franz Edelman
Award Competition included teams from
Alliance for Paired Donation, The Energy
Authority, Grady Health System, Australia’s
NBN and Twitter.
MAYO CLINIC EARNS INFORMS PRIZE
Mayo Clinic, the innovative healthcare organization that has used analytics throughout its organization to provide
economical, quality services in an era of
ballooning medical costs, was named the
2014 winner of the INFORMS Prize. The
prize was presented at an awards gala held
in conjunction with the 2014 INFORMS
Conference on Business Analytics and
Operations Research in Boston.
“Operations research is deeply rooted
in Mayo Clinic’s culture,” says Mayo Clinic
President & CEO John Noseworthy, M.D.
M A Y / J U N E 2 014

|

25

INFO RM S H O N O R S

Mayo Clinic’s century-long
history of using systems
thinking, analytics and
operations research traces
its roots back to Dr. Henry
Plummer, who developed
the first integrated, paper
medical record as a
platform to organize and
share patient information
within a group practice of
medicine.

26

|

Representatives from the Mayo Clinic accept the INFORMS Prize.

“These disciplines help us improve patient outcomes
and experience while controlling rising health care
costs, one of the biggest financial challenges facing
our country today.”
The INFORMS Prize recognizes effective integration of operations research into organizational
decision-making. The award is given to an organization that has repeatedly applied the principles of
O.R. in pioneering, varied, novel and lasting ways.
Mayo Clinic’s century-long history of using systems thinking, analytics and operations research
(O.R.) traces its roots back to Dr. Henry Plummer,
who developed the first integrated, paper medical
record as a platform to organize and share patient
information within a group practice of medicine. This
served as the foundation of Mayo Clinic’s culture of
applying engineering and O.R. principles.
Mayo Clinic continues to make significant investments to ensure a sophisticated advanced
analytics and O.R. infrastructure. With more than
500 practitioners of O.R. and analytics, Mayo
Clinic is able to continually leverage analytical
methods to enhance strategic planning, care process redesign, patient experience, inventory management and project management – leading to

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

important patient benefits as well as
financial savings.
Examples include optimization models
for patient scheduling, queuing theory for
effectively transporting patient, systems
dynamics for strategic capital allocation
planning, simulation modeling to redesign
pharmacies that reduce patient waiting and
discrete event simulation models for blood
management.
In addition to applying advanced analytics and O.R. to its business, Mayo Clinic
has made significant strides to educate
analytics_Layout 1 4/25/14 12:51 PM Page 1

staff and disseminate what they’ve learned.
Mayo Clinic’s leadership recognizes
analytics and engineering as key contributors to the organization’s sustained
excellence, market differentiation, and
superior customer experience. Looking to
the future, senior leaders consider these
disciplines to be vital in addressing the
formidable challenges in healthcare today and tomorrow.
Past recipients of the award include
Intel, UPS, HP, IBM, Ford, Procter
& Gamble and GE Research.

Stand Out.
Put yourself in a lucrative new career.
Apply now for a master’s degree in business
analytics or supply chain management.
• Intensive nine month programs
• World-renowned faculty
• Experiential projects with industry clients
• Personalized professional development

www.leeds.colorado.edu/ms
303-492-8397
[email protected]

A NA L Y T I C S

M A Y / J U N E 2 014

|

27

VA L U E OF BU S I N E S S I NTA NG I BLE S

Explaining ‘How to
Measure Anything’
Latest edition of book takes another look at seven
arguments, new and old.

BY DOUGLAS W. HUBBARD (left)
AND DOUGLAS A. SAMUELSON

A

nalytics professionals and
decision-makers are often
stymied by the lack of good
metrics on which to base decisions. But everything that matters has
observable consequences and, with a bit
of (often trivial) math, these observations
provide the grounds for reducing uncertainty. Even imperfect information has a
computable value for decisions.
These ideas were summarized in the
book “How to Measure Anything: Finding the Value of ‘Intangibles’ in Business”
28

|

A N A LY T I C S - M A G A Z I N E . O R G

[Hubbard, 2007, 2010, 2014], written by
one of the authors of this article. With
65,000 copies of the book sold in five languages, the message seems to strike a
chord. The client list of the author’s firm,
Hubbard Decision Research (HDR), and
the thousands of individuals who have
registered on the book’s website indicate
a diverse audience. They include engineers, human resources, software developers, information-security specialists,
scientists from many fields, managers in
many industries, actuaries and teachers. It
W W W. I N F O R M S . O R G

appears that the challenge
of measuring what – at
first – appear to be “intangible” is common for many
analysts and managers in
organizations of all types.
A third edition has just
been released, with an accompanying workbook to
facilitate classroom teaching and self-study. The
third edition also allowed
the author to include cases from new clients and to
respond to the most common challenges sent in by readers in the seven years
since the first edition. As in the two earlier
editions, readers learn how to frame the
measurement problem and how to avoid
measuring the wrong things, and they see
the value of relying on their quantitative
models over pure intuition.
However, even for the most fervent
advocates of quantitative methods among
our clients and readers, we find that they
can easily be bogged down by some of
the same obstacles as the skeptics of
quantitative methods. Even though we
make what seems to us to be a strong
argument for the correct way to approach
these issues and even though clients say
they “conceptually” agree with the argument, they sometimes still seem to repeat,
unknowingly, some of the same errors.
A NA L Y T I C S

What follows are seven
areas where we build on
the message of the previous edition by adding
new cases, new research
and new responses to the
challenges we continue
to observe among our
readers and clients.
1. It’s still true, anything
can be measured.
We haven’t found a
real “immeasurable” yet,
although many things initially appear to
be. In the past several years, HDR has
developed measures of the risk of a mine
flooding, drought resilience in the Horn of
Africa, the market for new laboratory devices, the risks of cyberattacks and the
value of industry standards, to name a
few. The other author of this article (Samuelson) measured the asset value of information technology [Samuelson, 2000]
and the value of deterrence in security
situations [unpublished]. In each of these
cases something was perceived to be virtually impossible to measure and, yet, the
authors were able to show that we can
use informative observations and simple,
established mathematical approaches to
reduce uncertainty enough to make decisions. As in earlier editions, the book explains the three reasons anything is ever
M A Y / J U N E 2 014

|

29

ME AS URIN G A NY T H I NG

Almost everyone
can be trained
to be an expert
“probability
estimator.”

perceived to be immeasurable, and why all three
are mistaken.
2. Do the math.
A key point in every edition of the book was
that we measure to feed quantitative decision
models, and that even naïve quantitative models
easily outperform human experts in a variety of estimation and decision problems. In a meta-study of
150 studies comparing expert judgment to statistical models, the models clearly outperformed the
experts in 144 of the cases [Meehl, 1975]. More
and more research confirms this. The third edition
adds the findings of Philip Tetlock’s giant undertaking to track more than 82,000 forecasts of 284
experts over a 20-year period. From this, Tetlock
could confidently state, “It is impossible to find any
domain in which humans clearly outperformed
crude extrapolation algorithms, less still sophisticated statistical ones” [Tetlock, 2006]. The book
reviews additional research to show that, unless
we do the math, most people, even statistically
trained experts, are susceptible to common inference errors.
3. Just about everyone can be trained to assess odds like a pro.
Almost everyone can be trained to be an expert “probability estimator.” Building on the work of
others in decision psychology [Lichtenstein and B.
Fischhoff, 1980], HDR started providing “calibrated
probability assessment” training in the mid-1990s.
The third edition included data from more than 900
people calibrated by HDR. The data consistently

30

|

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

shows that virtually everyone is extremely
overconfident before the training (e.g.,
over a large number of trials, when they
say they are 90 percent certain, they may
have less than a 60 percent chance of
being correct). But HDR also found that
about 80 percent of individuals can be
trained to be nearly perfectly calibrated
(they are right just as often as they expect to be). In other words, they can be
trained in about half a day to be as good
as a bookie at putting odds on uncertain
events. This skill becomes critical in the
process of quantifying someone’s current
uncertainty about a decision.
4. Calculating information values
avoids “the measurement inversion.”
A defined decision should always be
the objective of measurement. Uncertain variables in such a decision have a
computable expected value of information
(EVI); that is, what is it worth if we had
less uncertainty about this? When HDR
compared the EVI to clients’ past measurement habits, virtually always what got
measured and what needed to be measured were very different things. With the
third edition, HDR has conducted more
than 80 major decisions analysis, and the
results are consistent with earlier findings:
This phenomenon appears to pervade
every industry and profession from software development to pharmaceuticals,
A NA L Y T I C S

real estate to military logistics, and environmental policy to technology startups. It
appears that the intuition managers follow
to determine what to measure routinely
leads them astray; they tend not to measure the very things for which they have
the poorest information and would therefore benefit most from more data. Hubbard calls this practice “the measurement
inversion,” and it appears that the best
guarantee to avoid this problem is simply
to know the information values of uncertainties relevant to a decision.
5. A philosophical dilemma: Does
probability describe the object of
observation or the observer?
When someone says, “but how do I
know what the exact probability is?” they
are implicitly adopting a particular definition
of the word “probability.” Since the author
observed the challenges some readers
were having with this issue, the newest
edition of “How to Measure Anything” expands more on it. We generally take a
Bayesian position on the interpretation of
probability – that is, probability is used to
quantify the uncertainty of an observer, not
a state of the thing being observed. This
stands in contrast to the “frequentist” point
of view, which treats a probability as a kind
of idealized frequency of occurrence in
some objective system. Somewhat ironically, the validity of applying subjective
M A Y / J U N E 2 014

|

31

ME AS URIN G A NY T H I NG

probabilities to uncertain outcomes has
been tested with frequentist methods.
That is, extremely large trials have been
conducted where individuals’ probabilities
were compared to observed outcomes. As
mentioned earlier, the authors and other
researchers have verified that people who
are trained as “calibrated probability assessors” can repeatedly assign probabilities that, after sufficient trials, align with the
observed frequencies. Since probability is
your state of uncertainty, and since you can
be calibrated, you can always state a probability – in the Bayesian sense.
6. Statistical significance doesn’t mean
what you think, and what it does mean
you probably don’t need.
Another issue the author was observing that was getting in the way of useful
measurements was that there was a widely
held, but very vague, understanding of the
concept of “statistical significance.” The
new edition addresses pervasive misunderstandings of this idea and then makes
the case that, even when it is understood,
it isn’t directly relevant to most real decision-making problems. The book contains
examples in which just five sample observations, or in some cases even just one observation, substantially reduce uncertainty.
We have had clients who looked at a small
sample and – without attempting any math
– questioned the statistical significance of
32

|

A N A LY T I C S - M A G A Z I N E . O R G

the sample and the results. But sample
size alone is not sufficient to determine
statistical significance. Nor does statistical
significance mean the chance that a claim
is true, nor that if we fall short of statistical significance we have learned nothing.
The newest edition argues that the entire
concept is not necessary when even small
reductions in uncertainty can have significant economic value.
7. You need less data than you think,
and you have more data than you think.
A client or reader who says, “I would
like to measure this but we just don’t have
enough data” is very likely making a series
of erroneous assumptions. As in the previous point on statistical significance, managers may seriously underestimate how much
uncertainty reduction they get from a small
amount of data. In fact, we have never seen
anyone who made this claim who had actually calculated the uncertainty reduction
from a given set of data and computed the
value of it to the decision, to ascertain that
the uncertainty reduction had no value.
Managers also underestimate how much
data they really have. One example of this,
discussed in the third edition of the book,
is the “uniqueness fallacy.” This is the tendency to believe that only highly similar if
not identical examples are informative.
The latest edition includes cases where
experts insisted that since each situation is
W W W. I N F O R M S . O R G

unique, they cannot extrapolate from historical data. Then – and without a hint of
irony – they will claim that therefore they
must rely on their experience. Of course,
as the book argues, expertise and science are both based on past observations
– one of these with much more selective
recall and tendency for flawed inferences
than the other. Managers make just such
a mistake whenever they say that they
can’t make estimates about implementing
a new technology because it is so unique
– even though they have a long history of
implementing new technologies. Using
that same logic, your insurance company
couldn’t possibly compute a life insurance
premium for you because you’re unique
and because you haven’t died yet. In fact,
insurance actuaries know how to extrapolate from larger, more heterogeneous
populations.
The third edition also expands on developments in how big data, social media,
mobile phones and personal measurement devices are making the “we don’t
have enough data” excuse much harder
to justify.
SUMMARY
You can, in fact, measure anything,
in our view, but doing so is sometimes a
challenge even for those who are convinced the claim is true. We simply need
to recognize that the perceived challenge
A NA L Y T I C S

results from some of the same old, entrenched misconceptions. Your problem is
most likely not as unusual as you think;
there are sources of information you can
use, if you think creatively about how to
apply them; calibrated experts can make
good estimates of their uncertainty about
the data points they provide; and calculating the expected value of information
can focus you on collecting the most useful additional data, not wasting effort and
resources on data that won’t help much.
Douglas W. Hubbard (dwhubbard@
hubbardresearch.com) is president of Hubbard
Decision Research in Glen Ellyn, Ill., and an
internationally recognized expert in measurement
and decision analysis. Douglas A. Samuelson
([email protected]), D.Sc., is president
and chief scientist of InfoLogix, Inc., a consulting
and R&D company in Annandale, Va., and a
contributing editor of OR/MS Today and Analytics
magazines. He is a longtime member of INFORMS.

NOTES & REFERENCES
1. Douglas W. Hubbard, 2007, “How to Measure
Anything: Finding the Value of ‘Intangibles’ in
Business,” Wiley; third edition, 2014.
2. S. Lichtenstein and B. Fischhoff, 1980, “Training
for Calibration,” Organizational Behavior and Human
Performance, Vol. 26, No. 2, pp.149-171.
3. Paul Meehl, 1986, “Causes and Effects of My
Disturbing Little Book,” Journal of Personality
Assessment, Vol. 50, pp. 370-375.
4. Douglas A. Samuelson, 2001, “Information
Technology Benefits Assessment,” Encyclopedia of
Operations Research and the Management Sciences,
Second Edition, Springer. (A revised version also
appears in the third edition, 2013.)
5. Philip E. Tetlock, 2006, “Expert Political Judgment:
How Good Is It? How Can We Know?” Princeton,
N.J.: Princeton University Press.

M A Y / J U N E 2 014

|

33

B IG DATA

Using analytics
to make powerful
business decisions
Big data is a hot topic, but harnessing its full
potential can be elusive.

BY ALEX ROMANENKO (left)
AND ALEX ARTAMONOV
ig data is generating a powerful buzz. For firms that
know how to harness it, big
data can offer a significant
competitive advantage. However, much
of the ongoing hype has been focused
on gaining insights from these vast
amounts of accumulated information
while a more intriguing question lingers
outside of the spotlight: How can these
insights be translated into powerful business decisions?

B

34

|

A N A LY T I C S - M A G A Z I N E . O R G

So far, big data alone has not been
developed into anything near its touted
value, primarily because there is a disconnect between the vast volume of data and
the managers who make and implement
business decisions. Analytics can bridge
this gap by applying algorithms, generating and presenting recommendations for
optimal, practical and achievable business decisions in a user-friendly format.
Operations research – the scientific
discipline of using analytical methods to
W W W. I N F O R M S . O R G

Source: A.T. Kearney analysis

make better decisions – encapsulates
these analytical techniques, applicable
at all stages of a company’s operations,
to help improve overall profitability given
particular business objectives and constraints. In turn, a variety of user-friendly
visualization tools and techniques can
help management focus on their most
important key performance indicators
(KPIs).
This article explores some of the
most common applications for analytical
and visualization techniques and highlights the benefits that can be achieved.
THE POWER OF OPTIMAL DECISIONS
Analytical techniques allow us to understand and stimulate demand, develop
A NA L Y T I C S

an efficient production plan, effectively
source and allocate production resources, and lower distribution costs. Across
all industries, many companies are excelling at applying these techniques,
recognizing them as necessary to maintain a competitive advantage. Analytics
can have a sizable impact across all areas of operations (see Figure 1).
SALES AND MARKETING
Demand forecasting. Being customer-oriented and demand-driven
are modern business prerequisites.
Although demand sensing and predicting future behavior are crucial activities
that directly influence sales, required
inventory levels and customer service,
M A Y / J U N E 2 014

|

35

B U S IN E S S DE C I S I O N -M A K I NG

More accurate forecasting,
supported by collaborative
silo-penetrating processes,
can reduce working capital
up to 20 percent and
reduce out-of-stock events
by up to 6 percent.

36

|

many companies still use the wrong tools, including spreadsheets and black-box enterprise
resource planning (ERP) algorithms, which
are not necessarily fine-tuned for individual
SKUs and may be especially ill-suited for slowmoving items with no sales in some periods.
Forecasting is also often ignored at the pointof-sale level, which is harder to do but can be
used to improve distribution-center forecasts
and cross-department collaboration.
Choosing right-time series models, tightening modeling parameters using mathematical
optimization and adjusting processes to become
demand-driven can result in substantial operational improvements. More accurate forecasting, supported by collaborative silo-penetrating
processes, can reduce working capital up to 20
percent and reduce out-of-stock events by up to
6 percent.
Marketing optimization. Demand can be
stimulated by driving up sales with brand-recognition campaigns or by promoting individual
goods and services. Sometimes, these promotional campaigns are either too broad or poorly timed and very often offer higher discounts
than necessary to achieve extra sales volumes.
Marketing optimization approaches maximize
the effectiveness of these campaigns within
marketing budget constraints. Alternatively,
they can inform decision-makers about the right
budget level to achieve a certain sales volume.
These techniques routinely improve the marketing budget by 10 percent while allowing for
achieving the business objectives.

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

Optimal pricing. Once the desired
level of demand is attracted, it can be
further managed through optimal pricing.
Optimal pricing balances both margin and
volume so that transaction profitability is
maximized amid business constraints
such as production and distribution limitations. Determining the optimal pricing
level requires a good understanding and
quantification of the underlying demand
for goods and services and of the profitability of any given transaction. At the
portfolio level, optimal pricing, implemented based on multivariate constraint
optimization techniques, allows driving
some segments for volume and others for
margin growth, aligned with the business
strategy. Profitability improvements as a
result of applying these techniques can
reach 2 percent to 5 percent of revenue.
Once demand is understood and the
right level to maximize the profitability of
sales transactions is attracted, it is time
to analyze how demand triggers other
activities in the company. In traditional
systems, demand only affects decisions
at the nearest stock locations and their
replenishment through outbound logistics. However, demand signals from the
point of sale can also be taken into account on upstream stages of the supply
chain to more accurately decide how to
allocate stock across the system, which
needs to be modeled holistically.
A NA L Y T I C S

SUPPLY CHAIN
Strategic network. Whether it is after
inorganic growth, mergers and acquisitions, moving sourcing to low-cost countries or resourcing transport providers,
footprint and flow path restructuring is a
crucial activity for supply chain managers. The most common optimization applications are establishing where to get
raw materials, what to produce where,
how much and where to store it, who to
deliver to, and what assets are required
across the whole network. However,
real life presents interesting modeling
and implementation challenges.
Examples of these include convincing stakeholders of the need to holistically optimize the end-to-end supply chain,
considerations for production scheduling, demand sensing across all network
tiers, non-linear costs for warehousing
and transport, non-linear relationships
between the quality of raw materials
and the quality of finished goods, the integration of less quantifiable and known
elements such as competition into models, and the double objectives of costs
and CO2 emissions. Typical network optimization projects save 5 percent to 10
percent, but higher benefits – up to 20
percent – are not unprecedented.
Supply chain operations. Supply
chains are inherently dynamic because
of the uncertainties of customer demand,
M A Y / J U N E 2 014

|

37

B U S IN E S S DE C I S I O N -M A K I NG

Source: A.T. Kearney analysis

lead times and other unforeseen events.
Simulation is the best way to model dynamic operations to improve business
policies that balance customer service
levels, multi-echelon inventories, and
the use of transport and production assets while keeping costs under control.
Because company operations is an area
where the devil is in the details and multiple trade-offs exist, much effort goes
into modeling to capture this complexity
on the required level of detail. Benefits include overall service-level improvements
and 20 percent to 30 percent inventory
reductions.
38

|

A N A LY T I C S - M A G A Z I N E . O R G

Holistic supply chain. When overhauling the supply chain, strategic and
lower-detail operations designs are both
beneficial because they leave no stone
unturned. Operational design also helps
to show the customer how a redesigned
network will work in real life. Although
there are many ways to make the optimization and simulation methods work together, a model with two feedback loops
is ideal (see Figure 2). Optimized flow
path design is fed into a dynamic simulation model, which is then tightened with a
simulation-optimization loop. If an adjustment is needed to the high-level network
W W W. I N F O R M S . O R G

design, the outcome from operational
testing is passed back into strategic
optimization.
Once the structural backbone of the
supply chain and product flow is modeled, the next step is to improve operational complexity and distribution.
OPERATIONS
Production scheduling. In many
production processes, setup times are
conditional on the sequence in which
operations are performed on a single
machine. Batch size is therefore a crucial decision because it determines how
often changeovers need to be done
and determines product availability in
the warehouse, which affects dispatch
to customers. Scheduling problems
become even more complex when the
same job can be performed on different machines with varying degrees of
efficiency. Using complex combinatorial
math optimization with column generation to optimize production schedules,
product throughput and asset utilization
can be improved by up to 5 percent.
Site design. Simulation is again an
ideal tool when production, transport or
other capabilities need to be designed
or redesigned to support operational
changes, including increased throughput, alternative job-shop configurations
and production scheduling. Flexible and
A NA L Y T I C S

detailed models can be built to simulate many scenarios to determine the
most suitable configurations, especially
if optimization add-ons are used. For
example, if production capacity will increase significantly, a simulation can
quickly find the most feasible logistics
and warehouse layouts to cope with this
expansion.
Asset management and failure
prediction. Modern operations are characterized by a vast and complex assets
base. When an asset fails, the operating
routine can be significantly disrupted,
which can deal a big blow to revenue.
Methods that can predict an asset failure and then plan and schedule the corresponding maintenance can lower the
risk of failure and improve the return on
assets. Approaches that use predictive
analytics can lower maintenance and repair costs by 10 percent.
Because production and operations
decisions lead directly to dispatching
goods and services to points of sale and
end customers, distribution is the next
area to consider.
DISTRIBUTION
Transport routing. The traveling
salesman problem is one of the most
notorious optimization tasks when a vehicle needs to visit numerous geographically distributed locations. However,
M A Y / J U N E 2 014

|

39

B U S IN E S S DE C I S I O N -M A K I NG

Source: A.T. Kearney analysis

optimization can help plan this task.
Several constraints need to be taken
into account to reflect operational reality – including terminal handling capacities, demand availabilities and sequence
priorities – to represent operations with
terminal networks that use less-than-fulltruckload shipments. Within a few hours,
a huge number of alternatives can be
evaluated to identify the best schedule,
resulting in asset utilization improvements of 2 percent to 5 percent.
Transport loading. Transport loading is another interesting application of
math optimization that belongs to the
40

|

A N A LY T I C S - M A G A Z I N E . O R G

family of knapsack problems, where
different loading setups are possible.
This gets even trickier if every load
is unique; for example, if the problem
needs to be solved on a daily basis
because customer orders change or
if loading sequence matters because
some items can be transported on top of
others but some cannot. Typical project
benefits are 2 percent to 4 percent of increased vehicle space usage.
Many companies’ operations are
executed by external service providers,
and this is where efficient procurement
becomes essential. Procurement cuts
W W W. I N F O R M S . O R G

Source: A.T. Kearney analysis

across all business functions and, if executed well, can significantly improve the
bottom line.
PROCUREMENT
Sourcing. For many sourcing events,
optimization can help match expressive
or non-standard offers with the business
requirements. Potential suppliers often
come up with volume or package discounts, step-change pricing, alternative
offers, capacity constraints or other ways
to showcase their strengths. However,
these complex offers cannot be taken at
face value, and optimization is required to
A NA L Y T I C S

assemble the puzzle pieces into a coherent picture that covers business requirements and minimizes purchasing costs.
Another benefit of this approach is that
a sensitivity analysis can be used to estimate the costs of business constraints
and challenge business stakeholders on
the ones that are less crucial to business
(see Figure 3). For example, typical sourcing events for transport services result in
an 8 percent to 12 percent cost reduction.
Sourcing with strategic network
design. Sourcing optimization conducted jointly with strategic network
design can be especially beneficial. In
M A Y / J U N E 2 014

|

41

B U S IN E S S DE C I S I O N -M A K I NG

this case, network optimization uses
true market quotes rather than approximated lane and warehousing costs. In
many cases, it uncovers hidden market
potential (see Figure 4). The benefits for
sourcing transport and warehousing together with simultaneous supply network
optimization are often in the range of 10
percent to 15 percent.
Component choice. When the production process is flexible, such as if raw
materials vary or different formulations
can be used to arrive at the same result,
optimization can help determine the most
cost-efficient way to make products. This
is especially useful if costs for raw components are volatile, and different vendors
can supply materials of various quality and
resulting costs. Optimization projects that
explore production flexibility can minimize
total costs of goods sold by 1 percent to 3
percent on raw material purchasing.
WEAVING ANALYTICS INTO THE
FABRIC OF BUSINESS
Developing sophisticated models
is impractical if business stakeholders
don’t use them. Gaining their buy-in is
vital. To capture incremental business

benefits on a regular basis, analytical
solutions must be institutionalized and
incorporated into daily decision-making.
Visualization is one way to help stakeholders focus on their KPIs by presenting information in a user-friendly format.
Every analytics project needs a visualization component that reflects insights,
complexities and interdependencies so
that the advanced analytical algorithms
are not perceived as black boxes (see accompanying sidebar story, pages 43-44).
This, in turn, increases trust in the results.
Regardless of market conditions,
forward-thinking players that use analytics perform better than their competitors.
They know that capturing a competitive
advantage requires going beyond ERP upgrades. By bridging the gap between decision-makers and the vast volume of data,
an analytics-driven business transformation can ensure that optimal decisions are
an integral part of every business unit.
Alex Romanenko ([email protected])
leads A.T. Kearney’s Analytics Practice in London,
which develops and delivers analytics-based solutions
in the United Kingdom and around the world.
Alex Artamonov ([email protected]),
a manager with A.T. Kearney, leads supply chain
transformation projects that use analytics to support
strategic and operational decision-making.

Request a no-obligation INFORMS Member Benefits Packet
For more information, visit: http://www.informs.org/Membership

42

|

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

Transforming data into graphics to gain buy-in
The most sophisticated analytical mod-

An array of visualization tools can

els are meaningless if business stake-

transform advanced algorithms into eas-

holders avoid using them when making

ily understandable graphics. For example,

strategic decisions. Turning data into a

Tableau Software’s (tableausoftware.com)

more aesthetic, user-friendly format builds

interactive dashboard can present footprint

trust in complex analytical results.

optimization results and compare scenari-

Figure 5-1: Info graphic view of
analytics jobs.

A NA L Y T I C S

os to help stakeholders make informed decisions (see Figure 5-1).

M A Y / J U N E 2 014

|

43

B U S IN E S S DE C I S I O N -M A K I NG

Figure 5-2

Gephi (gephi.org),
an open-source
graph visualization
and manipulation
software program,
can be used to
highlight complexity and interlinks
between supply
chain tiers to derive
a cost-efficient and
sustainable setup
(see Figure 5-2).

Circos (circos.ca),
which takes data
from a table and
converts it into a
circular layout,
can show the
links between
three types
of blending
products
and their
components
to come up
with the most
economical
component
purchases
(see Figure
5-3).

Figure 5-3

44

|

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

Visual Analytics
Opportunity
at your fingertips.

The answers you need, the possibilities you seek—they’re
all in your data. SAS helps you quickly see through
the complexity and find hidden patterns, trends, key
relationships and potential outcomes. Then easily share
your insights in dynamic, interactive reports.

Try Visual Analytics and see for yourself

sas.com/VAdemo

SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. ® indicates USA registration. Other brand and product names are trademarks of their respective companies. © 2014 SAS Institute Inc. All rights reserved. S120597US.0214

VO L UME, VE LO C I T Y, VA R I E T Y A N D ...

The big ‘V’
of big data
Keys to tapping the hidden “value” of big data.

BY (l-r) PRAMOD SINGH, RITIN MATHUR, ARINDAM MONDAL
AND SHINJINI BHATTACHARYA
“There is a big data revolution. However, what is revolutionary
is not the quantity of data alone. The big data revolution is that
now we can do something with the data.”
– Professor Gary King, Harvard University

W

hile storage and computational capacity is essential
and a given, it is important to
note that improved statistical
and computational methods are creating
opportunities like never before when dealing with big data. Today, large amounts of
data are available that individuals, businesses and governments can manage
46

|

A N A LY T I C S - M A G A Z I N E . O R G

and leverage for information that can lead
to insights. However, it is essential to analyze information in a cost-effective manner. High volume, variety and velocity of
data cannot translate into value unless
management makes a concerted effort.
Unlocking this value that businesses
can get from big data involves three key
elements:
W W W. I N F O R M S . O R G

Figure 1: Three key elements to unlocking the value of data.
1. Information infrastructure (ingest
and store efficiently): This element is about
creating infrastructure that can capture,
store, replicate and scale information at
speed.
2. Information management: An
information ecosystem to manage,
secure, govern and leverage information
seamlessly across an organization’s
information assets.
3. Insights: Correlate and use this
data in conjunction with existing business data (usually structured data) and
analyze using descriptive and prescriptive analytics to aid decision-making.
The last few years have seen a lot
of focus and attention on infrastructure
and information management. Exciting new technologies, frameworks and
A NA L Y T I C S

methodologies have evolved to address
the needs of these elements. For example, infrastructure technologies have
greatly improved, as have the innovations that benefit data centers. They
range from fast and efficient servers to
data center solutions that can capture,
store, replicate and scale information at
high speeds.
Information management has seen
the most rapid evolution and change.
Managing information through distributed technologies (file systems such as
Hadoop) has changed information storage to provide low latency, high speed,
highly available systems. Innovations in
areas such as traditional enterprise data
warehousing environments through indatabase, in-memory capabilities and
M A Y / J U N E 2 014

|

47

B IG DATA

For business analytics to
be successful in meeting
an organization’s needs
for decision support, it
fundamentally needs to be
able to consider all data
sets relevant to solving
a particular business
question.

compression techniques have resulted in organizations being able to get insights from data quicker.
As organizations come to terms with managing
big data and harnessing them through systems and
converge their usage with traditional data sources,
the business analytics to guide decision-making is
itself evolving. This article will focus on how analytics has been influenced by big data and what
practices will emerge in years to come through observations within Hewlett Packard.
UNITED WE ARE “BIG,” DIVIDED WE MAY
BE SMALL
For business analytics to be successful in meeting an organization’s needs for decision support,
it fundamentally needs to be able to consider all
data sets relevant to solving a particular business
question.
Traditional business intelligence (BI) and enterprise data warehouse (EDW) environments focus
on the usual data generated from business operations. This is data generated through point of sale
transactions, customer data, financial, business
planning data, inventory management systems, etc.
Businesses today, however, also have access
to two other key forms of data. The first of these can
be loosely categorized as “human information”; this
form of data comes from having increased knowledge of customers through e-mail, social media
and other marketing channels, but also from an organization’s institutional data in the form of documents and customer support call records, as well
as video, audio and image sources. This data tends
to be unstructured in format.

48

|

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

Figure 2: Bringing data together.
The second new type of data comes
from machine data. This is data generated from an increasingly interconnected
world of devices and systems. Examples
range from data generated by sensors,
smart meters, RFID tags, security and
intelligence systems, IT logs (application and Web servers), etc. This data
tends to be largely semi-structured or
unstructured.
Business systems in BI and EDW environments are not architected to handle
the volume and variety of “human information” nor the volume and velocity of
machine generated data.
Today, organizations need to bring all
their data together for advanced analytics. For example, at HP, structured data
from a customer’s purchase history, demographics and warranty data can be

A NA L Y T I C S

combined with unstructured data coming
from customer support records and social media for a more focused customer
engagement strategy.
BIG DATA AND ANALYTICS:
PROCESS, PURPOSE, PRACTICE
As information and data assets of an
organization come together and combine
with external data, analytical techniques
and analysis will have a larger role to
play. In general, the characteristics of
big data that most influence the analytics process are related to the variety and
volume of data. However, velocity, which
is handled through business intelligence
practices, is considered distinct from
core analytics practices for the purposes
of this article. The analytics process is
usually represented as a set of activities

M A Y / J U N E 2 014

|

49

B IG DATA

– preparing data, developing analytical
models based on analytical techniques
to solve for the business question, validating the model and deploying it. These
are essentials of analytics and represent
the elements of big data that organizations today need to pay attention to. Following is a closer, more technical look at
each of these activities.
DATA PREPARATION:
SAMPLING. Sampling has been the
backbone of analytical processes with the
premise of using information of a sample
to infer on a larger population. Historically, sampling has been a core part of
analytical processes due to limitations on
collecting data on populations and then
analyzing it in aggregate. Sample accuracy, of course, depends on several
factors and is predicated on the minimization of various biases in the sampling
methodology.
While there are arguments both for
and against sampling in big data environments, from a data scientist’s perspective, a few aspects of the data used
in an analytics process need to be well
understood.
First, one can’t always use large storage and computing power to analyze
a population unless the marginal business returns are higher due to the addition of more data sets. Second, some
50

|

A N A LY T I C S - M A G A Z I N E . O R G

specialized application areas do need
population rather than sample. For example, in case of analysis of cyber security
threats out of a large data set, our interest lies in finding the outliers and anomalies in the data. Where millions of rows of
data may not give any value, a particular
row (one individual) may be very useful and could save huge losses for the
business. Another example is when one
has to identify the five top social media
influencers in cloud computing; here we
might consider population as an important element.
And last, irrespective of whether we
choose to use a sample or a population,
it is still vitally important for a data scientist to understand and question the data
source and collection methods so that a
selection bias in data may be averted.
As such, the need and relevance of
sampling in big data applications is contextual and depends on the question being solved and the source of the data.
ANALYTICS TECHNIQUE:
REGRESSION. One of the most common
analytical techniques used by analysts
today are related to regression. Regression is commonly understood as a statistical process for estimating relationship
between variables. The techniques help
predict the value of a dependent variable,
given values of independent variables
W W W. I N F O R M S . O R G

and are widely used for prediction and
forecasting.
Most common methods of regression
such as “ordinary least squares” and
“maximum likelihood estimation” require
that the number of variables be less than
the number of observations. In a big data
environment, where increasing newer
data sets are being incorporated, the
number of independent variables available often greatly exceeds the number
of observations. A case in point is the
study of genes, where the different types
of genes are the independent variables
and the number of patients in a study is
the observations. Another good example
is texture classification of images where
the variables are the pixels and observations are the number of images available
for observation.
In addition to this, the analyst also
has to address some very important issues. For example, do the new variables
really help improve the accuracy of the
prediction? In general, not all variables
contribute to an improved accuracy of the
model. Typically, only a few of the large
number of potentially influential factors
account for most of the variation.
To handle this complexity of variable selection brought about by increasing number of data sets available for
analysis through big data techniques, a
few methods have gained attention and

adoption, such as subset selection for
regression, penalized regression, Biglm,
Revolution R and Distributed-R Vertica,
and the split-and-conquer approach. For
a more technical discussion of each of
these methods, click here.
ANALYTICS TECHNIQUE:
CLUSTERING. Segmentation, using
clustering techniques, is a common method used to reveal natural structure of data.
Cluster analysis involves dividing the data
into useful as well as meaningful groups
where objects in one group (called a cluster) are more similar to each other than to
those in other groups.
In general, a clustering technique
should have the following characteristics
to be suitable for use in a big data environment: It should be able to capture
clusters of various shapes and sizes, effective treatment for outliers and be able
to efficiently execute the algorithms for
large data sets.
Most partitional and hierarchical
methods that rely on centroid-based approaches to clustering do not work very
well in large data sets where the underlying data supports clusters of different
sizes and geometry.
Techniques such as DBSCAN (Density Based Spatial Clustering of Application
with Noise) [1] can help find clusters with
arbitrary shapes. It works by determining
M A Y / J U N E 2 014

|

51

B IG DATA

the density associated with a point by
counting number of points in a region of
a specified radius around a point. Points
with a density above a threshold are classified as core points, while noise points
are defined as non-core points that don’t
have core points within the specified radius. Noise points are discarded and clusters are formed around core points. This
very idea of density-based identification
of a cluster helps in creating clusters of
various shapes.
CURE (Clustering with Representatives) [2] also does well at capturing clusters of various shapes and sizes, since
only the representative points of a cluster
are used to compute its distance from other clusters. The clustering algorithm starts
with each input point as a separate cluster, and at each successive step merges
the closest pair of clusters. The representative points help in capturing the different
physical shape and size of the clusters.
A clustering technique called BIRCH
(Balanced Iterative Reducing and Clustering Using Hierarchies) [3] is effective
in managing outliers. It works by first
performing a “pre-clustering phase” in
which dense regions of points are represented by compact summaries, and then
a centroid-based hierarchical clustering
algorithm is used to cluster the set of
summaries (which is much smaller than
the original data sets).
52

|

A N A LY T I C S - M A G A Z I N E . O R G

SUMMARY
Big data has influenced the entire
spectrum of analytics – from data ingestion, storage, preparation, modeling and
deployment. Organizations are exhibiting rigor in trying to harness big data
through innovations in information infrastructure and information management.
New core analytics techniques and practices are also changing to accommodate
for the challenges of volume and variety of data associated with big data. A
new era of volume, velocity and variety
is leading the way for value creation in
organizations like never before.
Dr. Pramod Singh is director of Digital and Big
Data Analytics at Hewlett-Packard and a member
of INFORMS. Ritin Mathur is senior manager of
Big Data Analytics at HP. Arindam Mondal and
Shinjini Bhattacharya are data scientists at HP.
All four are based in Bangalore, India.
REFERENCES
1. Levent Ertoz, Michael Steinbach and Vipin
Kumar, “DBSCAN: Finding Clusters of Different
Sizes, Shapes and Densities in Noisy, High
Dimensional Data,” paper, Department of
Computer Science, University of Minnesota.
2. Sudipto Guha, Rajeev Rastogi and Kyuseok
Shim, “CURE: An Efficient Algorithm for Large
Databases,” Proceedings of the ACM SIGMOD
Conference, 1998.
3. Tian Zhang, Raghu Ramakrishnan and Miron
Livny, BIRCH presentation, 2009.

Subscribe to Analytics
It’s fast, it’s easy and it’s FREE!
Just visit: http://analytics.informs.org/

W W W. I N F O R M S . O R G

Wanted: Big Data Experts.
Introducing the SMU Cox Master of Science in Business Analytics.
Across a variety of industries, there’s a need for big data problem
solvers. And the SMU Cox School of Business is filling it. We’re
training the best and brightest to translate big data into the big
picture. All in less than a year’s time. Whether you’re looking to hire
the most wanted or become one of them, look no further.
Learn more at coxmsba.com.

SMU is an Affirmative Action/Equal Opportunity Institution.

ADVEN T U RE S I N C O NS U LT I NG

What’s a defense
consulting company
doing in sports?
BY STEPHEN CHAMBAL
he Perduco Group is a small
defense consulting company in Dayton, Ohio, near
Wright Patterson Air Force
Base (WPAFB). The company specializes
in high-end data analytics with core competencies in data architecting, business
intelligence and business analytics. Perduco was formed in 2011 and has grown
rapidly to 21 employees. Twenty of those
employees are working in the defense
space, and one, Dr. Jacob Loeffelholz,
is the lead for a strategic push into the
sports domain. Now, what is a defense
consulting company doing with a director
for sports analytics? The easy answer is
simple: sports are fun. However, the real

T

54

|

A N A LY T I C S - M A G A Z I N E . O R G

answer is more tied to core capabilities,
business opportunity, two chance encounters and a willingness to believe in
what you’re doing!
BIG, UGLY DATA
Perduco’s data competency is tied
to integrating and aggregating big, ugly
data. This includes the use of enterprise
data architecting to build the data infrastructure required when solving client
problems. On the analytics side, the company specializes in advanced or predictive
analytics and has broad-based expertise
and experience in the field of operations
research (O.R.). This core capability in
O.R., coupled with an ability to visualize
W W W. I N F O R M S . O R G

Football heatmap:
visualizing player
performance in any
situation.

Peyton Manning Heat Map
and communicate results, gives Perduco
a competitive advantage in the defense
space. This same advantage can be applied to other business areas such as energy, healthcare and finance. However, a
chance encounter and 30-minute discussion brought a new opportunity – sports
– to the forefront.
I retired from the United States Air
Force in 2011 and partnered with Toyzanne
Mason to form The Perduco Group. My
final assignment in the Air Force was
spent serving on faculty at the Air Force
Institute of Technology (AFIT) at WPAFB.
On a return visit to AFIT in 2012, I stopped in
to visit with Dr. Ken Bauer, professor in the
Department of Operational Sciences at
AFIT. We traded updates, and the topic
turned to sports. Ken mentioned an article
A NA L Y T I C S

he had recently published with Jacob on
predicting NBA outcomes using artificial
neural networks. I knew Jacob from his time
at AFIT and had even helped Jacob find a
job with a defense consulting company (not
Perduco) after graduation. What he did not
know, however, was that Jacob’s article had
just topped 1,000 full downloads from the
Journal of Quantitative Analysis in Sports.
For those not familiar with academic literature, this is an uncommonly high number.
I left AFIT that day knowing two things:
Perduco was going into the sports domain,
and Jacob was the right person to lead
this push.
I first called my business mentor and
vice president for Perduco, Chris Mason, and brought up the idea of expanding Perduco into the sports sector. As a
M A Y / J U N E 2 014

|

55

ADVEN T U RE S I N C O NS U LT I NG

VFT player evaluation: converting subjective assessments into quantifiable rankings.
sports zealot, Chris was very interested
but needed to understand the business
case before committing to this decision.
Jacob and I met to discuss the potential
opportunities and the growing application
of analytics in sports. Jacob highlighted a
number of areas where analytics could
be leveraged in the sports domain. More
importantly, they both recognized the
limited use of advanced analytics from
the O.R. discipline, which could quickly
be adapted to solve many problems in
sports, as many of the challenges are
similar to those being faced in defense.
In fact, Perduco was already working in a
number of these crossover areas.
56

|

A N A LY T I C S - M A G A Z I N E . O R G

‘QUANTIFIED WARRIOR’
The defense community is very interested in what is called “quantified warrior,”
which is an ability to monitor and assess a
soldier’s condition and to understand how
to optimize their performance on the battlefield. As one can imagine, athletic teams are
very interested in maximizing their players’
performance “on the battlefield.” A second
example is related to overhead monitoring and the collection of surveillance data
to understand and predict activities on the
ground. The intelligence community is continuously engaged in this type of effort in
order to provide better assessments of enemy activity to commanders on the ground.
W W W. I N F O R M S . O R G

Scout scheduling: maximizing the value of a scout on the road.
STATS Inc. recently installed SportVU, an
overhead surveillance system in all professional basketball arenas to track player
movements on the court at all times. The
collection systems allow coaches or team
“commanders” to better analyze team and
opponent behavior and evaluate performance on the court. There are countless
other examples of defense crossover areas, but in every case, O.R. is a critical requirement in solving these problems.
The business opportunities were
there, and in the summer of 2012, Perduco made the corporate decision to hire
Jacob as the lead for all things sports.
Jacob’s first task was to determine where
A NA L Y T I C S

the company should focus in this very
broad business area called sports analytics. This led to a one-year path of exploration, investigating a number of ways
to generate revenue through the application of advanced analytics and O.R. This
also led to a second chance encounter
which would result in Perduco finding
their sports focus in the summer of 2013.
Before jumping ahead, there is useful
insight into covering two major explorations, which are, in some ways, still being
pursued within the company.
The first push into the sports sector
came in the form of team consultation.
Anyone passionate about sports analytics
M A Y / J U N E 2 014

|

57

ADVEN T U RE S I N C O NS U LT I NG

has read the book or seen the movie “Moneyball.” The idea of helping teams pick
players, make trades, analyze strategy and
win more games is what sports geeks get
excited about – the sexy side of sports analytics. Perduco aggressively pursued direct
team consultation and met with a large
number of organizations at both the professional and collegiate levels and across
multiple sports including football, basketball, baseball and hockey. Professional
basketball quickly became the target of interest, and Perduco developed a number
of prototype applications to demonstrate
the benefits of O.R. capabilities to team
management. Scout scheduling optimization, aggregate player evaluation and prediction of player performance are just a few
of the solutions presented to professional
teams. Although these solutions were well
received, Perduco gained little traction with
respect to establishing formal consulting
agreements with organizations.
TEAM CONSULTATION
Perduco discovered a number of challenges when pursuing direct team consultation. Teams are very protective of
their data and even more protective of the
questions related to team and player challenges. Even with a willingness to sign
non-disclosures and protect data sources,
teams are hesitant to fully share information
with respect to organizational decisions.
58

|

A N A LY T I C S - M A G A Z I N E . O R G

Additionally, most teams were interested in
how the capabilities presented were being
used by other teams. The idea being, if no
one else was using the tools, there was no
forcing function for them to adopt new analytical methods. Put another way, teams
are very interested in being “first to be second.” The challenge then became getting
the first team on board, which highlighted
the most critical issue limiting the expansion into sports – relationships. Relationships are the foundational building blocks
required to make any business successful.
Perduco had many strategic relationships
in defense, but building a similar network
of connections in professional sports would
take time to develop. The company continues to pursue team consultation, and in
January of 2013, an NBA Western Conference team began using Perduco’s scout
scheduling optimization tool and is now
seeing great benefits with their use of this
capability.
The second major push into sports
came in the area of business-to-business consulting. The connections are
easier to find, and the relationships are
easier to build. Companies supporting
the sports industry are very interested in
new capabilities and are willing to discuss challenge areas where they see
a need for analytical solutions. New or
advanced capabilities give companies
a competitive advantage in a rapidly
W W W. I N F O R M S . O R G

changing and dynamic market space.
Perduco met with a number of sports
companies, including STATS Inc., and
continued to gain exposure for their predictive analytic expertise. The business
case is working, but it takes considerable time and resources to develop solutions at the level required for these
companies to market to their client base.
Business-to-business was expected to
be the major push in sports until a second chance encounter presented Perduco with a new option and a major shift
in company focus.

c

ontinuing
ducation

ESSENTIAL PRACTICE SKILLS
FOR ANALYTICS PROFESSIONALS
Topic areas:

» Problem Framing

» Developing the Work Plan
» Testing Recommendations

Next time I get a visionary assignment
that needs some clarity, I’ll be using what
I learned in this course to work towards
a great solution!
- Caroline Alexander,
Fed Ex Corporation

A NA L Y T I C S

ANALYTICS

PROFESSIONALS

INFORMS Continuing Education
program offers intensive, two-day
in-person courses providing analytical
professionals with key skills, tools, and
methods that can be implemented
immediately in their work environment.

Topics areas:

This course will be held
San Jose, CA – June 20-21, 2014



Learn more about these
courses at:
informs.org/continuinged

COURSES FOR

DATA EXPLORATION
& VISUALIZATION

» Presenting Results
» Impact Assessment



Prior to my 20-year reunion, I reconnected with an old friend and classmate
from the Air Force Academy, Troy Neihaus. We talked over the phone and
during this call, Troy mentioned Perduco’s website and noticed the push into
sports analytics. He had a friend who
was part-owner of a company in the fantasy football space and offered to make
the connection. The next day, Perduco
was introduced to Full Time Fantasy, a
company that runs a very popular fantasy football website FFToolbox.com. The
first phone call lasted only 30 minutes,

» Accessing Data
» Understanding Raw Data
» Cleaning & Transforming Data
» Exploring & Visualizing Data
» Dimension Reduction



The datasets on which the
exercises are based are taken
from real-life scenarios, are
fun to work with and very
challenging. The course
provides a general framework
for tackling data analysis and
the instructors highlight the
pitfalls one can made along
the process.
- Ivan Hernandez,
Stevens Institute
of Technology



This course will be held
San Jose, CA – June 25-26, 2014

M A Y / J U N E 2 014

|

59

ADVEN T U RE S I N C O NS U LT I NG

but the decision was made to shift focus, integrated capabilities in the summer of
resources and energy into web-based 2014. A number of capabilities are befan support for fantasy sports.
ing developed, tested and implemented
into the FFToolbox infrastructure. These
FANTASY SPORTS
capabilities will provide users access to
Jacob, working with FFToolbox, advanced analytics for analyzing players,
quickly researched the fantasy sports simulating drafts and visualizing statistilandscape and identified numerous op- cal information in an easy to use and unportunities for advanced analytics and derstandable format. The goal is to put
data visualization. The two companies the power of O.R. in the hands of active
began developing a
fantasy sports players
strategy to move forwho may not otherwise
ward and outlined both
have access to these
the execution and busitypes of mathematical
ness plan to make this happen. The in- algorithms. FFToolbox gives their custegration of FFToolbox’s expertise in tomers a competitive advantage of their
fantasy sports and Perduco’s depth in own when competing in daily, weekly
O.R. provides enormous opportunity and season-long fantasy sports games.
to bring advanced and unique capabili- The fantasy sports market is an exploties to the market space. Furthermore, sive industry with millions of potential
FFToolbox is already recognized as a customers, each looking for an edge and
leader in the space with nearly 10 mil- bragging rights when competing against
lion users visiting their website on an family, friends, co-workers and unknown
annual basis. This footprint provides opponents across the world.
immediate exposure for Perduco analytPerduco has spent nearly two years
ics at a level never imagined this early committed to expanding into the sports
in the company’s expansion into sports. domain. Perduco’s core capabilities in
In fact, the building footprint in sports operations research and its passion to
has led to the spin off of Perduco Sports bring advanced analytics to the sports
from the parent company, The Perduco industry has kept the company moving
Group, which happened in the spring of forward at all times. The expected busi2014.
ness areas have shifted here and there,
Full Time Fantasy and Perduco but with each shift came increased opare on schedule to release their first portunities for success. The chance
60

|

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

encounter with Bauer in summer of 2012
led to the hiring of Loeffelholz and the
launch of sports analytics within Perduco. The chance encounter with Neihaus
in summer of 2013 led to the introduction
of Perduco to Full Time Fantasy and the
launch into the fantasy sports market. Believing in chance encounters and being
ready and willing to react when they happen has been a driving force behind this
adventure. If the pattern continues, the
next chance encounter should be coming soon in summer of 2014 and could
ignite the viral power of the Internet

and launch Perduco into an overnight
success, 24 months in the making!
Dr. Stephen Chambal (Stephen.chambal@
theperducogroup.com) is vice president of The
Perduco Group and is responsible for strategic
business development and high-end recruiting.
Chambal retired from the United States Air Force
in 2011 after more than 24 years of service. In
his final military assignment, he served as the
director of Operational Analysis for the Air Force
Institute of Technology. Chambal enlisted in the Air
Force in 1986 and obtained his commission from
the Air Force Academy in 1993. He held various
assignments within the scientific analysis career
field, including test, space and special programs.
Chambal holds a Ph.D. in operations research and
has authored and co-authored numerous articles,
white papers and conference presentations.

BECOME A CERTIFIED ANALYTICS PROFESSIONAL
®

www.informs.org/Build-Your-Career/Analytics-Certification

Be among the first to become a Certified Analytics Professional (CAP®).
Make plans now to take the profession's first analytics certification exam.

BENEFITS OF CERTIFICATION

• Advances your career potential by setting you apart from the competition
• Drives personal satisfaction of accomplishing a key career milestone
• Helps improve your overall job performance by stressing continuing
professional development
• Recognizes that you have invested in your analytics career by pursuing
this rigorous credential
• Boosts your salary potential by being viewed as experienced
analytics professional
CBT
• Shows competence in the principles and practices of analytics

APPLICATIONS

• Prepare to apply by reviewing Candidate
Handbook & Study Guide
• Arrange now to secure academic transcript and
confirmation of “soft skills” to send to INFORMS

COST

• $495 INFORMS Members
• $695 Non-Members
• Bundled rates with meetings available

QUESTIONS? [email protected]

A NA L Y T I C S

STARTING
JUNE 1

UPCOMING CAP® EXAMS
MAY 17
Baltimore, MD

YOUR
SCHEDULE
Kryterion Test
Centers
Computer
Based Test
(CBT)

MAY 22
Cincinnati, OH
MAY 26
Ottawa, ON
JUNE 19
Chicago
JUNE 21
San Jose, CA

NOW INTRODUCING
COMPUTER-BASED TESTING
It is now more convenient than ever to schedule your CAP
exam in more than 700 Kryterion test centers in more than
100+ countries. No more waiting for INFORMS to come to
you! To find the location closest to you, check this site:
http://www.kryteriononline.com/host_locations/

M A Y / J U N E 2 014

|

61

Special Advertising Supplement

INFORMS would like to thank and salute the following
sponsors of the Franz Edelman Awards Gala who contributed
at the Leadership Level and Executive Level.
Leadership Level ($10,000)
IBM
Executive Level ($5,000)
AIMMS
Alix Partners
Booz Allen Hamilton
Institute for Information Industry
Metron
Revenue Analytics
Spinnaker
University of Cincinnati

Smarter technology for a Smarter Planet:

Anyone can make
a decision. The
question is: Can you
make the right one?

Want to see the power
of CPLEX® and Decision
Optimization in action?
Visit us at the booth to see
how IBM’s innovative
technology and services
are fundamentally
redefining how real-world
companies meet their
performance goals.

OPTIMIZE YOUR BUSINESS
WITH UNPRECEDENTED SPEED
IDEA

IN A FEW
HOURS

MISSION CRITICAL
ENTERPRISE APP

IN A FEW
MONTHS

PUBLISHED
INSTANTLY
TO YOUR ENTERPRISE
OPTIMIZATION
APP STORE

PROOF OF
CONCEPT

IN A FEW
DAYS

OPTIMIZATION APP

IN A FEW
WEEKS

To learn more about AIMMS Optimization Apps, visit aimms.com.
[email protected] | +1 425 458 4024

S T R AT E G Y & O R G A N I Z AT I O N

|

TECHNOLOGY

|

E N G I N E E R I N G & O P E R AT I O N S

|

A N A LY T I C S

2012 INFORMS
Innovation in Analytics
Award Winner

Make sense of your data.

Achieve operational efficiencies.

Exceed your organizational goals.

Ready for what’s next. Now more than ever, making sense of large, complex data sets is
critical to an organization’s success. Supported by extensive industry experience, our analytics experts
are armed to help clients make informed decisions and turn choices into action. These insights help our
clients anticipate, identify, and address their specific needs—from achieving cost savings to reducing fraud,
waste, and abuse. We leverage these analytics capabilities to help clients across US and international
governments in defense, intelligence, and civil sectors, as well as commercial health, financial services, and
energy and utilities industries. Whether you’re managing today’s issues or looking beyond the horizon, count
on us to help you be ready for what’s next.
Ready for what’s next. www.boozallen.com/analytics

Congratulations to the
2014 Franz Edelmen Finalists!

SCIENTIFIC SOLUTIONS

Applied Mathematics, Physical Modeling, and Computer Science

www.metsci.com

The Lindner Master of Science in Business Analytics

TOPTop
20
PROGRAM
in the country

—InformationWeek

Learn more: business.uc.edu/analytics
UCLindnerCollege

@UCBusAnalytics | @LindnerCollege

Center for Business Analytics

Salutes the Sponsors of the
2014 Edelman Gala

RECOGNIZING DISTINCTION IN APPLICATIONS
OF ANALYTICS, OPERATIONS RESEARCH,
AND MANAGEMENT SCIENCE
LEADERSHIP LEVEL

EXECUTIVE LEVEL

CHAMPION LEVEL

SUSTAINER LEVEL

www.informs.org
www.informs.org/aboutanalytics

March 31, 2014
Boston, MA

PR ED IC TIVE A N A LY T I C S

Most swans are white:
Living in a predictive society
My view – and Siegel’s,
I would guess – is that
this predictive activity
has generally been good
for humankind. In the
context of healthcare,
crime and terrorism, it
can save lives. In the
context of advertising,
using predictions is more
efficient … In politics, it
seems to reward those
candidates who respect
the scientific method.

BY THOMAS H.
DAVENPORT
62

|

Eric Siegel’s book – “Predictive Analytics: The
Power to Predict Who Will Click, Buy, Lie, or Die”–
deals with quantitative efforts to predict human behavior. One of the earliest efforts to do that was in
World War II. Norbert Wiener, the father of “cybernetics,” began trying to predict the behavior of German
airplane pilots in 1940 – with the goal of shooting
them from the sky. His method was to take as input
the trajectory of the plane from its observed motion,
consider the pilot’s most likely evasive maneuvers,
and predict where the plane would be in the near future so that a fired shell could hit it. Unfortunately,
Wiener could predict only one second ahead of a
plane’s motion, but 20 seconds of future trajectory
were necessary to shoot down a plane.
In Siegel’s book, however, you will learn about
a large number of prediction efforts that are much
more successful. Computers have gotten a lot faster
since Wiener’s day, and we have a lot more data. As
a result, banks, retailers, political campaigns, doctors
and hospitals, and many more organizations have
been quite successful of late at predicting the behavior of particular humans. Their efforts have been
helpful at winning customers, elections, and battles
with disease.
My view – and Siegel’s, I would guess – is that
this predictive activity has generally been good for

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

humankind. In the context of healthcare,
crime and terrorism, it can save lives. In
the context of advertising, using predictions is more efficient, and could conceivably save both trees (for direct mail and
catalogs) and the time and attention of
the recipient. In politics, it seems to reward those candidates who respect the
scientific method (some might disagree,
but I see that as a positive).
However, as Siegel points out – early
in the book, which is admirable – these approaches can also be used in somewhat

harmful ways. “With great power comes
great responsibility,” he notes in quoting
Spider-Man. The implication is that we
must be careful as a society about how
we use predictive models, or we may be
restricted from using and benefiting from
them. Like other powerful technologies
or disruptive human innovations, predictive analytics is essentially amoral, and
can be used for good or evil. To avoid the
evil applications, however, it is certainly
important to understand what is possible
with predictive analytics, and you will

How will you stand out from the crowd?

A membership in INFORMS will help!
• Certification for Analytics Professionals
• Online access to the latest in operations research and advanced analytics techniques
• Networking Opportunities available at INFORMS Meetings and Communities
• New Members receive one free Subdivison membership in 2014

A NA L Y T I C S

visit http://join.informs.org

M A Y / J U N E 2 014

|

63

PR ED IC TIVE A N A LY T I C S

certainly learn that if you keep reading.
This book is focused on predictive analytics, which is not the only type of analytics, but the most interesting and important
type. I don’t think we need more books
anyway on purely descriptive analytics,
which only describe the past, and don’t
provide any insight as to why it happened.
I also often refer in my own writing to a
third type of analytics—“prescriptive”—
that tells its users what to do through controlled experiments or optimization. Those
quantitative methods are much less popular, however, than predictive analytics.
This book and the ideas behind it are
a good counterpoint to the work of Nassim Nicholas Taleb. His books, including
“The Black Swan,” suggest that many efforts at prediction are doomed to fail because of randomness and the inherent
unpredictability of complex events. Taleb
is no doubt correct that some events are
black swans that are beyond prediction,
but the fact is that most human behavior is quite regular and predictable. The
many examples that Siegel provides of
successful prediction remind us that most
swans are white.
Siegel also resists the blandishments
of the “big data” movement. Certainly
some of the examples he mentions fall
into this category – data that is too large
or unstructured to be easily managed by
conventional relational databases. But the
64

|

A N A LY T I C S - M A G A Z I N E . O R G

point of predictive analytics is not the relative size or unruliness of your data, but
what you do with it. I have found that “big
data often equals small math,” and many
big data practitioners are content just to
use their data to create some appealing
visual analytics. That’s not nearly as valuable as creating a predictive model.
Siegel has fashioned a book that
is both sophisticated and fully accessible to the non-quantitative reader. It’s
got great stories, great illustrations and
an entertaining tone. Such non-quants
should definitely read this book, because
there is little doubt that their behavior
will be analyzed and predicted throughout their lives. It’s also quite likely that
most non-quants will increasingly have
to consider, evaluate and act on predictive models at work.
In short, we live in a predictive society.
The best way to prosper in it is to understand the objectives, techniques and limits of predictive models. And the best way
to do that is simply to read Siegel’s book.
Thomas H. Davenport (www.tomdavenport.com)
is a visiting professor at Harvard Business School,
the President’s Distinguished Professor at Babson
College, co-founder of the International Institute
for Analytics and the co-author of “Competing on
Analytics” and several other books on analytics.
He is a member of INFORMS. This foreword by
Professor Davenport is excerpted with permission of
the publisher, Wiley, from “Predictive Analytics: The
Power to Predict Who Will Click, Buy, Lie, or Die”
(February 2013) by Eric Siegel and is reprinted for
promotional considerations.

W W W. I N F O R M S . O R G

Is your solver provider

getting in the way
of your success?

At Gurobi,
we make it easier to achieve
your optimization goals.
Faster solutions with superior
algorithms and powerful tuning
Maximum productivity with flexible
interfaces and modeling language support
Great technical support from
easy-to-reach optimization experts
No surprises with our transparent
pricing and flexible licensing

Gurobi Optimizer 5.6

State-of-the-Art Mathematical Programming Solver
Experience state-of-the-art optimization algorithms
backed by industry-leading customer support.

Gurobi.com +1 713-871-9341 [email protected]

Get started with a FREE TRIAL today!

CO N FERE N C E P R E V I E W

INFORMS Big Data
Conference
June 22-24 event in San Jose, Calif., to focus on the business of big data.
INFORMS is launching a new topical conference that will put the focus
squarely on the business of big data –
how organizations can transition from
being data-rich to decision-smart. The
INFORMS Big Data Conference will be
held June 22-24 at the San Jose Convention Center in San Jose, Calif. The
conference is organized around tracks
on “Big Data Case Studies,” “Big Data
101” and “Emerging Trends in Big Data.”
Case studies of big data projects
that illustrate the complete journey from
business problem to analytics solutions
will be a major component of the conference. Sessions on big data 101 will offer
tutorials on how to navigate the big data
ecosystem, how to select and use the
right technologies, as well as the challenges of building data science teams.
Critical topics such as ethics and privacy requirements will also be addressed.
Other sessions on the program will look
into the future, exploring emerging technologies and business trends.
66

|

A N A LY T I C S - M A G A Z I N E . O R G



Bill Franks

Michael Svilar

The committee has hand-picked
speakers to address defined topics in
an intensive program geared to the interests of business decision-makers,
IT managers and analytics professionals. Poster sessions, technology workshops, tutorials and panel discussions
will also be part of the program, as well
as facilitated networking opportunities.
Bill Franks, chief analytics officer
at Teradata Corporation, and Michael
Svilar, managing director, delivery lead
and capability lead at Accenture, will
deliver keynote presentations. At Teradata and throughout his career, Franks
W W W. I N F O R M S . O R G

has focused on translating complex
analytics into terms that business users can understand and then helping
organizations implement the results effectively within their processes. He is
the author of the book “Taming the Big
Data Tidal Wave” and holds a patent in
forecasting analytics. For more than 30
years, Svilar has run analytics projects
across multiple industries including
retail, communications, financial services, automotive, consumer packaged
goods and electronics.

Early registration rates will be available until May 23. Rooms are being held
at the Marriott San Jose at a discounted
rate until May 26. Conference organizers anticipate that rooms will sell quickly
and advise attendees to make reservations early, well before the cut-off date.
After that date, reservations will be accepted at prevailing rates on a room
available basis.
Visit http://meetings.informs.org/bigdata2014/ for more information on the
INFORMS Big Data Conference.

Would like to congratulate the
2014 Edelman Award winner
U.S. Centers for Disease
Control and Prevention
with KidRisk, Inc.

Click here to relive the excitement
of the winner announcement on YouTube.

A NA L Y T I C S

M A Y / J U N E 2 014

|

67

CO N FERE N C E P R E V I E W

Program tracks and speakers
Track: Big Data Case Studies
•G
 oogle – Behdad Masih, quantitative
analyst
•H
 ERE, a Nokia Business – Toby
Tennent, product development manager
•E
 xponent®, Inc. – Juergen Klenk,
principal scientist
•K
 abbage, Inc. – Pinar Donmez, chief
data scientist
•U
 .S. Census Bureau – Cavan Capps,
big data lead
•M
 erck & Co., Inc. – John E. Koch,
director of informatics
• Intel Corp. – Link C. Jaw, Internet-ofthings solutions group
•J
 P Morgan Intelligent Solutions –
Govind Nagubandi, data scientist
•A
 nalytics Media Group (AMG) ­Alan
Papir, software engineer

•K
 aiser Permanente – Anton J. Mobley,
MS, security data scientist
•B
 ooz Allen Hamilton – Peter Guerra,
BS, principal
• IBM – Kevin Foster, MS, big data
solutions architect
•K
 overse, Inc. – Paul Brown, MS, CEO
and founder
•V
 erizon Wireless – Anne Robinson,
Ph.D., director, supply chain strategy
& analytics
•B
 ooz Allen Hamilton – Brian Keller,
Ph.D., data scientist

Track: Emerging Trends in Big Data
• IBM – Rob High, BA, chief technology
officer, Watson solutions, IBM fellow and
vice president
•L
 inkedIn Corp. – Simon Zhang, MD &

•T
 eradata Aster – Lee Paries, vice

MBA, senior director, business analytics

president, Central & Western U.S.

•S
 AS – Paul Kent, B Commerce (WITS),

•B
 ell Laboratories – Marina Thottan,
director
•A
 lpine Data Labs – William C. Ford,
lead data scientist

vice president, big data
•U
 niv. of California Berkeley – Ion
Stoica, Ph.D., professor, Univ. of
California Berkeley; CEO, Databricks;
CTO, Conviva

Track: Big Data 101
•A
 mazon – Vikram Garlapati, manager,
solutions architect
•N
 orthwestern University – Diego
Klabjan, professor, industrial engineering
& management sciences; director,
master of science in analytics program

68

|

A N A LY T I C S - M A G A Z I N E . O R G

•K
 aggle – Anthony Goldbloom, B
Commerce, founder and CEO
•S
 AP – Chris Hallenbeck, global vice
president
•P
 ivotal – Kaushik Das, senior principal
data scientist
• GraphLab – Shawn Scully, data scientist

W W W. I N F O R M S . O R G

Organizing Committee
General Chair
Candace A. Yano
University of California-Berkeley
Program Chair
Philip Kaminsky
University of California-Berkeley
Plenary/Keynotes Chair
Shmuel S. Oren
University of California-Berkeley
Invited Sessions Co-Chairs
Hyun-Soo Ahn
Damien Beil
University of Michigan
Sponsored Sessions Co-Chairs
Alper Atamturk
Zuo-Jun Max Shen
University of California-Berkeley
Contributed Sessions Co-Chairs
Rachel Chen
University of California-Davis
Steven Nahmias
Santa Clara University
Practice Program Co-Chairs
Vijay Mehrotra
San Francisco State University
Warren Lieberman
Veritec Solutions
Thomas Dag Olavson
Google, Inc.

November 9-12, 2014
Hilton San Francisco & Parc 55 Wyndham
San Francisco, California

Submission Deadline: May 15, 2014
Submit Early, Capacity Limited!
Registration opens June 1, 2014

meetings2.informs.org/sanfrancisco2014

Interactive Sessions Co-Chairs
Hari Balasubramanian
Ana Muriel
Univ. of Massachusetts-Amherst
Tutorials Co-Chairs
Alexandra M. Newman
Colorado School of Mines
Janny Leung
Chinese University of Hong Kong
Arrangements Co-Chairs
Julia Miyaoka
Theresa M. Roeder
San Francisco State University

INFO RM S IN I T I AT I VE

Continuing ed courses set for San Jose
INFORMS’ popular continuing education
courses will be held June 20-21 (“Essential
Practice Skills for Analytics Professionals”)
and June 25-26 (“Data Exploration &
Visualization”) before and after, respectively,
the 2014 INFORMS Conference on the
Business of Big Data in San Jose, Calif.
Register early to save $100.

Essential Practice Skills for
Analytics Professionals
Taught by Dr. E. Andrew Boyd, Texas A&M,
University of Houston, and Houston Public Media
Participants will learn practical tools for
integrating their analytical skills into realworld problem solving for businesses and
other organizations. The course provides

Data Exploration & Visualization
Taught by Stephen McDaniel and Eileen
McDaniel, Freakalytics, LLC
Attendees will experience first-hand the
power of visualization in exploring data as
an adjunct to tried and trusted analytics
methods. Using a hands-on lecture and case
study approach, attendees will walk away
with a proven framework for data exploration
that is directly relevant to real-world problems
in many fields. Incorporating popular free and
leading software tools to facilitate hands-on
learning, attendees will be able to directly
apply the methods and techniques from
this course in their work, regardless of their
visualization tool of choice.
At the end of the course, participants are

approaches that can be applied immediately
to a wide variety of settings, whether within

expected to:

a participant’s own organization or for an

• have confidence to explore new data using

external client. By the end of the course,

the exploration/visualization approach;

participants will:

• have the ability to approach and deploy

• learn to link their subject-matter expertise

interactive visualization;

to the challenges of messy, unstructured

• understand how to identify practically

problems, organizational noise, and non-

meaningful discoveries;

technical decision makers; and
• understand best-practice techniques,

• experience the use of state-of-the-art

including: problem statement summaries,

visualization software; and

issue trees, interview guides, work

• think creatively about data and insights.

plans, sensitivity analysis, stress-testing
recommendations, story-boarding, slide-

Course dates: June 25-26, San Jose, Calif.

craft, delivering presentations and fielding

For more information on the course

Q&A sessions.
Course dates: June 20-21, San Jose, Calif.

70

|

and available discounts, visit www.informs.org/
continuinged.

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

Connect with the earned expertise
of business forecasters and
practical research from top
academics from around the globe.

Each issue of Foresight contains articles that you’ll use in your day-today work, whatever types of forecasting you do.

““

Here’s what our readers say:

Issue 26
Summer 2012
THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING
THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING

“The information is relevant to practitioners and is presented in a
way that is not overly academic but with significant credibility.”
Thomas Ross, Financial Analyst, Brooks Sports

5 Setting Internal Benchmarks Based on a Product’s ForecaStaBIlIty DNa

“Foresight make(s) important research findings available to the
practitioner.”

18 Regrouping to Improve Seasonal Product Forecasting
32 Forecasting Software that Works For – Not against – Its Users

38 Book Review Abundance: The Future Is Better Than You Think
41 reliably Predicting Presidential elections

Anirvan Banerji, Economic Cycle Research Institute

“...an important forum for practitioners to share their experiences....”

Issue 27
Fall 2012
THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING
THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING

Dan Kennedy, Senior Economist, Connecticut Department of Labor

“I find Foresight very useful! I use it as a teaching resource to
bring theoretical forecasting techniques to life for the students.”
Dr. Ilsé Botha, Senior Lecturer, University of Johannesberg

5 Special Feature: Why Should I Trust Your Forecasts?
23 Tutorial: The Essentials of Exponential Smoothing
29 S&OP: Foundation Principles and Recommendations for Doing It Right

40 New Texts for Forecasting Modelers

Put Foresight to work to improve your forecasts and rally
support for the people, processes and tools that accurate
forecasting requires.

Subscribe today!
forecasters.org/foresight/subscribe/

Issue 29
Spring 2013
THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING
THE INTERNATIONAL JOURNAL OF APPLIED FORECASTING

5 Forecasting revenue
in Professional Service Companies
14 Forecast value added:
A Reality Check on Forecasting Practices
19 s&oP and Financial Planning
26 cPFr: Collaboration Beyond S&OP
39 Progress in Forecasting rare events
50 Review of "global trends 2030:
alternative Worlds"

Foresight | IIF Business Office | 53 Tesla Ave. | Medford, MA 02155 USA | [email protected] | +1 781 234 4077

FIVE- M IN U T E A N A LYST

Buffet’s billiondollar basketball
bracket bet

A collection of Tribbles, which despite their harmless appearance, quickly
grow to fill the space. Much like Factorials.

BY HARRISON
SCHRAMM, CAP
72

|

Warning: Factorials and powers are
ubiquitous in this article. Like Tribbles
from Star Trek, expressions like “64” look
cute and innocent, but they are some of
the most deadly mathematical beasts
known to man.

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

PHOTO BY STEPHEN SLADE. COURTESY OF UNIVERSITY OF CONNECTICUT

Shabazz Napier (13) leads UConn to
victory over Kentucky in the 2014 NCAA
Championship game.
I was bemused to read this article [1]
in Slate magazine detailing the odds of
Warren Buffet’s basketball challenge,
which may be found here [2]. [Buffet
offered a billion dollars to anyone who
submitted a perfect bracket (i.e., correctly predicting the winner of all 63
games) of “March Madness,” otherwise
known as the NCAA Men’s Basketball
Championship Tournament.] A billion
dollars – even with taxes – is a lot of

money. How hard is it to come up with
a perfect bracket? There is only one
perfect bracket in a world with many
potential brackets, so we first need to
find out how many possible brackets
there are.
The NCAA is a single elimination
tournament, which means that each
team plays until they lose. In a single
elimination tournament, each round is
made up of n teams, with n / 2 games
n /2
played. Therefore, there are 2
possible outcomes in the first round. Knowing that the tournament starts with 64
teams, there are
possible
outcomes for the first round. Using similar calculations at each round, there are
possible outcomes, only one of which
is correct.
For comparison’s sake, 1 billion is
a thousand million, or 109 so the odds
of winning the basketball challenge
9 2
are around 1: (10 ) or one in a billion
billion 3 . So, it appears your odds of
winning are not very good at all.
Frequently, one can get a feel for the
value of a gambling game by the “fair
price” that one would be willing to pay
to play the game; specifically, the value
that would make one indifferent between
playing the game and just keeping their
money. For this game, a “fair” price would
be nine million attempts per penny!
M A Y / J U N E 2 014

|

73

FIVE- M IN U T E A N A LYST

SOME EXCURSIONS
Suppose I could have one piece of
information. A likely choice would be:
“How much better off would we be if
we knew the eventual winner?” In this
case, we would reduce the number of
possibilities in each round by 1; and the
number of combinations would “only” be
, which
is 64 times better than the original estimate. Sixty-four is generally reckoned
to be a small number when compared
18
to 10 . If you knew the Final Four, you
would be considerably better off, at
,
or 520,000 times better than the original
bet, which is to say that in the scheme
of things, you are no better off at all.
Now suppose that you had to pay
$1 to play this game instead of it being
free, but you think you are pretty good at
predicting basketball games. You would
need to have 72 percent accuracy in
your ability to pick basketball games to
be risk neutral for a dollar (i.e., 72 percent accuracy increases your odds of
winning to 1 in 1 billion).
As bad as these odds are, here’s a
game that is even worse, which I will
call the Georgetown Wager (after my
colleague who challenged me to come
up with a tougher game). Suppose that
you are given the 64 teams that will play,
but the games are randomized and you
74

|

A N A LY T I C S - M A G A Z I N E . O R G

have no information about their parings;
you know that there are 64 teams in the
first round, of which 32 will win, and so
on, but you don’t know who will play
who. In this version, you have to figure
out for the number of possibilities,
(1)
where the “choose” function,
if you expand
(1) out by hand and cancel terms4, you
will find that it is:
.
If the odds of Buffet’s Billion Bracket
are bad, the Georgetown Wager is patently absurd; the odds of winning are
, which are roughly on the
order of winning Buffet’s game twice in
a row!
You may be asking: “So, if this is
such a good bet for the house, why
don’t I run a similar lottery?” Because
I don’t have a billion dollars, and I’m
not willing to lose. Remember that the
“house” has to be willing to pay out the
fee in the extraordinarily rare event that
someone won. Events that are “statistically impossible” are still “possible.”
and while it is extraordinarily unlikely
that someone will win, there is no law
of physics that prevents someone from
winning.
W W W. I N F O R M S . O R G

The perfect first round buyout:
Suppose you had a perfect first round,
in that you guessed the first 32 games
correct. Congratulations. Strictly speaking, the risk-neutral buyout price from
the bank’s perspective (i.e., Mr. Buffet) is approximately $2. Now, this figure presumes that you got to this point
by dumb luck, and you will certainly
claim – and the house may believe –
that you got to this point because you
are very good at predicting basketball
games. After all, to have a 50 percent

probability of predicting the first round
correctly, you would need to have ~98
percent per-game prediction accuracy.
So, should you play? Sure, go
ahead. Expected value calculations
presume that you are going to do
something else with the money; this is
true for large amounts but typically not
for small. So it depends on what else
you would do with the money. In this
particular example, you won’t pay anything, except some advertising e-mails,
so if that works for you, go ahead.

video learning center
Your one-stop shop to view top presentations from key INFORMS meetings
NOW ONLINE! 2014 Edelman Presentations
2013 Analytics Conference and Annual Meeting
2012 Analytics Conference and Annual Meeting
2011 Analytics Conference and Annual Meeting
2010 Practice Conference and Annual Meeting
2009 Annual Meeting

Your latest member benefit lets you learn from the best on your schedule.

http://livewebcast.net/INFORMS_Video_Learning_Center
A NA L Y T I C S

M A Y / J U N E 2 014

|

75

FIVE- M IN U T E A N A LYST

If a similar wager cost $1 to play, it
would depend on what else you would
do with the money. Foregoing a late
afternoon soda to buy a ticket for this
game, if you enjoy talking about it,
would be OK. Dumping out your life
savings in order to play games is a terrible idea (we’ve written about this before, see July 2013 [5]). The point is
that we all do lots of things where the
odds of winning are practically zero.
This is not necessarily a bad thing. If
you derive “pleasure” out of daydreaming about winning a billion dollars or
have fun arguing basketball scores with
your friends, go for it! Just do so with
eyes open, knowing that it is incredibly
unlikely that you will win.
And don’t forget, there are also
20 first-prize winners, regardless of
whether the grand prize is given or
not, valued at $100,000. While no billion, this is no small amount of money,
and most importantly, does not require
you to be perfect, simply better than
the other players who enter. If you think
you are good at filling out your bracket,
then perhaps you should enter with the
hopes that you win the first prize. Here,
the odds are no worse than 1:750,000,
which is a number that you can start to
comprehend!
A note on calculation. I used R to
do the large calculations in this article.
76

|

A N A LY T I C S - M A G A Z I N E . O R G

Professionals always need to be concerned about numerical stability and
floating point precision, which may be
the subject of a subsequent article. If I
did not have a good computational platform or was doing this 50 years ago,
I would resort to Sterling’s Approximation,
It’s amazing to think about all of the
computation that we simply take for
granted.
Finally, knowing that

is very handy. (If you need a proof,
start writing out numbers in binary)
Harrison Schramm (harrison.schramm@gmail.
com) is an operations research professional in the
Washington, D.C., area. He is a member of INFORMS
and a Certified Analytics Professional (CAP).

NOTES & REFERENCES
1. http://www.slate.com/articles/sports/sports_
nut/2014/03/billion_dollar_bracket_challenge_why_
it_s_a_bad_idea_to_enter_warren_buffett.html
2. https://tournament.fantasysports.yahoo.com/
quickenloansbracket/challenge/?qls=BDB_B14qlb03.
qlredirect
3. When your exponents have exponents, the
numbers are really huge!
4. Comment: If you want real understanding in
mathematics, there is no substitute for expanding by
hand. This is how the mathematicians of 50 years ago
did things, and there is goodness in it, even today.
5. http://www.analytics-magazine.org/july-august2013/838-five-minute-analyst-carnival-game

W W W. I N F O R M S . O R G

Analytics
SAS and Hadoop take on
the Big Data challenge.
And win.

Why collect massive amounts of Big Data if you can’t analyze
it all? Or if you have to wait days and weeks to get results?
Combining the analytical power of SAS with the crunching
capabilities of Hadoop takes you from data to decisions in a
single, interactive environment – for the fastest results at the
greatest value.

Read the TDWI report

sas.com/tdwi

SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. ® indicates USA registration. Other brand and product names are trademarks of their respective companies. © 2014 SAS Institute Inc. All rights reserved. S120598US.0214

THIN K IN G A N A LY T I CA LLY

Spy catcher

BY JOHN TOCZEK
John Toczek is the senior director
of Decision Support and Analytics for
ARAMARK Corporation in the Global
Operational Excellence group. He
earned a bachelor of science degree
in chemical engineering at Drexel
University (1996) and a master’s
degree in operations research from
Virginia Commonwealth University
(2005). He is a member of INFORMS.

78

|

Your government has lost track
of a high profile foreign spy and they
have requested your help to track
him down. As part of his attempts to
evade capture, the spy has employed a
simple strategy. Each day the spy moves
from the country that he is currently in to a
neighboring country.
The spy cannot skip over a country (for example,
he cannot go from Chile to Ecuador in one day). The
movement probabilities are equally distributed among
the neighboring countries. For example, if the spy is currently in Ecuador, there is a 50 percent chance he will move
to Colombia and a 50 percent chance he will move to Peru.
The spy was last seen in Chile and will only move about
countries that are in South America. He has been moving
about the countries for several weeks.
Question: Which country is the spy most likely hiding in
and how likely is it that he is there?
Send your answer to [email protected] by June 15.
The winner, chosen randomly from correct answers, will
receive a $25 Amazon Gift Card. Past questions can be
found at puzzlor.com.

A N A LY T I C S - M A G A Z I N E . O R G

W W W. I N F O R M S . O R G

OPTIMIZATION

www.gams.com

High-Level Modeling
The General Algebraic Modeling System (GAMS)
is a high-level modeling system for mathematical programming problems. GAMS is tailored for
complex, large-scale modeling applications, and
allows you to build large maintainable models that
can be adapted quickly to new situations. Models
are fully portable from one computer platform to
another.

State-of-the-Art Solvers
GAMS incorporates all major commercial and
academic state-of-the-art solution technologies for
a broad range of problem types.

GAMS Integrated Developer Environment for editing,
debugging, solving models, and viewing data.

A Water Management Decision Support System (DSS)
for the Indus Basin
Large non-linear optimization models developed in GAMS are a centerpiece of the water management
DSS for Pakistan's Indus Basin. The system is used for agricultural investment planning and water management, but also to investigate the impacts of climate risks on water and agriculture. An international
consortium led by National Engineering Services Pakistan (NESPAK) recently extended this system.
Major features include:
• The GAMS models seamlessly interact with an Oracle database, which feeds both model data and
results into a Geographic Information System (GIS).
• Users from government, industry, and consulting groups use the web-based application to calculate
water requirements and cropping patterns.
• The results are available in various formats: maps, graphs, and tables.

Europe
GAMS Software GmbH
[email protected]
USA
GAMS Development
Corporation
[email protected]
http://www.gams.com

For further information please contact Khalid Mahmood - [email protected]

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close