Description
Assignment-I: Topic Selection & Research Questions (15 points)
A topic of research should be chosen. It can be either a research paper or a startup concept. Once the topic is finalized, the research questions (based on which objectives of the study will be formed) must be developed (not more than three). Each selection must be discussed in detail with logic and rationale. Reasons for the topic should be discussed first followed by discussion on selection of each of the research questions.
Regulations:
- This assignment is an individual assignment.
- Support your submission with course material concepts, principles, and theories from the textbook along with few scholarly, peer-reviewed journal articles.
- Use Saudi Electronic University academic writing standards and APA style guidelines, citing references as appropriate.
- Submit your findings in a 3–4-page document, excluding the title page, abstract and required reference page, which are never a part of the minimum content requirements, in the Assignment Dropbox.
- It is strongly encouraged that you submit all assignments to the Turnitin Originality Check (available under Information folder on your Blackboard) prior to submitting them to your instructor for grading. If you are unsure how to submit an assignment to the Originality Check tool, review the Turnitin Originality Check – Student Guide for step-by-step instructions.
The McGraw-Hill/Irwin Series in Operations and Decision Sciences
SUPPLY CHAIN MANAGEMENT
BUSINESS RESEARCH METHODS
Benton
Purchasing and Supply Chain Management
Second Edition
Schindler
Business Research Methods
Thirteenth Edition
Burt, Petcavage, and Pinkerton
Supply Management
Eighth Edition
BUSINESS FORECASTING
Bowersox, Closs, Cooper, and Bowersox
Supply Chain Logistics Management
Fourth Edition
Johnson, Leenders, and Flynn
Purchasing and Supply Management
Fifteenth Edition
Simchi-Levi, Kaminsky, and Simchi-Levi
Designing and Managing the Supply Chain:
Concepts, Strategies, Case Studies
Third Edition
PROJECT MANAGEMENT
Brown and Hyer
Managing Projects: A Team-Based Approach
First Edition
Larson and Gray
Project Management: The Managerial Process
Seventh Edition
SERVICE OPERATIONS MANAGEMENT
Fitzsimmons and Fitzsimmons
Service Management: Operations, Strategy,
Information Technology
Ninth Edition
MANAGEMENT SCIENCE
Hillier and Hillier
Introduction to Management Science:
A Modeling and Case Studies Approach
with Spreadsheets
Sixth Edition
Stevenson and Ozgur
Introduction to Management Science with
Spreadsheets
First Edition
Keating, Wilson, and John Galt Solutions, Inc.
Business Forecasting and Predictive Analytics
with ForecastXTM
Seventh Edition
LINEAR STATISTICS AND REGRESSION
Kutner, Nachtsheim, and Neter
Applied Linear Regression Models
Fourth Edition
BUSINESS SYSTEMS DYNAMICS
Sterman
Business Dynamics: Systems Thinking and
Modeling for a Complex World
First Edition
OPERATIONS MANAGEMENT
Cachon and Terwiesch
Operations Management
First Edition
Cachon and Terwiesch
Matching Supply with Demand: An Introduction
to Operations Management
Fourth Edition
Finch
Interactive Models for Operations and Supply
Chain Management
First Edition
Jacobs and Chase
Operations and Supply Chain Management:
The Core
Fourth Edition
Jacobs and Chase
Operations and Supply Chain Management
Fifteenth Edition
MANUFACTURING CONTROL SYSTEMS
Jacobs and Whybark
Why ERP? A Primer on SAP Implementation
First Edition
Jacobs, Berry, Whybark, and Vollmann
Manufacturing Planning & Control for Supply
Chain Management
Sixth Edition
Schroeder, Goldstein, and Rungtusanatham
Operations Management in the Supply Chain:
Decisions and Cases
Seventh Edition
Stevenson
Operations Management
Twelfth Edition
Swink, Melnyk, Cooper, and Hartley
Managing Operations across the Supply Chain
Third Edition
PRODUCT DESIGN
Ulrich and Eppinger
Product Design and Development
Sixth Edition
BUSINESS MATH
Slater and Wittry
Practical Business Math Procedures
Twelfth Edition
Slater and Wittry
Math for Business and Finance: An Algebraic
Approach
Second Edition
BUSINESS STATISTICS
Bowerman, O’Connell, Murphree, and Orris
Essentials of Business Statistics
Fifth Edition
Bowerman, O’Connell, and Murphree
Business Statistics in Practice
Ninth Edition
Doane and Seward
Applied Statistics in Business and Economics
Sixth Edition
Doane and Seward
Essential Statistics in Business and Economics
Second Edition
Lind, Marchal, and Wathen
Basic Statistics for Business and Economics
Ninth Edition
Lind, Marchal, and Wathen
Statistical Techniques in Business and Economics
Seventeenth Edition
Jaggia and Kelly
Business Statistics: Communicating with Numbers
Third Edition
Jaggia and Kelly
Essentials of Business Statistics: Communicating
with Numbers
First Edition
McGuckian
Connect Master: Business Statistics
>businessresearchmethods
Pamela S. Schindler
Wittenberg University
thirteenthedition
BUSINESS RESEARCH METHODS, THIRTEEN EDITION
Published by McGraw-Hill/Irwin, a business unit of The McGraw-Hill Companies, Inc., 1221 Avenue of the
Americas, New York, NY, 10020. Copyright © 2019 by The McGraw-Hill Companies, Inc. All rights reserved.
Printed in the United States of America. Previous editions © 2014, 2011, 2008, and 2006. No part of this publication may be reproduced or distributed in any form or by any means, or stored in a database or retrieval system,
without the prior written consent of The McGraw-Hill Companies, Inc., including, but not limited to, in any network or other electronic storage or transmission, or broadcast for distance learning.
Some ancillaries, including electronic and print components, may not be available to customers outside the
United States.
This book is printed on acid-free paper.
1 2 3 4 5 6 7 8 9 0 LWI/LWI 1 0 9 8 7 6 5 4 3
Bound:
ISBN 978-1-259-91893-3
MHID 1-259-91893-9
Looseleaf:
ISBN 978-1-260-21009-5
MHID 1-260-21009-X
Portfolio Manager: Noelle Bathurst
Product Developer: Ryan McAndrews
Executive Marketing Manager: Harper Christopher
Content Project Managers: Erika Jordan and Angela Norris
Buyer: Susan K. Culbertson
Design: David Hash
Content Licensing Specialist: Melissa Homer
Cover Image: ©scanrail/Getty Images
Compositor: MPS Limited
All credits appearing on page or at the end of the book are considered to be an extension of the copyright page.
Library of Congress Cataloging-in-Publication Data
Schindler, Pamela S., author.
Business research methods / Pamela S. Schindler, Wittenberg University.
Thirteen edition. | New York, NY : McGraw-Hill/Irwin, [2019] |
Earlier editions were co-authored with Donald R. Cooper.
LCCN 2017055982 | ISBN 9781259918933 (alk. paper)
LCSH: Industrial management–Research.
LCC HD30.4 .E47 2018 | DDC 658.0072/1—dc23
LC record available at
The Internet addresses listed in the text were accurate at the time of publication. The inclusion of a website does
not indicate an endorsement by the authors or McGraw-Hill, and McGraw-Hill does not guarantee the accuracy
of the information presented at these sites.
www.mhhe.com
To my soulmate and husband, Bill, for his sound counsel and unwavering support.
Pamela S. Schindler
vi
Walkthrough
Preface
Addressing a Revolution in Student Learning.
A transformation is taking place in many of our classrooms. During the last decade, more and more of our
students have transformed to visual—from verbal—learners. Visual learners need pictures, diagrams, and graphs to
clarify and reinforce what the text relates.
Integrated research process exhibits
reveal a rich and complex process in a
visual way.
>Exhibit 1-3 The Research Process
Clarify the Research Question
Stage 1
31 fully integrated research process
exhibits link concepts within stand
alone chapters.
Exploration
Each exhibit in this series shares symbols,
shapes, and colors with others in the series.
Exhibit 1-3 is the overview exhibit of the
research process.
Design the Research
Data Collection
Design
Subsequent exhibits (like this one for
measurement instrument development)
show more detail in a part of this process.
>Exhibit 12-1
Instrument Design within the
Research Process
Investigative
Questions
Measurement
Questions
Revise
Phase 1
Pretest Individual
Questions
Revise
Pretest
Instrument Elements
Phase 2
Pretest
Instrument
Phase 3
Develop
Non-Question Elements
Revise
Develop
Instrument
Instrument
Ready for Data
Collection
Exploration
Sampling
Design
Stage 2
Collect and Prepare Data
Stage 3
Analyze and Interpret the Data
Stage 4
Report Insights and Recommendations
Stage 5
Responsive to industry changes.
>chapter 17
Understand what is happening
behind the scenes during a project.
An Integrated Example
“I approach each
project with a new
insecurity, almost like the
first project I ever did.
And I get the sweats. I go in and
start working; I’m not sure where
I’m going. If I knew where I was
going I wouldn’t do it.”
Frank Gehry
Award-winning architect
>chapter 16
Research reports are increasingly
oral and all about storytelling.
“If you’re
>learningobjectives
navigating a
Stage 5: Research Reports: dense
information
Supported Insights and
jungle, coming across
a beautiful graphic or lovely
Recommendations
data visualization is a relief.
After reading this chapter, you should understand . . .
1 How the various stages of and steps within the research process work together to complete a research
project.
2 What decisions are made behind the scenes of a research project.
It’s like coming across
a clearing in the jungle.”
David McCandless,
British data journalist,
information designer, and author
>chapter 13
“A
>learningobjectives
Clean data is critical to effective
analysis.
well-crafted,
After reading this chapter, you should understand
…
thoughtful
1 How changes in business and technology are changing research reporting.
visualization makes
2 How to plan an audience-centric report using audience analysis.
the light bulb go off.
3 Different report structures and why
each
You
justis used.
don’t get that with
spreadsheet.
4 The types and specifications of various a
types
of reports.
Stage 3:
Collect, Prepare, and
Examine Data
”
5 The organizational frameworks for reports.
Dana Zuber,
director
of analytics
Butler,
Shine, Stern &
6 Visualization and howassociate
to effectively
use different
support
materials.
Partners Dana Zuber,
7 The role of compilation, practice, and delivery in achieving audience effect.
8 The ethical considerations in reporting research results.
>chapter 6
>learningobjectives
All researchers need qualitative
skills.
Stage 2:
Data Collection Design:
Qualitative Research
After reading this chapter, you should understand . . .
David McCandless, “The Beauty of Data Visualization,” TED,
November 23, 2012, downloaded June 25, 2017 (
LO13-1
The tasks of data collection.
LO13-2
The use of content analysis to postcode textual and verbal data.
/talks/david_mccandless_the_beauty_of_ data_visualization).
“The only way to
capture a deeply personal
LO13-3 The importance of editing raw data to assure it is complete, accurate, and correctly coded.
insight, which will help
LO13-4 The exploratory data analysis techniques that provide visual representations of the data.
you evoke that emotion in
LO13-5 How cross-tabulation examines
relationshipsisbetween
consumers,
throughvariables.
qualitative.”
Gia Calhoun,
global insights manager
Burt’s Bees
>learningobjectives
>chapter 3
After reading this chapter, you should understand . . .
1 The nature of qualitative research and its distinctions from quantitative research. downloaded June 22, 2017 (
The research question is the
basis of effective research.
Stage 1:
Clarify the Research
Question
-dashboards-persuade-inform-and-engage.html?cid=70132000001
HCNA&ls=Advertisement&lsd=DBM%20-%20Retarget%20-%20
2 The types of business decisions that use qualitative methods.
How%20to%20Build%20Dashboards&adgroup=Retarget%20-%20
Dashboards&distribution=DBM&creative=building&dclid=CMHOwqm
nnswCFUYagQodwwwKWQ#preview).
3 The variety of qualitative research methods.
4 The importance and responsibility of the interviewer.
“A beautiful question
5 Ethical issues related to qualitative research.
is an ambitious yet
actionable question that can
begin to shift the way we perceive
or think about something—and
that might serve as a catalyst to
bring about change.”
>chapter 1
Warren Berger,
consultant and author, A More Beautiful Question
>learningobjectives
Research Foundations
and Fundamentals
After reading this chapter, you should understand . . .
LO3-1
You can’t learn research
without understanding
the fundamentals.
The question hierarchy.
“As big data
LO3-2
The purposes and process of exploration.
increases, we see a
LO3-3
parallel
How internal and external exploration
differ.growth in the
LO3-4
for ‘small data’
The process and goal of researchneed
valuation.
LO3-5
The process and justification needed to budget for research.
LO3-6
Ethical issues at this stage of the process.
to answer the questions
it raises.”
William C. Pink,
senior partner
Creative Analytics
>learningobjectives
After reading this chapter, you should understand . . .
LO1-1
How business research and data analytics complement each other.
LO1-2
The language of professional researchers.
>chapter 9 Stage 2: Data Collection Design: Survey Research
>snapshot
Internet Brings Prediction Research into 21st Century
Managers often must make decisions about the future. These
decisions offer high uncertainty. Research is designed to reduce
the risk, but simply asking people to predict their own behavior,
attitude, or reaction hasn’t worked well; we are notoriously poor
at the task. For example, in 1985 when individuals were asked to
predict their acceptance of Coke’s planned reformulation, they
predicted incredibly wrong and it cost Coca-Cola millions.
Historically, researchers have used the consensus prediction of experts (Delphi technique) to correct for the individual’s
poor predictive capabilities. However, not all situations offer a
logical panel of experts. James Surowiecki, in his The Wisdom
of Crowds, describes how a group of diverse individuals is able
to make decisions and predictions better than isolated individuals or experts. MIT researchers explain that people’s heightened
connectivity due to the Internet has brought about the “emergence of surprising new forms of collective intelligence.” As
social animals, people are getting good at noticing what others
are doing, sensing why they might be doing it, and predicting
what they will do. In a PEW Research study, collectively Americans
predicted 37 percent of Americans were obese, a fairly good predictor of the actual 31 percent who were so diagnosed.
Marcus Thomas (MT) needed a research method that was
fast and that would overcome client skepticism about the inaccuracy of self-reported anticipated versus actual behaviors for
its financial client. It chose to use a prediction market. “A prediction market is like an online stock investing game,” explained
Jennifer Hirt-Marchand, associate partner and strategic insights
executive for MT. “Traders ‘invest’ virtual dollars in ideas, products, assets, etc. to be tested. Based on the investments they
make, traders can win greater incentives if they invest in the
winning idea than an incentive they might earn by completing
a survey alone. This ‘skin in the game’ is a critical component
of the methodology, as it fosters engagement and thoughtfulness on the part of the traders. Its strength is that is doesn’t
rely on asking individuals to make predictions about what they
would do in the future but rather what they think other people
would do.”
Using the services of a sample provider, an online survey
was sent to general population panelists. Likely participants
self-identified based on having an understanding of finance with
regard to estate planning, personal finances and investing, vacation planning, health care, etc. A thousand participants, known
as traders, were recruited from this group. While panel participants are compensated by the sample company for their regular
participation in research projects, panelists selected for this project could earn additional compensation based on the accuracy
Rich with examples, this
edition is a collaboration
with dozens of researchers.
Snapshots are research
examples from the
researcher’s perspective.
©Ridofranz/Getty Images
of their predictions. Those payouts would be determined by the
number of traders who invested in the “winning” group, as well
as the amount each trader invested in that group.
Through a continuation of the online survey, the selected
traders were first presented a written description of the new
financial service (each had previously agreed to a nondisclosure
agreement, as the product was in development). Then each
was provided six consumer profiles (called vignettes) one at a
time. Each vignette—developed based on consumer segmentation from prior research and extensive secondary research—
represented a possible purchaser group. It included a narrative,
describing the group as people, along with photographs bringing each group to life. Traders were each given $1,000 in virtual money to invest in one or more vignette groups—the ones
they thought would be most likely to purchase the new financial service. In addition, through open-ended questions, each
trader was asked to explain the reasons why they believed each
vignette group would or would not purchase.
Using this methodology, Marcus Thomas identified three
segments based on the best-choice vignettes—including one
that seemed unlikely at the outset—for its financial client. The
vignette that represented the client’s core segment for its main
product line failed to rank high in the prediction research for
>chapter 12 Stage 3: Measurement Instruments
the new financial service. Media/engagement touchpoints and
messaging insights to reach the three best-choice groups were
If a topic deals with a sensitive subject, researchers may start the topic with buffer questions, designed
developed based on this research.
to build rapport and put the participant at ease. These are broad, neutral questions on the topic that
Some of the world’s leading consumer and industrial
compa-participant to take a stand on the sensitive issue. For example, “In the last 30 days, have
don’t require
nies rely on Cleveland-based Marcus Thomas LLC
create or used a streaming service to watch a movie?” before asking “Should anyone be able to
youtopersonally
refine their brands and drive customers to seek out
and buy
theirwith graphic sexual content with streaming services?” In tests, sensitive questions that
access
movies
products and services.
www.marcusthomasllc.com
followed buffer questions have been shown to extract markedly different responses compared with when
participants are directly asked a sensitive question without buffers.5
Facilitate Topic and Measurement Question Sequencing
The design of measurement instruments is influenced by the need to relate each question to the others in
the instrument. Often, the content of one question (called a branched question) assumes other questions
have been asked and answered in a certain way. In computer-based instruments or computer-assisted
201branching is handled by internal coding of the initial question. The PicProfile
instruments, such
indicates a typical branch question; it reveals the elimination of alternatives not chosen in one question
when asking the second question, thus shortening the participant’s time.
Instructions also are a primary tool to facilitate sequencing Three types of sequencing result in
skip directions. These instructions indicate where the participant or interviewer should go within the
instrument—a question, topic or section— given one or a series of responses. These instructions can be
embedded in the instrument (paper or computer-based) or provided to the interviewer. Computer-based
instruments and computer-assisted interviewing make skipping fairly easy; once a pre-programmed
response is entered, the computer automatically skips the participant ahead. The first type is a questionto-question skip: a question screens for experience or knowledge, and the participant is judged unable to
answer the next question without it:
Example: In the last two weeks, have you used (product)?
PicProfiles use a visual
cue to enhance the
concept or key term.
Yes
No (If No, skip to Q3)
2. Which of the following attributes do you like about the automobile you just saw? (Select all that apply.)
Overall appeal
Headroom
Design
Color
Height from the ground
Other
Next Question
None of the above
3. For those items that you selected, how important is each? (Provide one answer for each attribute.)
Extremely
important
Neither
important
nor
unimportant
Not at all
important
Don’t
know
a) Overall appeal
b) Height from the ground
c) Headroom
>picprofile
One of the attractions of using a web survey is the ease with which participants follow branching questions immediately customized
to their response patterns. In this survey, participants were shown several pictures of a prototype vehicle. Those who responded
to question 2 by selecting one or more of the attributes in the checklist question were sequenced to a version of question 3 that
related only to their particular responses to question 2. Note also that in question 3 the researcher chose not to force an answer,
allowing the participant to indicate he or she had no opinion (“Don’t know”) on the issue of level of importance.
309
A Closeup offers a more in-depth example.
>part II The Design of Business Research
>chapter 5 Stage 2: Sampling Design
>closeup
Who’s Really Taking Your Surveys?
Early in panel development, panelists were offered $100 to
$150 to join and participate in qualitative studies. Historically,
>closeupcont’d
Broome and Hecht’s Phase 2 research revealed factors essential for reducing respondent drop outs: Don’t ask screening
powerful motivator.” Many also cited meeting interesting people
and hearing different viewpoints as motivation. “They also love
questions within a survey once a participant has answered a detailed screener to qualify; merge the data. Participants don’t care
that different companies do different aspects of the research; they
learning the results of studies they participate in and understanding why we as researchers do what we do or ask what we
ask, ” shared Broome.
want researchers to avoid duplication. Don’t ask essentially the
same question in different ways; once is enough. Use previous
Recently, a SurveyMonkey study also found panelists gave
thoughtful, consistent answers over time. It released results of a
tive for recruiting respondents and the Internet became
more widely used, researchers started using panels—also
question responses to determine later questions asked; panelists are willing to share their thoughts, behaviors, and lives but
don’t want their time wasted. Respondents have different pref-
1, 000-person international panel assessment study, using three
surveys with the same respondents, one in each of three sequential months, which checked for quality-reducing behaviors
called communities—to recruit participants for quantitative
research. Quantitative respondents who are engaged for
much shorter periods are more likely to be paid $1 to $10 per
erences and styles; let them respond as they wish—with text or
video, exclusively on a mobile device or a computer. “They shared
lots of ideas for making surveys more engaging and interesting,
like straight lining (repeatedly choosing the same answer choice
in matrix questions), poor open response validity (responding
with nonhelpful, gibberish answers), and whether they were un-
survey completed.
The professional respondent—one who takes repeated
surveys—was once considered a deterrent to quality research.
” claimed Hecht, “including, adding music or video, getting rid of
grid questions, and reducing survey time to 15 minutes or less.”
Using information extracted from an open question and ana-
focused or not paying attention. Of the panelists, 97 percent,
97 percent, and 94 percent or more passed each of these tests,
respectively, with no difference between men and women.
“As researchers, ” explained Jessica Broome, PhD, principal of
Jessica Broome Research, “we wanted to keep the ‘cheaters’ and
‘repeaters’ out of our studies, believing they biased results.”
lyzed with OdinText, Phase 2 research also revealed that “most
participants were a member of one or two panels but struggled
to remain engaged with their panels, ” shared Hecht. “Partici-
And in terms of response reliability, over the three waves of
surveys, in 23 indicators SurveyMonkey tracked, only three
items showed significantly significant change among U.S par-
pants appreciate feeling like a valued part of a full process, not
just an interchangeable cog in a wheel. They like to participate
ticipants: time-to-complete, choice of the “other” response, and
attitude about “moral acceptability of alcohol use.”
in studies that are interesting to them and about products or
services that are relevant to their daily lives—knowledge is a
Sources: jessicabroomeresearch.com; recollective.com;
surveymonkey.com
these participants participated in longer engagements (online qualitative, ethnography studies or face to face in-depth
interviews). As random digit dialing became less produc-
As decision cycles shorten, the demand for better and more
timely information means attracting and retaining qualified participants. During the last three decades, increasingly this means
claim honesty drives their responses; they think lying on survey
researchers turn to panels, and by design, these participants
are asked to participate in numerous studies.
questions would undermine the study.”
“Often panelists expressed feeling abused, misled, and dis-
“We wanted to know ‘Who are these people willing to take
repeated surveys?’” explained Broome. “And given that some surveys are overly long and others poorly designed, ‘Why do they
do this?’” Broome teamed with Kerry Hecht, Director of Research
Services, Recollective, a division of Ramius Corporation, to find
out what motivates panel respondents and if their motivations are
likely to reduce the quality of the information they provide.
respected. For example, when they are told about survey length,
they often felt deceived when a promised 15-minute survey took
45 minutes, or when the survey was not only long, but boring,”
explained Broome. Increasingly, panelists in quantitative studies are tech-savvy. “They understand what current technology
should permit a survey company to do—like eliminate the need
to ask demographic questions repeatedly in the same survey
process or use earlier answers to filter later questions, ” claimed
Hecht. “They basically think researchers can make the experience so much better.” “The research industry needs their insights, but can treat panelists with disdain,” claimed Broome.
The Qualitative Study
Broome and Hecht designed a multistage study that drew participants from multiple panel providers, including Critical Mix,
Schlesinger Associates, and Swagbucks. “We started with a
5-day online qualitative community study with 20 people to
explore what got them started as a survey panelist and what
kept them going,” explained Hecht. “While money is a motivator
in keeping panelists engaged, they also shared the influences
of intrinsic motivators like fun, feeling useful, contributing to
important decisions, and participation being more interesting
than time spent on social media.”
For any particular study, panelists are often screened exclusively on demographics. “Because participants derive intrinsic
benefits, panelists will sometimes fudge on screening information in an attempt to be included in a study, ” shared Broome.
When a panelist doesn’t meet the desired demographic parameters, they are told “You don’t qualify” but are rarely told why.
“But once they are included, ” explained Broome, “participants
The Quantitative Study
Broome and Hecht followed their qualitative exploration study
with Phase 2, a mobile-optimized quantitative study of 1,499
participants, also drawn from various panel providers and fielded
by Propeller Insights. Each panelist took a topical survey that
included a creativity assessment. Additionally, half the group
(750) took a VARK assessment. VARK assesses visual, aural/
audio, read/write, and kinesthetic learning preferences through
a series of learning scenario questions. Creativity was assessed
through a battery of 38 statements requiring agreement or disagreement, as well as a checklist of 54 descriptors. “We discovered that participants didn’t favor any one of the learning
approaches, nor were they outliers on the creativity assessment,”
shared Broome.
92
a particular software training strategy, we infer that others will also. The basic idea of taking a sample is
that by selecting some cases in a population, we may draw conclusions about the entire target population.
There are several compelling reasons for using a sample (a subset of the target population) rather than
a census (all cases within a population), including (1) lower cost, (2) greater speed of data collection, (3)
availability of population cases, and (4) greater accuracy of results. The advantages of taking a sample
over census are less compelling when two conditions exist: (1) a census is feasible due to a small target
population and (2) a census is necessary when the cases are quite different from each other.5 When the
population is small and variable, any sample we draw may not be representative of the population from
which it is drawn. The resulting values we calculate from the sample are incorrect as estimates of the
population values.
Lower Cost
The economic advantages of taking a sample rather than a census are massive. Consider the cost of
2 The Research Process: An Overview
taking a census. The 2020 U.S. Census of the Population is expected to cost as much as>chapter
$30 billion,
barring any natural disasters and using 2010 costs as a barometer; the Census Bureau’s own estimate is
$22billion.6 By any reasonable standard, continuing to take a census of the population every 10 years is
promotions,
training
experiences,
projectinassignments,
project leadership,
newshould
customers
captured, and
unsustainable.
Is it any
wonder
that researchers
all types of organizations
ask, “Why
we spend
so on.
businesses
track these
KPIs through
maintained
thousands
of Sophisticated
dollars interviewing
thousands
of employees
in ourdigitized,
companyperpetually
if we can find
out whatdashboards—
we
data visualization
thathundred?”
shows current and a period of prior status on each metric, usually on one
need toa know
by asking lytool
a few
screen. Even small businesses can dashboard KPIs create with the software tools available.
Identifying opportunity-based management dilemmas is more time consuming and difficult. It
requires monitoring obscure developments in a variety of industries, as well as emerging trends in
your own industry. Those companies looking for opportunities are trying to be first movers; they want
Due to the smaller number of cases in a sample, using a sample drawn from a target population will
to capitalize on an environmental trend that others haven’t recognized. Jeff Bezos, chief executive
always take less time than conducting a census.
officer (CEO) of Amazon, is always seeking these kinds of dilemmas: customers getting frustrated at not
finding the solutions they sought at stores, companies needing a more reliable delivery option, readers
93
wanting to take more books with them when they travel than luggage space permits, etc. The search for
opportunity-based management dilemmas is full of risk, but when the approach pays off, it often pays big.
However, choosing one dilemma on which to focus may be difficult. Solving different dilemmas
offers different rewards. One approach is to estimate the payoff of solving a dilemma, using this value
to prioritize. This is more easily done for problems (e.g., estimating the cost of reducing customer
returns) than for opportunities. Ultimately, to choose incorrectly puts a company or business on an
unproductive path. As a manager, only practice makes you proficient at this task. For new managers,
or for established managers facing new responsibilities, developing several management dilemma-toresearch question hierarchies, each starting with a different dilemma, will assist in the choice process.
To develop these, much exploratory research is used, tapping into published secondary sources and
mining company data.
Greater Speed of Data Collection
Exploration
Seeking existing information is often used not only to identify dilemmas (e.g., identify industry standard
to compare to company performance), but also to ask the right questions and better understand decision options. Historic company data are increasingly being used in exploration as better tools to tap
into digital data warehouses have become available.1 Much of this information may not be in searchable
databases; it may be in written reports, where accessibility is an issue. At this stage, it always pays the
We are attracted to experiments at an early age due to
our unlimited curiosity. A great
researcher fosters curiosity as
an important skill.
©Blend/Image Source
Images are worth more
than 1,000 words; they
serve as visual cues
to anchor concepts in
memory.
31
>part II The Design of Business Research
interview 129
collage 133
semantic mapping
semistructured 130
completion/fill in the blank 133
brand mapping
structured 130
component sorts 133
sensory sorts 133
unstructured 130
imagination exercises 133
Storytelling 133
interview guide (discussion guide) 131
imaginary universe 133
Thematic Apperception Test 133
moderator 129
personification 133
Third-party projection 133
nonprobability sampling
projective drawing 133
word or picture association 133
pretasking 127
visitor from another planet 133
Write a letter 133
projective techniques 133
laddering or benefit chain 133
qualitative research 124
ambiguities and paradoxes 133
metaphor elicitation technique
quantitative research
authority figure 133
role playing 133
triangulation 143
Using learning aids to cement concepts.
cartoons or empty balloons 133
creative innovation role play 133
>discussionquestions
Discussion questions tie to
learning objectives and
come in four types.
Terms in Review
1 How does qualitative research differ from quantitative
research?
2 What is data saturation, and how does it influence qualitative research?
3 What is the nature of data collected in qualitative research?
>part VI Stage 5: Report the Research
Relating questions to newsworthy businesses
style (visual, auditory or kinesthetic) and the level
of pathos,to
logos,
and ethos needed to persuade
makes them more relevant
students.
the audience. Visualization includes selection and
preparation of the best possible support materials to
encourage the audience to embrace the research
insights and recommendations. There are several
types of support a researcher might use: facts, statistics, stories, demonstration, testimony/expert opinion,
analogy, and metaphor. Facts and statistics are the
core of most research reports, but these should not
be the only support provided. Presenting statistics
presents special challenges for researchers, given
the possibility that the audience might misunderstand
or misconstrue findings if incorrectly presented.
4 Why do senior executives feel more comfortable relying on
quantitative data than qualitative data? How might a qualitaexercises,
cues company
audio andlessen
video the
clips,
prepares executive’s
tive research
senior-level
responses
to anticipated questions, stages the
skepticism?
presentation environment (seating arrangement, lighting,
5 Distinguish among structured, semistructured, and unstrucsound, screen placement, etc.), and rehearses contintured interviews.
gency plans for things that might go wrong. Practice is
Making
Research
Decisions
the difference
between
a stellar result in achieving audieffectyou
andare
disappointment.
6ence
Assume
a manufacturer of small kitchen electrics,
like
Hamilton
Beach/Proctor
Silex, and of
you
want
LO16-8 Whether the researcher is an employee
the
res-to determinesponsor
if some or
innovative
designs
withto
unusual
shapes
earch
an external
supplier,
achieve
and and
colors developed
the European market
could be sucmaintain
an effectivefor
researcher–sponsor
relationcessfully
marketed
theseveral
U.S. market.
qualitative reship,
the report
must in
fulfill
ethicalWhat
responsearch would
you(1)recommend,
and
why?
sibilities,
including
the sponsor’s
right
to quality
LO16-7 The last three activities within the content and style
phase are compilation, practice, and delivery. Compilation includes not only the preparation of the written
report to include all the support materials, but also the
development of the oral presentation and its aids. It is
the selection of the oral report within the report structure
that determines the need for practice. During practice,
the researcher determines the time it takes to explain
various insights, marks a script or notecards to indicate
pauses, experiments with various audience-engagement
(2) the absence
coercion,
7research,
NCR Corporation,
knownofasresearcher
a world leader
in ATMs, pointand
(3) the
sponsor’s
right to findings
nondisclosure.
of-sale
(POS)
retail checkout
scanners,
and check-in kiosks
The
first requires
choosing
visualization
andit dataat airports,
announced
in June
2009 that
would move
reporting
andfrom
toolsDayton
appropriate
the (GA), a
its worldtechniques
headquarters
(OH) toforDuluth
data
collected
and maximizing
the sponsor’s
suburb
of Atlanta,
after more than
125 years.value.
An employer of
The
second
requires
explaining
and maintaining
1,200
mostly
high-salaried,
professional
workersthe
in Dayton,
researcher’s
role, living
within
scope of
the of
data
NCR was enticed
to move
bythe
Georgia’s
offer
more than
collected,
and not
violating
confidentiality.
$56.9 million
in tax
credits;participant
its fast-growing,
educated 25- to
The
third requires
controlling
theinternational
safety and distri34-year-old
population
cohort;
offices for 10
bution
of anystate
datagovernments;
reports, support
and
European
andmaterials,
the busiest
international
presentation
materials
airport (Atlanta)
in theprinted
world. or maintained on the
researcher’s or other cloud servers.
a. What qualitative research might NCR have done to reach
this decision?
b. NCR will use its move to Georgia to downsize its world
headquarters workforce. What qualitative research could
help NCR determine which of its 1,200 employees will be
offered positions in Duluth?
From Concept to Practice
8 Use Exhibit 6-6 to develop the recruitment screener for the
research you described in your answer to question 5.
9 Conduct a focus group among students in your class on one
of the following topics:
a. The department’s problems offering requirements
and electives essential for meeting your graduation
expectations.
b. Entertainment sponsored by your university to bring the
community on campus.
From the Headlines
10 Lately, airlines have been having a rough time, in terms
of legal actions and PR issues, with consumers openly
expressing outrage at being bumped from—or forcibly removed from—flights. Design a qualitative study to reveal the
suppressed (as opposed to surface) issues that are contributing to this rage.
a. What are some of the surface issues?
b. Who will you want to participate and how will you recruit
them?
c. What qualitative method(s) will you choose. Be specific
about any exercises you will incorporate.
146
>keyterms
3-D graph 456
facts 446
pictograph 456
actionable insights 434
findings nondisclosure 461
pie graph 451
analogy 447
geograph 456
predispositions 438
anchoring bias 438
graph 450
report framework 443
area graph 451
infographic 459
report structure 439
audience analysis 437
information 434
right to quality 460
audience-centric planning 436
insight 434
scope 441
auditory learners 445
jargon 458
statistics 446
bar graph 454
kinesthetic learners 445
story 447
confirmation bias 438
language level 458
survivorship bias 438
conformity bias 438
limitations 442
table 449
data 434
line graph 451
technical report 440
data-centric planning 436
logos 446
data clarity 451
loss-aversion bias 438
tone 458
demonstration 447
management report 440
visualize 444
desired audience effect 436
metaphor 447
visual learners 445
446
3-D graphic a presentation techniqueethos
that permits
a graphical
comparison of three or more variables;
types include
column,
executive
summary
441
ribbon, wireframe, and surface line.
a priori contrasts a special class of tests used in conjunction with
462
the F test that is specifically designed to test the hypotheses
of the experiment or study (in comparison to post hoc or
unplanned tests).
acquiescence bias a tendency for participants to agree with an
item or statement within a measurement question that asks
for levels of agreement/disagreement; occurs when they have
less knowledge on a topic; more a problem for less educated
or less informed participants.
action research a methodology with brainstorming followed
by sequential trial-and-error to discover the most effective
solution to a problem; succeeding solutions are tried until
the desired results are achieved; used with complex problems
about which little is known.
actionable insights insights aligned with key business goals and
strategic initiatives that are novel, unusual, or unexpected and
that lead to recommendations for specific decisions.
administrative question a measurement question that identifies the
participant, interviewer, interview location, and conditions;
generates nominal data.
after-only design preexperimental design that takes one measurement of DV after manipulation of the IV.
alternative hypothesis (HA) an assumption that a difference exists
between the sample parameter and the population statistic
to which it is compared; the logical opposite of the null
hypothesis used in significance testing.
ambiguities and paradoxes a projective technique (imagination
exercise) in which participants imagine a brand applied to a
different product (e.g., a Tide dog food or Marlboro cereal),
and then describe its attributes and position.
analogy a rhetorical device that compares two different things to
highlight a point of similarity.
analysis of variance (ANOVA) tests the null hypothesis that the
means of several independent populations are equal; test
statistic is the F ratio; used when you need k-independent-
Key terms are a valuable
refresher, in each chapter
and in the glossary.
>glossary
testimony/expert opinion 447
pathos
446predisposition to respond to oneself,
whitespace 457
attitude a learned,
stable
other persons,
objects, or issues
in 459
a consistently favorable or
performance
anxiety
unfavorable way.
attitude scaling process of assessing a personʼs disposition (from
extremely favorable disposition to an extremely unfavorable
one) toward an object or its properties using a number that
represents a person’s score on an attitudinal continuum range.
audience analysis an analysis of the expected audience for a
research report.
audience-centric planning a research report orientation whose
focus is on gaining the audience‘s embrace of data insights
and recommendations; the resulting presentation is persuasive
and tells a story employing statistics.
auditory learners audience members who learn through listening;
represent about 20 to 30 percent of the audience; implies
the need to include stories and examples in research
presentations.
authority figure a projective technique (imagination exercise)
in which participants are asked to imagine that the brand or
product is an authority figure and to describe the attributes of
the figure.
automatic interaction detection (AID) a data partitioning procedure that searches up to 300 variables for the single best
predictor of a dependent variable.
balanced rating scale has an equal number of categories above
and below the midpoint or an equal number of favorable/
unfavorable response choices.
bar graph a graphical presentation technique that represents
frequency data as horizontal or vertical bars; vertical bars are
most often used for time series and quantitative classifications
(histograms, stacked bar, and multiple-variable charts are
specialized bar charts).
Bayesian statistics uses subjective probability estimates based
on general experience rather than on data collected. (See
“Decision Theory Problem” at the Online Learning Center.)
behavior cycle how much time is required for a behavior and
between behavior events; used with behavior frequency in
Glossary reinforces the
language of research.
>part III Measurement
Summated Rating Questions
The Likert scale, developed by Rensis Likert (pronounced Lick-ert), is the most frequently used variation
of the summated rating question. Questions based on summated rating scales consist of statements that
express either a favorable or an unfavorable attitude toward the object of interest. The participant is asked to
agree or disagree with each statement. Each response is given a numerical score to reflect its degree of attitudinal favorableness, and the scores may be summed to measure the participant’s overall attitude. Summation is not necessary and in some instances may actually be misleading, as our caution below clearly shows.
In Exhibit 11-8, the participant chooses one of five levels of agreement. This is the traditional Likert
scale because it meets Likert’s rules for construction and testing. The numbers indicate the value to be
assigned to each possible answer, with 1 the least favorable impression of Internet superiority and 5 the
most favorable. Likert scales may also use 7 and 9 scale points. Technically, such question is a Likerttype question as its construction is less rigorous than the process Likert created. However, the advantages of the 7- and 9- point scales are a better approximation of a normal response curve and extraction
of more variability among respondents.
Conscientious researchers are careful that each item meets an empirical test for discriminating ability between favorable and unfavorable attitudes. Originally, creating a Likert scale involved a procedure
known as item analysis. Exhibit 11-9 provides the steps for selecting Likert statements (items) for the
scale using item analysis. The values for each choice are normally not part of the measurement instrument, but they are shown in Exhibit 11-10 to illustrate the scoring system.
“How-to” exhibits
and Appendices help
students DO research.
>Exhibit 11-9 How to Perform an Likert Item Analysis
Item analysis assesses each item (statement) in a Likert scale based on how well it discriminates between those
people whose total score is high and those whose total score is low.
Step 1
Collect a large number of statements that meet the following criteria
Step 2
Select people similar to study participants (participant stand-ins) to read each statement.
Step 3
Participant stand-ins indicate their level of their agreement with each statement, using a 5-point scale.
A scale value of 1 indicates a strongly unfavorable attitude (strongly disagree). A value of 5 indicates
a strongly favorable attitude (strongly agree). The other intensities (2 (disagree), 3 (neither agree nor
disagree), 4 (agree) are mid-range attitudes (see Exhibit 11-3).
Step 4
Add each participant stand-in’s responses to secure a total score.
Step 5
Array these total scores from highest to lowest; then and select some portion—generally defined as
the top and bottom 10 to 25 percent of the distribution—to represent the highest and lowest total
scores.
●
Each statement is relevant to the attitude being studied.
●
Each statement reflects a favorable or unfavorable position on that attitude.
●
To ensure consistent results, the assigned numerical values are reversed if the statement is worded
negatively. The number 1 is always strongly unfavorable and 5 is always strongly favorable.
●
The two extreme groups represent people with the most favorable and least favorable attitudes
toward the attitude being studied. These extremes are the two criterion groups by which individual
Likert statements (items) are evaluated.
●
Discard the middle group’s scores (50 to 80 percent of participant stand-ins), as they are not highly
discriminatory on the attitude.
Step 6
Calculate the mean scores for each scale item among the low scorers and high scorers.
Step 7
Test the mean scores for statistical significance by computing a t value for each statement.
Step 8
Rank order the statements by their t values from highest to lowest.
Step 9
Select 20-25 statements (items) with the highest t values (statistically significant difference between
mean scores) to include in the final Likert scale
Researchers have found that a larger number of items for each attitude object improves the reliability of the Likert
scale. As an approximate indicator of a statement’s discrimination power, one authority suggests using only those
statements whose t value is 1.75 or greater, provided there are 25 or more participant stand-ins in each group. See
Exhibit 11-5 for an example.
>part II The Design of Business Research
Source: Allen L. Edwards, Techniques of Attitude Scale Construction (New York: Appleton-Century-Crofts, 1957), pp. 152–54.
Observation Environment
264
Observation studies can be designed for the field or laboratory. In business research, field studies may
take place at a customer’s home, the shopping environment, an employee work area (plant, office, distribution center), a supplier’s location, and more. Field studies, offering a natural setting, are most likely to
obtain unaltered behavior, especially when the observer isn’t directly involved.
Laboratory studies are most likely to provide data protection. When specialized equipment is needed for
observation (e.g., eye-tracking cameras, heart rate monitors, galvanic skin response machines, etc) laboratory
settings are often the choice. We’ve had some success with employing eye-tracking via a subject’s laptop and
tablet cameras. Laboratory settings are obviously more expensive, usually involve smaller sample sizes, and
pose more difficulties in recruiting subjects. Laboratory observations can be part of an experimental design.
>snapshot
Observation and Police Cameras
If you read or watch the news, you’ll know that urban areas have
had a sharp increase in questioned—if not questionable—police
actions with regard to excessive use of force, and bystander videos have played an increasing role in judging police actions. In
2015, the U.S. Supreme Court, in Graham v. Connor, held that an
officer’s actions, however, “must be judged from the perspective of
a reasonable officer, rather than with the 20/20 vision of hindsight.”
In an article in The Atlantic, Seth W. Stoughton, a law professor and
former police officer, and Geoffrey Alpert, a professor of criminology, both at the University of South Carolina; along with Jeff Noble,
a police consultant based in Orange County, California, write, “The
aversion to what officers derisively refer to as “‘second-guessing’
. . . [makes] officers less receptive to a critique of their actions . .
. [and] makes them reluctant to provide their own complete and
honest critiques.” Yet nationwide, we’ve seen a demand for police
to change and for police decisions to be more transparent, resulting in a clamoring for use of police body and cruiser cameras.
Do you believe you get a true picture of an incident when you
see a body-mounted or dash-mounted video? Stoughton, who
also consults with law enforcement agencies, has choreographed
a series of videos to demonstrate the answer to this question. His
use parallels an observation study based on respondents watching video footage of mock police incidents. Using a series of
chest-mounted or dash-mounted cameras and bystander videos,
he shows just how difficult it is to arrive at an accurate conclusion
when using only a police-cruiser or body-cam video.
Chest-mounted cameras during interactions or pursuits often
create jerky movements and wildly distorted images. Stoughton
calls this “deceptive intensity”; it creates the impression that the
Connect resources
enrich and engage.
Cases, video, sample projects,
templates, appendices, and more.
officer is under attack when he might not be. In an interception
incident video, using a dash-mounted camera involving a fleeing
suspect and Taser use by an officer, accuracy is related to vantage point. The body camera doesn’t reveal the use of a Taser
or the absence of a gun, while video shot by a bystander does.
“When video allows us to look through someone’s eyes, we
158
©Aaron Roeth Photography
tend to adopt an interpretation that favors that person,” explains
Stoughton, a psychological phenomenon known as “camera
perspective bias.” Stoughton’s research also reveals that the
degree to which the viewer trusts or distrusts the police influences his or her video interpretation. So while the bystander
might not know any facts leading up to the incident, his or her
camera has its own bias. He concludes video evidence interpretation depends, therefore, on perspective as well as bias.
So in observation research, should we consider the camera
as an accurate, unbiased observer? See for yourself. Check out
the videos on Connect.
>preface
It’s serious business to revise a book that has been a worldwide leading text for more than three decades, one that’s
published in eight international editions and is published
in eight languages. The process of text writing and the
speed with which revisions are developed don’t often permit a complete overhaul. But for this edition, given major
changes in how students and professors use text material
and how professors are teaching, as well as major changes
in the research industry, a fresh approach was necessary.
To address major industry changes and both professor and
student needs, my McGraw-Hill team and I came up with a
process and plan to deliver what students are seeking while
giving professors what they need. The approach involved
the following:
• Reflect changes in student learning and teacher
pedagogy:
• Separate
• Streamline and simplify
• Clarify
• Reflect the current state of the research industry:
• Rethink everything
• Collaborate
Reflect Educational Changes
A deep dive into educational articles, as well as comments
from our reviewers and numerous teaching colleagues and
students, present a picture of some major changes in student learning and professors’ use of textbooks.
• An increasing percentage of students don’t buy
books; some professors don’t require this of them
based on the way text material is or is not used in
the course.
• Students often enter classrooms without a foundation for that day’s discussion or activities.
• Professors see students disengaging from important
tasks of the learning process: self-preparation and
self-learning.
• Some professors—even educational institutions—are
choosing to craft their own books, drawing only
those chapters from one or multiple textbooks that
are critical for their instructional approach.
• Customized books omit topics and tools deemed
unnecessary for a given course based on its number
of credit hours or the chosen pedagogy. Students
revealed that omitted material often leaves them with
voids affecting their understanding.
xii
• Professors want chapters that stand alone, so they
may order chapters however they choose to teach the
material.
• Many professors want chapters to focus on the essential material of the topic of each chapter, not have that
material spread intermittently throughout the book.
• Professors want any text-embedded examples to
enhance clarity of a concept.
• Students want unnecessary material eliminated; to
them, “unnecessary” means background and history
of a concept, not practical examples.
• Students believe that in a course like business research,
they should be able to DO research after they learn,
not just be able to describe it.
• Students are expected to apply what is learned in
their research methods course in more advanced
classes; their research methods text needs to serve as
a reference manual.
This revision accomplishes the above in the following ways:
• Separate. This edition:
• Eliminated elements that artificially tie chapters
together.
• Reinvented the use of elements that share crosschapter features.
• Streamline and simplify.
• At the book level, this edition:
◦◦Changed the number and order of chapters.
◦◦Addressed writing and reading level by choosing more widely accepted vocabulary, rather
than jargon, to describe and explain; shortening sentences; and employing a more approachable journalist style.
• At the chapter level, this edition:
◦◦Moved some material from one chapter to
another where it had a more logical fit.
◦◦Reorganized material in each chapter to
make it flow more logically (always putting
A before B),
◦◦Removed from every chapter material that might
be “nice to know” but wasn’t “critical to know.”
◦◦Assessed the value of every exhibit and example and made appropriate changes.
◦◦Removed multiple terms for the same concept.
◦◦Held the list of Key Terms to research-specific
terms.
>preface
• Clarify.
• Students are often frustrated with textbooks;
books define concepts using other concepts students don’t fully understand. This edition redefines dozens of key terms to remedy this problem.
• Students want examples to be relevant to them.
This edition chose business research projects to
profile by choosing behaviors, issues, or brands
students might know or embrace.
• Students want to understand; a definition alone
doesn’t achieve this. In this edition, any concept
important enough to include has more than a
definition.
• Students prefer one term for a single concept;
they find multiple terms aggravating and confusing. This edition uses one term for one concept,
with alternatives relegated to the glossary.
• Students want the key terms list for each chapter
to be comprehensive but to exclude terms that
aren’t critical. After conferring with researchers
and professors, several concepts were removed
from the key terms lists and the glossary, while
others—having achieved more common use in the
industry—were added.
• Students want, and need, to be able to DO
research. This edition delivers:
◦◦One chapter for basic foundational concepts.
◦◦One chapter with everything needed to craft
measurement questions.
◦◦One chapter with everything needed to develop
a measurement instrument.
◦◦One chapter each for each of the data collection methods.
◦◦One chapter for preparing data for subsequent
detailed analysis.
◦◦One chapter for reporting research.
◦◦“How-to” guides for difficult tasks and helpful
tips for other tasks in exhibits and in several
new appendices (Better Tables, Better Reports,
Sample Computer-Based Questions and Guidelines for Mobile Q.).
◦◦A new chapter, An Integrated Example, that
provides an insider’s view of a research project
from management dilemma to research report.
• Students need visual cues to process and retain
information:
◦◦This edition provides 66 new photo visual cues
to help them remember material.
◦◦This text uses a series of exhibits linked to the
research process; these use common shapes
xiii
and colors. Every exhibit in the 31-exhibit
process series of exhibits has been subtly or
substantially redesigned; the series includes
two new exhibits.
◦◦Concepts are now strongly linked to the five
stages in the primary research process model.
• Students and faculty alike want ways to assess
student understanding of material. This edition
offers a new resource: Connect. Connect provides
students computer-based assessment exercises,
which encourage practice and provide instant
feedback on mastery of material. Use of Connect
improves understanding and recall. Connect provides instructors student- and class-level analytics
to improve subject, class, and course decisions.
Reflect the Research Industry
As in prior editions, the use of various interim GreenBook
Research Industry Trends (GRIT) reports guided the
research for this revision. The 2016 GRIT Report is based
on the largest study of research suppliers and users ever
conducted.
• Rethink. This edition:
• Makes a clear distinction between research and
data analytics.
• Removed topics that are no longer relevant.
• Reduced coverage of topics showing waning
importance.
• Enhanced coverage of topics the industry has
embraced.
• Redesigned exhibits to reflect industry changes; of
the more than 200 exhibits in this edition, 31 are
completely new, and an additional 55 have been
updated or redesigned.
• Collaborate. In an industry that is changing very
quickly, any revision depends on people on the front
lines of research. Hundreds of emails and conversations, numerous books and articles, and almost 100
webinars and presentations have influenced this
edition.
• Of the 56 Snapshots, PicProfiles, and Closeups
featured in this edition, 79 percent are new (32) or
updated (12). Topics in these rich research stories
cover cyber security, prediction markets, sentiment
analysis, why data analytics isn’t delivering results,
programmatic ad buying, millennials and housing, the art of asking questions, using interviews
to define the management question, learning from
Pixar to tell research stories, automated secondary data searchers, agile research, performance
management research, who’s taking surveys, digital
xiv
>preface
transformation, use of smartphones, eye tracking,
observation with body cameras, experiments in
employee health, use of robots, experimental labs,
gaming, packaging redesign, question banks, survey engagement, infographic reports, coding word
data, data insights, finding best practices, presentation venues, and much more.
• Discussion questions, especially those labeled
From the Headlines, cover Chipotle’s reputation,
BMW and electric cars, Uber software that
excludes neighborhoods and buildings, shifting
jobs to robots, airline safety, Delta’s reorganization of LAX, Dolby’s experiments with theater
light and sound, Mercedes-Benz and self-driving
cars, Walmart and Nabisco’s Oreo O’s cereal,
Kohl’s department store and Apple Pay, performance-enhancing drugs in the workplace, Toyota
and public confidence, and more.
Keep the Features Adopters Love
• Critical Core Content. The materials adopters have
loved for decades are still the core of this edition. In
an attempt to make the book more flexible to current
instructional methodologies, we haven’t abandoned
what made the book an industry leader.
• Strong Learning Objectives and Summaries. Every
chapter has new learning objectives. The summaries
are comprehensive, knowing sometimes these are the
only material a student has time to read before class.
• Multipurpose Discussion Questions. These can serve
as review for students, as testing exercises, or as
options for vibrant class discussions because many
reflect real-business situations.
• Versatile Appendices. End-of-chapter and end-of-text
appendices for information that, given the differing
skills and knowledge of their students, professors
may want to emphasize or exclude. New appendices
relate to building Better Tables and offering tips on
Better Reports; to address mobile and other types of
computer-delivered measurement instruments, there
is the appendix Sample Computer-Based Questions
and Guidelines for Mobile Q. We retained end-of-chapter appendices related to More Effective Measurement
Questions and Calculate Sample Size.
Professors sometimes use writing a proposal as
an end-of-term project or testing exercise. As a result,
Appendix A has been rewritten with three exercises in
mind: writing a formal proposal, creating an RFP, and
assessing a proposal submitted in response to an RFP.
Other end-of book appendices offer a professional
focus group discussion guide (B), cover nonparametric statistics (C), and provide statistical tables (D).
Use the Cloud
We offer a comprehensive set of teaching and learning
resources for Business Research Methods for faculty in
Instructor Resources within Connect and for students at
www.mhhe.com/Schindler13e. You’ll find the following:
• Written Cases. Cases offer an opportunity to tell
research stories in more depth and detail. You’ll also
find cases about hospital services, lotteries, data mining, fundraising, new promotions, and website design,
among other topics, featuring organizations like
Akron Children’s Hospital, Kelley Blue Book, Starbucks, Yahoo!, the American Red Cross, and more.
• Video Cases and Supplements. New to this edition is
a video supplement about an experiment in observation using body cameras; it should be ideal for discussing error in observation research. Additionally,
several short segments drawn from a two-hour metaphor elicitation technique (MET) interview should
be invaluable in teaching students to conduct almost
any type of individual depth interview and to explain
the concept of researcher–participant rapport.
Four of our video cases were written and produced
especially to match the research process model and
feature noted companies: Lexus, Starbucks, Wirthlin
Worldwide (now Harris Interactive), Robert Wood
Johnson Foundation, GMMB, Visa, Bank One,
Team One Advertising, U.S. Tennis Association,
Vigilante New York, and the Taylor Group.
• Data Files. If your course doesn’t involve a project
where students collect their own data, use one of the
cases here that contain data.
• Sample Student Project. Visualization of the finished
deliverable is crucial to creating a strong research
report, or critique this one.
• Appendices. You’ll find helpful appendices within
Connect: Bibliographic Database Searches,
Advanced Bibliographic Searches, Complex Experimental Designs, Test Markets, and Pretesting Options and
Discoveries.
• Articles, Samples, and Templates. Students often
need to see how professionals do research to really
understand the research process. You’ll find a sample EyeTrackShop report, a Nielsen report of using
U.S. Census data, an Excel template for generating
sample data displays, and more.
• Multivariate Analysis: An Overview is a chapter for
the benefit of graduate students who use Business
Research Methods.
• Instructor’s Manual (instructors only).
• Web Exercises. Due to the ever-changing nature of
web URLs, you’ll find these exercises here.
>preface
• Written and Video Case Discussion Guides.
• Additional Business Research Examples for
discussion.
• Test Bank (instructors only).
Collaborators
Research industry collaborators are the lifeblood of this
textbook writer. The following people collaborated directly
on this edition or connected me with those who did: Andy
Peytchev, Research Triangle Institute; Bella Tumini, Suja;
Betty Adamou, Research Through Gaming Ltd.; Cassandra McNeill, GutCheck; Colin McHattie, iTracks; Dan
Weber, iTracks; Daniel Enson, Toluna; David Harris,
Insight and Measurement; Denise D’Andrea, Focus Vision;
Edwige Winans, Marcus Thomas LLC; Elaine Arkin,
research consultant; Eric Lipp, Open Doors Organization;
Ilan Hertz, SiSense; Jane Boutelle, Digsite; Jennifer HirtMarchand, Marcus Thomas LLC; Jessica Broome, Jessica
Broome Research; Justin Ohanessian, Sticky; Kerry Hecht,
Ramius; Lance Jones, Keynote Systems; Lenard Murphy, GreenBook; Lisa Whestone, Gutcheck; Malgorzata
Kolling, OdinText; Mark Bunger, Forrester Research; Matt
Marta, GutCheck; Monika Wingate, Digsite; Nicola Petty,
Statistics Learning Centre; Patricio Pagani, InfoTools; Pete
Cape, SSI; Rob Ramirez, Schlesinger Associates; Robert
W. Kahle, author; Tom H.C. Anderson, Anderson Analytics and OdinText; Sean Case, Research for Good; Seth
Stoughton, University of South Carolina;Stuart Schear,
Robert Wood Johnson Foundation; and Zoe Downing,
Focus Vision.
The following are just a few of the people who offered
me ideas for new concepts, Snapshots, PicProfiles, and CloseUps for this edition: Andrew McAfee, MIT; Carlo Ratti,
Senseable City Lab, MIT; David Kiron, MIT-Sloan; Didier
Bonnet, Capgemini Consulting; George Westerman, MIT
Sloan; Glenn Kelman, author; John Cendroski, TIAA;
Julia Smith, AIG; Kevin Lonnie, KL Communications;
Lenard Murphy, GreenBook; Martin Lindstrom, author;
Michael Benisch, Rocket Fuel Inc.; Michelle Shail, TIAA;
Nick Drew, Fresh Intelligence; Pamela Kirk Prentice, MITSloan; Ray Poynter, NewMR; Richard Cassidy, AlertLogic;
Sam Ransbotham, MIT-Sloan; Warren Berger, researcher
and author; and William Pink, Millward Brown.
And to all those research collaborators who have suggested ideas, collaborated on cases or past Snapshots,
Closeups, or PicProfiles, and continue to discuss the
research industry with me, I’m grateful. These individuals
include: Rachel Sockut, Innerscope; Erica Cenci, Brady
PR for OpinionLab; Olescia Hanson, The Container
Store; Cynthia Clark, 1to1 Magazine; Betty Adamou,
Research Through Gaming Ltd.; Debra Semans, Polaris
Marketing Research; Keith Chrzan, Maritz Research Inc.;
Michael Kemery, Maritz Research Inc.; Christian Bauer,
xv
Daimler AG; Kai Blask, TNS Infratest; Melinda Gardner,
Novation; Keith Phillips, SSI; Nels Wroe; SHL; Ephraim
(Jeff) Bander, Eye Track-Shop; Ron Sellers, Grey Matter
Research & Consulting; Guadalupe Pagalday, Qualvu.com;
Sandra Klaunzler, TNS Infratest; Steve August, Revelation;
Kathy Miller, GMI (Global Market Insite Inc.); Takayuki
Nozoe, NTT Communications Corporation; Janeen Hazel,
Luth Research; Christine Stricker, RealtyTrac; Stephanie Blakely, The Prosper Foundation; Jennifer Frighetto,
Nielsen; Andy Pitched, Research Triangle Institute (RTI
International); Jeffrey C. Adler, Centric DC Marketing
Research; Josh Mendelssohn, Chadwick Martin Bailey
Inc.; Ruth Stan, SIS International Research; Sharon Starr,
IPC Inc.; Keith Crosley, Proofpoint; Christopher Schultheiss, SuperLetter.com; Hy Mariampolski, QualiData
Research Inc; Julie Grabarkewitz and Paul Herrera, American
Heart Association; Holly Ripans, American Red Cross;
Mike Bordner and Ajay Gupta, Bank One; Laurie Laurant
Smith, Arielle Burgess, Jill Grech, David Lockwood, and
Arthur Miller, Campbell-Ewald; Francie Turk, Consumer
Connections; Tom Krouse, Donatos Pizza; Annie Burns
and Aimee Seagal, GMMB; Laura Light and Steve Struhl,
Harris Interactive; Emil Vicale, Herobuilders.com; Adrian
Chiu, NetConversions; Colette Courtion, Starbucks; Mark
Miller, Team One Advertising; Rebecca Conway, The Taylor Research Group; Scott Staniar, United States Tennis
Association; Danny Robinson, Vigilante; Maury Giles,
Wirthlin Worldwide; and Ken Mallon, Yahoo!.
To our faculty reviewers, your insights, aggravations,
challenges, frustrations, suggestions, and disagreements
were very helpful. These encouraged me to examine every
word, every sentence, and every concept and see better,
clearer ways to engage students in the subject we all love.
Reviewers for this edition’s revision are: Ahmed Al-Asfour,
Ogala Lakota College; Zara Ambadar, Carlow University; Don Ashley, Wayland Baptist University; Kristopher
Blanchard, Upper Iowa University; Cristanna Cook, Husson University; Charlene Dunfee, Capella University;
Ernesto Gonzalez, Florida National University; Wendy
Gradwohl, Wittenburg University; Pam Houston, Ogala
Lakota College; Yan Jin, Elizabeth City State University;
Abdullah Khan, Clafin University; Tracy Kramer, North
Greenville University; Rex Moody, Angelo State University; Jason Patalinghug, University of New Haven; Glen
Philbrick, United Tribes Technical College; Denel Pierre,
Shorter University; Pushkala Raman, Texas Woman’s
University; Charles Richardson, Clafin University; Marcel
Robles, Eastern Kentucky University; Angela Sandberg,
Shorter University; Brian Satterlee, Liberty University;
Jonathan Schultz, Amberton University; Stefano Tijerina,
Husson University; Greg Turner, Clafin University; Sam
VanHoose, Wayland Baptist University; Greg Warren,
Wilmington University; Beyonka Wider, Claflin University;
and Ron Zargarian, University of Indianapolis
xvi
>preface
Prior edition reviewers included: Scott Bailey, Troy
University; Scott Baker, Champlain College; Robert Balik,
Western Michigan University–Kalamazoo; John A. Ballard, College of Mount St. Joseph; Jayanta Bandyopadhyay,
Central Michigan University; Larry Banks, University of
Phoenix; Caroll M. Belew, New Mexico Highlands University; Kay Braguglia, Hampton University; Jim Brodzinski,
College of Mount St. Joseph; Taggert Brooks, University of
Wisconsin–La Crosse; Cheryl O’Meara Brown, University
of West Georgia; L. Jay Burks, Lincoln University; Marcia Carter, University of Southern New Hampshire; Raul
Chavez, Eastern Mennonite University; Darrell Cousert,
University of Indianapolis; David Dorsett, Florida Institute of Technology; Michael P. Dumler, Illinois State University; Kathy Dye, Thomas More College; Don English,
Texas A&M University–Commerce; Antonnia Espiritu,
Hawaii Pacific University; Hamid Falatoon, University of
Redlands; Judson Faurer, Metropolitan State College of
Denver; Eve Fogarty, New Hampshire College; Bob Folden,
Texas A&M University–Commerce; Gary Grudintski, San
Diego State University; John Hanke, Eastern Washington
University; Alan G. Heffner, Silver Lake College; Ron E.
Holm, Cardinal Stritch University (Director of Distance
Learning); Lee H. Igel, New York University; Burt Kaliski,
New Hampshire College; Jane Legacy, Southern New
Hampshire University; Andrew Luna, State University of
West Georgia; Andrew Lynch, Southern New Hampshire
University; Iraj Mahdvi, National University; Warren Matthews, LeTourmeau University; Erika Matulich, University
of Tampa; Judith McKnew, Clemson University; Rosemarie Reynolds, Embry Riddle Aero University–Daytona;
Randi L. Sims, Nova Southeastern University; Gary Stark,
Northern Michigan University; Bruce Strom, University
of Indianapolis; Cecelia Tempomi, Southwest Texas State
University; Gary Tucker, Northwestern Oklahoma State
University; Marjolijn Vandervelde, Davenport University;
Charles Warren, Salem State College; Dennis G. Weis,
Alliant International University; Robert Wheatley, Troy
University; Bill Wresch, University of Wisconsin–Oshkosh;
and Robert Wright, University of Illinois at Springfield;
and Ken Zula, Keystone College.
This revision incorporates the feedback of dozens of students who identified areas of confusion so that this edition
could make concepts more understandable, who participated in search tests, who worked on numerous research
projects demonstrating where the book needed to include
more information, and who provided reminders with their
questions and actions that some aspects of the research
process operate below their learning radar.
Through this 13th edition, I hope you and your students
discover, or rediscover, how stimulating, challenging, fascinating, and sometimes frustrating this world of researchsupported decision making can be.
Pamela Schindler
Many thanks to my McGraw-Hill team; without your
assistance this revision wouldn’t have happened so smoothly:
Chuck Synovec, Director; Noelle Bathurst, Portfolio Manager;
Ryan McAndrews, Product Developer; Erika Jordan, Core
Project Manager; Harper Christopher, Executive Marketing Manager; David W. Hash, Designer; Daryl Horrocks,
Program Manager; Sue Nodine, copyeditor; Elizabeth
Kelly, proofreader; and Angela Norris, Senior Assessment
Project Manager.
>detailedchangestothisedition
In its 13th edition, all chapters within Business Research
Methods have been evaluated for currency and accuracy.
Revisions were made to accommodate new information
and trends in the industry, changing teaching pedagogy,
and information about what teachers and students are
looking for in their textbooks.
• The book’s chapter structure is slimmer and has been
changed to reflect how teachers are teaching research
and using the book; the book now has 17 chapters.
• A foundations chapter replaces the first three
chapters.
• The chapters on data preparation and examination have merged.
• A Research Reports chapter now merges informa
tion on oral and written reports, with emphasis on oral reports to better reflect industry
practice.
• A new Chapter 17, An Integrated Example, now
provides an insider’s perspective of a research
project. This example applies text practices and
theory to one example from management
dilemma to research report.
• Material has been reorganized to tie better to the
modified research process model; there are now five
parts, each part a match to a stage in the model.
Part I, contains three chapters and establishes the
foundations for what follows. Part II contains five
chapters, all focused on research design and its various methodologies. Part III contains four chapters,
all related to data collection and preparation. Part IV
contains two chapters related to data analysis. Part V
contains one chapter on research reporting. The part
structure was designed to better reflect the research
process as it is currently managed.
• Ethical issues are discussed, with their possible solutions, in every chapter, rather than in a stand-alone
chapter, to reflect how teachers are using this material.
• Every section and every word has been examined for
concept clarity and better student understanding;
whole sections and whole chapters and appendices
have been reconceived and rewritten.
• An emphasis has been placed on indicating solutions
for problems or possible error sources, not just indicating or describing the errors/problems.
• Based on student feedback, an emphasis has been
placed on providing sufficient information to “do”
research, not just learn about research. Exhibits have
been added to reflect how to execute a particular
practice, facilitating the experiential approach to
teaching and learning business research methods.
• For clarity and to match a chapter’s new structure,
numerous Exhibits are new (38), have been revised
significantly (34), or have been slightly modified (8).
• Continuing examples no longer weave throughout
the text; chapters can now be assigned in different
order to fit any teaching pedagogy.
• Images (58) have been added or replaced, giving a
visual cue for new Snapshots, PicProfiles, or new
embedded examples.
• To reflect industry practices, the series of exhibits that
reflect the research process and that are used as conceptual “thought flowcharts”—especially valuable for
visual learners—have been reenvisioned and redesigned;
new exhibits have been added to this process series.
• The Cases section contains an updated case-bychapter-suggested-use chart.
• Continuing to provide rich examples from the
research industry, 30 new Snapshots, five new
PicProfiles, and two new Closeups have been added;
two Closeups have been updated.
• Several new chapter-level appendices have been
added to this edition: Better Reports (Chapter 16),
Better Tables (Chapter 13), Sample Computer-Based
Questions and Guidelines for Mobile Q (Chapter 11),
and Sources of Measurement Questions (Chapter 11).
• The Glossary has been updated; 77 new terms ref
lect changes in industry practices and 27 additional
terms were upgraded to key term status.
• The Instructor’s Manual contains additional research
examples for discussion or testing.
• McGraw-Hill Connect® has been added to the book’s
resources; Connect provides opportunities for both
formative and summative assessment by providing
students regular and consistent feedback, encouraging practice, and enabling them to move closer to
mastery by improving understanding and recall. Instructors are provided student and class analytics, to
improve teaching decision making. Assignable material within Connect for this edition includes multiple
choice questions for homework for each chapter and
test bank questions for online testing.
• SmartBook®, also assignable in Connect, is a digital version of our textbook that actively tailors that content to
an individual student’s needs. It helps a student focus on
the things they don’t know, helps them retain key concepts, is accessible on the go, and tracks student progress.
xvii
xviii
>detailed changes to this edition
• Student Resources/Faculty Resources within Connect contain new materials (sources, videos, examples) and video showcasing an observation
experiment using body cameras.
• Test Bank has been updated to reflect changes in
content and organization.
• PowerPoint slide decks have been updated to reflect
changes in content and organization.
For Each of the Chapters A detailed listing of
chapter-by-chapter changes is provided here for your
convenience.
• Chapter 1 This chapter was completely rewritten
and has a new focus: the fundamentals or critical
concepts students need to understand the remainder
of the book. It combines material from 12e Chapters
1 and 3, with elements from Chapter 2. The following elements are new to this edition: chapter-opening
quote (William Pink), the learning objectives and
summary, a PicProfile on emerging trends based on
the latest GRIT report, three new Snapshots (Big vs.
Small Data, Research on Cyber Security, and Identifying and Defining Constructs), two revised exhibits,
multiple images as visual cues, new embedded examples related to Hobby Lobby and Siemens AG,
one new key term (data blending), four new photos
serving as visual cues, and new discussion questions.
Several sections have been pulled and others moved
to chapters with a better fit. Six snapshots and three
exhibits have moved to the IM.
• Chapter 2 Previously Chapter 4, this chapter features a restricted and simplified research process
exhibit and a new structure based on five stages
of the research process, with material on proposing research moving to Chapter 3. The following
elements are new to this edition: chapter-opening
quote (Brad Smith, Microsoft), learning objectives
and summary, three new sections (identifying and
prioritizing dilemmas, research project timeframe,
and ethical issues and responsibilities) and five
restructured sections, a PicProfile on emerging
trends in research design, a Snapshot (Research and
Programmatic Algorithms), a revised snapshot on outsourcing research, a new exhibit on Gantt chart of
research project, four new images, five new key terms
[key performance indicators (KPIs), dashboards,
findings, insights, recommendations], and a new
From-the-Headlines discussion question. The What
is Good Research and Ethical Issues sections moved
to this chapter from Chapters 1 and 3, respectively.
Detailed management-research question hierarchy
section was moved to Chapter 3. CPM chart was
moved from Chapter 6.
• Chapter 3 Previously Chapter 5, this chapter is
restructured and focuses on stage 1 of the research
process: the management-research question hierarchy and exploration to include valuing and budgeting
research. The following elements are new to this edition: chapter-opening quote (Warren Berger, author),
learning objectives and summary, eight new sections,
one modified and four new exhibits, three new Snapshots (Housing and Millennials, The Art of Asking the
Right Question, Using Interviews to Refine the Management Question), eight key terms relocated from
other chapters, and six new images as visual cues.
The section on data mining was dropped to reinforce
Chapter 1’s emphasis on research and data mining
as different courses. Several key terms have moved to
other chapters to reflect relocation of certain material. Several Snapshots have moved to the IM.
• Chapter 4 Previously Chapter 6, the emphasis of
this chapter has changed to research design once
the research question(s) and investigative questions
have been determined and the decisions involved in
research design, including those involved in sampling
design, have been made. The following elements are
new to this edition: chapter-opening quote (Nick
Drew, Fresh Intelligence), learning objectives and
summary, a new section (sampling design), two new
Snapshots (TIAA Performance Management Overhaul,
AIG and Research Design), a new CloseUp (How
Agile Research Helped Prove the Value in a Packaging
Redesign), two modified exhibits, embedded example
on productivity and morale, five new images as visual
cues, one new key term (single-methodology design),
and modified discussion questions—including a new
From-the-Headlines discussion question. Detailed
sections on causation moved to Chapter 8, and focus
groups moved to Chapter 6. Two Snapshots moved
to the IM.
• Chapter 5 Previously Chapter 14, this chapter has
a different structure based on the six steps of sampling design. The following elements are new to this
edition: chapter-opening quote (Gerald Earl Gillum,
American rapper and producer), learning objectives
and summary, three new sections (sampling design,
selection and recruiting protocols, and ethical issues
and their solutions), one new snapshot (Who’s Taking Your Surveys), one revised PicProfile on mixed
access sampling, four new exhibits, four revised or
modified exhibits, four new images as visual cues,
and anew From-the-Headlines discussion question. Two key terms were moved here (case, target
population).
• Chapter 6 Previously Chapter 7, the following elements are new to this edition: chapter-opening quote
(Gia Calhoun, Burt’s Bees), learning objectives and
>detailed changes to this edition
summary, two new sections (qualitative sampling
design, including incentivizing participants and
interviewers as consultants, and ethical issues and
their solutions) and one enhanced section (creative
exercises), three new Snapshots (Digital Transformation Revealed Using IDIs, IDIs Help Restructure
Maritz Travel, Qualitative Research in the Era of
Smartphones), two revised exhibits, three new images as visual cues, and seven new key terms (data
saturation, collage, completion/fill in the blank,
role playing, creative innovation roleplay, story
telling, write a letter). Four Snapshots and PicProfiles
moved to the IM.
• Chapter 7 Previously Chapter 8, this chapter has
been reorganized to follow the observation research
design steps. The following elements are new to
this edition: chapter-opening quote (Katie Hafner,
author), learning objectives and summary, two new
sections (sampling design, ethical issues and their
solutions), two new Snapshots [Visual Content Gets
Sticky, Observation and Police Cameras (with video
on the website)], three revised or modified exhibits,
six new images as visual cues, two new key terms
(memory decay, selective filtering), and a new Fromthe-Headlines discussion question. Several Snapshots
and a PicProfile have moved to the IM.
• Chapter 8 Previously Chapter 9, this chapter has
been reorganized, with evaluation of experiments
moving toward the end of the chapter. The following
elements are new to this edition: chapter-opening
quote (Jeff Bezos, CEO, Amazon), learning objectives and summary, a new section (ethical issues and
their solutions), a relocated section on causation
(including two exhibits), four snapshots (Experiments in Improving Employee Health, Robotic Experiments, Zeotap Experiments with Mercedes-Benz, MIT
SENSEable City Lab), two revised exhibits, four new
images as visual cues, 14 new key terms (debriefing,
after-only design, group time series design, history,
instrumentation, maturation, nonequivalent control
group design, one group pretest-posttest design,
posttest-only control group design, selection, separate sample pretest-posttest design, static group comparison design, regression toward the mean, testing),
and a From-the-Headlines discussion question. Four
Snapshots moved to IM.
• Chapter 9 Previously Chapter 10, this chapter has
been reorganized. The following elements are new to
this edition: chapter-opening quote (David Goldberg,
CEO, SurveyMonkey), learning objectives and summary, four new sections (classification of data collection designs, telephone survey trends, evaluation
of survey design, and ethical issues and their solutions), two new Snapshots (Internet Brings Prediction
xix
Research into 21st Century, Research Embraces the
Smartphone), two new PicProfiles on emerging
trends in survey research and declining response
rates, two new and three revised exhibits, four new
images as visual cues, updated statistics, three new
key terms (acquiescence bias, probe, social desirability bias), and a From-the-Headlines discussion question. Two additional exhibits moved to this chapter
(informed consent and IRB process).
• Chapter 10 Previously Chapter 11, The following
elements are new to this edition: chapter-opening
quote (David McCandless, author), one new exhibit,
two revised exhibits, one new Snapshot (The Emotional Face of Research), four images as visual cues,
and a From-the-Headlines discussion question.
• Chapter 11 Previously Chapter 12, this chapter is
reorganized and rewritten to focus on measurement
questions, rather than the scales on which they are
based, in order to work better with the chapter on
measurement instruments. The following elements
are new to this edition: chapter-opening quote
(David F. Harris, president, Insight and Measurement), learning objectives and summary, two new
sections (instrument design, prepare the preliminary
analysis plan) and one revised section (data entry),
a new Snapshot (Toluna and Voss Measure Water)
and a new PicProfile about Urban Dictionary, four
new and three revised or modified exhibits, two additional exhibits moved from other chapters, some
material on coding moved from another chapter,
five new images as visual cues, five new key terms
(attitude scaling, checklist, error of strictness, interview guide, scaling) and 17 key terms moved here
from other chapters, modified discussion questions
(including a From-the-Headlines question), and two
new appendices (sample computer-based questions
by scale type, sources of measurement questions).
• Chapter 12 Previously Chapter 13, this chapter
has a new structure to work better with the chapter
on Measurement Questions, with a stronger link
to the preliminary analysis plan. The following elements are new to this edition: chapter-opening quote
(Kristin Luck, research consultant), learning objectives and summary, three new sections (instrument
design, physical design, nonquestion elements), one
new Snapshot (New Vehicle Survey), five new and
three revised exhibits, four new images as visual
cues, 15 new key terms (assimilation effect, behavior cycle, behavior frequency, behavior time frame,
completion estimate, contrast effect, filter question, instrument coverage, instrument scope, interview guide,
measurement instrument, rapport, skip directions,
skip logic diagram, social desirability bias), revised
discussion questions, and a new From-the-Headline
xx
>detailed changes to this edition
question. The Invoke PicProfile was moved to
the IM.
• Chapter 13 As a merger of 12e Chapters 15 and
16, this chapter has a new structure and new content. The following elements are new to this edition:
chapter-opening quote (Dana Zuber, director of analytics for Butler, Shine, Stern & Partners), learning
objectives and summary, one new section (collect the
data) and a revised section (coding), six revised and
three new exhibits, two new Snapshots (How Might
You Code Word Data, The Difference Between Data
and Insight), updated statistics, 17 new key terms
(coding scheme, context units, cross-tabulation, data
collection, data validation, inter-rater reliability, intrarater reliability, listwise deletion, data missing at
random (MAR), data missing but not missing at random (NMAR), data missing completely at random
(MCAR), pairwise deletion, predictive replacement,
recoding, recording units, sampling units, survey
activation), revised discussion questions, and new
chapter appendix (Better Tables). Four Snapshots, a
CloseUp, and a PicProfile were moved to the IM.
• Chapter 14 Previously Chapter 17, the following
elements are new to this edition: one revised and one
new exhibit, and three new images as visual cues.
• Chapter 15 Previously Chapter 18, the following
elements are new to this edition: chapter-opening
quote (Jeff Bezos, CEO, Amazon), one new Snapshot, and three new images as visual cues. One
exhibit (grammar and style proofreader results) was
moved to the IM.
• Chapter 16 As a merger of Chapters 19 and 20,
this chapter has a new structure emphasizing the
oral presentation. The following elements are new to
this edition: chapter-opening quote (David McCandless, British data journalist, information designer,
and author), learning objectives and summary, four
new sections (audience-centric planning, visualization specifically for the oral report, infographics,
ethical considerations in reporting), a new Snapshot
(Hitting the Wall is a Good Thing), a new CloseUp
(Storytelling from Pixar Applied to Research), five new
and 10 revised exhibits, an infographic image, 21
new key terms (predispositions, confirmation bias,
anchoring bias, conformity bias, survivorship bias,
loss-aversion bias, visualize, data clarity, actionable
insights, audience-centric planning, data-centric
planning, desired audience effect, graph, information, insights, limitations, report framework, report
structure, table, tone, geography), revised discussion questions, and a new chapter appendix—Better
Reports—with five new exhibits and five existing exhibits from prior chapters. Two items (constructing
a story and overcoming the jitters) were moved to
the IM.
• Chapter 17 This new chapter, An Integrated Example,
provides an insider’s perspective of a research project. This example applies text practices and theory
to one example from management dilemma to
research report. The companies, Visionary Insights
and BrainSavvy, might be fictional, but the research
profiled in the example is very real. This chapter
can be used throughout the course to review (or test)
various concepts, or at the end of the course as the
basis for a lively discussion or final exam.
>additionalresources
There is a wealth of information, samples, templates, and
more within Connect for instructors, and at www.mhhe
.com/Schindler13e for students.
Written Cases. Cases offer an opportunity to tell
research stories in more depth and detail. You’ll find
a new case, Marcus Thomas LLC Tests Hypothesis for
Troy-Bilt Creative Development, complete with its online
questionnaire, at the Online Learning Center. You’ll
also find cases about hospital services, lotteries, data
mining, fundraising, new promotions, and website
design, among other topics, featuring organizations
like Akron Children’s Hospital, Kelley Blue Book, Starbucks, Yahoo!, the American Red Cross, and more.
Video Cases. We are pleased to continue to make
available a first in video supplements: several short
segments drawn from a two-hour metaphor elicitation
technique (MET) interview. These segments should
be invaluable in teaching students to conduct almost
any type of individual depth interview and to explain
the concept of researcher–participant rapport. Four of
our video cases were written and produced especially
to match the research process model in this text and
feature noted companies: Lexus, Starbucks, Wirthlin
Worldwide (now Harris Interactive), Robert Wood
Johnson Foundation, GMMB, Visa, Bank One, Team
One Advertising, U.S. Tennis Association, Vigilante
New York, and the Taylor Group.
Web Exercises. It is appropriate to do web searches
as part of a research methods course, so each chapter
offers one or more exercises to stimulate your students
to hone their searching skills. Due to the ever-changing
nature of web URLs, however, we offer these exercises
in the Instructor’s Manual.
Articles, Samples, and Templates. Students often need
to see how professionals do things to really understand,
so you’ll find a sample EyeTrackShop report, a Nielsen
report of using U.S. Census data, an Excel template for
generating sample data displays, and more.
Sample Student Project. Visualization of the finished
deliverable is crucial to creating a strong research report.
xxi
McGraw-Hill Connect® is a highly reliable, easy-touse homework and learning management solution
that utilizes learning science and award-winning
adaptive tools to improve student results.
Homework and Adaptive Learning
▪ Connect’s assignments help students
contextualize what they’ve learned through
application, so they can better understand the
material and think critically.
▪ Connect will create a personalized study path
customized to individual student needs through
SmartBook®.
▪ SmartBook helps students study more efficiently
by delivering an interactive reading experience
through adaptive highlighting and review.
Over 7 billion questions have been
answered, making McGraw-Hill
Education products more intelligent,
reliable, and precise.
Using Connect improves retention rates
by 19.8 percentage points, passing rates
by 12.7 percentage points, and exam
scores by 9.1 percentage points.
Quality Content and Learning Resources
▪ Connect content is authored by the world’s best subject
matter experts, and is available to your class through a
simple and intuitive interface.
73% of instructors
who use Connect
require it; instructor
satisfaction increases
by 28% when Connect
is required.
▪
The Connect eBook makes it easy for students to
access their reading material on smartphones
and tablets. They can study on the go and don’t
need internet access to use the eBook as a
reference, with full functionality.
▪
Multimedia content such as videos, simulations,
and games drive student engagement and critical
thinking skills.
©McGraw-Hill Education
Robust Analytics and Reporting
▪
Connect Insight® generates easy-to-read
reports on individual students, the class as a
whole, and on specific assignments.
▪
The Connect Insight dashboard delivers data
on performance, study behavior, and effort.
Instructors can quickly identify students who
struggle and focus on material that the class
has yet to master.
©Hero Images/Getty Images
▪
Connect automatically grades assignments
and quizzes, providing easy-to-read reports
on individual and class performance.
More students earn
As and Bs when they
use Connect.
Trusted Service and Support
▪
Connect integrates with your LMS to provide single sign-on and automatic syncing
of grades. Integration with Blackboard®, D2L®, and Canvas also provides automatic
syncing of the course calendar and assignment-level linking.
▪
Connect offers comprehensive service, support, and training throughout every
phase of your implementation.
▪
If you’re looking for some guidance on how to use Connect, or want to learn
tips and tricks from super users, you can find tutorials as you work. Our Digital
Faculty Consultants and Student Ambassadors offer insight into how to achieve
the results you want with Connect.
www.mheducation.com/connect
>briefcontents
Preface xii
Appendix: Sources of Measurement
Questions 286
>part I
Appendix: More on Effective Measurement
Questions 287
Building the Foundation for
Research 1
12
1
Research Foundations and Fundamentals 2
2
The Research Process: An Overview
26
3
Stage 1: Clarify the Research Question
45
>part II
>part IV
Collect, Prepare and Examine
the Data 319
13
The Design of Business
Research 69
Appendix: Describing Data Statistically 352
Stage 2: Research Design, An Overview
5
Stage 2: Sampling Design
70
85
Appendix: Calculate the Sample Size 114
6
Stage 2: Data Collection Design: Qualitative
Research 122
7
Stage 2: Data Collection Design: Observation
Research 148
8
Stage 2: Data Collection Design:
Experiments 170
9
Stage 2: Data Collection Design: Survey
Research 194
>part V
Analyze and Interpret Data
14
Stage 4: Hypothesis Testing
15
Stage 4: Measures of Association
357
358
395
>part VI
Stage 5: Report the
Research 431
16
>part III
Stage 5: Research Reports: Supported …
Purchase answer to see full
attachment