PART I: FROM BIG QUAL TO THE BREADTH-AND-DEPTH METHOD
1. The place and value of large-scale qualitative data analysis
1.1 Introduction
1.2 What is Big Data?
1.3 Big Data and Qualitative Research
1.4 Breaking Methodological Borderlines
1.5 Big Opportunities for Qualitative Research
1.6 In Search of Breadth and Depth
1.7 Big Qualitative Data Analysis
1.8 Computers, Computing and Qualitative Data
1.9 Capabilities and Skills
1.10 Conclusion1.11 Resources
References
2. Introducing the breadth-and-depth method
2.1 Introduction2.2 The Metaphorical Foundations of the Method
2.3 An Overview of the Four Steps 2.3.1 Step One: ‘Aerial Surveying’ - Overviewing the Qualitative Data and Constructing a Corpus2.3.2 Step Two: ‘Geophysical Surveying’ - Approaches to Breadth Analysis using ‘Data Mining’ Tools
2.3.3 Step Three: ‘Test Pit Sampling’- Preliminary Analysis2.3.4 Step Four: ‘Deep Excavations’ - In-depth Interpretive Analysis
2.4 The Relationship between Theory and Method2.4.1 Deduction
2.4.2 Induction2.4.3 Abduction
2.4.4 Retroduction2.5 Benefits of the Method for Qualitative and Quantitative Researchers
2.6 Key Considerations2.6.1 Epistemological Issues
2.6.2 The Nature of Data in the Breadth-and-Depth Method2.6.3 Quality
2.6.4 Generalisability2.7 Getting Started
2.8 ResourcesReferences
PART II: AN ENQUIRY-LED OVERVIEW OF QUALITATIVE RESEARCH DATA SETS
3. Sourcing and searching for suitable data sets
3.1 Starting your Breadth-and-Depth Project
3.2 What do we mean by Qualitative Data?3.3 Where can I find Data?
3.3.1. Introducing the Archive3.3.2. Qualitative Archives - a Growing Infrastructure
3.3.3 Where can I find Archived Qualitative Data Sets?3.4 Community Archiving
3.5 Problematising Archiving and Reuse3.6 How to Search an Archive
3.6.1 Accessing the Archive – Issues to Consider3.6.2 Data Quality and the Politics of the Archive
3.7 Outside and Alongside the Archive3.8 Moving your Search Strategy Forward
3.9 ResourcesReferences
4. ‘Ariel surveying’: Overviewing the data and constructing a corpus
4.1 Introduction
4.2 The Aerial Survey
4.3 Understanding Metadata and Meta-narratives
4.3.1 Using Metadata to Audit your Data Sets
4.4 Working with Metadata – Challenges
4.4.1 Closeness and Distance
4.4.2 Data Harmonization
4.4.3 Constructing your Corpus
4.5 Assembling your new Corpus
4.5.1 Using Metadata to Create a New Corpus
4.5.2 Data Management
4.6 Key Areas for Consideration
4.7 Resources
References
PART III: MOVING BETWEEN BREADTH-AND-DEPTH IN QUALITATIVE ANALYSIS
5. ‘Geophysical surveying’: Recursive surface ‘thematic’ mapping using data mining tools
5.1 Introduction5.2 CAQDAS Software not the only Tool in the Box
5.3 Surface Sifting not ‘Mining’ as Depth
5.4 Capacity, Knowledge and Skill Required5.5 Basics of Computer Text Analysis and ‘Text Mining’
5.5.1 Throwing Away ‘Stop Words’
5.5.2 Word Counting: Comparing Frequencies and ‘Keyness’ of Words5.5.3 The Framing of Words
5.5.4 RAKE Rapid Automatic Keyword Extraction
5.5.5 Other Approaches to Keyness5.5.6 Clusters of Words as ‘Topics’
5.5.7 The Document-term Matrix and ‘bag of words’ Approach
5.5.8 Latent Dirichlet Allocation LDA5.5.9 Visualisation
5.6 Conclusion
5.7 ResourcesReferences
6. ‘Test pit sampling’: Preliminary analysis
6.1 Introduction
6.2 Moving from Breadth to Depth
6.3 The Metaphorical Foundations of Test Pit Sampling
6.4 The Logic of Identifying Samples and Choosing Cases
6.5 The Transition from Step Three to Step Four
6.6 Conducting Cursory readings
6.7 Key Areas for Consideration
6.8 Resources
References
7. ‘Deep excavations’: In-depth interpretive analysis
7.1 Introduction
7.2 Metaphorical Foundations
7.3 Cases for Deep Excavation
7.4 Qualitative Analysis
7.5 Forms of In-depth Interpretive Analysis
7.6 Mixing and Matching Interpretive Analyses
7.7 Bringing Depth Back into Conversation with Breadth
References
PART IV: REFLECTING ON THE IMPLICATIONS OF LARGE-SCALE QUALITATIVE ANALYSIS
8. Ethics and practice
8.1 Introduction
8.2 Ethical Practice and Constructing a Corpus8.2.1. Issues of Data Sovereignty
8.2.2 Overcoming Exclusion in the Archives8.3 Working with Integrity at Scale
8.3.1 Research Integrity8.3.2 Establishing the Nature of Consent
8.3.3 Risks of Data Linkage8.3.4 Centrality of Context
8.3.5 Reliance on Computational Tools and Algorithms8.4 Working with Care
8.4.1 Caring for and about Researcher’s Investments8.4.2 Shifting Connectedness to Data
8.4.3 Researcher Well-being8.4.4 Caring about Environmental Impact
8.4.5 Ethic of Care Framework8.5 Resources
References
9. Big qual and the future of qualitative analysis
9.1 Introduction
9.2 The Emergence of Big Qual
9.3 Overview of the Breadth-and-Depth Method
9.4 Stepping Out of the Methodological Borderlines
9.5 Challenges and Limitations
9.6 Moving the Integrative Field Forward
9.7 ResourcesReferences
“…The ultimate guide for learning how to do big qualitative research. The rapidly expanding world of big data calls for new solutions to manage and analyse large volumes of qualitative data now becoming available to social scientists everywhere…Big Qual will lead you on a rewarding quest to navigate qualitative data landscapes.”
— Kathy A. Mills, Australian Catholic University, Australia
“The volume takes a unique approach to analysis … and can be used as a methodological treatise grounded in empirical research, or the individual chapters as discrete modules in specific aspects of analysis.”
— Malcolm Williams, Cardiff University, UK
“This approach offers readers a way to think about how to analyse big qualitative data in meaningful and theoretically grounded ways. Weller et al.’s book will serve as an important guidepost as we increasingly encounter and work with big qualitative datasets.”— Jessica Nina Lester, Professor of Qualitative Methodology, Indiana University Bloomington, USA
This upper-level textbook presents a new approach to large scale qualitative analysis – the pioneering breadth-and-depth method. It covers the strengths and deployment of “big qual” as a distinct research methodology. The book will appeal to students and researchers across disciplines and methodological backgrounds.
The growing availability of large qualitative data sets presents exciting opportunities. Pooling multiple qualitative data sets enhances the possibility of theoretical generalisability and strengthens claims from qualitative research about understanding how social processes work.Given the evolving possibilities that big data offers the humanities and social sciences, this book will be a must-have resource, building capacity and provoking new ways of thinking about qualitative research and its analysis.
Susie Weller is Senior Research Fellow in the Clinical Ethics, Law and Society (CELS) research group and a Fellow of the Centre for Personalised Medicine, University of Oxford, UK.
Emma Davidson is Lecturer in Social Policy and Qualitative Research Methods at the University of Edinburgh, UK and Co-Director at the Centre for Research on Families and Relationships, Scotland.
Rosalind Edwards is Professor of Sociology and a Fellow of the National Centre for Research Methods at the University of Southampton, UK.
Lynn Jamieson is Professor of the Sociology of Families and Relationships at the University of Edinburgh, UK and Co-Director at the Centre for Research on Families and Relationships, Scotland.
1997-2024 DolnySlask.com Agencja Internetowa