COVID and ‘BIG QUAL’
ABOUT THE AUTHORS
Professor Lynn Jamieson along with Dr Emma Davidson, both from the Centre for Research on Families and Relationships, University of Edinburgh and Professor Rosalind Edwards and Dr Susie Weller from University of Southampton.
ABOUT THE CRFR BLOG
This is the official blog of the Centre for Research on Families and Relationships.
To keep up to date on all the latest posts and related events from CRFR, please subscribe via the button below.
This Blog is moderated by a CRFR representative who reserves the right to exercise editorial control over posted content.
Please note: the views expressed in a post are those of the author(s) and do not necessarily reflect those of CRFR.
This blog was first posted on 20th April 2020 on the Working across qualitative longitudinal studies: a feasibility study looking at care and intimacy project website.
The blog discusses the potential of qualitative secondary analysis, and in particular ‘big qual’ analysis, for helping to overcome the restrictions placed on qualitative work during the global pandemic. In so doing, Lynn and colleagues draw on a recent ESRC National Centre for Research Methods study – Working across qualitative longitudinal studies: A feasibility study looking at care and intimacy.
It seems appropriate to review the possibilities of secondary analysis of data that has already been gathered by face-to-face techniques, as the current pandemic closes down many such forms of research. The substitution of virtual means of data collection for face-to-face means, such as interviewing using internet telephony, is not the only possible response to barriers against tried and tested methods; researchers at or able to return to the design stage might consider the creative possibilities of drawing together existing archived qualitative data for new research. Secondary analysis of qualitative data remains a relatively under used research strategy, despite the accumulation of anonymised, quality-assured and well-documented data that has been carefully curated in official archives having been generated by peer-reviewed, funded and published studies. Researchers seem less able to see secondary analysis as ground breaking and, in the case of qualitative research, heightened sensitivity to the creative connection between researcher and researched builds concerns about ethics and intellectual property. However, in our published work (Davidson, Edwards, Jamieson and Weller, 2019) we counter these claims and point to the ground breaking opportunities of merging data from several studies in a new data assemblage using a set of steps that iteratively combine breadth and depth. The way of proceeding that we advocate, helps the analyst to ask new questions, to make theoretical use of comparison and, in the process, extend the generalisability of qualitative research.
Our method is the outcome of a project under the umbrella of the National Centre for Research Methods. We set out to develop materials that would assist other researchers to remain true to the principles of qualitative research while working with what could be called ‘big qualitative data’ or ‘big qual’ for short – a data assemblage that is much larger than the typical volume of a single project and too large to readily tackle solely by conventional qualitative analysis techniques. We have called our method of ‘big qual’ secondary analysis breadth-and-depth method. In addition to publishing our outline of bread-and-depth method, we have produced a set of resources to help others to use it, teach it and teach with it.
The four steps in the method are described using an analogy with different stages in an imagined archaeological project. At each step, it may be necessary to return to the starting point or a previous step.
- The researcher’s research questions shape the direction of an enquiry-led overview of archived qualitative research using meta data about the archived data sets. This is equivalent to an archaeologist using photographs taken in an aerial survey to select ground for further scrutiny.
- Computer-aided scrutiny using means of text searching that are sometimes called ‘data mining’ – although the techniques involved are surface sifting and mapping the breadth of the selected data collections rather than mining the depth. This is like the archaeologists’ ground-based geophysical survey on the surface of an area to assess what merits closer investigation by digging.
- Analysis of multiple small samples of likely data, equivalent to digging shallow ‘test pits’ to find an area worthy of deeper excavation.
- In-depth analysis of the selected sample, using techniques and processes drawn from the repertoire familiar to qualitative researchers. This is the equivalent to archaeological deep excavation.
Our own demonstration project worked with the Timescapes archive. Because we were interested in possible convergence by gender in the language and practice of care and intimacy over time, we re-assembled data from across four projects into age cohorts of men and women. This new data set is now available for teaching purposes. As a follow on of our original project we worked with experts in pedagogy, Melanie Nind and Sarah Lewthwaite, in order to develop training the trainer materials to assist research methods teachers.
Our publications and links to teaching resources can be found at: http://bigqlr.ncrm.ac.uk/resources-3/
Davidson, E., Edwards, R. Jamieson, L. and Weller, S. (2019) Big data, qualitative style: a breadth-and-depth method for working with large amounts of secondary qualitative data, Quality & Quantity, 53(1), 363-376.