Felstead, Alan ORCID: https://orcid.org/0000-0002-8851-4289 2021. Are online job quality quizzes of any value? Selecting questions, maximising quiz completions and estimating biases. Employee Relations 43 (3) , pp. 724-741. 10.1108/ER-07-2020-0349 |
Preview |
PDF
- Accepted Post-Print Version
Available under License Creative Commons Attribution Non-commercial. Download (744kB) | Preview |
Abstract
Purpose The purpose of this paper is to compare two ways of collecting job quality data in Britain using a common set of questions. One way is through a short quiz taken by a self-selected sample and completed by clicking on a web link www.howgoodismyjob.com. The other way is via an invitation to take part in a long-running survey of working life – the Skills and Employment Survey. The survey takes much longer to complete, is carried out face-to-face and is based on random probability principles. Design/methodology/approach To be content-comparable, the quiz uses tried and tested questions contained in recent waves of the Skills and Employment Survey. Each survey comprises a nationally representative sample of workers in Britain aged 20–65 years. However, the quiz is based on uncontrolled convenience sampling prompted, in large part, by a Facebook advertising campaign, whereas survey participants are randomly selected. In this paper, the authors compare the profile of respondents and their responses to these two different modes of data collection and therefore shine a light on any biases in the samples and differences in the results respondents report. Findings The paper shows that while the number taking in the quiz is impressive, participation in the quiz – unlike the survey – is heavily skewed. Weighting can be used to correct some of these sample selection biases. But, even then, the picture painted by the quiz and survey data varies with the quiz under-reporting the intrinsic quality of jobs, while over-reporting on the extrinsic rewards. This suggests that how job quality data are collected can have a strong influence on the results produced. Research limitations/implications The findings suggest that a number of biases are in operation, both in terms of those who take part and the answers they give. This makes comparison between data collected using radically different methods, at best, inadvisable and, at worst, misleading. Nevertheless, quizzes are a good way of engaging large numbers of people in public debates, gathering additional data, extending the reach of academic work and prompting action to improve working life. However, the limitation of this study is that it does not offer a true experiment of different ways of collecting the same data. The quiz and survey were, for example, not carried out at the same time, but were some 14 months apart. Practical implications Over 50,000 people took part in the quizzes reported in the paper and almost 1,300 investigated joining a trade union as a result. The reach of the quiz far exceeds the 3,306 people who took part in the Skills and Employment Survey 2017. Originality/value This paper focusses on how job quality data are collected and the consequences this has for the validity of the data gathered. This is a unique contribution to international debates about the measurement and monitoring of trends in job quality.
Item Type: | Article |
---|---|
Date Type: | Publication |
Status: | Published |
Schools: | Social Sciences (Includes Criminology and Education) |
Publisher: | Emerald |
ISSN: | 0142-5455 |
Funders: | ESRC |
Date of First Compliant Deposit: | 6 November 2020 |
Date of Acceptance: | 5 November 2020 |
Last Modified: | 27 Nov 2024 13:30 |
URI: | https://orca.cardiff.ac.uk/id/eprint/136139 |
Citation Data
Cited 2 times in Scopus. View in Scopus. Powered By Scopus® Data
Actions (repository staff only)
Edit Item |