Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

A test for a difference in the associability of blocked and uninformative cues in human predictive learning

Uengoer, Metin, Dwyer, Dominic M ORCID: https://orcid.org/0000-0001-8069-5508, Koenig, Stephan and Pearce, John M ORCID: https://orcid.org/0000-0001-6121-8650 2019. A test for a difference in the associability of blocked and uninformative cues in human predictive learning. Quarterly Journal of Experimental Psychology 72 (2) , pp. 222-237. 10.1080/17470218.2017.1345957

[thumbnail of Dwyer A test for_.pdf]
Preview
PDF - Accepted Post-Print Version
Download (752kB) | Preview

Abstract

In human predictive learning, blocking, A+ AB+, and a simple discrimination, UX+ VX–, result in a stronger response to the blocked, B, than the uninformative cue, X (where letters represent cues and + and – represent different outcomes). To assess whether these different treatments result in more attention being paid to blocked than uninformative cues, Stage 1 in each of three experiments generated two blocked cues, B and E, and two uninformative cues, X and Y. In Stage 2, participants received two simple discriminations: either BX+ EX– and BY+ EY–, or BX+ BY– and EX+ EY–. If more attention is paid to blocked than uninformative cues, then the first pair of discriminations will be solved more readily than the second pair. In contrast to this prediction, both discriminations were acquired at the same rate. These results are explained by the theory of Mackintosh, by virtue of the assumption that learning is governed by an individual rather than a common error term.

Item Type: Article
Date Type: Publication
Status: Published
Schools: Psychology
Publisher: SAGE Publications
ISSN: 1747-0218
Date of First Compliant Deposit: 15 April 2019
Date of Acceptance: 19 June 2017
Last Modified: 06 Nov 2023 20:57
URI: https://orca.cardiff.ac.uk/id/eprint/121640

Citation Data

Cited 8 times in Scopus. View in Scopus. Powered By Scopus® Data

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics