756 posts · 544,322 views
Cutting-edge reports on the latest psychology research
Even naps as short as ten minutes have been shown to provide psychological benefits in terms of reduced fatigue and improved concentration (pdf). But would-be nappers face some awkward decisions, most obviously - does it matter whether I nap in my chair or ought I try to find somewhere to lie down? And then ... if remaining seated, is it okay to lean forwards and rest my head on a desk?When it comes to napping while leaning back in a chair or car seat, past research has shown that the further you can lean back, the better, at least in terms of subjective fatigue and reaction times. Now Dayong Zhao and colleagues have addressed the leaning forward issue, comparing lying-down napping and leaning-forward napping, and they've found that the former is the most effective, but that the leaning-forward variety still has clear benefits compared with no nap at all. Thirty undergrads, all regular nappers, had electrodes attached to their heads before lunch. Then they performed an 'oddball' auditory task in which they had to listen to a string of tones and listen out for the occasional one of a different pitch. Next they had lunch before splitting into three groups: one group enjoyed a twenty minute nap lying down; another enjoyed a twenty-minute nap leaning forwards onto a desk (plus pillow for comfort); the final group just spent the same time sitting quietly. After this, all the participants performed a repeat of the oddball task whilst having their brainwaves recorded via electroencephalography. Zhao's team were particularly interested in the size and delay of the P300 - a brainwave measure of cognitive alertness. Participants in both of the napping conditions showed benefits compared with their peers who'd been denied a nap. The nappers, leaning and lying, reported being in a better mood and feeling less sleepy and they performed better at the oddball task. When it came to the brainwave recordings, however, the leaning-forward nappers, unlike the lying-down nappers, showed no difference from the control group. Uniquely, the lying-down nappers showed an increased P300 amplitude, perhaps indicating increased cortical arousal on their part. The message it seems is clear. A post-luncheon nap is beneficial to your mental functioning even if you're forced to rest your head on your desk. However, if you can find somewhere to lie down properly, then do, because the benefits of the nap will be that much greater. _________________________________Zhao, D., Zhang, Q., Fu, M., Tang, Y., & Zhao, Y. (2010). Effects of physical positions on sleep architectures and post-nap functions among habitual nappers. Biological Psychology, 83 (3), 207-213 DOI: 10.1016/j.biopsycho.2009.12.008
... Read more »
Zhao, D., Zhang, Q., Fu, M., Tang, Y., & Zhao, Y. (2010) Effects of physical positions on sleep architectures and post-nap functions among habitual nappers. Biological Psychology, 83(3), 207-213. DOI: 10.1016/j.biopsycho.2009.12.008
Many neuro-imaging studies claim to have investigated what happens in the brain when people interact socially. To overcome the awkward fact that participants have to lie entombed in the bore of a large magnet, these studies have used various means to simulate a social interaction. This includes: having participants watch videos of social interactions; interact with an animated character; or play a game with a human opponent (usually computer controlled) supposedly located in another room. Such methods score marks for improvisation but arguably none of them fully capture the dynamic cut and thrust of a real face-to-face social interaction between two people. That's why Elizabeth Redcay and her colleagues have devised the first ever experimental set up that allows for live face-to-face (via video link) interaction whilst participants lie prostrate inside a brain-imaging magnet.Participants in this study watched a live video feed of the experimenter. The experimenter in turn had a display showing them a live feed of where the participant was looking. Experimenter and participant then engaged in a series of 'games' that required social interaction. For example, in one, the experimenter picked up various toys and the participant had to look in the direction of the appropriately coloured bucket to which the toy belonged. Compared with watching a recording of this same interaction, the live interaction itself triggered increased activation in a swathe of social-cognitive, attention-related and reward processing brain regions.The second experiment involved the participant identifying which screen quadrant a mouse was hidden in. In the live 'joint attention' condition, the experimenter's gaze direction cued the mouse's location and only when both experimenter and participant looked at the correct quadrant did the mouse appear. Compared with a solo condition in which a house symbol cued the mouse location, the interactive joint attention condition triggered increased activation in the right superior temporal sulcus and right temporal parietal junction. The former brain region has previously been associated with processing socially relevant stimuli such as eye gaze and reaching, whereas the latter temporal-parietal region is associated with thinking about other people's thoughts.Past research using simulations of social interaction has identified the dorso-medial prefrontal cortex as a key area involved in social engagement. The quietness of this region in the current study suggests it may have been the competitive or social judgement elements of previous paradigms, rather than social interaction per se, that led to its activation.'Social interaction in the presence of a live person (compared to a visually identical recording) resulted in activation of multiple neural systems which may be critical to real-world social interactions but are missed in more constrained, offline experiments,' the researchers said. In particular, Redcay's group said that their new set-up would be ideal for studying the social difficulties associated with autistic spectrum disorders (ASD). Attempts to identify the neural bases of these difficulties have previously met with mixed success. 'A neuroimaging task that includes the complexity of dynamic, multi-modal social interactions may provide a more sensitive measure of the neural basis of social and communicative impairments in ASD,' the researchers said. _________________________________Redcay E, Dodell-Feder D, Pearrow MJ, Mavros PL, Kleiner M, Gabrieli JD, & Saxe R (2010). Live face-to-face interaction during fMRI: a new tool for social cognitive neuroscience. NeuroImage, 50 (4), 1639-47 PMID: 20096792Image courtesy of Elizabeth Redcay.
... Read more »
Redcay E, Dodell-Feder D, Pearrow MJ, Mavros PL, Kleiner M, Gabrieli JD, & Saxe R. (2010) Live face-to-face interaction during fMRI: a new tool for social cognitive neuroscience. NeuroImage, 50(4), 1639-47. PMID: 20096792
Six weeks of computer brain training has little benefit beyond boosting performance on the specific tasks included in the training. That's according to an online study involving more than 11,000 participants conducted as part of the BBC's Bang Goes The Theory science programme.Adrian Owen of the MRC Cognition and Brain Sciences Unit and his colleagues first measured participants' baseline performance on a battery of freely available 'benchmark' tests. Included were measures of reasoning, verbal short-term memory, spatial working memory and paired-associates learning (a test of longer-term verbal memory).The participants, who had an average age of 39, then formed three groups. The first group spent six weeks, for a minimum of ten minutes a day, three times a week, performing computerised training tasks in reasoning, planning and problem solving. The second group spent the same time training on a broader range of tests of short-term memory, attention, visuospatial processing and mathematics, similar to those found in commercial brain training products. The final, control group spent the same time using the internet to find answers to obscure quiz questions.Participants in all groups showed improvements on the specific tasks included in their training regimens, but a repeat of the benchmark performance tests used at the study outset showed that these benefits had not generalised, not even when the training tests and benchmark tests involved similar cognitive processes.The vanishingly modest transferable benefits of brain training that were observed, were no greater than those found in the control group after the same amount of time spent Googling the answers to obscure general knowledge questions. To take one example, following the brain training, participants in the second group were on average able to remember three-hundreths of a digit more than before training. Participants in the Googling control group, who had no formal brain training, were able to remember two-tenths of a digit more than before the study.'These results provide no evidence for any generalised improvements in cognitive function following brain training in a large sample of healthy adults,' the researchers said.What about the possibility that the training regimens in the current study weren't long enough to generate transferable benefits? This seems unlikely because there was a negligible link between the number of training sessions completed and the amount of observed transferable benefit. 'That said,' the researchers admitted, 'the possibility that an even more extensive training regime may have eventually produced an effect cannot be excluded'.The results of this study will be shared and discussed on Bang Goes The Theory on BBC One at 9pm on 21 April and on the BBC's Lab UK website. The new findings are just the latest to cast doubt on the value of commercial brain training products. A 2008 investigation by the consumer charity Which? concluded that 'none of the claims [of commercial brain training products] are supported by peer-reviewed research published in a recognised scientific journal and involving the specific product'. The Which? investigators, Adrian Owen among them, recommended a healthy diet, physical exercise and challenging mental activities, including learning a new instrument or language, or completing crosswords, as the most effective ways to maintain a healthy mind. _________________________________A.M. Owen, A. Hampshire, J.A. Grahn, R. Stenton, S. Dajani, A.S. Burns, R.J. Howard, & C.G. Gallard (2010). Putting brain training to the test. Nature [In Press].Link to interactive website featuring the benchmark cognitive tests used in the current study, including useful background information.Link to Which? investigation of brain training products.Link to BBC Bang Goes The Theory programme.Link to recent feature article in The Independent on brain training.
... Read more »
A.M. Owen, A. Hampshire, J.A. Grahn, R. Stenton, S. Dajani, A.S. Burns, R.J. Howard, & C.G. Gallard. (2010) Putting brain training to the test. Nature. info:/
You want to impress but you realise that bold-faced bragging can backfire. So instead you highlight the achievements of those close to you - perhaps your son or daughter's success, or even a colleague's - with the hope of basking in the reflected glory. 'I'm a lecturer at Neverland University,' you say, 'our head of department just won a Nobel Prize.' Bad move. According to Nurit Tal-Or's latest research on the psychology of boasting, this form of indirect self-promotion, known as 'burnishing', carries all the costs of bragging but none of the gains.Sixty participants aged between 60 and 90 read one of three versions of a fictional scenario in which a 68-year-old called Joseph attended a university reunion. In one version, Joseph tells his former classmates that he's a professor of bio-medicine at a well-respected university. In another, he says he used to have that position. In the final version he says his son holds that post. The participants were asked to rate Joseph's character. The Joseph who bragged about his son was rated as no more sociable than the Joseph who boasted about his own past or present career, but was rated as less capable. The indirect bragging was just as costly as direct bragging, it seems, but carried none of the gains. A second study found the same pattern of results in a different context and with the benefit of a control condition. In this case 83 students read one of four versions of a conversation between two undergrads. In one version, one student tells the other that last month he came second in a marathon. In another, he says he came second in a marathon when at school. In the indirect boasting condition, he says that his brother came second in a marathon last month. And in the control condition, he says that their (shared) stats tutor came second in a marathon. The boasting student, whether done directly or indirectly, was rated by participants as more manipulative than the control version student. And yet only the student who boasted about himself was rated as more able than the control student.'When people boast about the success of other people, this need to bask in the reflected glory of the success of others may be perceived as pathetic and unworthy of respect,' Tal-Or said. Another possibility is that 'when people brag about their associates' success, their audience may suspect that they themselves do not have any successes of their own to be proud of.'Previous research by Tal-Or has shown that bragging is perceived as more socially acceptable when it occurs in the context of a topic raised by someone else. Meanwhile, research published in 2008 showed that name-droppers are perceived as manipulative and incompetent. _________________________________Tal-Or, N. (2010). Direct and indirect self-promotion in the eyes of the perceivers. Social Influence, 5 (2), 87-100 DOI: 10.1080/15534510903306489
... Read more »
Tal-Or, N. (2010) Direct and indirect self-promotion in the eyes of the perceivers. Social Influence, 5(2), 87-100. DOI: 10.1080/15534510903306489
When groups of people get together to make decisions, they often struggle to fulfil their potential. Part of the reason is that they tend to spend more time talking about information that everyone shares rather than learning fresh insights from each other. In a forthcoming paper, Andreas Mojzisch and Stefan Schulz-Hardt have uncovered a new reason groups so often make sub-optimal decisions. The researchers show that when a group of people begin a discussion by sharing their initial preferences, they subsequently devote less attention to the information brought to the table by each member, thus leading the group to fail to reach the optimal decision. The practical implications are clear - if you can, avoid beginning group decision-making sessions with the exchange of members' initial preferences.Mojzisch and Schulz-Hardt began their investigation with a carefully controlled simulation of a real group discussion. Rather than exchanging ideas face-to-face, dozens of participants were presented with some selective written information about various job candidates and either told or not told about the initial preferences of other group members who'd received different information. Each participant then received the information that had been given to all the other group members.Participants needed to consider the information available to the entire group if they were to identify the optimum candidate. Crucially, participants who began the session by hearing about other group members' initial candidate preferences were subsequently less successful at using the group's shared information to pick the optimum candidate. A memory test suggested this was because they'd paid less attention to the relevant information than had the participants who'd been kept in the dark about other members' initial candidate preferences.A final study tested these effects in a real, face-to-face group decision-making situation. One hundred and eighty students participated in sixty three-person groups tasked with selecting the best among three job candidates. Each group member started off with a unique set of information about the three candidates and the optimum candidate selection could only be reached if group members shared with each other their unique information. Once again, groups were far less successful at sharing the necessary information, and therefore at reaching an optimal decision, if they began their session by sharing their initial preferences. As before, the reason was that sharing initial preferences led group members to pay less attention to the relevant information during group discussion.'The take-home-message of our study is simple,' Mojzisch told the Digest. 'Ninety per cent of group discussions start with the members exchanging their pre-discussion preferences. Our research shows that learning the other group members' preferences at the beginning of a group discussion has a negative effect on the quality of group decision-making.'_________________________________Andreas Mojzisch, & Stefan Schulz-Hardt (2010). Knowing others' preferences degrades the quality of group decisions. Journal of Personality and Social Psychology.PS. This study is due to be published in the Journal of Personality and Social Psychology in May. I will add a link to the abstract as soon as it's available.PPS. The authors of the current study tipped off the Digest editor about their research findings. If you have some exciting peer-reviewed research in press, you too could tip off the Digest editor, for the chance to have your findings popularised on one of the world's leading psychology blogs. Email: christianjarrett[@]gmail.com Thanks!
... Read more »
Andreas Mojzisch, & Stefan Schulz-Hardt. (2010) Knowing others' preferences degrades the quality of group decisions. Journal of Personality and Social Psychology. info:/
Research conducted in the aftermath of a devastating Chinese earthquake has uncovered a paradoxical psychological phenomenon - survivors living in the most devastated regions appear to be the least concerned by the ongoing risks. Shu Li and colleagues dubbed this the 'Psychological Typhoon Eye' in a paper published last year and now they've reported follow-up investigations that suggest the effect was still in evidence a year after the disaster.The 2008 Wenchuan Earthquake registered 8 on the Richter scale and killed over 68,000 people. More than four million people were also injured. In their initial paper, Shu Li's team observed that survivors living in the most devastated regions were the least concerned, as measured by their estimates for: how many relief workers were needed, the likelihood of a epidemic outbreak, the need to take safety measures against aftershocks, and the level of dose needed if a fictitious psychological medication were made available for an earthquake victim.The new study of over 5000 residents finds that this association held after four and eleven months and it also replicates the finding when using a 'relational distance' measure of involvement in the quake. That is, people who reported having closer rather than more distant relations who'd been affected by the quake tended to report less ongoing concern with the threat. One of the explanations for the Psychological Typhoon Eye mooted in Li's 2009 paper was psychological immunity - the idea being that exposure to danger builds psychological resilience. However, the new study undermined this explanation - people living in the most devastated regions still showed the same level of Psychological Typhoon Eye regardless of whether they themselves had suffered physical or economic harm from the quake. Another possible explanation is cognitive dissonance. The idea here is that continuing to live in a dangerous area is psychological uncomfortable - to justify this decision people have to downplay the risks in their own mind. Li's team said more research was needed to test this explanation.These studies are not the first to find paradoxical psychological responses to danger. Research published in the 1970s found that people living nearer to French nuclear power stations perceived the risk to be lower than people living further away._________________________________Li, S., Rao, L., Bai, X., Zheng, R., Ren, X., Li, J., Wang, Z., Liu, H., & Zhang, K. (2010). Progression of the “Psychological Typhoon Eye” and Variations Since the Wenchuan Earthquake. PLoS ONE, 5 (3) DOI: 10.1371/journal.pone.0009727Image courtesy of Wikipedia Commons.
... Read more »
Li, S., Rao, L., Bai, X., Zheng, R., Ren, X., Li, J., Wang, Z., Liu, H., & Zhang, K. (2010) Progression of the “Psychological Typhoon Eye” and Variations Since the Wenchuan Earthquake. PLoS ONE, 5(3). DOI: 10.1371/journal.pone.0009727
Mirror neurons are one of the most hyped concepts in psychology and neurocience. V.S. Ramachandran famously wrote that they will 'do for psychology what DNA did for biology'. Although recordings from single cells in the brains of monkeys have identified 'mirror' neurons that respond both to the execution of a movement and the observation of another agent performing that same movement, the existence of such cells in humans has, up until now, been inferred only from indirect evidence, particularly brain imaging. Now, for the first time, Roy Mukamel and colleagues have provided direct evidence, using implanted electrode recordings of single cells, for the existence of mirror neurons in humans.Mukamel's team seized the opportunity for single cell recording provided by the clinical investigations that were being carried out on patients with intractable epilepsy. These patients had electrodes implanted into their brains to identify the loci of their seizures. Mukamel and his colleagues recruited 21 of these patients and had them look at videos of hand gestures or facial expressions on a laptop in one condition, and perform those same gestures and expressions in another condition.Most of the 1177 cells that were recorded showed a response either to the execution of an action or the sight of that action, not both. However, there was a significant subset of 'mirror' neurons in the front of the brain, including the supplementary motor area, and in the temporal lobe, including the hippocampus, that responded to the sight and execution of the very same actions.Critics could argue that rather than having mirror properties, these cells were responding to a concept. For example, according to this argument, a cell that responded to the sight of a smile and the execution of a smile, was actually being activated by the smile concept. Mukamel's group reject that argument. They had a control condition in which the words for actions appeared on a screen, rather than those actions being seen or performed. The postulated mirror neurons responded to the sight and execution of an action, but not the word.Another potential criticism is that the execution-related activity of a postulated mirror neuron is triggered by the sight of one's own action, rather than by motor-output per se. However, this can't explain the mirror neurons that responded both to the sight of a given facial expression and one's own execution of that facial expression (although proprioceptive feedback could still be a potential confound). Mirror neurons make functional sense in relation to empathy and imitative learning, but a drawback could be unwanted imitation and confusion regarding ownership over actions. The researchers uncovered another subset of cells that could help reduce these risks - these cells were activated by the execution of a given movement but inhibited by the sight of someone else performing that same movement (or vice versa). 'Taken together,' the researchers concluded, 'these findings suggest the existence of multiple systems in the brain endowed with neural mirroring mechanisms for flexible integration and differentiation of the perceptual and motor aspects of actions performed by self and others.' _________________________________Roy Mukamel, Arne D Ekstrom, Jonas Kaplan, Maraco Iacoboni, & Itzhak Fried (2010). Single-Neuron Responses in Humans during Execution and Observation of Actions. Current Biology [In Press].
... Read more »
Roy Mukamel, Arne D Ekstrom, Jonas Kaplan, Maraco Iacoboni, & Itzhak Fried. (2010) Single-Neuron Responses in Humans during Execution and Observation of Actions. Current Biology. info:/
Emails feel so transient, so disembodied, that we're more tempted to lie when sending them compared with writing with pen and paper. That's according to Charles Naquin and colleagues who tested the honesty of students and managers as they played financial games.Forty-eight graduate business students were presented with an imaginary $89 kitty and had to choose how much they'd tell their partner was in the kitty, and how much of the kitty to share with their partner. Crucially, some participants shared this information by email, others by pen and paper. You guessed it - those who shared the info by email were more likely to lie about the kitty size (92 per cent of them did vs. 63 per cent of the pen and paper group), and they were also more unfair in how they shared the money. Participants in the email group also said they felt more justified in misrepresenting the amount of money to their partner. A follow-up study ramped up the ecological validity. One hundred and seventy-seven full-time managers took part in a group financial game. Participants formed teams of three with each member pretending to be the manager of a science project negotiating for grant money. This game was played with real money, the players all knew each other, and any lies would be revealed afterwards. Once again, players who shared information by email were more likely to lie and cheat than were players who shared information by pen and paper.Charles Naquin's team said their results chime with previous research showing, for example, that peer performance reviews are more negative when conducted online rather than on paper. 'Moving paper tasks online either within or across organisational boundaries should be undertaken with caution,' they said. For example: 'Taxes using the increasingly popular e-filing system could be even more fraught with deception than the traditional paper forms.'_________________________________Naquin, C., Kurtzberg, T., & Belkin, L. (2010). The finer points of lying online: E-mail versus pen and paper. Journal of Applied Psychology, 95 (2), 387-394 DOI: 10.1037/a0018627
... Read more »
Naquin, C., Kurtzberg, T., & Belkin, L. (2010) The finer points of lying online: E-mail versus pen and paper. Journal of Applied Psychology, 95(2), 387-394. DOI: 10.1037/a0018627
The Research Digest blog was five years old in February. As part of an ongoing celebratory series, I've asked Dr Gavin Nobes of the University of East Anglia to look back on his research on children's naive models of the earth that I covered in March 2005, to reflect on that study and the field more generally. Here's what he had to say:"Almost 15 years ago the late George Butterworth visited UEL and inspired a group of us to follow up some work he and Michael Siegal had started in Australia. Using a novel, forced-choice question task, they were testing the claim, based on children's drawings, that children have theory-like ‘naive mental models’ of the Earth; that is, children believe it to be (for example) flat, or a hollow sphere in which we live. This area of research has important implications for our understanding of the acquisition of knowledge, and for science education. For example, if children are influenced primarily by their intuitions and observations (as proponents of the naive mental model approach claim), they would be expected to think the Earth is flat; but if cultural communication is the principal source of information, children’s first concept of the Earth should be a rudimentary version of the scientific, spherical model.In the study featured in the Digest five years ago, Georgia Panagiotaki, Alan Martin and I asked children not to draw but to choose, from a set of pictures, those that they thought best represented the Earth. As in the Australian study, we found that children knew much more about the Earth than previous researchers had claimed, and found no evidence of naive mental models.Despite this apparently strong evidence from two different methods, the debate continued. Our recognition (forced-choice questions and picture selection) methods were criticised on the grounds that, unlike the earlier studies based on children's drawings, they failed properly to elicit children’s understanding. We responded to these criticisms by giving the same open-ended, drawing-based questions (used in the earlier studies) to university students. We were amazed to find that many of them drew exactly the same pictures, and gave identical non-scientific answers, as had children who were supposed to have naive mental models. Subsequent interviews revealed that the students had drawn and answered in these ways because they didn’t understand the questions – despite them being designed for 5-year-olds! Further experiments with a new version of the task, in which we rephrased the original open questions to reduce their ambiguity, led both adults and children to give substantially fewer non-scientific answers. We concluded that naive mental models are methodological artifacts: children and adults give these responses to the original instrument because the questions are poorly worded.One recommendation that arises from this work is that, wherever possible, different methods should be used to test the same hypotheses. Another is that, however simple your children’s task might be, try it out first on adults: this is quick, easy, and can be remarkably revealing. And third, don’t be too dispirited by negative reviews: especially early on, editors sent our submissions to proponents of the naive mental model view, whose disparaging reviews resulted in rejections. Had it not been for Michael’s and George’s generous support and encouragement, we would probably have given up and turned to less controversial areas of research." _______________________________Nobes, G., Martin, A., & Panagiotaki, G. (2005). The development of scientific knowledge of the Earth. British Journal of Developmental Psychology, 23 (1), 47-64 DOI: 10.1348/026151004x20649Look out for more of these 'looking back' guest posts in the coming months.
... Read more »
Nobes, G., Martin, A., & Panagiotaki, G. (2005) The development of scientific knowledge of the Earth. British Journal of Developmental Psychology, 23(1), 47-64. DOI: 10.1348/026151004x20649
Psychologists have devoted entire careers to finding out how people can be persuaded but far less time investigating what people know intuitively about persuasion.Now Karen Douglas and colleagues at Kent University have bucked this trend with a paper which they say shows people have an intuitive understanding of how a person's thinking style affects their vulnerability to persuasion, known formally as 'the elaboration likelihood model'. This is the idea, supported by research findings, that people who have a greater inclination for thinking things through tend to be less swayed by adverts that use superficial tricks like beautiful models and slick graphics, but are more persuaded by adverts that make an intelligent argument. The jargon for the character trait in question is 'need for cognition'. Douglas' team asked 132 non-psychology undergrad students to either rate themselves or 'other students in their class' on their weak-mindedness, their strong minded-ness and their 'need for cognition'. Next the students had to look at six colour advertisements that used style rather than intelligent argument to promote things like food and mobile phones, and their task was to say how much either they or typical undergrads in their class would be persuaded by those adverts. The key finding was that the participants' judgments about either their own or other people's vulnerability to the adverts was strongly related to the scores they gave on 'need for cognition', even above and beyond the relation to strong and weak-mindedness. In other words, if they saw themselves or other students as low on this measure then the students also tend to say that they or other students would be swayed by the ads. It's as if they were applying the rules of psychology's 'elaboration likelihood model' even though it's highly unlikely they'd ever heard of such a thing. Another finding to come out of the research was that the students tended to think other people would be swayed by the adverts far more than they would be themselves - a well-established phenomenon in persuasion research. Past studies have suggested that this tendency to think other people will be more prone to persuasion is just another expression of our egotistical tendency to see ourselves as better than average. However, the current study suggested instead that we think other people will be more prone to persuasion (by superficial ads) because we think they have less 'need for cognition'. We probably make this assumption, the researchers said, not for self-serving, egotistical reasons but simply because we 'have greater access to our own thoughts, and therefore to occasions in which we were personally motivated to think.'Concluding their paper, Douglas's team said: 'This research provides the first evidence that people do indeed use their intuitive understanding of persuasion and the personal characteristics associated with persuasion, to judge the extent to which persuasive attempts will be successful.'In as yet unpublished research Douglas has further shown how people are able to make effective use of their lay theories of persuasion. In one study participants tailored mobile phone adverts appropriately according to an audience who were described as being either high or low in need for cognition. For example, for consumers who think more, the participants chose an ad with more detail on technical specifications. 'We and Tobias Vogel at the University of Heidelberg have a lot of data on this topic ... it's very interesting because no research (to our knowledge) to date investigates people's lay theories of persuasion and certainly not how people use these theories to persuade people,' Douglas told the Digest.'Our findings suggest that people do have some kind of awareness of how persuasion works and can use their knowledge to attempt to persuade people. It's just the beginning really - while people seem to have an intuitive understanding of how thinking style relates to persuadibility, it could plausibly extend to other aspects of persuasion and persuasive techniques such as social norms and the foot-in-the-door technique.'_________________________________Douglas, K., Sutton, R., & Stathi, S. (2010). Why I am less persuaded than you: People's intuitive understanding of the psychology of persuasion. Social Influence, 5 (2), 133-148 DOI: 10.1080/15534511003597423
... Read more »
Douglas, K., Sutton, R., & Stathi, S. (2010) Why I am less persuaded than you: People's intuitive understanding of the psychology of persuasion. Social Influence, 5(2), 133-148. DOI: 10.1080/15534511003597423
Stanley Milgram's 1960s obedience to authority experiments, in which a majority of participants applied an apparently fatal electric shock to an innocent 'learner', are probably the most famous in psychology, and their findings still appall and intrigue to this day. Now, in a hunt for fresh clues as to why ordinary people were so ready to harm another, Nestar Russell, at Victoria University of Wellington, has reviewed Milgram's personal notes and project applications, which are housed at Yale University's Sterling Memorial Library.Milgram trained under Solomon Asch, author of the famous conformity experiments, and the obedience project was originally conceived as an extension of Asch's work. Milgram was going to see how the behaviour of a group of cooperating participants (actually confederates working for the researcher) influenced the naive participants' willingness to harm another. A condition in which single participants followed the experimenter's orders on their own was planned as a mere control condition.It was during Milgram's extensive pilot work that he discovered the remarkable willingness for participants to obey instructions, without the need for group coercion, thus changing the direction of his project. The focus shifted to lone participants and Milgram began a process of trial and error pilot work to identify the perfect conditions for inducing obedience - what he described as 'the strongest obedience situation'. Early on, Milgram recognised the need for an acceptable rationale for harming another and so he invented the cover story that the experiment was about using punishment to improve learning. To counter participants' reluctance to harm an innocent person, Milgram also devised several other 'strain resolving mechanisms'. This included replacing the final shock level label 'LETHAL' with the more ambiguous 'XXX'; removing a Nazi-sounding 'pledge to obey' from the experiment instructions; and creating physical distance between the participants and the innocent, to-be-electrocuted learner.In fact, this latter factor worked too well. When Milgram removed any sight or sound of the learner, 'virtually all' participants showed a willingness to inflict lethal harm. Milgram realised this near-total obedience was counter-productive and would prevent his paradigm from 'scaling obedient tendencies'. For his first official experiment he therefore settled on auditory feedback only, in the form of the learner banging on the wall in distress.Another 'strain resolving mechanism' that Milgram devised included increasing the number of levels on the shock generator. This allowed for exploitation of the 'foot in the door' persuasion effect whereby people are more likely to cooperate once they have already agreed to a less significant request - a kind of piecemeal compliance.Milgram was also careful about the actors he chose to play the part of experimenter and learner. Though both non-professionals, the man acting as learner was chosen because he was 'mild and submissive; not at all academic' and a 'perfect victim', whilst the man playing the experimenter was 'stern' and 'intellectual looking'. Finally, Milgram was careful to plan things so that the 'experimenter', whenever challenged, replied that he was responsible for anything that happens to the learner. Taken altogether, Russell's new analysis shows how Milgram used ad hoc trial and error pilot testing to hone his methodology and ensure his first official obedience experiment achieved such a high obedience rate (of 65 per cent). 'Knowing step-by-step how Milgram developed this result may better arm theorists interested in untangling this still enigmatic question of why so many participants inflicted every shock,' Russell said. _________________________________Charles, N. (2010). Milgram's obedience to authority experiments: Origins and early evolution. British Journal of Social Psychology DOI: 10.1348/014466610X492205 [Open Access]
... Read more »
Charles, N. (2010) Milgram's obedience to authority experiments: Origins and early evolution. British Journal of Social Psychology. DOI: 10.1348/014466610X492205
The widespread misconception that psychology is easy and mere common sense has its roots in the biased way that children work out whether a topic is challenging or not. Frank Keil and colleagues asked children aged between five and thirteen, and adults, to rate the difficulty of questions from physics (e.g. How does a spinning top stay upright?), chemistry, biology, psychology (e.g. Why is it hard to understand two people talking at once?) and economics. The questions had been carefully chosen from earlier pilot work in which they'd all been rated as equally difficult by adults. Consistent with the pilot work, the adults in the study proper rated the questions from the different disciplines as equally difficult. However, children from age 7 to 13 rated psychology as easier than the natural sciences - physics, chemistry and biology, which they rated as equally difficult.Young children can't possibly have the depth of understanding to know which scientific questions are more difficult. Instead they must resort to some kind of mental short-cut to make their verdict. Keil's team think that children's feelings of control over their own psychological faculties - memories, emotions and so forth - and the superficial familiarity of those kinds of concepts, likely lead them to believe psychological concepts are easier to understand.A second study provided this account with some support. This time children and adults rated the difficulty of questions from within the various branches of psychology. Similar to the first study, the children, but not the adults, rated questions related to social psychology, personality and emotions as progressively easier, compared with questions related to cognition, perception and biological psychology, which they rated as progressively more difficult.So, when do these childish misconceptions leak through into adult judgments? For a third study, another batch of children and adults were again presented with the same questions from the different scientific disciplines, but this time they were asked to say whether they would be able to solve each question on their own (or require expert help) and to estimate what proportion of the adult population would know the answers.This time the adults as well as the children tended to say they could solve more psychology questions on their own, compared with questions in the other sciences, and kids and adults estimated that more people knew the answers to the psychology questions. Remember these were psychology questions that adults had already rated as just as difficult and complex as questions in the other sciences. 'Such biases [towards seeing psychology as easy] may be observed when tasks do not so directly ask about difficulty of understanding and instead use measures such as ease of learning on one's own,' the researchers said.Keil's team said their findings have real-life implications, for example in the court-room. 'If psychological phenomena are seen as usually quite easy to understand and largely self-evident and if such judgments are inaccurate and underestimate the need for experts,' they warned, 'cases might well be decided in ways that unfairly exclude valuable expert insights.'In fact, the researchers pointed out that such situations have already occurred. In the US trial of former Presidential Assistant I. Lewis 'Scooter' Libby, for example, the judge disallowed the use of psychology experts on memory, on the basis that the jury could rely on their common sense understanding of memory. This is particularly ironic given that prior psychology research has shown that jurors and judges' have a woefully poor understanding of how memory actually works. _________________________________Keil FC, Lockhart KL, & Schlegel E (2010). A bump on a bump? Emerging intuitions concerning the relative difficulty of the sciences. Journal of experimental psychology. General, 139 (1), 1-15 PMID: 20121309
... Read more »
Keil FC, Lockhart KL, & Schlegel E. (2010) A bump on a bump? Emerging intuitions concerning the relative difficulty of the sciences. Journal of experimental psychology. General, 139(1), 1-15. PMID: 20121309
A short while ago there was a shocking advert on British TV that used slow motion to illustrate the bloody, crunching effects of a car crash. The driver had been drinking. Using these kind of scare tactics for anti drink-driving and other health issues makes intuitive sense. The campaigners want to grab your attention and demonstrate the seriousness of the consequences if their message is not heeded. However, a new study makes the surprising finding that for a portion of the population, scare tactics can back-fire, actually undermining a message's efficacy.Steffen Nestler and Boris Egloff had 297 participants, 229 of them female, average age 35, read one of two versions of a fictional news report from a professional medical journal. The report referred to a study showing links between caffeine consumption and a fictional gastro-intestinal disease 'Xyelinenteritis'. One version was extra-scary, highlighting a link between Xyelinenteritis and cancer and saying that the participant's age group was particularly vulnerable. The other version was lower-key and lacked these two details. Both versions of the article concluded by recommending that readers reduce their caffeine consumption. Before gauging the participants' reaction to the article and its advice, the researchers tested them on a measure of 'cognitive avoidance'. People who score highly on this personality dimension respond to threats with avoidance tactics such as distracting themselves, denying the threat or persuading themselves that they aren't vulnerable.The key finding is that participants who scored high on cognitive avoidance actually rated the threat from Xyelinenteritis as less severe after reading the scary version of the report compared with the low-key version. Moreover, after reading the scary version, they were less impressed by the advice to reduce caffeine consumption and less likely to say that they planned to reduce their caffeine intake. On the other hand, highly cognitive avoidant participants were more responsive to the low-key report than were the low cognitive avoidant participants. In other words, for people who are cognitively avoidant, scary health messages can actually back-fire. 'Practically, our results suggest that instead of giving all individuals the same threat communications, messages should be given that are concordant with their individual characteristics,' Nestler and Egloff said. 'Thus, the present findings are in line with the growing literature on tailoring intentions to individual characteristics, and they highlight the role of individual differences when scary messages are used.'_________________________________Nestler, S., & Egloff, B. (2010). When scary messages backfire: Influence of dispositional cognitive avoidance on the effectiveness of threat communications Journal of Research in Personality, 44 (1), 137-141 DOI: 10.1016/j.jrp.2009.10.007Also on the Digest:-Morbid warnings on cigarette packs could encourage some people to smoke.-How to promote the MMR Vaccine.-Public health leaflets ignore findings from health psychology.
... Read more »
Nestler, S., & Egloff, B. (2010) When scary messages backfire: Influence of dispositional cognitive avoidance on the effectiveness of threat communications. Journal of Research in Personality, 44(1), 137-141. DOI: 10.1016/j.jrp.2009.10.007
Internet use is growing at a phenomenal rate and much ink has been spilled by commentators forecasting the psychological consequences of all this extra web-time. A lot of that comment is mere conjecture whilst many of the studies in the area are cross-sectional, with small samples, producing conflicting results. The latest research contribution comes from Irena Stepanikova and her colleagues and involves a massive sample, some of whom were followed over time. The results suggest that more time on the internet is associated with increased loneliness and reduced life satisfaction. However, it's a complicated picture because the researchers' different outcome measures produced mixed results.Over thirteen thousand people answered questions about their internet use, loneliness and life satisfaction in 2004 and in 2005. They'd been chosen at random from a list of US land-line numbers. The majority of the people quizzed in 2004 were different from those quizzed in 2005, but 754 people participated in both phases, thus providing some crucial longitudinal data. An important detail is that the researchers used two measures of internet use. The first 'time-diary' method required participants to consider six specific hours spread out over the previous day and to estimate how they'd spent their time during those hours. The other 'global recall' measure was more open-ended and required participants to consider the whole previous twenty-four hours and detail as best they could how they'd used that time. The cross-sectional data showed that participants who reported spending more time browsing the web also tended to report being lonelier and being less satisfied with life. This association was larger for the time-diary measure. The strength of the association was modest, but to put it in perspective, it was five times greater than the (inverse) link between loneliness and amount of time spent with friends and family. Turning to web-communication, the global recall measures showed that time spent instant messaging, in chat rooms and news groups (but not email) was associated with higher loneliness scores. For the time-diary measure, it was increased email use that was linked with more loneliness. The longitudinal data showed that as a person's web browsing increased from 2004 to 2005, their loneliness also tended to increas (based on the global recall measure only). Both measures showed that increased non-email forms of web communication, including chat rooms, also went hand in hand with increased loneliness. Finally, more web browsing over time was linked with reduced life satisfaction by the time-diary measure, whilst more non-email web communication over time was linked with reduced life satisfaction by the global recall measure. Perhaps the most important message to come out of this research is that the results varied with the measure of internet use that was used - future researchers should note this. The other message is that more time browsing and communicating online appears to be linked with more loneliness, the two even increase together over time. However, it is important to appreciate that we don't know the direction of causation. Increased loneliness may well encourage people to spend more time online, rather than web time causing loneliness. Or some other factor could be causing both to rise in tandem. It's worth adding too that the web/loneliness link held even after controlling for time spent with friends and family. So if more web use were causing loneliness, it wasn't doing it by reducing time spent socialising face-to-face. 'We are hopeful that our study will stimulate future research ... ,' the researchers said, 'but at this point any claims suggesting that as Internet use continues to grow in the future, more people will experience loneliness and low life-satisfaction would be premature.'_________________________________Stepanikova, I., Nie, N., & He, X. (2010). Time on the Internet at home, loneliness, and life satisfaction: Evidence from panel time-diary data Computers in Human Behavior, 26 (3), 329-338 DOI: 10.1016/j.chb.2009.11.002
... Read more »
Stepanikova, I., Nie, N., & He, X. (2010) Time on the Internet at home, loneliness, and life satisfaction: Evidence from panel time-diary data. Computers in Human Behavior, 26(3), 329-338. DOI: 10.1016/j.chb.2009.11.002
We're slower to direct our attention to the same location twice in succession, a well-established phenomenon that cognitive psychologists call 'inhibition of return' (IoR). It's thought the mechanism may act to make our search of the visual scene more efficient by deterring us from looking at the same spot twice. Now Paul Skarratt and his colleagues have documented a new 'social' form of inhibition of return, in which people are slower to attend to a location that social cues, such as gaze direction, suggest another person has already attended to.Twelve participants sat at a table with an animated character projected opposite. Each participant and their animated partner had two lights and two buttons in front of them, near the middle of the table (see figure above). One light/button pair was to the left, the other pair was to the right. The basic task was to press the corresponding button as fast as possible when it's light came on. Participants were slower to respond to a light when the animated partner had just responded to the adjacent light on their side of the table - this is what you might call a weak version of social inhibition of return. However, when two large vertical barriers were put up with a gap in the middle, so that the participants could only see their partner's eyes and initial reaching action, and not their actual button presses, this social IoR disappeared.In a second experiment, the animated partner was replaced with a human. This time, the social IoR effect occurred even when the barriers were erected and only the partner's eye gaze and initial hand movement could be seen. In other words, inferences about where the partner was going to attend, based on their eyes or early hand movement, seemed to be enough to inhibit a participant's own attention to the same location. For some reason, this strong version of social IoR only occurred with a real, human partner, not the animated, computer-controlled partner of the first experiment. The final experiment added yet another visual barrier, which left only the partner's eyes or only their early hand movement visible. This was to try to establish which cue was the more important for provoking social IoR. The answer was that both cues were equally effective. It's only supposition at this stage, but Skarratt and his team think social IoR could be supported by the postulated mirror neuron system. Monkey research has shown, for example, that there are mirror neurons in the premotor cortex that fire whether a monkey sees another person grasp an object or if they just see the initial part of that grasping movement.'Although the critical mechanisms underlying social IoR remain to be discovered,' the researchers said, 'the current study indicates that it can be generated independently of direct sensory stimulation normally associated with IoR, and can occur instead on the basis of an inference of another person's behaviour.'_________________________________Skarratt, P., Cole, G., & Kingstone, A. (2010). Social inhibition of return. Acta Psychologica, 134 (1), 48-54 DOI: 10.1016/j.actpsy.2009.12.003Figure courtesy of Paul Skarratt.
... Read more »
The sight of their own blood plays a key role in the comfort that some non-suicidal people find in deliberately cutting themselves. That's according to a new study by Catherine Glenn and David Klonsky that suggests it is those self-harmers who have more serious psychological problems who are more likely to say the sight of blood is important.There are plenty of anecdotal reports hinting at the importance of the sight and taste of blood to self-harmers, as well as references in popular music. 'Yeah you bleed just to know you're alive,' sing the Goo Goo dolls in Iris. 'I think it's time to bleed I'm gonna cut myself and Watch the blood hit the ground,' sings Korn on Right Now. However, this is the first systematic investigation on the topic. Glenn and Klonsky recruited 64 self-harmers from a mass screening of 1,100 new psychology students. With an average age of 19, and 82 per cent being female, the students answered questions about their self-harming and other psychological problems and specifically reported on the importance of the sight of blood.Just over half the participants said that it was important to see blood when they self-harmed, with the most common explanation being that it helps relieve tension and induces calmness. Other explanations were that it 'makes me feel real' and shows that 'I did it right/deep enough'. The participants who said blood was important didn't differ in terms of age and gender from those who said it wasn't. However, the blood-important group reported cutting themselves far more often (a median of 30 times compared with 4 times) and they were more likely to say they self-harmed as a way of regulating their own emotions. The blood-important group also reported more symptoms consistent with bulimia nervosa and borderline personality disorder.'Overall, these results suggest that self-injurers who report it is important to see blood are a more clinically severe group of skin-cutters,' the researchers said. 'Therefore, a desire to see blood during non-suicidal self-injury may represent a marker for increased psychopathology.'Glenn and Klonsky said more research was needed to find out why the sight of blood has the significance it does for some people who self-harm. However, they surmised that the sight of one's own blood could, after an initial rise in heart-rate, lead to a rebound effect characterised by reduced heart-rate and feelings of calmness._________________________________Glenn, C., & Klonsky, E. (2010). The Role of seeing blood in non-suicidal self-injury. Journal of Clinical Psychology DOI: 10.1002/jclp.20661
... Read more »
Glenn, C., & Klonsky, E. (2010) The Role of seeing blood in non-suicidal self-injury. Journal of Clinical Psychology. DOI: 10.1002/jclp.20661
Information, information, information. That's the message from one of the first studies to look at people's preferences for different forms of advice. Reeshad Dalal and Silvia Bonaccio presented hundreds of students with fictional decision-making scenarios, such as choosing which job to apply for. The students were offered various permutations of advice and asked to say how satisfied they'd be if a friend had given them that advice. The different kinds of advice were: which option to go for; which option not to go for; info on how to make the decision (e.g. use a points allocation system); information on one or more of the options; and sympathy about the difficulty of making a decision. Whilst all forms of advice were positively received, the students' consistent preference was for information about one or more of the options.A second study spiced things up by introducing more varied decision-making scenarios: where to locate a new store; how to lay off excess staff; and how to invest some inheritance. A fresh batch of students were presented with the new scenarios and this time they were to imagine they'd solicited the advice from an expert, rather than a friend, to see if this made any difference to their responses. Information again came out as the most preferred form of advice. However, this time round, specific advice on which option to go for was also particularly well received, especially in the investment scenario. The researchers said past research on advice giving has tended to focus purely on advice in the form of 'I recommend option X', so this study makes a novel contribution. 'Across the situational and dispositional variables we examined, decision-makers appeared to want their advisors to provide information about the alternatives,' the researchers said. Advice that says 'go for option X' can also be well-received but only in specific circumstances, such as when advice has been explicitly solicited from an expert.When it comes to lessons for real life, Dalal and Bonaccio said more research was needed to see how their results generalise, but in the meantime they advised: 'Individuals who are advising decision-makers should at the very least be careful to provide information along with their recommendations.'_________________________________Dalal, R., & Bonaccio, S. (2010). What types of advice do decision-makers prefer? Organizational Behavior and Human Decision Processes DOI: 10.1016/j.obhdp.2009.11.007Related Digest item: 'We're more likely to listen to expensive advice'.
... Read more »
People don't need to be treated as a stereotype for harm to occur; their mere belief that they could be viewed in a stereotyped fashion is enough - a phenomenon known as 'stereotype threat'. For example, women reminded of the stereotype that men are better at maths tend to perform more poorly in a subsequent maths task, even if they are actually treated fairly. Now Julie Henry and colleagues have extended this line of research to the domain of mental health. They've found that patients with a schizophrenia diagnosis function less well socially, when they think that the person they're chatting with knows their diagnosis.Thirty people diagnosed with schizophrenia or schizoaffective disorder spent a few minutes chatting on their own to one research assistant and then they did the same with another assistant an hour later. There were a few points of deception: first, the participants were led to believe that the assistants were participants from another study. Also, most importantly, before one of the conversations began, they were told that the assistant knew about their diagnosis of schizophrenia; before the other, they were told the assistant did not know. They were also told, truthfully, that both the people they were to chat with did not themselves have a diagnosis of schizophrenia. In reality, the research assistants didn't know whether each participant had a diagnosis of schizophrenia or not. This was achieved by having them them chat to the participants diagnosed with schizophrenia plus a number of control participants. Crucially, they weren't told in advance who was who. After each conversation, the research assistants rated the social behaviour of the person they'd just chatted with. The participants in turn rated the behaviour of the assistant they'd just chatted with and they said how they felt the conversation had gone.The key finding is that the social functioning of the participants with schizophrenia seemed to deteriorate when they thought their conversational partner knew their diagnosis (even though they didn't). Specifically, when they thought their diagnosis had been disclosed, the participants were rated by the research assistants as being more impaired at initiating conversations and at switching topics appropriately, and the assistants also found these conversations less comfortable.Henry's team can't be sure, but they think these apparent deficits emerged because the participants' concern about how they would be judged, in light of their diagnosis having been disclosed, interfered with their ability to converse in a more effective manner.A further twist was that the participants with schizophrenia seemed unaware of these effects - they reported finding the conversations, in which their diagnosis was known, just as comfortable and successful as when they thought their diagnosis had been kept hidden. This contrasts with non-clinical research on stereotype threat, in which people seem to be aware of the effects on their performance.The results provide food for thought regarding when and how mental health diagnoses should be disclosed. The researchers said their findings suggest 'that one of the defining qualities of [schizophrenia] - social skill impairment - is not caused solely by the disorder per se, but rather, also derives from feelings of being stereotyped.'_________________________________Henry, J., Hippel, C., & Shapiro, L. (2010). Stereotype threat contributes to social difficulties in people with schizophrenia. British Journal of Clinical Psychology, 49 (1), 31-41 DOI: 10.1348/014466509X421963
... Read more »
Henry, J., Hippel, C., & Shapiro, L. (2010) Stereotype threat contributes to social difficulties in people with schizophrenia. British Journal of Clinical Psychology, 49(1), 31-41. DOI: 10.1348/014466509X421963
If a mother has a negative perception of her baby when it's just one month old, there's a strong possibility that same baby will have attachment problems as an adult, thirty or forty years later. That's the claim of a longitudinal study that recommends screening new mothers to see if they have a negative perception of their child, so that any necessary action can be taken to stop the transmission of attachment problems from mother to child.Elsie Broussard and Jude Cassidy recruited twenty-six adults in the area of Pittsburgh, whose mothers had signed up to a longitudinal study up to forty years earlier. Back then, in the 60s and 70s, the mothers had been asked to rate their one-month-old babies on factors like crying, spitting, sleeping, feeding and predictability, and then do the same for the 'average baby'. Twelve of the babies were judged to be at risk because their mothers had rated them more negatively than an average baby. Back to the present, and the researchers interviewed the adults using the Adult Attachment Interview, which includes questions about memories of their childhood, their memories of separation and loss and whether they felt affected by their parents' behaviour. Based on these kinds of questions, the participants were classified as being securely or insecurely attached, the latter classification suggesting that they have ongoing problems forming healthy emotional attachments to other people. The key finding is that 9 of the 12 adults who, so many years earlier, had been perceived negatively by their mothers were today classified as insecurely attached adults, compared with just 2 of the 14 adults who'd been positively perceived by their mothers. '...These findings reflect transmission from one individual's representational world to that of another,' the researchers said. In other words, the researchers believe that a mother who views her baby negatively has attachment problems and these problems tend to be passed onto that baby, even affecting his or her attachment style thirty or forty years later. How could a negative attachment style be transmitted in this way? Apparently, earlier work in Broussard's lab showed that 'mothers with a negative perception of their infants had limited awareness of their infant's states, had difficulties recognising their infant's signals, and lacked a flexible and effective range of responses.' Moreover, the researchers surmised, babies with mothers who perceive them negatively may fail to come to see their mother as a secure base and may come to feel 'rejected and unloved, feelings that may contribute to an insecure state of mind [in adulthood] with respect to attachment.' Given their results, Broussard and Cassidy suggested more professional support be given to new mothers, especially during the critical early period between hospital discharge and the next contact with medical staff.As with so many studies that look for effects of parenting on children, this study contains a serious confound that's barely touched upon by the researchers. The effects that Broussard and Cassidy attribute to parenting and attachment style could well be genetic. We're not surprised when the children of tall parents grow up to be tall. Perhaps we shouldn't be surprised that the children of insecurely attached parents grow up to be insecurely attached themselves. _________________________________Broussard, E., & Cassidy, J. (2010). Maternal perception of newborns predicts attachment organization in middle adulthood. Attachment & Human Development, 12 (1), 159-172 DOI: 10.1080/14616730903282464
... Read more »
Broussard, E., & Cassidy, J. (2010) Maternal perception of newborns predicts attachment organization in middle adulthood. Attachment , 12(1), 159-172. DOI: 10.1080/14616730903282464
When it comes to avoiding infection, a growing body of evidence suggests we don't just have a physiological immune system, we also have a behavioural immune system - one that alerts us to people likely to be carrying disease, and that puts us off interacting with them. Indeed, there's research showing that people who are more fearful of disease tend to hold more xenophobic attitudes and to display greater prejudice towards people with outwardly visible disabilities. Now Chad Mortensen and his co-workers have extended this line of research by showing that a disease-themed slide show makes people feel less sociable and extravert, and primes their motor system for repelling other people.In the first study, half of 59 participants watched a disease and infection-themed slide show before completing a measure of their own personality. The other participants watched a slide show about architecture before doing the same. The researchers took pains to conceal the true purpose of the study. They asked participants to rate the slide shows' usefulness for another project and they had them answer irrelevant questions. The key finding was that participants who watched the disease slide show subsequently rated themselves as less extravert than did the control participants. Also, among those participants who scored highly on a measure of fear of disease, those who watched the infection slide show rated themselves afterwards as less open to experience and less agreeable. Taken altogether this suggests that reminders of disease makes us view ourselves as less outgoing and gregarious, especially if we're the kind of person who's already fairly neurotic about infection. If these effects are real, you'd expect them to have some effect on actual behaviour. The second study tested that by having participants watch one of the slide shows before completing a computer task. The task involved faces and shapes flashing on a screen and participants responding with a button press that required either an extension or contraction of the arm. The take-home finding here was that participants who watched the disease slide show were quicker at the button presses that required them to extend their arm - the same muscle action that would be required to push someone away. This effect was particularly strong among those participants who were more scared of infection. Again, cover stories were used to conceal the true purpose of the study. '...It appears that humans have evolved a mechanism that responds to environmental cues of disease and modulates attitudes and behaviours in functionally appropriate ways,' the researchers said.Looking to the future, Chad Mortensen and his colleagues added that it would be interesting to see if there could be a reverse effect in conditions in which risk of infection appeared to be absent. In this case, people normally afraid of infection might become particularly extravert and sociable._________________________________Mortensen, C., Becker, D., Ackerman, J., Neuberg, S., & Kenrick, D. (2010). Infection Breeds Reticence: The Effects of Disease Salience on Self-Perceptions of Personality and Behavioral Avoidance Tendencies. Psychological Science, 21 (3), 440-447 DOI: 10.1177/0956797610361706Related open-access article in The Psychologist magazine: 'Parasites, minds and cultures'.
... Read more »
Mortensen, C., Becker, D., Ackerman, J., Neuberg, S., & Kenrick, D. (2010) Infection Breeds Reticence: The Effects of Disease Salience on Self-Perceptions of Personality and Behavioral Avoidance Tendencies. Psychological Science, 21(3), 440-447. DOI: 10.1177/0956797610361706
Do you write about peer-reviewed research in your blog? Use ResearchBlogging.org to make it easy for your readers — and others from around the world — to find your serious posts about academic research.
If you don't have a blog, you can still use our site to learn about fascinating developments in cutting-edge research from around the world.
Research Blogging is powered by SMG Technology.
To learn more, visit seedmediagroup.com.