Predicting individual differences to cyber attacks: Knowledge, arousal, emotional and trust responses

Abstract

Cyber attacks are increasingly commonplace and cause significant disruption, and therefore, have been a focus of much research. The objective of this research was to understand the factors that might lead users to fail to recognize red flags and succumb to cyber events. We investigated users’ knowledge of cyber attacks, their propensity to trust technology, arousal, emotional valence, and situational trust in response to different types and severity of cyber attacks. Our findings suggest that high-risk attacks elicited more arousal and more negative emotional valence than low-risk attacks. The attack-type manipulation revealed that phishing scenarios yielded distinctive patterns, including weaker affective responses than ransomware and other malware. The authors further examined arousal, emotional valence, and situational trust patterns among the subset of high- knowledge participants who successfully identified all the attacks and compared these responses with those of less knowledgeable peers. Our findings suggest that the more knowledgeable the user, the higher was their general propensity to trust technology, the more sensitive were their emotional responses to the manipulation of risk, and the lower their situational trust when faced with cyber attack scenarios.

Bibliographic citation

Pyke, A., Rovira, E., Murray, S., Pritts, J., Carp, C. L., & Thomson, R. (2021). Predicting individual differences to cyber attacks: Knowledge, arousal, emotional and trust responses. Cyberpsychology: Journal of Psychosocial Research on Cyberspace, 15(4), Article 9. doi:https://doi.org/10.5817/CP2021-4-9

Keywords

Cyber attacks; arousal; trust; phishing; malware; ransomware; emotion; individual differences

Full Text:

HTML

References

Show references Hide references

Agrafiotis, I., Nurse, J. R. C., Goldsmith, M., Creese, S., & Upton, D. (2018). A taxonomy of cyber-harms: Defining the impacts of cyber-attacks and understanding how they propagate. Journal of Cybersecurity, 4(1), Article tyy006. https://doi.org/10.1093/cybsec/tyy006

Aycock, J., Crawford, H., & deGraaf, R. (2008). Spamulator: The Internet on a laptop. ACM SIGCSE Bulletin, 40(3), 142–147. https://doi.org/10.1145/1597849.1384311

Backhaus, S., Gross, M. L., Waismel-Manor, I., Cohen, H., & Canetti, D. (2020). A cyberterrorism effect? Emotional reactions to lethal attacks on critical infrastructure. Cyberpsychology, Behavior, and Social Networking, 23(9), 595–603. https://doi.org/10.1089/cyber.2019.0692

Baker, M. (2016). Striving for effective cyber workforce development. Software Engineering Institute.

Baron, R. M., & Kenny, D. A. (1986). The moderator–mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51(6), 1173–1182. https://doi.org/10.1037/0022-3514.51.6.1173

Birditt, K. S., & Fingerman, K. L. (2003). Age and gender differences in adults' descriptions of emotional reactions to interpersonal problems. The Journals of Gerontology: Series B, 58(4), P237–P245. https://doi.org/10.1093/geronb/58.4.P237

Blum, S. C., Silver, R. C., & Poulin, M. J. (2014). Perceiving risk in a dangerous world: Associations between life experiences and risk perceptions. Social Cognition, 32(3), 297–314. https://doi.org/10.1521/soco.2014.32.3.297

Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: The self-assessment manikin and the semantic differential. Behavioral Therapy and Experimental Psychiatry, 25(1), 49–59. https://doi.org/10.1016/0005-7916(94)90063-9

Braithwaite, J. J., Watson, D. G., Jones, R., & Rowe, M. (2013). A guide for analysing electrodermal activity (EDA) & skin conductance responses (SCRs) for psychological experiments. University of Birmingham.

Buck, R., Khan, M., Fagan, M., & Coman, E. (2017). The User Affective Experience Scale: A Measure of emotions anticipated in response to pop-up computer warnings. International Journal of Human–Computer Interaction, 34(1), 25–34. https://doi.org/10.1080/10447318.2017.1314612

Canetti, D., Gross, M., Waismel-Manor, I., Levanon, A., & Cohen, H. (2017). How cyberattacks terrorize: Cortisol and personal insecurity jump in the wake of cyberattacks. Cyberpsychology, Behavior, and Social Networking, 20(2), 72–77. https://doi.org/10.1089/cyber.2016.0338

Chen, J. Y. C., & Terrence, P. I. (2009). Effects of imperfect automation and individual differences concurrent performance of military robotics tasks in a simulated multitasking environment. Ergonomics, 52(8), 907–920. https://doi.org/10.1080/00140130802680773

Chen, Y., Zahedi, F. M., Abbasi, A., & Dobolyi, D. (2021). Trust calibration of automated security IT artifacts: A multi-domain study of phishing-website detection tools. Information & Management, 58(1), Article 103394. https://doi.org/10.1016/j.im.2020.103394

Chong, I., Xiong, A., & Proctor, R. W. (2019). Human factors in the privacy and security of the internet of things. Ergonomics in Design: The Quarterly of Human Factors Applications, 27(3), 5–10. https://doi.org/10.1177/1064804617750321

Chughtai, A. A., & Buckley, F. (2008). Work engagement and its relationship with state and trait trust: A conceptual analysis. Journal of Behavioral and Applied Management, 10(1), 47–71. https://doi.org/10.21818/001c.17170

Cohen, R. A. (2011). Yerkes–Dodson Law. In Encyclopedia of clinical neuropsychology (pp. 2737–2738). Springer. https://doi.org/10.1007/978-0-387-79948-3_1340

Pricewaterhouse Coopers. (2016). Turnaround and transformation in cybersecurity: Key findings from The Global State of Information Security Survey 2016. https://www.pwc.com/sg/en/publications/assets/pwc-global-state-of-information-security-survey-2016.pdf

Critchley, H. D. (2002). Electrodermal responses: What happens in the brain. The Neuroscientist, 8(2), 132–142. https://doi.org/10.1177/107385840200800209

Cuthbert, B. N., Schupp, H. T., Bradley, M. M., Birbaumer, N., & Lang, P. J. (2000). Brain potentials in affective picture processing: Covariation with autonomic arousal and affective report. Biological Psychology, 52(2), 95–111. https://doi.org/10.1016/S0301-0511(99)00044-7

de Visser, E., Shaw, T., Mohamed-Ameen, A., & Parasuraman, R. (2010). Modeling human-automation team performance in networked systems: Individual differences in working memory count. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 54(14), 1087–1091. https://doi.org/10.1177/154193121005401408

Dhamija, R., Tygar, J. D., & Hearst, M. (2006). Why phishing works. In CHI '06: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 581–590). ACM. https://doi.org/10.1145/1124772.1124861

Downs, J. S., Holbrook, M., & Cranor, L. F. (2007). Behavioral response to phishing risk. In eCrime '07: Proceedings of the Anti-phishing Working Groups 2nd Annual eCrime Researchers Summit (pp. 37–44). ACM. https://doi.org/10.1145/1299015.1299019

Dunn, J. R., & Schweitzer, M. E. (2005). Feeling and believing: The influence of emotion on trust. Journal of Personality and Social Psychology, 88(5), 736–748. https://doi.org/10.1037/0022-3514.88.5.736

Dutt, V., Ahn, Y.-S., & Gonzalez, C. (2013). Cyber situation awareness: Modeling detection of cyber attacks with instance-based learning theory. Human Factors, 55(3), 605–618. https://doi.org/10.1177/0018720812464045

Endsley, M. R., & Kiris E. O. (1995). The out-of-the-loop performance problem and level of control in automation. Human Factors, 37(2), 381–394. https://doi.org/10.1518/001872095779064555

Fagan, M., Khan, M. M. H., & Buck, R. (2015). A study of users’ experiences and beliefs about software update messages. Computers in Human Behavior, 51(Part A), 504–519. https://doi.org/10.1016/j.chb.2015.04.075

Faul, F., & Erdfelder, E. (1992). GPOWER: A priori, post-hoc, and compromise power analyses for MS-DOS [Computer program]. Bonn University.

Gilles, I., Bangerter, A., Clémence, A., Green, E. G. T., Krings, F., Staerklé, C., & Wagner-Egger, P. (2011). Trust in medical organizations predicts pandemic (H1N1) 2009 vaccination behavior and perceived efficacy of protection measures in the Swiss public. European Journal of Epidemiology, 26(3), 203–210. https://doi.org/10.1007/s10654-011-9577-2

Glass, G. V., Peckham, P. D., & Sanders, J. R. (1972). Consequences of failure to meet assumptions underlying the fixed effects analyses of variance and covariance. Review of Educational Research, 42(3), 237–288. https://doi.org/10.3102/00346543042003237

Glikson, E., & Woolley, A. W. (2020). Human trust in artificial intelligence: Review of empirical research. Academy of Management Annals, 14(2), 627–660. https://doi.org/10.5465/annals.2018.0057

Gomez, M. A. (2019). Past behavior and future judgements: Seizing and freezing in response to cyber operations. Journal of Cybersecurity, 5(1), Article tyz012. https://doi.org/10.1093/cybsec/tyz012

Gross, M. L., Canetti, D., & Vashdi, D. R. (2016). The psychological effects of cyber terrorism. Bulletin of the Atomic Scientists, 72(5), 284–291. https://doi.org/10.1080/00963402.2016.1216502

Gross, M. L., Canetti, D., & Vashdi, D. R. (2017). Cyberterrorism: Its effects on psychological well-being, public confidence and political attitudes. Journal of Cybersecurity, 3(1), 49–58. https://doi.org/10.1093/cybsec/tyw018

Hess, A. (2015, November 22). "Everything was completely destroyed": What it was like to work at Sony after the hack. Slate. http://www.slate.com/articles/technology/users/2015/11/sony_employees_on_the_hack_one_year_later.html

Hoff, K. A., & Bashir, M. (2015). Trust in automation: Integrating empirical evidence on factors that influence trust. Human Factors, 57(3), 407–434. https://doi.org/10.1177/0018720814547570

Kelley, C. M., Hong, K. W., Mayhorn, C. B., & Murphy-Hill, E. (2012). Something smells phishy: Exploring definitions, consequences, and reactions to phishing. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 56(1), 2108–2112. https://doi.org/10.1177/1071181312561447

Kincaid, J. P., Fishburne, R. P., Jr., Rogers, R. L., & Chissom, B. S. (1975). Derivation of new readability formulas (Automated Readability Index, Fog Count and Flesch Reading Ease Formula) for Navy enlisted personnel (Report no. RBR-8-75). Institute for Simulation and Training, University of Central Florida. https://stars.library.ucf.edu/istlibrary/56/

Koelsch, S., Kilches, S., Steinbeis, N., & Schelinski, S. (2008). Effects of unexpected chords and of performer's expression on brain responses and electrodermal activity. Plos One, 3(7), Article e2631. https://doi.org/10.1371/journal.pone.0002631

Kostyuk, N., & Wayne, C. (2021). The microfoundations of state cybersecurity: Cyber risk perceptions and the mass public. Journal of Global Security Studies, 6(2), Article ogz077. https://doi.org/10.1093/jogss/ogz077

Lang, P. J., Greenwald, M. K., Bradley, M. M., & Hamm, A. O. (1993). Looking at pictures: Affective, facial, visceral, and behavioral reactions. Psychophysiology, 30(3), 261–273. https://doi.org/10.1111/j.1469-8986.1993.tb03352.x

Lazarus, A. A. (1961). Group therapy of phobic disorders by systematic desensitization. The Journal of Abnormal and Social Psychology, 63(3), 504–510. https://doi.org/10.1037/h0043315

Lee, J. D., & Moray, N. (1994). Trust, self-confidence, and operators’ adaptation to automation. International Journal of Human-Computer Studies, 40(1), 153–184. https://doi.org/10.1006/ijhc.1994.1007

Lee, J. D., & See, K. A. (2004). Trust in automation: Designing for appropriate reliance. Human Factors, 46(1), 50–80. https://doi.org/10.1518/hfes.46.1.50.30392

Mayer, R. C., Davis, J. H., & Schoorman, F. D. (1995). An integrative model of organizational trust. The Academy of Management Review, 20(3), 709–734. https://doi.org/10.2307/258792

Merritt, S. M., Heimbaugh, H., LaChapell, J., & Lee, D. (2013). I trust it, but I don’t know why: Effects of implicit attitudes toward automation on trust in an automated system. Human Factors, 55(3), 520–534. https://doi.org/10.1177/0018720812465081

Mosier, K., Fischer, U., & The HART Group. (2012). Impact of automation, task and context features on pilots’ perception of human-automation interaction. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 56(1), 70–74. https://doi.org/10.1177/1071181312561035

Mosier, K. L., Skitka, L. J., Heers, S., & Burdick, M. (1998). Automation bias: Decision making and performance in high-tech cockpits. The International Journal of Aviation Psychology, 8(1), 47–63. https://doi.org/10.1207/s15327108ijap0801_3

Mooradian, T., Renzl, B., & Matzler, K. (2006). Who trusts? Personality, trust and knowledge sharing. Management Learning, 37(4), 523–540. https://doi.org/10.1177/1350507606073424

Nieto, A., Rios, R. (2019). Cybersecurity profiles based on human-centric IoT devices. Human-centric Computing and Information Sciences, 9, Article 39. https://doi.org/10.1186/s13673-019-0200-y

Pak, R., McLaughlin, A. C., & Bass, B. (2014). A multi-level analysis of the effects of age and gender stereotypes on trust in anthropomorphic technology by younger and older adults. Ergonomics, 57(9), 1277–1289. https://doi.org/10.1080/00140139.2014.928750

Panda Labs. (2016, October 16). Cybercrime reaches new heights in the third quarter. https://www.pandasecurity.com/mediacenter/pandalabs/pandalabs-q3/

Parasuraman, R., de Visser, E., Lin, M.-K., & Greenwood, P. M. (2012). Dopamine beta hydroxylase genotype identifies individuals less susceptible to bias in computer-assisted decision making. Plos One, 7(6), Article e39675. https://doi.org/10.1371/journal.pone.0039675

Parasuraman, R., & Wickens, C. D. (2008). Humans: Still vital after all these years of automation. Human Factors, 50(3), 511–520. https://doi.org/10.1518/001872008X312198

Pfleeger, S. L., & Caputo, D. D. (2012). Leveraging behavioral science to mitigate cyber security risk. Computers & Security, 31(4), 597–611. https://doi.org/10.1016/j.cose.2011.12.010

Prati, G., Pietrantoni, L., & Zani, B. (2011). Compliance with recommendations for pandemic influenza H1N1 2009: The role of trust and personal beliefs. Health Education Research, 26(5), 761–769. https://doi.org/10.1093/her/cyr035

Proofpoint. (2020). State of the phish: An in-depth look at user awareness, vulnerability and resilience (Annual Report). https://www.proofpoint.com/us/resources/threat-reports/state-of-phish

Rauch, S. M., Strobel, C., Bella, M., Odachowski, Z., & Bloom, C. (2014). Face to face versus Facebook: Does exposure to social networking web sites augment or attenuate physiological arousal among the socially anxious? Cyberpsychology, Behavior, and Social Networking, 17(3), 187–190. https://doi.org/10.1089/cyber.2012.0498

Rossi, P. H., & Anderson, A. B. (1982). The factorial survey approach: An introduction. In P. H. Rossi & S. L. Nock (Eds.), Measuring social judgments: The factorial survey approach (pp. 15–67). Sage.

Rovira, E., McLaughlin, A. C., Pak, R., & High, L. (2019). Looking for age differences in self-driving vehicles: Examining the effects of automation reliability, driving risk, and physical impairment on trust. Frontiers in Psychology, 10, Article 800. https://doi.org/10.3389/fpsyg.2019.00800

Rovira, E., Pak, R. & McLaughlin, A. (2017). Effects of individual differences in working memory on performance and trust with various degrees of automation. Theoretical Issues in Ergonomics Science, 18(6), 573–591. https://doi.org/10.1080/1463922X.2016.1252806

Sawyer, B. D., & Hancock, P. A. (2018). Hacking the human: The prevalence paradox in cybersecurity. Human Factors, 60(5), 597–609. https://doi.org/10.1177/0018720818780472

Sheng, S., Chan, W. L., Li, K. K., Xianzhong, D., & Xiangjun, Z. (2007). Context information-based cyber security defense of protection system. IEEE Transactions on Power Delivery, 22(3), 1477–1481. https://doi.org/10.1109/TPWRD.2006.886775

Slovic, P. (2016). The perception of risk. Routledge. https://doi.org/10.4324/9781315661773

SonicWall (2021). Mid-Year Update Cyber Threat Report. https://www.sonicwall.com/medialibrary/en/white-paper/mid-year-2021-cyber-threat-report.pdf

Wogalter, M. S., Young, S. L., Brelsford, J. W., & Barlow, T. (1999). The relative contributions of injury severity and likelihood information on hazard-risk judgments and warning compliance. Journal of Safety Research, 30(3), 151–162. https://doi.org/10.1016/S0022-4375(99)00010-9

Yu, K., Taib, R., Butavicius, M. A., Parsons, K., & Chen, F. (2019). Mouse behavior as an index of phishing awareness. In D. Lamas, F. Loizides, L. Nacke, H. Petrie, M. Winckler, & P. Zaphiris (Eds.), Human-Computer Interaction – INTERACT 2019. Springer. https://doi.org/10.4324/9781315661773

https://doi.org/10.5817/CP2021-4-9



Copyright (c) 2021 Cyberpsychology: Journal of Psychosocial Research on Cyberspace

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.