Predicting individual differences to cyber attacks: Knowledge, arousal, emotional and trust responses



Cyber attacks are increasingly commonplace and cause significant disruption, and therefore, have been a focus of much research. The objective of this research was to understand the factors that might lead users to fail to recognize red flags and succumb to cyber events. We investigated users’ knowledge of cyber attacks, their propensity to trust technology, arousal, emotional valence, and situational trust in response to different types and severity of cyber attacks. Our findings suggest that high-risk attacks elicited more arousal and more negative emotional valence than low-risk attacks. The attack-type manipulation revealed that phishing scenarios yielded distinctive patterns, including weaker affective responses than ransomware and other malware. The authors further examined arousal, emotional valence, and situational trust patterns among the subset of high- knowledge participants who successfully identified all the attacks and compared these responses with those of less knowledgeable peers. Our findings suggest that the more knowledgeable the user, the higher was their general propensity to trust technology, the more sensitive were their emotional responses to the manipulation of risk, and the lower their situational trust when faced with cyber attack scenarios.

Cyber attacks; arousal; trust; phishing; malware; ransomware; emotion; individual differences
Author biographies

Aryn Pyke

U.S. Military Academy Army Cyber Institute

Ericka Rovira

U.S. Military Academy

Savannah Murray

U.S. Military Academy

Joseph Pritts

U.S. Military Academy

Charlotte L. Carp

University of Houston

Robert Thomson

U.S. Military Academy


Agrafiotis, I., Nurse, J. R. C., Goldsmith, M., Creese, S., & Upton, D. (2018). A taxonomy of cyber-harms: Defining the impacts of cyber-attacks and understanding how they propagate. Journal of Cybersecurity, 4(1), Article tyy006.

Aycock, J., Crawford, H., & deGraaf, R. (2008). Spamulator: The Internet on a laptop. ACM SIGCSE Bulletin, 40(3), 142–147.

Backhaus, S., Gross, M. L., Waismel-Manor, I., Cohen, H., & Canetti, D. (2020). A cyberterrorism effect? Emotional reactions to lethal attacks on critical infrastructure. Cyberpsychology, Behavior, and Social Networking, 23(9), 595–603.

Baker, M. (2016). Striving for effective cyber workforce development. Software Engineering Institute.

Baron, R. M., & Kenny, D. A. (1986). The moderator–mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51(6), 1173–1182.

Birditt, K. S., & Fingerman, K. L. (2003). Age and gender differences in adults' descriptions of emotional reactions to interpersonal problems. The Journals of Gerontology: Series B, 58(4), P237–P245.

Blum, S. C., Silver, R. C., & Poulin, M. J. (2014). Perceiving risk in a dangerous world: Associations between life experiences and risk perceptions. Social Cognition, 32(3), 297–314.

Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: The self-assessment manikin and the semantic differential. Behavioral Therapy and Experimental Psychiatry, 25(1), 49–59.

Braithwaite, J. J., Watson, D. G., Jones, R., & Rowe, M. (2013). A guide for analysing electrodermal activity (EDA) & skin conductance responses (SCRs) for psychological experiments. University of Birmingham.

Buck, R., Khan, M., Fagan, M., & Coman, E. (2017). The User Affective Experience Scale: A Measure of emotions anticipated in response to pop-up computer warnings. International Journal of Human–Computer Interaction, 34(1), 25–34.

Canetti, D., Gross, M., Waismel-Manor, I., Levanon, A., & Cohen, H. (2017). How cyberattacks terrorize: Cortisol and personal insecurity jump in the wake of cyberattacks. Cyberpsychology, Behavior, and Social Networking, 20(2), 72–77.

Chen, J. Y. C., & Terrence, P. I. (2009). Effects of imperfect automation and individual differences concurrent performance of military robotics tasks in a simulated multitasking environment. Ergonomics, 52(8), 907–920.

Chen, Y., Zahedi, F. M., Abbasi, A., & Dobolyi, D. (2021). Trust calibration of automated security IT artifacts: A multi-domain study of phishing-website detection tools. Information & Management, 58(1), Article 103394.

Chong, I., Xiong, A., & Proctor, R. W. (2019). Human factors in the privacy and security of the internet of things. Ergonomics in Design: The Quarterly of Human Factors Applications, 27(3), 5–10.

Chughtai, A. A., & Buckley, F. (2008). Work engagement and its relationship with state and trait trust: A conceptual analysis. Journal of Behavioral and Applied Management, 10(1), 47–71.

Cohen, R. A. (2011). Yerkes–Dodson Law. In Encyclopedia of clinical neuropsychology (pp. 2737–2738). Springer.

Pricewaterhouse Coopers. (2016). Turnaround and transformation in cybersecurity: Key findings from The Global State of Information Security Survey 2016.

Critchley, H. D. (2002). Electrodermal responses: What happens in the brain. The Neuroscientist, 8(2), 132–142.

Cuthbert, B. N., Schupp, H. T., Bradley, M. M., Birbaumer, N., & Lang, P. J. (2000). Brain potentials in affective picture processing: Covariation with autonomic arousal and affective report. Biological Psychology, 52(2), 95–111.

de Visser, E., Shaw, T., Mohamed-Ameen, A., & Parasuraman, R. (2010). Modeling human-automation team performance in networked systems: Individual differences in working memory count. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 54(14), 1087–1091.

Dhamija, R., Tygar, J. D., & Hearst, M. (2006). Why phishing works. In CHI '06: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 581–590). ACM.

Downs, J. S., Holbrook, M., & Cranor, L. F. (2007). Behavioral response to phishing risk. In eCrime '07: Proceedings of the Anti-phishing Working Groups 2nd Annual eCrime Researchers Summit (pp. 37–44). ACM.

Dunn, J. R., & Schweitzer, M. E. (2005). Feeling and believing: The influence of emotion on trust. Journal of Personality and Social Psychology, 88(5), 736–748.

Dutt, V., Ahn, Y.-S., & Gonzalez, C. (2013). Cyber situation awareness: Modeling detection of cyber attacks with instance-based learning theory. Human Factors, 55(3), 605–618.

Endsley, M. R., & Kiris E. O. (1995). The out-of-the-loop performance problem and level of control in automation. Human Factors, 37(2), 381–394.

Fagan, M., Khan, M. M. H., & Buck, R. (2015). A study of users’ experiences and beliefs about software update messages. Computers in Human Behavior, 51(Part A), 504–519.

Faul, F., & Erdfelder, E. (1992). GPOWER: A priori, post-hoc, and compromise power analyses for MS-DOS [Computer program]. Bonn University.

Gilles, I., Bangerter, A., Clémence, A., Green, E. G. T., Krings, F., Staerklé, C., & Wagner-Egger, P. (2011). Trust in medical organizations predicts pandemic (H1N1) 2009 vaccination behavior and perceived efficacy of protection measures in the Swiss public. European Journal of Epidemiology, 26(3), 203–210.

Glass, G. V., Peckham, P. D., & Sanders, J. R. (1972). Consequences of failure to meet assumptions underlying the fixed effects analyses of variance and covariance. Review of Educational Research, 42(3), 237–288.

Glikson, E., & Woolley, A. W. (2020). Human trust in artificial intelligence: Review of empirical research. Academy of Management Annals, 14(2), 627–660.

Gomez, M. A. (2019). Past behavior and future judgements: Seizing and freezing in response to cyber operations. Journal of Cybersecurity, 5(1), Article tyz012.

Gross, M. L., Canetti, D., & Vashdi, D. R. (2016). The psychological effects of cyber terrorism. Bulletin of the Atomic Scientists, 72(5), 284–291.

Gross, M. L., Canetti, D., & Vashdi, D. R. (2017). Cyberterrorism: Its effects on psychological well-being, public confidence and political attitudes. Journal of Cybersecurity, 3(1), 49–58.

Hess, A. (2015, November 22). "Everything was completely destroyed": What it was like to work at Sony after the hack. Slate.

Hoff, K. A., & Bashir, M. (2015). Trust in automation: Integrating empirical evidence on factors that influence trust. Human Factors, 57(3), 407–434.

Kelley, C. M., Hong, K. W., Mayhorn, C. B., & Murphy-Hill, E. (2012). Something smells phishy: Exploring definitions, consequences, and reactions to phishing. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 56(1), 2108–2112.

Kincaid, J. P., Fishburne, R. P., Jr., Rogers, R. L., & Chissom, B. S. (1975). Derivation of new readability formulas (Automated Readability Index, Fog Count and Flesch Reading Ease Formula) for Navy enlisted personnel (Report no. RBR-8-75). Institute for Simulation and Training, University of Central Florida.

Koelsch, S., Kilches, S., Steinbeis, N., & Schelinski, S. (2008). Effects of unexpected chords and of performer's expression on brain responses and electrodermal activity. Plos One, 3(7), Article e2631.

Kostyuk, N., & Wayne, C. (2021). The microfoundations of state cybersecurity: Cyber risk perceptions and the mass public. Journal of Global Security Studies, 6(2), Article ogz077.

Lang, P. J., Greenwald, M. K., Bradley, M. M., & Hamm, A. O. (1993). Looking at pictures: Affective, facial, visceral, and behavioral reactions. Psychophysiology, 30(3), 261–273.

Lazarus, A. A. (1961). Group therapy of phobic disorders by systematic desensitization. The Journal of Abnormal and Social Psychology, 63(3), 504–510.

Lee, J. D., & Moray, N. (1994). Trust, self-confidence, and operators’ adaptation to automation. International Journal of Human-Computer Studies, 40(1), 153–184.

Lee, J. D., & See, K. A. (2004). Trust in automation: Designing for appropriate reliance. Human Factors, 46(1), 50–80.

Mayer, R. C., Davis, J. H., & Schoorman, F. D. (1995). An integrative model of organizational trust. The Academy of Management Review, 20(3), 709–734.

Merritt, S. M., Heimbaugh, H., LaChapell, J., & Lee, D. (2013). I trust it, but I don’t know why: Effects of implicit attitudes toward automation on trust in an automated system. Human Factors, 55(3), 520–534.

Mosier, K., Fischer, U., & The HART Group. (2012). Impact of automation, task and context features on pilots’ perception of human-automation interaction. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 56(1), 70–74.

Mosier, K. L., Skitka, L. J., Heers, S., & Burdick, M. (1998). Automation bias: Decision making and performance in high-tech cockpits. The International Journal of Aviation Psychology, 8(1), 47–63.

Mooradian, T., Renzl, B., & Matzler, K. (2006). Who trusts? Personality, trust and knowledge sharing. Management Learning, 37(4), 523–540.

Nieto, A., Rios, R. (2019). Cybersecurity profiles based on human-centric IoT devices. Human-centric Computing and Information Sciences, 9, Article 39.

Pak, R., McLaughlin, A. C., & Bass, B. (2014). A multi-level analysis of the effects of age and gender stereotypes on trust in anthropomorphic technology by younger and older adults. Ergonomics, 57(9), 1277–1289.

Panda Labs. (2016, October 16). Cybercrime reaches new heights in the third quarter.

Parasuraman, R., de Visser, E., Lin, M.-K., & Greenwood, P. M. (2012). Dopamine beta hydroxylase genotype identifies individuals less susceptible to bias in computer-assisted decision making. Plos One, 7(6), Article e39675.

Parasuraman, R., & Wickens, C. D. (2008). Humans: Still vital after all these years of automation. Human Factors, 50(3), 511–520.

Pfleeger, S. L., & Caputo, D. D. (2012). Leveraging behavioral science to mitigate cyber security risk. Computers & Security, 31(4), 597–611.

Prati, G., Pietrantoni, L., & Zani, B. (2011). Compliance with recommendations for pandemic influenza H1N1 2009: The role of trust and personal beliefs. Health Education Research, 26(5), 761–769.

Proofpoint. (2020). State of the phish: An in-depth look at user awareness, vulnerability and resilience (Annual Report).

Rauch, S. M., Strobel, C., Bella, M., Odachowski, Z., & Bloom, C. (2014). Face to face versus Facebook: Does exposure to social networking web sites augment or attenuate physiological arousal among the socially anxious? Cyberpsychology, Behavior, and Social Networking, 17(3), 187–190.

Rossi, P. H., & Anderson, A. B. (1982). The factorial survey approach: An introduction. In P. H. Rossi & S. L. Nock (Eds.), Measuring social judgments: The factorial survey approach (pp. 15–67). Sage.

Rovira, E., McLaughlin, A. C., Pak, R., & High, L. (2019). Looking for age differences in self-driving vehicles: Examining the effects of automation reliability, driving risk, and physical impairment on trust. Frontiers in Psychology, 10, Article 800.

Rovira, E., Pak, R. & McLaughlin, A. (2017). Effects of individual differences in working memory on performance and trust with various degrees of automation. Theoretical Issues in Ergonomics Science, 18(6), 573–591.

Sawyer, B. D., & Hancock, P. A. (2018). Hacking the human: The prevalence paradox in cybersecurity. Human Factors, 60(5), 597–609.

Sheng, S., Chan, W. L., Li, K. K., Xianzhong, D., & Xiangjun, Z. (2007). Context information-based cyber security defense of protection system. IEEE Transactions on Power Delivery, 22(3), 1477–1481.

Slovic, P. (2016). The perception of risk. Routledge.

SonicWall (2021). Mid-Year Update Cyber Threat Report.

Wogalter, M. S., Young, S. L., Brelsford, J. W., & Barlow, T. (1999). The relative contributions of injury severity and likelihood information on hazard-risk judgments and warning compliance. Journal of Safety Research, 30(3), 151–162.

Yu, K., Taib, R., Butavicius, M. A., Parsons, K., & Chen, F. (2019). Mouse behavior as an index of phishing awareness. In D. Lamas, F. Loizides, L. Nacke, H. Petrie, M. Winckler, & P. Zaphiris (Eds.), Human-Computer Interaction – INTERACT 2019. Springer.





PDF views