top of page
Search

Phishing for Justice: When Cybercriminals Hack the Human Mind

By Sophia Sonkin '27 & Shreya Avadhuta '24


Welcome to “The Cyber Courtroom,” a new series exploring how law and technology collide in the digital age. This collaborative project, written by FULR student writer Sophia Sonkin and Shreya Avadhuta, a cybersecurity researcher at the Florida Institute for Cybersecurity Research (FICS), will journey through the legal dimensions of cybercrime and tech law. 


Together, we’ll unravel cutting-edge cases where the courtroom meets tech, and follow how ever-evolving technologies are testing the law as it works to keep pace.


The Cost of a Click


In November 2024, a gripping federal indictment brought against five individuals revealed a sophisticated phishing campaign so stealthy it went undetected until damage occurred. (1) Unlike traditional cyberattacks that rely on malware or hacking tools, this scheme weaponized language itself, leveraging deceptive text messages as the Trojan horse. These attackers impersonated trusted internal security teams, IT departments, or service providers, sending urgent SMS alerts about account deactivation or security threats. Embedded within these texts were links to counterfeit login portals indistinguishable from legitimate access points. Once employees unwittingly submitted credentials, in some cases completing two-factor authentication, hackers gained entry to critical corporate systems. (2)


The intruders then exfiltrated sensitive information: intellectual property, proprietary work products, corporate credentials, and private contact lists. They also siphoned millions from cryptocurrency accounts before law enforcement intervened. This case reflects a broader trend of cybercriminals exploiting psychological vulnerabilities through social engineering, targeting the human element rather than solely relying on technical exploits. Similar cases have involved diverse schemes, including business email compromise scams, where deceptive email addresses prompt fraudulent payments reaching millions of dollars, emphasizing that linguistic manipulation is a favored vector of modern cyberattacks. (3)


The $16.6 Billion Problem: Why Our Brains Are the Weakest Link


In 2024, FBI reports documented $16.6 billion lost to cybercrime, a 33% increase from the previous year, with phishing attacks surging nearly 60% in that timeframe. (4) Global cybercrime costs are projected to rise further, potentially reaching $10.5 trillion annually by 2025 and soaring to $15.6 trillion by 2029. As Singh and Zheng argue in “The Psychology of Cybersecurity: Hacking and the Human Mind,” the most vulnerable operating system is not Windows or iOS, it is the human brain. “Hacking is not just about exploiting vulnerabilities in software or hardware, but also exploiting vulnerabilities in people.” (5)


This forces us to view cybercrime not as mere technical trespass, but as mindcrime, a fusion of persuasion, bias, and deception that law has struggled to define. Carbanak siphoned $1 billion by persuading employees. In 2024, a deepfake CFO duped an Arup employee in Hong Kong into wiring $25 million. The attack was not a technical exploit but a confidence trick executed on a global scale. (6)


Why are these attacks so successful? Human cognition still defaults to rapid threat responses, bypassing analysis under pressure. While technology continuously advances defense and detection tools, hackers have adapted by targeting a far more vulnerable frontier: our Stone Age brains. 


Cybercriminals expertly exploit these innate cognitive patterns, sending carefully crafted messages impersonating trusted entities like bosses, banks, or government agencies. These messages tap into deep-rooted psychological triggers such as urgency, fear, and authority, triggering reflexive actions before victims can engage their rational minds. This is exacerbated by stress and time pressure, which empirical studies show degrade decision-making faculties and increase vulnerability to deception. (7) Cognitive biases, such as overconfidence and confirmation bias, further impair detection, as victims may underestimate their susceptibility or accept messages aligning with preconceived expectations. (8)


Mind Games: How Hackers Manipulate Your Cognitive Biases


Authority Bias: “Because the CEO Said So.”


Authority bias is the psychological tendency to trust and comply with directives from perceived authority figures without question. Phishers capitalize on this deep-rooted behavioral pattern by impersonating executives, IT staff, or official departments in emails or messages. Such messages use logos, formal language, and urgency to appear legitimate.

Research consistently shows that authority bias is one of the most potent levers in phishing success. For example, experimental studies reveal that individuals are significantly more likely to follow instructions if they believe the message comes from a high-ranking figure, such as a CEO or CFO. (9) Even trained employees comply when they fear disobeying orders. (10)


Ultimately, authority bias is not just a psychological vulnerability but a legal gray area: courts are increasingly challenged to determine where corporate policies, due diligence, and liability intersect when employees are tricked by sophisticated impersonation attacks. As successful phishing campaigns escalate in complexity, legal frameworks must keep evolving to address the interplay between human factors and organizational responsibility. (11)


Scarcity Bias: “Act Now!”


Scarcity bias exploits the instinctive human reaction to perceive limited availability or time constraints as signals of high value. Phishing attackers harness this bias by constructing messages that emphasize urgency, warnings such as “limited-time security update” or threats of permanent account suspension, which compel recipients to make hasty decisions without deliberate scrutiny.


Scarcity increases perceived value and prompts impulsive action. Robert Cialdini identified scarcity as one of six key principles of persuasion, highlighting how urgency or limited availability triggers a fear of missing out (FOMO) and accelerates decision-making. (12) A 2024 Harvard Business Review article found scarcity-based phishing yields higher click rates. (13) Furthermore, organizations’ phishing simulations reveal that urgency-based lures consistently outperform more neutral messages in eliciting responses. (14) This psychological manipulation creates vulnerabilities as users focus on avoiding negative consequences rather than verifying message legitimacy. 


In essence, scarcity bias weaponizes the natural panic reaction to potential loss, fueling cybercriminals’ success in undermining cautious reflection and prompting clicks on fraudulent links. Legally, scarcity bias increases employer risk if urgent phishing messages succeed due to inadequate training or controls. Failure to address this vulnerability can lead to liability under data protection laws and negligence claims. (15)


Reciprocity Bias: “You Scratch My Back…”


Reciprocity bias refers to the deep-seated human tendency to feel compelled to return a favor when someone offers something, even if the initial goodwill is artificial. Phishers exploit this by offering “free” services or gifts, such as a complimentary security check, industry report, or bonus content, in exchange for sensitive information or login credentials.


Studies show reciprocity-based lures succeed by creating a sense of obligation. (16) This mirrors classic Cialdini persuasion principles, which identify reciprocity as a powerful influence tool. (Cialdini, 2007).


Further cybersecurity research finds that phishing attacks leveraging reciprocity often outperform other strategies because they hijack social norms of politeness and obligation. When a recipient perceives they have received something of value, however minimal, they feel psychologically pressured to reciprocate, often by clicking malicious links or providing credentials. (17) Legally, this complicates proving intent since victims often respond in good faith, highlighting the need for user education and robust verification protocols.


Overconfidence and Self-Efficacy: The Expert’s Blind Spot


Even cybersecurity experts can fall prey to phishing, largely due to overconfidence. Feeling skilled, IT professionals skip verification steps, increasing vulnerability.


Research shows that while many employees believe they can identify phishing emails confidently, a significant portion still falls victim, illustrating the dangerous gap between confidence and competence. (18) Studies also find that higher self-efficacy can paradoxically increase phishing risk because individuals might underestimate threats and reduce vigilance. (19) This cognitive bias can lead to complacency, even among experts, making it a key target for cybersecurity training.


Legally, this raises the stakes as experts may be held to higher negligence standards given their expected competence. Thus, expertise intended to protect can paradoxically increase risk when it fuels complacency. Each of these biases underscores why training must go beyond technical drills to include cognitive and psychological readiness. 


Legal Frameworks and Blind Spots


The law still struggles to keep pace with deception. The EU’s General Data Protection Regulation (GDPR) has imposed billion-euro fines on giants like Amazon and Meta, making data protection a global standard. (20) The California Consumer Privacy Act (CCPA) and California Privacy Rights Act (CPRA) give citizens rights to access, correct, and erase their data, with enforcement by a dedicated state agency. (21) The U.S. also enforces sectoral laws like the Health Insurance Portability and Accountability Act (HIPAA) for health, (22) the Children's Online Privacy Protection Act (COPPA) for children, (23) and the Gramm-Leach-Bliley Act GLBA for finance. (24)


Yet these frameworks target data governance, not manipulation. Singh & Zheng note that transparency alone doesn’t neutralize persuasion. (25) A major breakthrough came in Federal Trade Commission v. LendingClub (2019), when the court ruled that misleading interface designs were illegal deception. (26) The EU’s Digital Services Act (2024) went further, banning many “dark patterns” outright. (27) Regulators now treat psychological exploitation as systemic, but criminal law lags.


Evidentiary hurdles remain daunting. It is easy to prove that a server was breached; it is far harder to prove that a victim clicked under the sway of fear or authority. Jurisdictional mismatches let attackers exploit the weakest legal environment, while employer liability is inconsistently applied. Courts differ on whether companies or employees bear blame. (28)


What the law can do is clear. Legislatures could explicitly recognize deception as an intrusion, harmonize cybercrime definitions internationally, and mandate cognitive readiness training in critical sectors, just as physical safety is legally required in workplaces. Regulators can build on GDPR and CPRA precedents, extending enforcement from how data is handled to how people are manipulated.


Persuasion as the New Intrusion


These cases show intrusion now targets judgment, not code. These weren’t brute-force hacks; they were confidence tricks played on a global stage. Singh & Zheng describe these techniques as “all forms of lies and deception.” Hackers now deploy authority, fear, and urgency like malware.“The perfect social engineering hacker,” they warn, “would indeed be a sociopath with a heightened sensibility for a target’s context … persuading without remorse.” Yet laws continue to define intrusion as something purely technical, such as a flaw in the code, not in cognition. The true break-in happens when the victim’s judgment is compromised. 


Toward a Jurisprudence of Cyberpsychology


What is needed is a jurisprudence for cyberpsychology- a framework that treats persuasion and bias as vulnerabilities as real as zero-day exploits. Singh & Zheng recommend behavioral nudges, structured cognitive training, and cultures of readiness that integrate psychology with technology.

Their warning is blunt: “Any exposure to acts of violence, including cyberattacks, generates a human experience precipitating psychological harm, heightens threat perception and generates enduring shifts in political attitudes and behaviours.” Cyberattacks don’t just cost money. They corrode trust, weaken institutions, and reshape society. Cybercrime today targets minds as well as machines. Until the law thinks psychologically, it will stay a step behind.


Endnotes

  1. U.S. Department of Justice. “5 defendants charged federally for running scheme targeted at victim companies by phishing texts.” U.S. Department of Justice Press Release, November 20, 2024.

  2. Reuters. "US Charges Five in ‘Scattered Spider’ Hacking Scheme." Reuters, November 19, 2024.

  3. U.S. Immigration and Customs Enforcement (ICE), "HSI Investigation Leads to Seizure of $3.5 Million Dollars Stolen in Business Email Compromise Scam," U.S. Immigration and Customs Enforcement, February 26, 2025.

  4. Federal Bureau Investigation, “FBI Releases Annual Internet Crime Report,” FBI News Release, April 22, 2025.

  5. Tarnveer Singh and Sarah Y. Zheng, The Psychology of Cybersecurity: Hacking and the Human Mind. Abingdon: Routledge, 2025.

  6. Zak Doffman. “Deepfake Audio Used to Scam CEO Out of $243,000.” Forbes, September 1, 2019. https://www.forbes.com/sites/zakdoffman/2019/09/01/deepfake-audio-cybercriminals-trick-ceo-into-transferring-243000/.

  7. Scientific Reports. “Susceptibility to Phishing on Social Network Sites: A Personality Perspective.” National Library of Medicine, April 30, 2020.

  8. Trellix. “The Psychology of Phishing: Unraveling the Success Behind Social Engineering Attacks.” Trellix Research, January 31, 2024. https://www.trellix.com/blogs/research/understanding-phishing-psychology-effective-strategies-and-tips/.

  9. Megha Sharma, Manoj Kumar, Cleotilde Gonzalez, and Varun Dutt. "How the Presence of Cognitive Biases in Phishing Emails Affects Human Decision-Making." ICONIP 2022.

  10. Trellix Research. 2024. The Psychology of Phishing: Effective Strategies and Tips.

  11. Ferner & Alsdorf, "Liability of Companies in Phishing and CEO Fraud Incidents," accessed October 1, 2025, https://www.ferner-alsdorf.com/liability-of-companies-in-phishing-and-ceo-fraud-incidents/.

  12. Robert B Cialdini. Influence: The Psychology of Persuasion. New York: Harper Business, 2007.

  13. Harvard Business Review. "Phishing Attacks Are Evolving. Here's How to Resist Them." Harvard Business Review, October 2024.

  14. Proofpoint. 2024 State of Phish Report: Impact of Human Behavior. 2024.

  15. Keeper Security, "Employer Liability for Data Breaches: What Companies Should Know," May 4, 2023, https://www.keepersecurity.com/blog/2023/05/04/employer-liability-for-data-breaches-what-companies-should-know/.

  16. Anjali Van Der Heijden and Luca Allodi. "Cognitive Triaging of Phishing Attacks." Paper presented at the USENIX Security Symposium, 2019

  17. Ridge Security. "How Phishing Uses Your Cognitive Biases Against You." Ridge Security Blog, 2025.

  18. KnowBe4. Security Confidence Gap Research. 2025.

  19. Abdulrahman Alnifie and Jin Kim. "Overconfidence Bias Measures and Herd Behavior in Cybersecurity." Information & Management, 2025.

  20. Adam Satariano. Amazon fined $887 million (NYT 2021); Adam Satariano, Meta fined $1.3 billion (NYT 2023)

  21. California Consumer Privacy Act (CCPA). (Cal. Civ. Code §§ 1798.100–1798.199)

  22. Health Insurance Portability and Accountability Act (Pub. L. 104–191).

  23. Children’s Online Privacy Protection Act (HIPAA). (15 U.S.C. §§ 6501–6506)

  24. Gramm-Leach-Bliley Act (GLBA) (Pub. L. 106–102

  25. Singh and Zheng, The Psychology of Cybersecurity: Hacking and the Human Mind

  26. Federal Trade Commission v. LendingClub Corp., No. 18-cv-02454 (N.D. Cal. 2019)

  27. Digital Services Act, Regulation (EU) 2022/2065

  28. Experi-Metal, Inc. v. Comerica Bank, 2011 WL 2433383 (E.D. Mich. 2011; Reilly v. JPMorgan Chase & Co., No. 15-cv-1119 (S.D.N.Y. 2017)


 
 
 
  • Instagram
  • LinkedIn

Florida Undergraduate Law Review 2024 | University of Florida

All opinions expressed herein are those of individual authors and are not endorsed by the Florida Undergraduate Law Review. The Florida Undergraduate Law Review is a student-run organization and does not reflect the views of the University of Florida.

bottom of page