top of page
Search

Is State Farm Guilty of Racial Discrimination? The Presence of Algorithmic Bias in Claims Systems

  • Writer: FULR Management
    FULR Management
  • 6 days ago
  • 8 min read

By Hannah Karvosky '26


In March of 2022, a devastating storm hit Evanston, Illinois, resulting in severe damage to several residents’ homes. Ms. Wynn, a resident of Evanston, saw serious damage to the roof of her townhome, including the membrane being blown off all three units of the townhome. (1) It is common knowledge among homeowners that any type of roof damage is crucial to address quickly, as it may result in leakages and water damage if left as is. Ms. Wynn and her neighbor were well aware of the issues that can be caused by roof damage, so they both filed claims with their insurance company, State Farm, on March 6th, 2022. (2) Despite filing the claims through the same insurance company, for the same issue, and on the exact same date, Ms. Wynn’s claim faced far more scrutiny compared to her neighbor’s. State Farm demanded additional documentation, estimates, and inspections from Ms. Wynn. (3) Ultimately, Ms. Wynn’s claim took approximately three months longer to process than her neighbor’s claim did, by which point the damage was far too severe for Ms. Wynn to remain living there safely, thus forcing her to move out. (4)


Ms. Huskey faced a strikingly similar situation to Ms. Wynn only a year prior. On June 12, 2021, hail rained down on Ms. Huskey’s home in Matteson, Illinois. (5) The hail from the storm broke some of the shingles on Ms. Huskey’s roof, causing leaks in two of her bathrooms and her kitchen. (6) Ms. Huskey was also a State Farm home insurance policyholder, so she filed a claim with them for assistance with the repairs. (7) After multiple months, a couple of adjusters came and went, but no progress was made on Ms. Huskey’s claim. (8) Ms. Huskey states that she contacted State Farm 20 to 30 different times since her initial claim in June. (9) Nearly four months after Ms. Huskey initially filed her claim, State Farm decided to grant it, but only partially for interior repairs (about $4,700), when the cost of the damages totaled about $7,000. (10) The delay in State Farm’s claims processing system led to Ms. Huskey’s home experiencing significant water damage to her kitchen and two bathrooms, which resulted in an overall decrease in her home’s value. (11) It is apparent that Ms. Wynn and Ms. Huskey were not treated fairly during the processing of their State Farm insurance claims. The true question that lies here is why? The answer may be related to a rising issue that has faced much legal discourse in recent years: artificial intelligence.


In recent years, artificial intelligence has advanced rapidly, with many companies employing it to save themselves time, money, and labor; however, this becomes a problem when companies rely on it to make decisions for them. While AI can be useful as a decision-making tool, it cannot process or interpret human emotions or social norms. This can result in biases arising in certain instances, even if it was not intended by the company utilizing the AI. In these cases specifically, it is a serious concern that Ms. Wynn and Ms. Huskey’s claims were faced with intense scrutiny simply because they are Black. (12) Furthermore, Ms. Wynn and Ms. Huskey are not the only people claiming to be treated unfairly by State Farm. In one survey conducted by YouGov in 2021, 648 White and 151 Black homeowners with State Farm policies in the Midwest were questioned about their experiences with their insurance claims filed through State Farm. (13) The survey showed that, among other disparities, white homeowners were almost a third more likely than Black homeowners to have their claims processed expeditiously. (14) Additionally, “Black policyholders were 39% more likely to have to submit extra paperwork to justify their claims, causing months of delay in receiving coverage for urgent repairs.” (15) This significant disparity strongly suggests that a bias may be the cause.


Ms. Wynn and Ms. Huskey are only two of thousands of individuals who have claimed that they have been treated unfairly due to State Farm’s use of algorithmic decision-making tools in its claims review process. (16) The plaintiffs claim that the algorithmic decision-making tools that State Farm employs are trained to predict the chances of fraud within each claimant. (17) Those who are deemed to be “low touch” are paid out immediately, while those who are deemed to be “high touch” are faced with more scrutiny, resulting in lengthy delays on their claims. (18) The bias is believed to have been caused by the system’s inputs “that correspond with race or learn from historic housing or claims data that is biased.” (19) If there is a racial bias within State Farm’s system, that does not necessarily mean it was intentionally placed there. There is a chance that an algorithmic bias is present, which occurs when the inputs of the algorithm, usually imposed by the data science team that collects and codes the training data, result in unfair or discriminatory outcomes. (20) Algorithmic bias has been a growing concern in recent years, with an increasing number of companies and institutions using algorithms to aid in decision-making scenarios. To reference a recent case, algorithmic bias was also found in the COMPAS algorithm (Correctional Offender Management Profiling for Alternative Sanctions), which was used to assess the risk of recidivism and produced a higher false positive rate for black defendants compared to white defendants. (21)


Further evidence of technological bias being present in State Farm’s claims processing system can be found through their relationship with Duck Creek Technologies, an insurance-specialized technology company that assists in processing claims. (22) Considering the number of insurance claims that State Farm receives each day, they do not have the capacity to handle all of them in-house. For this reason, State Farm turns to Duck Creek Technologies to assist in processing its claims. (23) Duck Creek utilizes software from the artificial intelligence firm FRISS, which flags certain claims that it believes could be fraudulent. (24) FRISS evaluates claims by giving “each insurance policyholder a ‘risk score’ by running the customer’s information through its computer programs”; the score is based on demographic data about the neighborhood, crime statistics, and even data harvested from social media. (25) This suggests that FRISS’s methods for assigning ‘risk scores’ may be racially discriminatory towards Black policyholders.


With the multitude of Black State Farm policyholders coming forward about their unfair treatment through the claims processing system, along with the evidence of State Farm’s utilization of Duck Creek and FRISS, there is solid ground for a class-action lawsuit against State Farm. Sanford Heisler Sharp McKnight, along with their co-lead counsel, Fairmark Partners, LLP, filed their class action Complaint with the U.S. District Court for the Northern District of Illinois on behalf of Ms. Huskey, Ms. Wynn, and other Black homeowners in the Midwest. (26) The plaintiffs are citing the federal Fair Housing Act in alleging that the policy claims filed by Ms. Huskey and Ms. Wynn were treated unfairly by the automated claims processing system because of their race. (27) The complaint alleges that the reason behind the unfair bias in State Farm’s automated claims processing system has to do with its reliance on “(1) biometric data that function as proxies for race, such as physical appearance, genetics, and voice; (2) intrusive behavioral data that function as proxies for race, such as geolocation, social media presence, and browser search history; and (3) historical housing and claims data that are themselves infected with racial bias.” (28) Given the algorithm’s reliance on data that functions as a proxy for race, State Farm’s claims processing system unfairly identifies a higher number of claims submitted by Black homeowners as ‘high touch’ and, in turn, evaluates them with greater scrutiny compared to their white counterparts. (29) Due to the unfair nature of State Farm’s claims processing system, it can be alleged that State Farm is utilizing discriminatory practices and leaving a widespread impact on Black Homeowners, which is in direct violation of the Fair Housing Act. (30)


Although the Fair Housing Act has prohibited racial discrimination in housing since 1968, such discrimination continues today in more covert forms. One way that institutions and companies have continued to discriminate on the basis of race is through automation and data mining, including the use of machine-learning algorithms. (31) Unfortunately, these algorithms can easily develop biases by identifying patterns in the input data. Even if demographic data is excluded, algorithms can still use other variables as proxies for race to identify these patterns. (32) It is also known that State Farm uses Duck Creek Technologies, which incorporates FRISS software—a system that has been found to use racially discriminatory methods to identify policyholders’ risk scores. Furthermore, insurance companies’ claims processing systems are often opaque and lack sufficient regulatory oversight, meaning no one is ensuring that racial discrimination is not occurring within these algorithms. (33)


If State Farm’s claims processing system uses machine-learning algorithms with known discriminatory effects, then the company is perpetuating and worsening existing patterns of racial discrimination. (34) Since the Fair Housing Act prohibits racial discrimination in housing, State Farm may be found in direct violation of the Fair Housing Act. Regardless of whether the discrimination in State Farm’s claims processing is intentional, it still violates the Fair Housing Act, meaning something must be done to remedy this. Furthermore, addressing the algorithmic bias within State Farm’s system does not have to be complex or costly. It is a standard approach to test for bias through census tract or inferred demographics. (35) State Farm itself has also acknowledged that it is possible to identify undesired factors within the algorithm by teaching it not to consider those factors in setting a premium amount. (36) If State Farm not only has the means but also has shown that they are willing to remedy racial bias within their machine-learning algorithms, then they have the full capacity to find a way to stop the racial discrimination within their claims processing system. Additionally, Gina Morss Fischer, a State Farm spokeswoman, stated, “This suit does not reflect the values we hold at State Farm. State Farm is committed to a diverse and inclusive environment, where all customers and associates are treated with fairness, respect, and dignity.” (37) If these are truly the values that State Farm upholds as a company, they should have no objection to taking steps to eliminate racial bias within their claims processing system for the future.


Endnotes

  1. “State Farm Algorithm Bias Lawsuit.” Sanford Heisler Sharp McKnight, LLP, September 13, 2024. https://sanfordheisler.com/case/discrimination-harassment/state-farm-algorithm-bias-lawsuit/

  2. Ibid.

  3. Ibid.

  4. Ibid.

  5. Flitter, Emily. “Where State Farm Sees ‘a Lot of Fraud,’ Black Customers See Discrimination.” The New York Times, March 18, 2022. https://www.nytimes.com/2022/03/18/business/state-farm-fraud-black-customers.html

  6. Ibid. 

  7. Ibid. 

  8. Ibid. 

  9. Ibid.

  10. Ibid.

  11. Jacqueline Huskey, et al. v. State Farm Fire & Casualty Co., No. 22-cv-7014 (N.D. Ill. Dec. 13, 2022) (Class Action Complaint).

  12. Alafriz, Olivia, and Kaustuv Basu. “AI’s Racial Bias Claims Tested in Court as US Regulations Lag.” Bloomberg Law, February 7, 2025. https://news.bloomberglaw.com/artificial-intelligence/ais-racial-bias-claims-tested-in-court-as-us-regulations-lag

  13. “State Farm Algorithm Bias Lawsuit.” Sanford Heisler Sharp McKnight, LLP, September 13, 2024. https://sanfordheisler.com/case/discrimination-harassment/state-farm-algorithm-bias-lawsuit/

  14. Ibid.

  15. Ibid.

  16. Alafriz, Olivia, and Kaustuv Basu. “AI’s Racial Bias Claims Tested in Court as US Regulations Lag.” Bloomberg Law, February 7, 2025. https://news.bloomberglaw.com/artificial-intelligence/ais-racial-bias-claims-tested-in-court-as-us-regulations-lag

  17. Ibid.

  18. Jacqueline Huskey, et al. v. State Farm Fire & Casualty Co., No. 22-cv-7014 (N.D. Ill. Dec. 13, 2022) (Class Action Complaint).

  19. Ibid.

  20. Alafriz, Olivia, and Kaustuv Basu. “AI’s Racial Bias Claims Tested in Court as US Regulations Lag.” Bloomberg Law, February 7, 2025. https://news.bloomberglaw.com/artificial-intelligence/ais-racial-bias-claims-tested-in-court-as-us-regulations-lag

  21. Jonker, Alexandra, and Julie Rogers. “What Is Algorithmic Bias?” IBM, September 20, 2024. https://www.ibm.com/think/topics/algorithmic-bias

  22. Yong, Ed. “A Popular Algorithm Is No Better At Predicting Crimes Than Random People.” The Atlantic, January 17, 2018. https://www.theatlantic.com/technology/archive/2018/01/equivant-compas-algorithm/550646/

  23. Merlin, Chip. “State Farm Accused of Systemic Discrimination in Class Action Lawsuit.” Merlin Law Group, January 3, 2023. https://www.propertyinsurancecoveragelaw.com/blog/state-farm-accused-of-systemic-discrimination-in-class-action-lawsuit/#:~:text=Especially%20relevant%20here%2C%20State%20Farm,management%20and%20fraud%20detection%20tools

  24. Flitter, Emily. “Where State Farm Sees ‘a Lot of Fraud,’ Black Customers See Discrimination.” The New York Times, March 18, 2022. https://www.nytimes.com/2022/03/18/business/state-farm-fraud-black-customers.html

  25. Ibid. 

  26. Ibid. 

  27. “State Farm Algorithm Bias Lawsuit.” Sanford Heisler Sharp McKnight, LLP, September 13, 2024. https://sanfordheisler.com/case/discrimination-harassment/state-farm-algorithm-bias-lawsuit/

  28. Alafriz, Olivia, and Kaustuv Basu. “AI’s Racial Bias Claims Tested in Court as US Regulations Lag.” Bloomberg Law, February 7, 2025. https://news.bloomberglaw.com/artificial-intelligence/ais-racial-bias-claims-tested-in-court-as-us-regulations-lag

  29. Jacqueline Huskey, et al. v. State Farm Fire & Casualty Co., No. 22-cv-7014 (N.D. Ill. Dec. 13, 2022) (Class Action Complaint).

  30. Ibid.

  31. Ibid.

  32. Ibid.

  33. Ibid.

  34. Ibid.

  35. Ibid.

  36. Ibid.

  37. Flitter, Emily. “Where State Farm Sees ‘a Lot of Fraud,’ Black Customers See Discrimination.” The New York Times, March 18, 2022. https://www.nytimes.com/2022/03/18/business/state-farm-fraud-black-customers.html


 
 
 

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
  • Instagram
  • LinkedIn

Florida Undergraduate Law Review 2024 | University of Florida

All opinions expressed herein are those of individual authors and are not endorsed by the Florida Undergraduate Law Review. The Florida Undergraduate Law Review is a student-run organization and does not reflect the views of the University of Florida.

bottom of page