Flagged by the Algorithm

Why Klarna Thought I’m a Fraudster

  • Published:
  • Author: Theresa Adamietz
  • Category: Deep Dive
Table of Contents
    Flagged by the algorithm, Deep Dive, Alexander Thamm GmbH
    Alexander Thamm GmbH 2025

    Flexible payment company Klarna flagged me for allegedly failing to pay on time for an online order. I had been told that a glitch in their platform prevented me from paying, and then their algorithms sent my open bill straight to a collection agency, demanding nearly double the original amount. What followed was an administrative hassle and a personal fight against flawed and opaque automation. Here is my story.

    From Pay Later to Pay Twice

    I loved Klarna’s pay-later option. Especially in the weeks before Christmas, when expenses for gifts, decorations, and endless cookie dough add up, it was a huge relief to order those cute pants and the boots to match for New Year’s without having to pay for them until the holiday madness was over. Little did I know I was treating myself to a rage fit over a collection letter instead of just a new outfit.

    On the due date in January, I initiate the payment in the Klarna app as usual. The next day, an email arrives: payment failed. I don’t think much of it and try again. Payment failed again. Confused, I hop on Klarna’s customer support chat. To my relief, they tell me it’s a technical glitch on their end blocking payments and instruct me not to retry but to wait until they get back to me once it’s fixed. I do as I’m told.

    A week later, a collection letter hits my inbox, demanding nearly double the original amount. Surely this is a bad joke? Furious, I call Klarna. After an eternity on hold, an unfriendly agent tells me there is nothing they can do. I should have just paid on time. Bewildered, I call the collection agency. Another unfriendly agent, but at least he explains how to file an objection.

    What follows are weeks of back and forth with the collection agency, trying to convince them this was all a mistake. As a last resort, I cite a clause in the German Civil Code (BGB) stating that if you can’t pay on time for reasons beyond your control, you’re not at fault. Bingo. Still insisting they acted correctly, they offer to settle if I pay what I owe to Klarna plus a €10 service fee. Relieved to put an end to it, I agree and pay immediately.

    Although things are settled now, I feel defeated. This was never just about the ridiculous amount of money they were demanding. My entire credit score went downhill, I’m blocked on other platforms like PayPal, and I still can’t wrap my head around why Klarna forwarded my bill to a collection agency in the first place. The more I think about it, the more I suspect deeper issues.

    A Guide to GDPR for Algorithmic Accountability

    Eager to find out, I decide to exercise my GDPR rights. Article 15 provides individuals the right to access all personal data Klarna has collected about them. I also ask for information on any automated decisions related to my account, specifically the forwarding of my bill to the collection agency. Article 13 requires companies to provide such information, including an explanation of the logic and criteria behind any automated decisions, and confirmation of whether a human was involved in reviewing or approving them. Finally, Article 22 grants me the right not to be subjected to automated decision-making if it produces significant legal effects. Which I assume it did: my credit score was damaged presumably because an algorithm decided to escalate my bill.

    Joana, the first customer agent working my case, confirms receipt of my data request but fails to provide answers to my other questions. Instead, she sends a generic response explaining that “orders with overdue payments are forwarded to a debt collection agency in line with standard procedures” and that “although automated systems are used, human oversight is always in place to ensure fairness and accuracy in the decision-making processes.”

    I’m not letting them off the hook so easily. Now they’ve admitted to using automated systems that escalate bills, I feel like I’m getting closer to an explanation. Once again, with the GDPR on my side, I request details about the process behind forwarding my bill to the collection agency.

    This time, I have a response within the hour. Now it’s Amarilda, confirming what I was already suspecting: “At Klarna, we use automated systems to make decisions about forwarding claims to debt collection agencies. These decisions are based on various factors, including your payment history and outstanding balances. [They] are made automatically and cannot be overridden.”

    I read it again, hardly believing what I’m seeing. I reply by requesting a manual review of the forwarding of my bill (Article 22). Immediately, I get a response from Nevila. But my initial sense of progress evaporates instantly. She simply tells me to contact the collection agency with my questions. No mention of any review.

    My rage returns. But now I have proof: my bill was forwarded to the collection agency based solely on automated decisions, with zero human oversight. On top of that, they violated numerous other GDPR provisions – most notably my right to an explanation of any automated logics and to a manual review.

    Finally, I file an official complaint with the responsible Data Protection Authority on May 23rd. It is my final attempt to bring clarity and accountability to the situation. Their investigations are still ongoing.

    When I inform Klarna about the complaint, Marsid replies. She simply copies and pastes the part of my email where I mention the complaint, then wishes me a good day. It feels like mockery.

    Understanding the Bigger Picture

    Knowing how this happened doesn’t undo the damage, but it helps me understand how my case fits into a broader pattern of opaque automation.

    I spoke to Dr. Tim Kraft, a lawyer specializing in data protection and media law at Lausen Rechtsanwälte. He points out that while the GDPR generally prohibits automated decision-making, there are exceptions. One is consent: if you accept Klarna’s privacy policy, and it includes information about automated decision-making, you’re effectively agreeing to it, along with any consequences. Klarna’s privacy policy – the one I blindly agreed to upon creating an account – does mention profiling and automated decisions for things like fraud prevention or credit checks, but not for debt collection processes. So, while I may have consented to automation in other areas, I did not agree to having my bill forwarded to a collection agency by an algorithm.

    On that basis, Kraft concludes Klarna likely violated the GDPR, specifically Article 22, which prohibits decisions based solely on automated processing that significantly affect individuals: “Forwarding a bill to a collection agency is a decision that significantly affects the customer. If, according to Klarna’s own statements, this decision is based solely on automated processing and it is even impossible for a human to intervene or override that decision, this applies all the more.”

    But the problem with automated algorithms is not just the lack of human oversight. As the saying goes: garbage in, garbage out. If an algorithm is fed biased or flawed data, its output will inevitably reflect those same flaws. Klarna confirmed that its decision was based on payment history data and outstanding balances. And according to its own privacy policy, Klarna also draws on external data from credit agencies, like SCHUFA in Germany.

    Kraft notes that “SCHUFA relies on an individual’s home address to create a credit score, in the absence of any other data relating to an individual like their payment history or recent credits.” In other words: The simple fact that I may live in a socially disadvantaged neighbourhood (however that is classified) may have lowered my score based on arbitrary data, despite my flawless payment record.

    In this context, the European Court of Justice ruled in 2023 that Article 22 also applies when a third party relies heavily on an automatically generated credit score to provide or deny services. So Klarna’s algorithm, drawing on my SCHUFA score for several decisions, may have been predisposed to escalate the issue, regardless of my actual payment behavior. Note that this is a speculative argument: Klarna’s privacy policy notes that it relies on SCHUFA data to decide, for instance, whether to grant a user credit. However, it does not explicitly disclose that SCHUFA data is also used for bill escalation.

    Unfortunately, my case is not an outlier. Similar opaque systems have caused harm elsewhere. Take the French welfare system, which has faced heavy criticism for using an algorithm that assigns “suspicion scores” to beneficiaries, flagging them as suspicious the closer the score is to one. Or the UK Department for Work and Pensions, which uses algorithms to flag credit claims for potential fraud. An analysis revealed that the system disproportionately targeted people based on age, disability, marital status, and nationality.

    Taking Back Control From the Algorithm

    With my background in AI Governance and a solid understanding of the GDPR, I had the tools to fight back. But what about those who don’t? “Data protection rights are personal rights. Thus, only the person affected by a breach of data protection can enforce their rights,” Kraft points out. This sounds sobering at first, but he continues: “Exercising your rights is easy, and anyone can do it. A request does not have to be put in legalese and can be issued in common language. In fact, we see a lot of our clients facing an increasing amount of such requests, so it appears that people become more aware of their rights.”

    Still, if you’re unsure how to proceed, you can turn to consumer protection organizations like the “Vebraucherzentrale” in Germany or “noyb”, working across Europe. They can help by taking on your case and filing complaints on your behalf, and have secured numerous victories in recent years.

    Author

    Theresa Adamietz

    Theresa joined [at] in early 2024 as Content & Communication Manager. She oversees all content-related activities, from identifying relevant topics for the target audience to writing blog posts and whitepapers and supporting our experts in developing their own content. She is also a freelance journalist specializing in AI Governance technology ethics, and data protection. Certified in both AI governance and GDPR auditing, Theresa additionally supports [at]’s projects on AI strategy, compliance, and risk management.

    X

    Cookie Consent

    This website uses necessary cookies to ensure the operation of the website. An analysis of user behavior by third parties does not take place. Detailed information on the use of cookies can be found in our privacy policy.