Technology

Kalmara suggests it can detect sexually transmitted diseases from images of genitals, which is a dangerous idea

[ad_1]

You come home with a Tinder date, and things are escalating. You don’t really know or trust this guy, and you don’t want to get an STD, so… now what?

Company called Kalmara It wants you to take a photo of a man’s penis, then uses its AI to tell you whether your partner is “clear” or not.

Let’s make something clear right away: You shouldn’t take a photo of someone’s genitals and scan it with an AI tool to decide whether or not you should have sex.

Calmara’s premise has more red flags than a bad first date, but it gets worse from there when you think about it The majority of STDs are asymptomatic. So, it’s possible your partner has an STD, but Kalmara will tell you he or she is safe. That’s why actual STD tests use blood and urine samples to detect infection, rather than a visual examination.

Other startups are addressing the need for STD testing in a more responsible way.

“With in vitro diagnostics, sensitivity and specificity are two key metrics that help us understand a test’s tendency to miss infections and false positives,” Daphne Chen, founder of TBD Health, told TechCrunch. “There’s always some level of fallibility, even with very stringent tests, but test manufacturers like Roche push their validation rates for a reason — so doctors can put the results in context.”

In the small print, Kalmara cautions that her findings should not be a substitute for medical advice. But its marketing suggests otherwise. Before TechCrunch reached out to Calmara, the title of its website was: “Calmara: Your BFF for Unprotected Sex” (it has since been updated to say “Safer Sex” instead). Promotional videoIt describes itself as “the perfect site for networking!”

Co-founder and CEO Mei-Ling Lu told TechCrunch that Kalmara was never intended to be a serious medical tool. “Calmara is a lifestyle product, not a medical app. It does not include any medical conditions or discussions within it, and no doctors are involved in the current Calmara experience. It is a free information service.”

“We are updating communications to better reflect our intentions at this time,” Lu added. “The obvious idea is to start a conversation regarding STD status and testing.”

Calmara is part of HeHealth, founded in 2019. Uses Calmara and HeHealth The same artificial intelligence, which says it’s 65-90% accurate. HeHealth is positioned as a first step in assessing sexual health; Hence, the platform helps users connect with partner clinics in their area to schedule an appointment for a physical and comprehensive examination.

HeHealth’s approach is more reassuring than Calmara’s, but that’s a low bar — and even then, there’s a giant red flag waving: data privacy.

“It’s nice to see that they’re offering an anonymous mode, where you don’t have to associate your photos with personally identifiable information,” Valentina Milanova, founder of tampon-based STD screening startup Daye, told TechCrunch. “However, this does not mean that their service has been de-identified or anonymised, as your images may still be traceable to your email or IP address.”

HeHealth and Calmara also claim to be HIPAA compliant, a system that protects patient privacy, because they use Amazon Web Services. This sounds reassuring, but in its privacy policy, Calmara writes that it shares user information with “service providers and partners who help operate the service, including data hosting, analytics, marketing, payment processing, and security.” It also does not specify whether these AI scans take place on your device or in the cloud, and if so, how long this data stays in the cloud, and for what purpose it is used. This is a bit vague to reassure users that their intimate photos are safe.

These security questions not only concern users, but also pose a risk to the company itself. What happens if a minor uses the website to check for STDs? Kalmara then ends up in possession of child sexual abuse material. Kalmara’s response to this moral and legal responsibility is to write in its terms of service that it prohibits the use of minors, but this defense would have no legal weight.

Kalmara represents the danger of overhyping the technology: it looks like a HeHealth publicity stunt to capitalize on excitement around AI, but in its actual implementation, it gives users a false sense of security about their sexual health. These consequences are serious.

“Sexual health is a difficult area to innovate in, and I can see where their intentions are noble,” Chen said. “I think they may be too quick to market an incomplete solution.”

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button