The Social Network Where Doctors Swap Gross Pics of Patients

Health care has its own social network, where users share and discuss everything from compound fractures to gigantic tumors.
Image may contain XRay Ct Scan and Medical Imaging XRay Film

Time was, seeing a huge gash in somebody's foot would have sent me clicking away in a hurry. Especially if that gash contained a massive, gooey cyst. But after several weeks regularly browsing Figure 1—a photo sharing app for health care professionals—my skin is thicker. I linger for a second, check out the comments, then keep scrolling. Past compound fractures, degloved limbs, lesions, rashes, warts, and tumors of all kinds.

This app is not meant for people like me, those who like to play chicken with their gag reflex. You can tell because the captions for these images do not say "Check out this crazy s%&*!," "Gross," or even "NSFL." A typical entry reads like this:

large intraosseus cyst encountered during a chevron bunionectomy. It was packed with DMB and cortical-cancellous allograft.

Figure 1 is for doctors, nurses, EMTs, and the rest of the professionally unsqueamish to share the latest morbidities from their shifts. Sure, some of the pictures are straight up medical oddities. But just as often, users post because they are stumped and looking for a 2nd, 3rd, 4th, nth opinion. Or because they want to stump members of their community with little diagnostic quizzes. Compared to other social media, the app's user base is small; it just surpassed 500,000 users last week. But as Figure 1 grows, it has the potential to become global health's central nervous system—improving diagnostics, care, and treatment for non-users everywhere.

"I know this is something you hear founders say all the time, but I really feel like our app is doing something important." says Josh Landy, Figure 1 co-founder and practicing physician in Toronto. He's definitely right on the first count. And actually, Landy does make a pretty good case for Figure 1's role in improving healthcare.

Exhibit A: Emily Nayar, a physician assistant in rural Oklahoma. "I float between 10 different facilities, and in many I'm the only trained health care provider in the building," she says. She's also a Figure 1 addict, using it as a way to connect with colleagues, learn new things, and get help with her diagnoses.

A few months ago, she was working in one of those rural ERs, and in came an older gentleman whose doctor had diagnosed him with shingles. "He had been put on treatment and was taking it, but here it was four days later and he was worse," she says. In addition to his rash, he had a horrible headache and a fever of 103. Headaches and fevers are more like flu symptoms.

In the previous month, Nayar had seen several cases of some more serious shingles variants on Figure 1. So rather than giving the man flu meds and prescribing bed rest, she figured that he might have a more serious case of shingles. A lit search led her to shingles meningitis—a version of the disease that gets into a patient's blood and brain.

"I was able to get a lumbar puncture on this gentleman, and he was admitted to the ICU for a couple days," she says. Without the knowledge she got from Figure 1, Nayar says it's likely she would have sent the man home with more medication, and he probably would have died.

Exhibit B: The healthcare community has been struggling for years to design digital tools for learning and diagnostics that people will actually use. Look through the app store, and you'll see dozens of reference guides, diagnostic aids, and so on.

Mostly, these use either a smart search algorithm, like Google's, or require users to fill out troubleshooting decision trees: Where is the rash? What color is the rash? How big is the rash? "But for something like that, you kind of have to know what you are looking for," says Nayar. That's about as useful as thumbing through a stack of medical encyclopedias: The answer is probably in there, but you might not find it in time.

True, health care professionals should always be reading and learning. But a lot of that learning is social. Figure 1 is like having 500,000 colleagues at elbow-nudging distance: "Hey y'all, waddaya think of this?"

Figure 1

Those examples indicate that Figure 1 could be much more than just the "Instagram for Doctors," a label media types slapped onto the app in its early days. Please. Leave the tortured tech metaphors to the professionals.

Instagram is what most people use when they need a distraction from work. Figure 1 lets health professionals—not just doctors—immerse themselves in the stuff. Figure 1 is educational, engaging, and privacy-obsessed. I'd probably describe Figure 1 as a Pinterest-inspired version of Epernicus mixed with Doximity and a dash of Diaspora. But if you're at a party, I guess "Instagram for Doctors" works fine.

In all seriousness, Figure 1 is pretty impressive in its ability to get people to engage with medical information in new, educational ways. Most of the app's post fall into one of three categories. One is people asking for help with a diagnosis. The next is people playing stump the chump, with little quizzes. And last are people showing off something crazy. Landy says the breakdown is pretty even, about 33 percent in each category.

How do you keep those posts, especially the ones in the last category, from deteriorating into Ebaumsworld for Mobile? Anybody can make a profile and browse on Figure 1—even norms like me. But only healthcare professionals can comment, and that usually keeps the discourse focused and professional. (Plus, Landy says only about 10 percent of Figure 1's users are not medical professionals.)

The app is also heavily moderated. Nine of Figure 1's 25 employees (including Landy) make sure every image has some educational content. "Early on, we had what we would call scene of the accident images: Bodies poking out of vehicles or other things you might find upon arriving at some disaster," he says. So the app made a rule that an image would be blocked if it didn't pose some kind of medical question. "You'd have to try pretty hard to come up with a question for some accident picture that wasn't just, 'Wow how do you get that person out of there!?!'"

And finally, if you're worried about your own gnarly ER visit showing up on Figure 1, the app is very careful about patient privacy. Every time anyone uploads an image, the first thing they do is fill out a consent form. Figure 1 has an algorithm that automatically obscures faces, and tools that let the user erase any pixels containing names, dates, or any other identifying details. "This makes the images less attractive, but I'm OK with that because it's for protecting privacy," says Landy.

Figure 1 also strips away all the metadata before the picture gets uploaded. "The best way to keep a secret is not to know it," he says. And if anything remains after all that, the mods are there to catch it.

No data collection, and so far, no ads. So how does Figure 1 make money? "Right now we don't have a business plan," says Landy. He and his co-founders have secured several million dollars in funding, and are focused on building their community and making the app work as well as it can. In other words, first they'll become the central nervous system—sustenance can wait.

When pressed, Landy listed a few ways the company might become profitable. Some companies have expressed interest in using Figure 1 as a way to scout for talented, intelligent new employees. The company could also license out its medical imagery for more formal educational purposes. Or it could develop its own version of sponsored content, allowing companies to suggest devices or treatments for various conditions.

But the most natural route forward might be to turn Figure 1 into a version of its current self: an educational and diagnostic help platform that people will actually pay for. In April, the app introduced Paging, which lets users with a need for urgent diagnostic help send notifications to relevant specialists.

And Landy says they have plans to introduce other features in the coming months. One is cleaning up how their search tool parses requests. "For instance, if a person searches for 'lupus,' they should probably get images with comments like 'Does this look like lupus?' But should they also get ones that say 'That does not look like lupus.'."

If Figure 1 was designed for normals like me, I'd suggest they come up with a filter. I can take seeping pustules, rotted, diabetic toes, and cracked tongue lesions by the terabyte. But a single anal tumor and I tap out.