A photo- and knowledge-sharing social networking app has become popular among medical professionals eager to discuss medical cases and broaden the scope of their collaboration. Although it might sound like a platform rife with privacy landmines, the Toronto-based company Figure 1 has taken steps to ensure the app doesn’t allow activity that could compromise protected health information (PHI). The platform may even support privacy by ensuring that medical collaboration and knowledge sharing takes place out in the open and in a controlled setting.
Figure 1 claims more than 2.5 million users in 190 countries who rely on the platform to expand and share their knowledge in a variety of medical specialties including dentistry, gastroenterology, obstetrics, and even psychiatry.
Here’s how the app works. Anyone can sign up for an account. However, to post cases (with or without photos) you must be “Verified.” Figure 1 gives this credential to licensed medical professionals once its team “has verified your status as a licensed healthcare professional or healthcare student.”
Once verified, you can post medical cases with photos that have had all identifiers, such as the face, tattoos, or protected health information (PHI) on scans, removed. Figure 1 provides in-app tools, including an automatic face block, that will remove identifiers.
The Figure 1 team examines each case to ensure it doesn’t convey any identifying data and provides educational value before approving it for posting.
Once the case is posted, verified healthcare professionals can discuss the case by contributing comments. Unverified users cannot. Comments often start coming in immediately after an image is displayed, but practitioners can also use Figure 1 paging to quickly receive feedback about a case from other verified users.
A Figure 1 video demonstrating the paging feature shows a physician in Boston talking with a patient. He takes a picture of a skin condition with his phone, posts it to the Figure 1 app, types in a question, and selects the “page specialists” option. The video then shows a physician in London receiving an alert on her phone. She examines the image and sends back a suggested diagnosis. A physician in Mumbai also contributes his opinion.
Through the paging feature and the comments section, medical practitioners from all over the world can weigh in on a medical case.
As Figure 1 states on their website:
|
The Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule is the US Federal law that gives patients rights over their health information and sets the rules that determine who can look at or receive PHI, whether it’s written, electronic, or verbal. Figure 1’s main strategy for avoiding HIPAA and other privacy violations is to make it a non-issue. As stated in their FAQs:
|
But is it possible for Figure 1 to monitor postings closely enough to prevent accidental leakage of PHI?
Figure 1’s initial review process is only the first step. Once a user posts a case, anyone who sees a possible privacy breach or activity that’s inconsiderate to the patient can quickly report it within the app. When a case is reported, it’s immediately pulled and reviewed. Cases that are infringements of Figure 1’s rules are securely destroyed.
Even if an image can’t be identified as being of a particular individual, we’re still talking about an image of a patient in a vulnerable situation being displayed to thousands, even millions, of users worldwide. For this reason, Figure 1 encourages practitioners to use in-app patient consent forms that are tailored by jurisdiction and currently offered in more than 20 languages.
Figure 1 is well aware that even with patient consent forms, there might still be concerns about patient privacy and comfort. Imagine, for example, that you gave your physician permission to use your photo and later discover insensitive comments about it in the discussion.
Figure 1's Community Guidelines emphasize the importance of sensitivity and request that users “be kind” and “if you wouldn’t say something in front of a patient, please don’t say it here.” The guidelines also lay out expectations for being professional - for example, only posting cases that come from direct clinical experience (i.e., no posting personal conditions), refraining from sharing promotional content, and supporting posts with scientific data.
At a time when there is growing unease about the vulnerability of personal data, Figure 1 has launched a conversation about how sensitive information can be handled and shared responsibly so it can contribute to the collective knowledge of the global medical community.
As a steadfast proponent for greater security and privacy, and as a business with many customers in healthcare professions, Hushmail will watch Figure 1 with great interest as it continues to develop this still relatively new “social” method of large-scale medical collaboration.
|