Search

Finding a Balance: How to use Sensitive Data Ethically - World - ReliefWeb

serongyu.blogspot.com

Esther has worked in violence prevention and response in humanitarian settings for a decade. She’s navigated ways to provide services to women and girls in active conflict zones and even during a pandemic. Even with her vast experience, and deep subject matter expertise in gender-based violence (GBV) response, there’s one conversation she dreads. It usually starts with a question: can you share your data?

Esther knows interventions should be evidence-based and that by measuring our work, we can learn invaluable lessons and identify gaps and opportunities that will improve services. She also knows that data on gender-based violence is highly sensitive1 and open to misinterpretation and misuse.

There is a way that we can help Esther though.

Women and girls “should not have to repeatedly prove that their experiences of oppression are real.”

Like any good practitioner would do, it’s important to understand and validate Esther’s concern. Why does the request to share sensitive data induce such a passionate response from GBV practitioners like her?

Consider first that the work of GBV case management comes from the social work field and is intended to help survivors after they’ve experienced a traumatic incident or act of violence. The subsequent conversion of “life experience into data entails a reduction of that experience”2 and can be perceived as a cold devaluation of a harrowing event. When others pull conclusions solely from this data, it can be jarring because it never feels like they have the whole “story.”

What can make this worse is that Esther gets this request frequently, and it is sometimes presented as a demand to justify that GBV work is needed, even though patriarchal systems are everywhere. Research upon research confirms this is true and is typically heightened in humanitarian settings where the usual support structures are broken down.3 Women and girls “should not have to repeatedly prove that their experiences of oppression are real.” 4

Crucially, Esther also knows that harm can follow when data is shared.

For example, if the data she shares shows violence was perpetrated by a government actor or government-related party to conflict, that can put survivors in real danger of retaliation.

There are also actors who – even with good intentions – can make situations worse. Consider if a humanitarian actor heard about a serious disclosure of violence and decided to “checkup” on the survivor. Their heart might be in the right place, but their actions can be incredibly dangerous. Showing up in a white organization-branded car in front of a survivor’s house could increase the chances of harm or actualized stigma, especially if the survivor is experiencing domestic violence.

Beyond that, Esther knows survivors have rights. Rights to determine how their data is used, rights to data deletion, rights to choose what to disclose, and rights to determine who can share this information. Yet, these rights are not meaningfully extended to displaced populations. Displaced individuals do not often make demands related to the right to determine how their data is used, so these rights often go ignored by the humanitarian community5

Their heart might be in the right place, but their actions can be incredibly dangerous.

So, how can Esther – knowing the risks – be responsible with the data entrusted to her? She can look for ways to make safe (aggregate, anonymised) information and then use that de-identified data to improve and inform her programming. Eliminating the need to use personally identifiable information to inform her program.

However, this requires more standardization in the sector. Just having standardized forms would lay the foundation for the development of ready-set, well-understood indicators on outcomes and outputs for programmes that could be scaled to multiple organizations.

This could be further aided by documentation and guidance to help dispel common points of misinterpretation and misuse by building out indices of potential, non-exhaustive, interpretations or contextual clues for what these indicators could mean.

If this were actualized, and Esther was asked to share her caseload numbers, she could easily do that with the indicators automatically generated in her information system and simultaneously supply potential meaning behind the numbers. Actors using the same platform (but storing data separately) would have a shared understanding of what results might mean and what other data points to triangulate for further interpretation.

One indicator, such as the number of recorded cases, could be paired with other indicators, such as average number of follow-up meetings per case, to help practitioners better understand the breadth and depth of GBV casework or the time and effort invested in supporting disclosure and providing GBV case management.

Esther could also look at outcome information, using validated questionnaires to determine the impact of GBV case management on a survivor’s psychosocial well-being and felt stigma. With this information, she would be able to better understand if her program is achieving the ultimate aim of GBV services: to help survivors heal and recover.

We are trying to resolve Esther’s dilemma. With funding from Elrha’s Humanitarian Innovation Fund (HIF), International Rescue Committee is contributing to the open-source, community-led work on Primero/GBVIMS+ (the GBV module of the Protection Related Information Management System) by building on top of the standardized GBV forms a way to collect outcome and output information that is automated and protected by role-based access, and also allows for the sharing of safe, actionable information. Adding potential interpretations to it all facilitates the sector moving forward by unveiling many of the different ways this information can be interpreted.

Having all of these features in a system governed by an inter-agency group that includes front-line service providers means it is scalable and useable by the entire sector, and, importantly, built by the practitioners who will use it most.

To stand behind the use of data in this sector, we have to commit to due diligence pertaining the experiences that a survivor has shared with us in their effort to get aid. Our job is to do better with data, in a responsible manner.

Take a look at the project profile to learn more about IRC’s work on this.

REFERENCES

  1. According to one well-used definition from ICRC, data is considered sensitive if “unauthorized access to or disclosure of which is likely to cause harm, such as discrimination, to persons such as the source of the information or other identifiable persons or groups, or adversely affect an organization’s capacity to carry out its activities or public perceptions of its character or activities.” ICRC, Professional Standards for Protection Work Carried Out by Humanitarian and Human Rights Actors in Armed Conflict and Other Situations of Violence(2018), p 9.
  2. D’Ignazio, Caterine, Klein, Laura F. 2020. Data Feminism, p 10.
  3. Inter-Agency Standing Committee. 2015. Guidelines for Integrating Gender-Based Violence Interventions in Humanitarian Action: Reducing risk, promoting resilience and aiding recovery. p 7.
  4. D’Ignazio, Caterine, Klein, Laura F. 2020. Data Feminism, p 72.
  5. The tide on this may be turning, as more refugee populations (particularly in Lebanon) are raising this issue up the flagpole and foregoing access to assistance in order to protect how their data is used. Ozkul, Derya. Refugee recognition: not always sought. Forced Migration Review. November 2020. https://www.fmreview.org/recognising-refugees/ozkul

Let's block ads! (Why?)



"use" - Google News
March 09, 2021 at 03:22PM
https://ift.tt/3brLehy

Finding a Balance: How to use Sensitive Data Ethically - World - ReliefWeb
"use" - Google News
https://ift.tt/2P05tHQ
https://ift.tt/2YCP29R

Bagikan Berita Ini

0 Response to "Finding a Balance: How to use Sensitive Data Ethically - World - ReliefWeb"

Post a Comment

Powered by Blogger.