How Data Science Is Helping to Detect Child Abuse

Image Source: pixabay.com

There is no good way to begin a conversation about child abuse or neglect. It is a sad and oftentimes sickening topic. But the fact of the matter is it exists in our world today and frequently goes unnoticed or unreported, leaving many children and young adults to suffer. Of the nearly 3.6 million events that do get reported, there are rarely enough resources to go around for thorough follow up investigations.

This means that at least some objective decisions have to be made by professionals in the field. These individuals must assess reports, review, and ultimately decide which cases to prioritize for investigation on a higher level, which ones are probably nothing, and which ones are worrisome but don’t quite meet the definition of abuse or neglect.

Overall, it can be a challenging job that wears on a person, and one that many think is highly based on subjective information and bias. Because of this, numerous data researchers have worked to develop risk assessment models that can help these professionals discover hidden patterns and/or biases and make more informed decisions.

Defining the Work Space

Defining exactly what falls under the category of child abuse or neglect can be a surprisingly sticky topic. Broadly, it means anything that causes lasting physical or mental harm to children and young adults or negligence that could potentially harm or threaten a child’s wellbeing. What exactly constitutes child abuse can depend upon the state you live in.

Ultimately, a lot of the defining aspects are gray. Does spanking count as physical abuse or is the line drawn when it becomes hitting with a closed fist? Likewise, are parents negligent if they must leave their kids home alone to go to work? Does living in poverty automatically make people bad parents because there may not always be enough food in the house?

Sadly, many of these questions have become more prevalent during the COVID-19 pandemic. With many families cooped up at home together, not going to work or to school, kids who live in violent households are more likely to be abused and fewer people are seeing the children regularly to observe and report signs of abuse. Unfortunately, limited statistical data is available at this point, but with so many people having lost jobs, especially amongst families that may have already been teetering on the edge of poverty, situations that could be defined as neglectful are thought to be exploding in prevalence.

Identifying Patterns

The idea of using data science to help determine the risk of abuse and neglect that many children face can be seen by many as a powerful means of tackling a difficult issue. Much like many other aspects of our world today — data has become a very useful and highly valued commodity that can work to help us understand some of the deeper or hidden patterns.

That is exactly what has been incorporated in Allegheny County, Pennsylvania. The algorithm that was developed assesses the “risk factor” for each maltreatment allegation that is made in the county. The system takes into account several factors including mental health and drug treatment services, criminal histories, past calls, and more. All of this ultimately adds up to helping employees take into account how at-risk a child may actually be and whether or not the case will be prioritized for further investigation. Generalized reports indicate that the program works well, but that even it can ‘learn’ to make decisions based on bias.

In this situation, the goal of the program isn’t to take all of the power away from the employees but rather to work as a tool to help them make a sounder decision. Some risk factors will automatically be referred to a case handler for further investigation, but most will allow for the case assessor to weigh the algorithm with research and other information that may not be well accounted for in the model. If there are significant differences between the case assessor’s conclusion and the model’s conclusion, a supervisor reviews the information and makes the final decision.

Dealing with Bias

Of course, using a model such as this one can be a double-edged sword. Certain things are difficult to account for. For instance, if parents take financial advantage of their children by using their clean Social SecuNumbers to open credit cards and other credit accounts, their children can then be saddled with poor credit they did not create. Because this type of financial abuse is difficult to prove, it can be difficult for young adults to repair their credit later in life or hold their parents or guardians responsible. And the algorithm may struggle to recognize this as abuse. But it can also take out some of the bias that many of the call takers can inadvertently have when they are assessing the risk level of certain cases. A difference in conclusion from the model can force them to take a second look at the hard facts.

But there is a flip side. All models are created based on some level of personal decisions from the algorithm designer — those decisions can carry into biases that get carried into the model’s outputs. For example, the Alleghany County model has come under significant scrutiny for being biased against people who are living in poverty and using government programs to get by.

Because some of the major components the model uses to assess risk are public data, statistically, more people who rely on government programs are likely to be flagged regardless of how serious a call that comes in appears. Whereas families in the upper and middle class with private health insurance, drug treatment centers, and food security may be less likely to be picked up.

***

Big data can play a profound role in our lives and has the potential to be a powerful tool in helping identify and address child abuse cases. The available models can aid caseworkers in prioritizing risk assessments for further investigation and make a difference in the lives of children that are facing unacceptable situations. Being aware of and working to address any biases in models is an ongoing issue and those that aid in child abuse detection are no exception. Ultimately, if used correctly, this can be a powerful tool.

About Author

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *

624 Views