Skip to main content
WiDSTORY | February 17, 2026

Datathon Spotlight: Applying Judicial Scrutiny to Code with Valentina Torres da Silva

Datathon Spotlight: Valentina Torres da Silva

WiDS technical curriculum developer and datathon architect whose instinct for catching hidden bias was shaped by two judges.

Justice at the Dinner Table

In Brazil, Valentina Torres da Silva’s family maintained a daily ritual of eating together. Her parents, both judges, would recount the cases they ruled on — the complex human factors and the impact of evidence in their decision-making.

One case stayed with her, a non-isolated case that is all too familiar.

A Black man was stopped at a barbershop with his son on his lap. The officers, without basis for the stop, directed him to his home, claimed to find drugs, and charged him with trafficking. Her father, the presiding judge, exonerated him — ruling that the officers lacked cause for the stop and had no consent to enter his property.

A Brazilian study recorded 90 wrongful arrests between 2012 and 2020; 83% involved Black individuals. A 2016 U.S. study of 1,136 cities found comparable patterns.

These statistics expose how technology mirrors social inequities.
Bias is not merely human error but a structural design where injustices are reinforced.

Those dinner-table conversations shaped how she reads and thinks about data — and how systems can produce surface results that appear neutral when reality is rarely simple.

That instinct is why, when the WiDS Global Datathon pipeline produced near-perfect scores during early testing, Torres dug deeper.

Rethinking the Question

Datathon development is challenging because it serves a global audience. Level setting becomes difficult when participants range from master Kagglers to new entrants seeking their first industry experience.

The original challenge was framed around a simple premise:

Will a wildfire “hit” a given asset based on perimeter updates?

Early models scored well — some impressively so. But Torres noticed the data told a different story.

Most fires either lacked enough consecutive perimeter observations to create usable lead time, or they registered hits almost immediately. There was no meaningful window for prediction. Instead, distance-based features were acting as proxies for future information, creating a leakage problem.

“Even seemingly harmless features became problematic once we examined how they correlated with later outcomes,” she says.
“I traced the failure back to how the target was defined.”

Rather than accept strong leaderboard results, she proposed reframing the task entirely — shifting from binary hit prediction to a time-to-event formulation that required models to reason about progression rather than proximity.

The redesign required:

  • Redefining prediction targets
  • Setting explicit time horizons
  • Building evaluation logic that handled censoring and uncertainty

It also meant working with a smaller, validated dataset. “That moment changed how I think about service,” she says. “It showed me that service often means redesigning the question before anyone ever writes a model.”

Building Tools That Measure Real Harm

Torres’ contributions extend beyond Datathon architecture.

Through WiDS Worldwide, she built two open-source notebooks that combine real-time economic APIs with spatial-join models to measure wildfire disruption. The tools track impacts on small businesses, tourism, and evacuation zones across U.S. regions — mapping the social and financial ripple effects of disasters beyond burned acreage.

She is also part of an ongoing NOAA-funded research project developing fine-tuned river masks and ice-detection models using VIIRS and Landsat satellite data. The work involves deriving sub-pixel water fractions and training classifiers for narrow-river ice conditions to support communities facing ice-jam flooding and climate-driven hydrologic instability.

Torres follows a simple principle:

Data science should quantify harm so it can be mitigated — not obscured.

Across domains — wildfires, river ice, legal systems — she asks how we can build systems with integrity.

When facing ambiguity, she anchors her process by auditing the data and establishing a baseline early.

“Baselines act as a stress test for the problem and allow me to separate signal from noise.”

She credits the team environment for enabling that rigor.

“Bryan Muñoz set a high standard for technical integrity and decision-making,” she says.

“He consistently prioritized operational constraints and the real lives affected by the work, which kept the challenge grounded.”

What’s Next: Engineering Accountability

Torres is currently targeting roles at the intersection of applied engineering and model governance, including Applied Data Scientist, Machine Learning Engineer, or Research Engineer positions.

Her independent research focuses on the technical architecture required to detect racial bias in felony-murder prosecutions. Between 2022 and 2024, there were over 10,000 felony murder cases in the United States, many involving defendants who neither killed nor intended to kill. In St. Louis alone, the Innocence Project notes that 100% of felony murder convictions over the last decade were Black defendants.

Torres envisions an NLP pipeline that goes beyond surface-level statistics. She aims to cluster cases by factual similarity rather than charge, isolating high-risk zones for disparity. By creating a quantitative jurisprudence-bias index, she seeks to demonstrate when defendants with identical fact patterns receive drastically different outcomes based on race.

It is a direct evolution of those family dinner-table conversations — using code to ensure the systems we build tell the truth.

Get in Touch with Valentina

  • Current roles: Computer Science student & Teaching Assistant at Fairleigh Dickinson University | Technical Curriculum Developer, Women in Data Science (WiDS) Worldwide 
  • Background: Division I athlete · Battle of the Brains national finalist (3rd place) · eBay Community Impact Scholar
  • Focus areas: AI Governance | NLP in legal settings | Bias detection in judicial systems
  • LinkedIn: Valentina Torres da Silva | Portfolio: valentinasporfolio.com

— Bryan Muñoz, Director of AI Learning and Datathon, WiDS Worldwide