Using and Interpreting AONES Data

During the development of AONES, preliminary evaluation activities were conducted, in the form of tabletop exercises with groups of potential users. The goal of these sessions was to gather feedback on its acceptability, usability, and perceived validity.

Two tabletop exercises were conducted, with different groups of participants:

  • Public health professionals from the AONES project team representing partner health agencies,
  • Individuals with lived and living experiences.

Both sessions were facilitated by an evaluator independent from the AONES project team and followed a consistent format. First, there was an overview of the tool, including a brief demonstration of the AONES dashboard. Then there was a facilitated group discussion using the following questions to elicit feedback:

  • Who do you think will use the tool? How might they use it?
  • This tool vs. reality: how does the data provided compare to what’s happening in real life?
  • Benefits of the tool: what does the tool add?
  • Biases or limitations of the tool: what does this tool miss? What are the limitations? What biases impact what's included in the tool?

The transcripts of both tabletop sessions were then analyzed thematically.

Key Themes Identified

  • Real-Time Surveillance: Public health professionals lack comprehensive real-time surveillance systems for opioid-related data. The tool's ability to provide immediate insights was seen as valuable, with the potential to enable quicker response times and more informed decision-making.
  • Complementary Data Source: AONES data can augment or support other sources of information, offering public health professionals a more holistic view of opioid-related incidents and drug supply geographical movement.
  • Useful Features: The "Recent Alerts" and "Emerging Themes" features were highlighted as particularly valuable, with the potential to identify emerging issues or trends.

  • Accurate and Timely: Participants generally found the data reflective of their experiences and knowledge. The tool seems effective in identifying incidents quickly, sometimes ahead of other public health alert systems. However, media coverage can be sporadic and influenced by external factors.
  • Consider Media Influence: Participants expressed concern about the potential for misleading headlines and recommended encouraging users to explore full articles. The accuracy of the data may be vulnerable to fluctuations in media cycles.

  • AI Learning: Concerns were raised about the AI's ability to detect new or emerging drug contamination events. Ongoing data validation and adjustments to AI methods are necessary to maintain and improve accuracy.
  • Bias from Media Sources: The tool's reliance on media sources introduces bias, potentially underrepresenting opioid-related incidents that may be underreported or misrepresented in media articles. Additionally, cultural and law enforcement biases should be considered when interpreting the data.
  • Potential Misinterpretation: Sensationalized headlines may lead to misrepresentation and misinterpretation. Users should be encouraged to access full articles to reduce this risk.
  • Unintended Consequences: There is a risk that misinformation could contribute to the stigmatization of individuals, populations, or geographical areas. Managing public perception and providing accessible training resources is essential.

  • Ease of Use: The tool was considered user-friendly overall, but participants emphasized the need for clear guidance on data interpretation and understanding the tool's limitations.
  • Mobile Accessibility: Many participants expressed interest in a mobile-compatible version of the tool to improve accessibility to the tool.

  • Hosting and Maintenance: The long-term location for hosting the tool needs to be considered, although the immediate option is the KFLAPHI platform alongside other public health tools. Similarly, resource allocation for ongoing maintenance needs to be identified.
  • Public Access and Communication: Clear communication strategies are necessary to manage user expectations and support appropriate data interpretation when AONES is launched. Transparency about the tool's limitations is essential to prevent misinterpretation.