APA Labs launches resource to guide clinicians, health systems and the public to evidence-based digital mental health tools

APA Labs’ Digital Badge Solutions Library supports clinicians, health systems, and users in search of these tools

Washington — To help health care providers and users searching for digital mental health products they can trust, APA Labs has launched the Digital Badge Solutions Library, a resource of digital mental and behavioral health technologies that have been awarded an APA Labs Digital Badge.


APA Labs, a unit of American Psychological Association Services, Inc., developed its Digital Badge program in partnership with ORCHA, a leader in digital health product evaluations, to offer independent, evidence-based evaluation of digital mental and behavioral health technologies. The badge is awarded to technologies that submit for evaluation and meet proprietary criteria addressing regulation and safety, data protection and privacy, usability and accessibility. Evaluations of these criteria are grounded in scientific principles and regulatory alignment, based on a proprietary framework developed by APA Labs and intended to highlight best practices in a largely unregulated market.


“Navigating the vast landscape of digital health tools can be overwhelming,” said Chris Mosunic, PhD, Chief Clinical Officer at Calm, one of the first technologies featured in the Digital Solutions Library. “The Digital Badge program brings incredible value to the space by providing a trusted framework for safety, ethics and evidence-based practice. As an original supporter we’re proud to display a badge that helps people make informed, confident choices about digital mental health.


“Digital mental health apps are exploding to meet rising demand, yet care providers are often left stuck figuring out which tools to recommend,” said Raj Amin, CEO and Co-Founder at Arcade Therapeutics, the developer of the StarStarter for Anxiety app. “Earning the APA Labs Digital Badge helps provide that clarity—giving care providers confidence that our app has been evaluated for safety, ethics and evidence-based use and can serve as a meaningful complement to care.” 


The Digital Badge Solutions Library provides a searchable collection of technologies that have earned the APA Labs Digital Badge. As the library grows, it will support clinicians, health care systems, payers, funders and the general public in exploring, recommending and integrating digital mental and behavioral health tools. 


“Technology can significantly expand access to mental health support—but only when it’s grounded in scientific research and responsible design,” said Tanya Carlson, managing director, APA Labs. “The Digital Badge Solutions Library is designed to cut through a crowded and often confusing marketplace, offering trusted, independent guidance rooted in evidence, ethics and real world use.” 

To meet demand for independent evaluation, the Digital Badge Solutions Library launched with an initial cohort of early adopters. As more companies move through evaluation, the library will expand rapidly—helping shape shared expectations and best practices in a continuously evolving space. 


More information about the APA Labs Digital Badge Program and the Digital Badge Solutions Library is available on the APA Labs website.


About APA Labs 

APA Labs is a unit of APA Services, Inc. (APASI), the companion professional association of the American Psychological Association (APA). APA Labs supports the work of APASI through creative, collaborative, psychology-centered projects, programs, and solutions. 


Share this insight:

Read more insights from APA Labs

March 19, 2026
Synthetic relationships are no longer experimental. AI-enabled companion tools, emotional support chatbots, and conversational agents are increasingly positioned as substitutes for social connection. For many users, these tools are filling a very real void — addressing the fundamental human need for connection, affirmation, and belonging. But as adoption grows, so do the risks. A recent APA Monitor article by Dr. Efua Andoh flags what the research is beginning to show: excessive reliance on AI companion tools may worsen loneliness over time and erode real-world social skills. When users form emotional bonds with nonhuman systems, the line between support and substitution becomes blurred. For founders and teams developing AI in digital mental and behavioral health, these risks are not theoretical. They are product realities. Emotional Simulation in AI Mental Health Tools Is Not Neutral Relational AI and companion chatbots interact directly with attachment systems, identity formation, vulnerability, and emotional regulation. When a digital mental health app simulates empathy, affection, or companionship, it produces measurable psychological impact — whether intended or not. Founders and product leaders developing AI mental health tool must ask: What is the long-term behavioral effect of repeated interaction? Are we supplementing connection or replacing it? What guardrails exist for vulnerable users? How are minors protected? What happens when a user expresses suicidal ideation? These are not edge cases in digital mental health innovation. They are foreseeable design obligations. At APA Labs , we view responsible digital mental and behavioral health innovation as a shared responsibility between developers, clinicians, researchers, and regulatory leaders. Chatbot Regulation Is Accelerating Regulation of Mental health chatbots and companion AI is no longer speculative. States are beginning to respond. In November 2025, New York passed a law requiring chatbots to remind users every three hours that they are not human. In California, Governor Gavin Newsom signed the Companion Chatbots Act (S.B. 243), mandating similar nonhuman disclosures, prohibiting exposure of minors to sexual content, and requiring crisis-response protocols for users expressing suicidal ideation. These policies signal an emerging expectation: AI mental health tools must incorporate transparency, user safety protection, and regulatory foresight. For founders, this means product decisions must anticipate compliance expectations before enforcement becomes reactive. Designing AI in Behavioral Health for Responsibility The next phase of digital mental health innovation will be defined not by conversational realism alone, but by safety, evidence, and oversight. Responsible AI mental health development requires: Clear disclosures that distinguish human and nonhuman interaction Evidence-informed product design Independent evaluation pathways Crisis response and escalation protocols Ongoing assessment of user impact Independent evaluation of digital mental health apps and AI-enabled behavioral health platforms is becoming central to long-term credibility. At APA Labs , we work at the intersection of psychology, technology, and evaluation to strengthen the mental and behavioral health ecosystem. That includes supporting mental health app evaluation , advisory services , and expert matching to help teams build responsibly. Responsible innovation in this space requires psychological expertise embedded early — not added later in response to public scrutiny. Responsibility as Competitive Advantage Relational AI and companion chatbots may help address loneliness, and barriers to care. However, innovation without oversight risks eroding trust – and inviting regulatory intervention. In digital mental and behavioral health, trust is not optional. User safety in mental health technology is becoming a defining market differentiator. Founders and teams who embed psychological expertise and independent evaluation into their development lifecycle will be better positioned to scale sustainably. The standards guiding AI mental health tools are being defined now by regulators, researchers, and industry leaders. Digital mental health founders who prioritize responsible innovation will not only reduce risk. They will shape the future of behavioral health technology.
December 18, 2025
Adapted from the congressional testimony of Dr. Mitch Prinstein, Chief Science Officer, American Psychological Association (APA)
Show More