The vendor is required to provide AI-supported system to assist with case categorization into no oral argument (NORA) and oral argument (ORA) tracks.
- Functional requirements
1. Case categorization
• Automated classification of appellate cases into NORA vs ORA categories, based on historical case data and judicially relevant factors.
• Ability to refine and validate outputs against court-provided benchmarks.
2. Memo generation
• Automated generation of case summaries to assist in judicial panel review.
• Customizable formats for summaries (length, organization, issue-spotting, case citations).
3. Workflow support
• Role-based user interface for justices, general counsel, interns, and staff.
• Sandbox environment that provides a secure, isolated environment for review and testing of new models, version selection, prompts, configurations, and workflows without impacting or influencing the deployed environment.
• Tracking and audit trail of AI model source and version, AI input prompt(s), AI generated response and recommendations, human reviews, and final decisions.
• Tracking and audit trail of token usage and related cost tracking. dashboard with ability to export detailed reports
4. Human oversight
• Features enabling human review, edits, overrides, and final decision authority.
• Logging of all user interactions for accountability.
- AI-specific requirements
• Transparency and explain ability: ability to document and explain model outputs, including training data sources, model design, and decision factors.
• Bias and accuracy: processes for validating accuracy, performing bias audits, and supporting contestability of outputs.
• Model drift and updates: how updates or retraining are managed, disclosed, and tested. additionally, solution providers should address:
o Redundancy and resilience: strategies to ensure continuity of service and consistent performance, including fallback mechanisms or redundant model configurations that can be activated in case of primary model failure or degradation.
o Model portability: ability to transition to alternative models or platforms in the event that the current model is depreciated or discontinued, including data portability, compatibility with other foundational models, and mitigation plans to avoid service disruption.
• Risk mitigation: safeguards against automation bias, confirmation bias, or undue reliance.
• Compliance with act: confirm alignment with state prohibitions (e.g., no “social scoring,” no AI outputs that infringe constitutional rights, oversight readiness).
- Virtual Information Session Date: September 29, 2025
- Questions/Inquires Deadline: October 3, 2025
Set up free email alerts and get notified when new government bids, tenders and procurement opportunities match your industry and location. Choose daily or weekly delivery.