Recommended Starting Stack
Chapter 4 of Responsible AI for the Small Law Firm introduces a four-factor framework for evaluating any AI tool a firm is considering. This page lists tools that JDAI Consultants has evaluated against that framework, with the date of evaluation and a record of which factors the tool met as of that date.
This page is not real-time, and nothing on it is a recommendation.
JDAI Consultants evaluates each tool listed on this page against the four-factor framework described in Chapter 4 of Responsible AI for the Small Law Firm. Each entry shows the date of evaluation and which of the four factors the tool met on that date. Evaluations are repeated periodically, not continuously. Pricing, terms of service, security postures, data-handling practices, sub-processor lists, and feature sets can change without notice between scheduled re-evaluations.
Listing on this page is not a recommendation that any firm adopt the tool. JDAI Consultants does not warrant that any tool listed here remains compliant with the four factors after the evaluation date shown, and does not warrant that the tool is appropriate for any specific firm, practice area, jurisdiction, matter, or client.
Before you test, purchase, or rely on any tool referenced on this page, you must independently verify that the tool still meets all four factors as applied to your firm's specific practice areas, jurisdictions, data, matters, and client commitments. That verification is your firm's responsibility. The information on this page does not replace it, and does not establish an attorney-client or consulting relationship between JDAI Consultants and any reader.
The four factors
From Chapter 4. Every entry on this page is evaluated against all four.
Data Protection
Does the vendor offer an executable Data Processing Agreement, enterprise-tier confidentiality terms, no-training-on-inputs commitments, defined retention, and a sub-processor list the firm can review? Without these, the tool fails this factor regardless of how capable it is.
Practice Area Fit
Does the tool's actual function match the kind of legal work the firm does? A platform optimized for transactional drafting is not a fit for a litigation-only practice. A tool built for large-firm volume is not a fit for a two-attorney shop. Generic capability is not the same as practice fit.
Supervision Compatibility
Does the tool produce outputs that an attorney can actually verify before relying on them? If the workflow makes meaningful supervision impossible or impractical, the tool fails this factor even if everything else looks good. Outputs the firm cannot inspect cannot be governed.
Cost Relative to Firm Size
Does the tool's total cost, including training time, integration work, DPA negotiation, and ongoing governance effort, make sense at the firm's scale and matter volume? A tool that costs more in governance effort than it produces in efficiency gains fails this factor, even if the seat price looks reasonable.
Evaluated tools
Each entry below carries the date of evaluation and the four-factor result as of that date.
No tools are listed yet.
JDAI is in the process of running candidate tools through the four-factor framework. Tools will be added here only after a complete evaluation is documented. There is no schedule for additions; this page is updated as evaluations finish, not on a calendar.
If a tool you are considering is not listed here, that does not mean it has failed an evaluation. It means JDAI has not yet completed one.
- Tool name and vendor
- Function category (legal research, practice management, drafting, intake, and so on)
- Tier evaluated (enterprise, business, consumer)
- Date of evaluation
- Four-factor result as of that date
- Notes on conditional passes, scope limits, or known caveats
- Pricing (changes too often to publish)
- A recommendation to adopt
- A warranty that the tool still passes
- Comparative ranking against other tools
- Any representation that the tool is appropriate for a specific firm or matter
The framework is in the book. The page is the running record.
If you need to evaluate a tool before this page lists it, work the four factors yourself, document the evaluation, and treat the result as a snapshot that requires re-verification before any meaningful firm-wide rollout. If you would like JDAI to run the evaluation as part of an engagement, get in touch.