AI Is Changing Legal Work. Here's What It Can't Replace.
The legal profession sits at an inflection point where AI investment is surging but core litigation skills remain irreplaceable. Legal tech startups raised $5.99 billion in 2025 alone — with Harvey AI reaching an $8 billion valuation — yet paralegal job growth has flatlined at 0% (BLS, 2024), digital evidence volumes are exploding, and courts are scrambling to draft new rules for AI-fabricated evidence. This data brief compiles current, citable figures across eight topic areas for solo and small firm litigators.
AI legal tech funding has reached escape velocity
Harvey AI, the sector's flagship, has raised over $1.2 billion across six rounds in under four years. Founded in 2022 by former O'Melveny litigator Winston Weinberg and ex-DeepMind researcher Gabriel Pereyra, Harvey's trajectory is staggering: a $100M Series C at $1.5B (July 2024), $300M Series D at $3B (February 2025), $300M Series E at $5B (June 2025), and a $160M Series F at $8B valuation led by Andreessen Horowitz (December 2025). As of February 2026, Harvey was reportedly in talks to raise $200M at an $11B valuation. The company surpassed $100M ARR in August 2025 and reached $190M ARR by year-end, serving 50 of the top AmLaw 100 firms and roughly 100,000 lawyers across 60+ countries.
Supio has raised $91 million total — confirmed across multiple sources including TechCrunch and BusinessWire. The Seattle-based platform, purpose-built for personal injury and mass tort plaintiff firms, closed a $60M Series B (April 2025) led by Sapphire Ventures with Thomson Reuters Ventures participating, following a $25M Series A (August 2024). Supio reports 4× ARR growth since its Series A and helped TorHoerman Law secure a $495 million verdict against Abbott Labs. In September 2025, Supio launched CaseAware AI™, a suite of 10 expert models covering litigation, case intake, and deposition analysis.
Clearbrief, founded by former Paul Weiss associate Jacqueline Schafer, has raised approximately $8.19 million across three rounds (most recently a ~$4M Series A in June 2024 led by Authentic Ventures). The Microsoft Word add-in automatically finds and verifies citations, flags unsupported statements, and detects AI-hallucinated fake cases. Clearbrief scored #1 with 40.5/50 in the State Bar of Nevada's AI evaluation, outperforming vLex/Fastcase, Luminance, and ChatGPT. It won Litigation Product of the Year at Legalweek 2023 and holds four issued patents. Its integrations span LexisNexis, Relativity, Clio, and iManage, and it is used by Microsoft Legal and the AAA-ICDR's 5,500 arbitrators.
Litmas AI provides an AI-native litigation management platform featuring motion drafting, evidence mapping, and a 3D “Litiverse Graph” that visualizes parties, witnesses, facts, and document relationships. The company claims 100% citation verification and is trusted by 100+ firms. Litmas was a semifinalist at ABA TECHSHOW 2026 Startup Alley. No public funding data is available, suggesting it remains early-stage or bootstrapped.
Other notable players in the space include EvenUp ($385M raised, $2B+ valuation, processing 10,000 personal injury cases per week), Luminance (~$165M raised, $75M Series C in February 2025, 700+ customers), Legora ($1.8B unicorn valuation after a $150M Series C in October 2025), Clio ($850M raised in 2025, acquired vLex for $1B), and Eudia ($105M Series A). Thomson Reuters acquired Casetext for $650M cash in 2023. Roughly 79% of all legal startup investment since 2024 has gone to AI-focused companies.
The AI legal tech market is valued at approximately $1.45 billion (2024) and projected to reach $3.9 billion by 2030 at a 17.3% CAGR (Grand View Research). MarketsandMarkets estimates a higher figure of $10.82 billion by 2030 at 28.3% CAGR. The broader legal tech market is projected at $38.1 billion in 2026, growing to $78.1 billion by 2036. AI adoption in law firms surged from 19% in 2023 to 79% in 2024 (Clio, 2024 Legal Trends Report).
Relativity and Everlaw are embedding AI into every stage of e-discovery
Relativity's aiR suite, built on Microsoft Azure OpenAI (GPT-4 Omni), now encompasses four products: aiR for Review, aiR for Privilege, aiR for Case Strategy, and the upcoming aiR Assist. At Relativity Fest 2025, the company announced aiR for Review and aiR for Privilege would be included in standard RelativityOne pricing at no additional cost — a signal that generative AI is becoming a baseline expectation, not a premium add-on.
The platform's adoption numbers are substantial. Over 200 customers have used aiR solutions, processing 25+ million documents across thousands of matters and generating 100+ million review decisions in 20 months. aiR for Review can analyze up to 3 million documents per day. Published case studies show dramatic results: Purpose Legal achieved an 85% reduction in review time and 95% recall on a 300,000-document review; KordaMentha cut costs by 85%; Teneo processed 1 million documents in 18 days with 70% cost savings. aiR for Case Strategy, which auto-extracts key facts and generates deposition outlines, reported users pulling facts together up to 70% faster than manual processes.
Everlaw's AI suite centers on Coding Suggestions (LLM-powered document classification), Deep Dive (RAG-based corpus-wide Q&A launched at Everlaw Summit in October 2025), and several features now included at no extra cost: Review Assistant, Writing Assistant, and Deposition Analyzer. Over 250 customers use Everlaw's GenAI features, and Coding Suggestions pricing was reduced by 40%+ in October 2025. In aggregated testing across four real-world datasets, Coding Suggestions achieved precision of 0.67 and recall of 0.89, with AI surpassing first-level human reviewer recall by 36% in one case. An Am Law 100 firm coded 126,000 documents in under 24 hours with 90%+ accuracy and a 50–67% reduction in review time using one-quarter of the personnel. Everlaw serves 91 of the Am Law 200 and all 50 state attorneys general.
Industry surveys reinforce the momentum. The Lighthouse 2025 AI in eDiscovery Report found AI adoption grew by nearly 100% year-over-year, though only ~12% use generative AI in all or most cases. Fear of job displacement has dropped from the #2 concern to last place. Document review accounts for more than 80% of total litigation spend — approximately $42 billion per year — making it the primary target for AI automation.
Small firms are adopting AI but lag on e-discovery tools
The 2024 ABA Legal Technology Survey Report (released March 2025, based on 512 attorney responses) reveals AI adoption nearly tripled year-over-year to 30% overall, up from 11% in 2023. Adoption breaks down sharply by firm size: 18% of solo practitioners now use AI tools (up from 10% in 2023 and 0% in 2022), while 46% of firms with 100+ attorneys have adopted AI. An additional 15% are “seriously considering” AI purchases. ChatGPT remains the most popular tool at 52% overall — and 64% among firms with 2–9 attorneys (ABA TechReport 2024). Efficiency is the primary driver, with 54% citing “saving time/increasing efficiency,” while 75% express concerns about AI accuracy.
Cloud computing adoption has reached approximately 75% overall (up from 60% in 2021), though solos lag at roughly 65% compared to 94%+ for firms with 50–99 attorneys. Alarmingly, 23.8% of solo attorneys report using no security precautions for their cloud computing — a significant risk for client data (ABA TechReport 2024).
The e-discovery gap between small and large firms remains pronounced. Only 27% of solo practitioners have access to litigation support software compared to 73% of firms with 100+ attorneys. Relativity dominates the market at 40% share among those using litigation support tools. E-discovery involvement shows a similar divide: 35% of solos handle cases requiring ESI processing versus 66% of large firms. Yet 78% of small firm practitioners regularly appear in court — more than double the rate of large-firm attorneys — meaning small firms face digital evidence challenges with fewer tools (ABA TechReport 2024).
Technology spending reflects this gap. Solo attorneys typically spend less than $3,000 per year on technology, and only 41% have a dedicated tech budget, compared to 90% of firms with 100+ attorneys. The recommended allocation is 4–7% of total budget for technology.
Paralegal employment is flat, but the role is evolving
The Bureau of Labor Statistics reports 376,200 paralegals and legal assistants employed in the United States (2024 data). The projected growth rate for 2024–2034 is 0% — “little or no change” — a stark downgrade from the previous cycle's 4% projection. The BLS explicitly attributes this to AI: “employment growth for these workers may be limited by advances in technology, including artificial intelligence. These technologies are expected to make paralegals and legal assistants more efficient at tasks such as conducting research and preparing documents, which may reduce demand” (BLS, 2024). Despite flat growth, approximately 39,300 annual openings are projected through 2034, primarily from replacement needs. The median annual wage is $61,010 (May 2024), well above the $49,500 median for all occupations.
Multiple studies quantify AI's exposure on paralegal work. The Goldman Sachs report (March 2023) estimated that 44% of legal work activities could be automated by generative AI, ranking legal as the second-most-exposed profession after administrative work. Clio's 2024 Legal Trends Report found that 69% of hourly billable work performed by paralegals could be automated with current AI. The Oxford/Frey-Osborne study assigned paralegals a 94% probability of computerization in the coming decades. Deloitte projects roughly 100,000 legal roles will be automated by 2036.
Yet the profession is adapting rather than disappearing. NALA's 2024 Utilization & Compensation Report shows paralegal compensation increased 15% from 2022 to 2024 — the highest jump since 2002. Over 70% of paralegals now have flexible work arrangements, and 20% more paralegals report handling complex responsibilities compared to 2020. An estimated 73% of paralegal educational programs now include AI training in their curricula. The consensus among professional associations is captured in a widely cited formulation: “AI will not replace paralegals — but paralegals who use AI will replace those who don't.”
Body camera adoption is near-universal and evidence volumes are surging
According to the U.S. Department of Justice, over 18,000 law enforcement agencies had adopted body-worn cameras by 2023 — a figure approaching the total of approximately 18,000 general-purpose agencies nationwide. Historical BJS data shows adoption climbing from 47% of agencies in 2016 to near-universal levels today. Over 80% of departments with 1,000+ officers have deployed BWCs. The NYPD alone expanded its program to cover over 36,000 officers in 2024. Federal grants for body camera programs increased 45% from 2019 to 2023.
Seven states now mandate statewide BWC use: Colorado, Connecticut, Illinois, Maryland, New Jersey, New Mexico, and South Carolina. More than half of U.S. states have enacted body camera legislation of some form. Retention requirements vary — California requires critical incident footage kept for at least 2 years; Minnesota mandates at least 90 days (1 year for use-of-force incidents); Colorado requires public release within 21 days of a complaint.
The data volumes are staggering. Individual officers generate 3 to 8 gigabytes of footage per shift. Axon, the dominant BWC vendor, reported its cloud database grew from ~6 terabytes in 2016 to over 100 petabytes by early 2024 — equivalent to 5,000+ years of HD video. The NYPD's five-year storage projection exceeds 5 petabytes at an estimated cost of $51 million. Jefferson County, Colorado's DA office saw BWC video submissions nearly double from 36,000 videos (2022) to 67,700 videos (2025). Storage costs for a midsized department can reach $2 million per year; Baltimore estimated video storage alone at $2.6 million annually.
More than 80% of criminal cases now involve video evidence, and digital evidence appears in approximately 90% of all criminal cases. Smartphones have become the single most important evidence source — 97% of investigators now cite them as the top source of digital evidence (Cellebrite, 2026 Industry Trends Report). An estimated 500,000+ litigation cases per year involve social media evidence. IoT devices — Fitbits, Amazon Echo, smart home systems, connected vehicles — are emerging as the “third wave” of e-discovery. The digital evidence management market reached $7.55 billion in 2023 and is projected to hit $23.25 billion by 2033 at an 11.9% CAGR.
Yet evidence sharing infrastructure lags: two-thirds of agencies still share evidence via physical media (USB drives, portable hard drives), creating chain-of-custody risks. Only 42% of public safety respondents accept cloud-based evidence sharing, up from 38% the prior year. Agencies average 3.1 different storage solutions for digital evidence, creating fragmentation challenges.
FRE 902(13) and 902(14) streamline digital evidence authentication
Federal Rules of Evidence 902(13) and 902(14), effective December 1, 2017, allow electronic evidence to be self-authenticated via written certification rather than live witness testimony. Rule 902(13) covers records generated by an electronic process or system (website contents, GPS metadata, text message logs, server logs). Rule 902(14) covers data copied from an electronic device, storage medium, or file — typically forensic copies authenticated through hash value comparison.
Both rules require a certification by a “qualified person” that complies with the certification requirements of Rule 902(11) and satisfies the notice requirements therein. A declaration under 28 U.S.C. § 1746 — an unsworn declaration under penalty of perjury — satisfies the certification requirement. The Advisory Committee Notes explicitly confirm this: “A declaration that satisfies 28 U.S.C. § 1746 would satisfy the declaration requirement of Rule 902(11).” Section 1746, enacted in 1976, requires the declaration to be in writing, signed, dated, and state it is “true under penalty of perjury.” No notarization is required. Such declarations are prosecutable as perjury under 18 U.S.C. §§ 1621 and 1623(a), as confirmed in United States v. Gomez-Vigil, 929 F.2d 254 (6th Cir. 1991).
Early case law includes United States v. Driscoll (D.D.C. 2018), one of the first federal cases applying Rule 902(13) to authenticate military internet archive records, and Kremerman v. Open Source Steel, LLC (W.D. Wash. 2018), where the plaintiff criticized the defendant for failing to use 902(14) for Facebook evidence. The Advisory Committee Notes specifically endorse hash value comparison as the standard authentication method for 902(14), while leaving room for future technology. The foundational pre-amendment case United States v. Lizarraga-Tirado, 789 F.3d 1107 (9th Cir. 2015), established that machine-generated evidence is not hearsay because it lacks a human declarant — a principle that underpins Rule 902(13). Multiple states have adopted identical or nearly identical versions, including Arizona and Pennsylvania (effective January 1, 2020).
The key scholarly works include Grimm, Capra & Joseph, “Authenticating Digital Evidence,” 69 Baylor L. Rev. 1 (2017), written by three authors directly involved in drafting the rules, and Levy & Haried, “Practical Considerations When Using New Evidence Rule 902(13) to Self-Authenticate Electronically Generated Evidence in Criminal Cases,” 67 DOJ J. Fed. L. & Prac. 81 (2019).
Courts are preparing for deepfakes but haven't pulled the trigger on new rules
The Advisory Committee on Evidence Rules has drafted but not adopted a proposed Rule 901(c) addressing AI-fabricated evidence. The draft uses a two-step burden-shifting framework: the party challenging authenticity must first present evidence sufficient to support a finding of AI fabrication, after which the proponent must prove authenticity under a heightened standard. The Committee considered the proposal in April 2024, November 2024, and May 2025, each time deciding the rule was “not necessary at this time” but keeping draft language ready for “rapid implementation.” Reporter Daniel J. Capra expressed a preference for monitoring case law “for at least a year.”
Separately, proposed Rule 707 — requiring machine-generated evidence offered without an expert witness to satisfy Daubert reliability standards — was approved by the Committee on Rules of Practice and Procedure on June 10, 2025, and published for public comment until February 16, 2026. If adopted through the normal rulemaking process, it could take effect December 1, 2026.
Deepfake concerns have already surfaced in courtrooms. In Wisconsin v. Rittenhouse (2021), the defense successfully challenged the prosecution's attempt to enlarge iPad video, arguing Apple's pinch-to-zoom used AI that could manipulate footage. In United States v. Reffitt (January 6 prosecution), defense counsel questioned whether video evidence was AI-manipulated. In Sz Huang v. Tesla, Tesla's lawyers argued video of Elon Musk's statements could be deepfakes. In Kohls v. Ellison (D. Minn. Jan. 2025), a court struck a Stanford professor's expert declaration because it contained AI-hallucinated citations — which the court called ironic in a case about deepfake dangers.
At the state level, 46 states have enacted deepfake legislation, with 174 total laws and 64 new laws in 2025 alone. Louisiana's HB 178 (effective August 1, 2025) is the first state law specifically addressing AI-generated evidence in litigation, requiring attorneys to “exercise reasonable diligence to verify the authenticity of evidence” and imposing contempt and disciplinary sanctions for offering AI-manipulated evidence without disclosure. California's SB 970 directs the Judicial Council to develop rules for assessing AI-manipulated evidence claims by January 1, 2026. California leads nationally with 18 separate deepfake laws. The federal TAKE IT DOWN Act, signed May 19, 2025, became America's first federal law directly regulating deepfake abuse.
Technological solutions are emerging. The C2PA standard (Coalition for Content Provenance and Authenticity), developed by Adobe, Microsoft, Intel, and 200+ members, creates cryptographically bound “Content Credentials” that function as a provenance chain for digital content. Version 2.2 was published May 2025 and fast-tracked as an ISO standard; the NSA endorsed C2PA in a January 2025 cybersecurity information sheet. Blockchain-based chain-of-custody systems are also being piloted — Vermont and Arizona have already enacted statutes recognizing blockchain-recorded data as business records.
The ABA has responded through Formal Opinion 512 (July 2024) on generative AI ethics and its Task Force on Law and AI, whose Year 2 Report (late 2025) highlights deepfakes as a direct threat to evidence integrity. Chief Justice Roberts has identified deepfakes as a danger to judicial independence. Europol projects 90% of online content could be synthetically generated by 2026, and Relativity has documented 130% year-over-year growth in video files entering e-discovery — underscoring why authentication frameworks must evolve.
Conclusion
The data paints a picture of accelerating collision between AI capability and legal system readiness. $6 billion in legal tech funding in 2025 signals massive market confidence, yet solo practitioners spend under $3,000 per year on technology and only 18% use any AI tools (ABA TechReport 2024). E-discovery AI can process 3 million documents daily with 85–95% accuracy, but 35% of solos handle ESI matters and two-thirds of agencies still share evidence on USB drives. The BLS projects zero paralegal job growth — not because the work disappears, but because AI makes fewer people more productive. Body camera footage alone has created a 100-petabyte evidence ecosystem that didn't exist a decade ago, while courts have drafted but deliberately shelved a deepfake evidence rule, waiting to see if existing frameworks hold. The legal professionals who thrive will be those who master AI tools while maintaining the judgment, credibility, and evidentiary rigor that algorithms cannot replicate. For platforms serving litigation teams, the opportunity lies not in replacing human work but in bridging the gap between exponentially growing digital evidence and the small-firm practitioners who must present it in court.
Related Reading
If your firm is using AI for document review but still sharing evidence over email, there's a gap in your workflow.
Attested provides SHA-256 integrity verification, viewer identity watermarking, automated access logging, and FRE 902(13) certificate generation. Built for attorneys who need evidence sharing that holds up in court.