This site is updated Hourly Every Day

Trending Featured Popular Today, Right Now

Colorado's Only Reliable Source for Daily News @ Marijuana, Psychedelics & more...

Post: Changes Proposed to the Federal Rules of Evidence to Address AI Usage

Picture of Anschutz Medical Campus

Anschutz Medical Campus

AnschutzMedicalCampus.com is an independent website not associated or affiliated with CU Anschutz Medical Campus, CU, or Fitzsimons innovation campus.

Recent Posts

Anschutz Medical Campus

Changes Proposed to the Federal Rules of Evidence to Address AI Usage
Facebook
X
LinkedIn
WhatsApp
Telegram
Threads
Email

William Carlucci Product Liability Law Barnes Thornburg Amendments to Artificial Intelligence Litigation Use Highlights Proposed amendments to federal rules of evidence address authentication of evidence generated by artificial intelligence (AI)

If an AI output is offered as evidence, and that type of evidence would traditionally have required an expert witness’ testimony, it would be subject to a Rule 702 analysis under the new rule

To authenticate evidence generated by AI, the proponent of the evidence would need to provide sufficient detail on training data, AI program used, and reliability of the AI outputs

If the opponent of a piece of evidence can “reasonably” demonstrate it has been altered by AI, the evidence would only be admissible under the new rule if it’s “more likely than not authentic”

The U.S. Courts Advisory Committee on the Federal Rules of Evidence has offered proposed amendments to rule changes to address the use of artificial intelligence (AI) in litigation. The proposed amendments would expand upon Rule 901 (Authenticating or Identifying Evidence) and would create a new rule – Rule 707, “Machine-generated Evidence.”

The proposed amendments are included in the Agenda Book for the Committee’s November meeting at pages 269-271. Changes to Rule 901

Rule 901(a) provides that “to satisfy the requirement of authenticating or identifying an item of evidence, the proponent must produce evidence sufficient to support a finding that the item is what the proponent claims it is.” Subsection (b) then provides specific examples of the types of evidence that satisfy the requirements of section (a).

The proposed amendments would add language to the list of examples that describes what is needed to demonstrate the authenticity of evidence that is “generated by artificial intelligence.” Under the amended rule, the proponent of such evidence would need to produce evidence that, among others “(i) describes the training data and software or program that was used; and (ii) shows that they produced reliable results in this instance.”

Additionally, the proposed amendment adds a new section – subsection (c) – to directly address “deepfakes” and the burden for advancing or opposing evidence that is suspected of being “altered or fabricated, […]

Leave a Reply

Your email address will not be published. Required fields are marked *

You Might Be Interested...