This guide compiles information on generative AI developments and tools pertinent to legal research, legal education, and legal practice. It is a work in progress and will be periodically updated with additional resources and new information on this rapidly changing area.
For more information on generative AI, please visit the Information and Educational Technology (IET) site, AI Technology Use at UC Davis. This resource includes details on upcoming AI-related events and the latest developments in AI, both within the university and beyond. Additionally, the UC Davis Library offers a guide titled Generative Artificial Intelligence for Teaching, Research, and Learning. This guide provides insights into key frameworks, best practices, and current research to help users deepen their understanding of this emerging technology.
Generative artificial intelligence, or generative AI, is a type of AI technology that creates content – including text, images, video, and computer code – by identifying patterns in large quantities of training data.
Large Language Models (LLMs) are a type of generative AI technology that learn on a massive amount of data to predict text. A few popular examples include OpenAI's ChatGPT (e.g., GPT-4), Google's Germini (formerly Bard), and Microsoft’s Copilot (formerly Bing Chat), all discussed further in this guide under the section Other Generative AI Tools.
It is predicted that the rollout and development of generative AI tools will impact various industries, including legal education and legal practice. For example, according to Lexis, the generative AI tools being developed will help streamline certain tasks such as composing legal briefs and client memos, conducting due diligence and producing complex analyses from troves of documents. Legal generative AI tools currently available to law schools include Lexis+ AI, Practical Law, and Westlaw's CoCounsel (discussed further under Legal Databases with Generative AI).
As the UC Davis Center for Educational Effectiveness has noted, generative AI tools "do not understand content and can make mistakes.... [E]ducators may need to cultivate AI literacy and train students to use these tools strategically and thoughtfully."