LLM chat helper (llm.py)

This module provides a minimal, uniform interface for creating chat objects backed by large language models (LLMs).

The goal is to:

Currently exposed models: - anthropic/claude-sonnet-4-5 - openai/gpt-5.1-codex

These are defined in the exported constant LLM_MODELS. The ordering is intentional: index 0 is the default model used in examples.

To create a chat instance:

from data401_nlp.helpers.llm import make_chat, LLM_MODELS
chat = make_chat(LLM_MODELS[0])

This returns a lisette.Chat object configured with the selected model and a default temperature.

Note: lisette is imported lazily — only when make_chat is actually called. This means importing this module is safe even in environments where no LLM API key is configured. An error will only occur if a student calls make_chat without the appropriate key set.

The default temperature is defined as DEFAULT_TEMPERATURE = 1. You may override it:

chat = make_chat(LLM_MODELS[0], temp=0.3)

This helper assumes the appropriate API key is already in the environment at the time make_chat is called (not at import time). See 00_env.ipynb for how keys are loaded across platforms.


make_chat


def make_chat(
    model, # Provider-qualified model identifier.
    temp:int=1, # Sampling temperature.
): # A configured chat object.

make_chat returns a callable. Calling it with a prompt returns plain text (str).