Past Event: Using LLMs on Yale's computing clusters - hands on workshop

Thu Oct 9, 2025 9:00 a.m.—11:00 a.m.

This event has passed.

This workshop will provide researchers interested in LLMs with the skills to launch and run inference-based workflows with open-source models on YCRC HPC systems using Ollama.

Specifically, attendees will learn: 

  • Advantages of using YCRC systems for LLMs
  • How to identify what GPU is needed for different LLMs 
  • How to launch an LLM on YCRC systems via the terminal and perform basic inference
  • How to modify an LLM's parameters for reproducibility/consistency/creativity in response 
  • How to implement an LLM within python using jupyter and ollama 
  • Initial setup for huggingface environments
  • Additional considerations for RAG and fine-tuning

Location:

This is an in-person-only event held at the YCRC Auditorium, located at 160 St Ronan Street. Remote access will not be available.