Skip to main content
Download PDF
- Main
Structured Contexts For Large Language Models
- Lin, Kevin
- Advisor(s): Klein, Dan;
- Gonzalez, Joseph E
Abstract
Large language models (LLMs) are as capable as the context that they are given. This dissertation studies how to structure context, improving LLMs by orchestrating and engineering the contexts that they are given. First, we present context decomposition, a technique for breaking complex contexts into simpler contexts that specialized models are more capableof handling. Second, we show in a comparative study that, context rewriting, a method for re-representing conversational utterances into simpler contexts improves data labeling efficiency and modularity. Finally, we present context tuning, a technique to finetune LLMs to better handle input contexts.
Main Content
For improved accessibility of PDF content, download the file to your device.
If you recently published or updated this item, please wait up to 30 minutes for the PDF to appear here.
Enter the password to open this PDF file:
File name:
-
File size:
-
Title:
-
Author:
-
Subject:
-
Keywords:
-
Creation Date:
-
Modification Date:
-
Creator:
-
PDF Producer:
-
PDF Version:
-
Page Count:
-
Page Size:
-
Fast Web View:
-
Preparing document for printing…
0%