hh.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Automating Workforce Scheduling with Large Language Models and Constraints
Halmstad University, School of Information Technology.
Halmstad University, School of Information Technology.
2025 (English)Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
Abstract [en]

This thesis explores the use of large language models (LLMs) to automate workforce scheduling through natural language interaction. The primary objective is tofine-tune a general-purpose LLM to generate structured scheduling data in JSONformat from natural language prompts. Using a parameter-efficient fine-tuningmethod (LoRA), we trained Microsoft’s Phi-4 models on a domain-specific datasetof Swedish scheduling requests. The model’s performance was evaluated acrossvalidation, test, and generalization datasets using structured accuracy and fieldlevel metrics such as F1 score. The fine-tuned model achieved 84% structuredaccuracy on the validation set and 81.74% on a generalization test set featuringdiverse scheduling scenarios. In contrast to previous work that relied on few-shotprompting, our approach emphasizes reliable structure generation followed byconstraint checking through external Python functions. Comparative results showthat the fine-tuned Phi-4 model outperforms OpenAI’s GPT models in accuracy,though at the cost of generation time. These findings demonstrate the feasibilityand effectiveness of a fine-tuned, locally deployable LLM for reliable and interpretable schedule generation. 

Place, publisher, year, edition, pages
2025. , p. 67
Keywords [en]
Workforce scheduling, Large Language Models, Phi-4, Fine-tuning, LoRA, Constraint validation, Artificial intelligence, Parameter efficient fine-tuning
National Category
Computer Sciences Artificial Intelligence
Identifiers
URN: urn:nbn:se:hh:diva-56516OAI: oai:DiVA.org:hh-56516DiVA, id: diva2:1971770
External cooperation
ClearQ AB
Subject / course
Computer science and engineering
Educational program
Computer Science and Engineering, 300 credits
Presentation
2025-05-20, R4341, Kristian IV:s väg 3, Halmstad, 23:22 (English)
Supervisors
Examiners
Available from: 2025-06-18 Created: 2025-06-17 Last updated: 2025-10-01Bibliographically approved

Open Access in DiVA

fulltext(3509 kB)205 downloads
File information
File name FULLTEXT02.pdfFile size 3509 kBChecksum SHA-512
6f41ddfd59ca3f91cf7912a638f80c382325caed4bbab56e0fe67ff8f9cfca99a66b813b4371787c143824d27e0395108bcf701db072066f2dd31a52b7fcf004
Type fulltextMimetype application/pdf

By organisation
School of Information Technology
Computer SciencesArtificial Intelligence

Search outside of DiVA

GoogleGoogle Scholar
Total: 206 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 303 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf