A new language model called Alpaca was released by Stanford.
- Alpaca behaves similarly to OpenAI’s DaVinci 3 but is much smaller and cheaper (under $600)
- The cost reduction is significant and unexpected, happening much faster than predictions
- Alpaca was created by fine-tuning a weak open-source model from Meta using GPT-3.5
- The fine-tuning process used a technique called self-instruct
- Alpaca performs comparably to DaVinci 3 but is not necessarily better in all tasks
- The model’s low cost raises questions about the future of expensive language models
- Tech giants such as Apple, Amazon, Baidu, and Google are also developing language models
- The rapid cost reduction could upend the economics of large language models
- Questions arise about incentives for companies to invest in cutting-edge models if they can be cheaply reproduced
Why it’s important: Large Language Models seem to becoming cheaper at a much faster rate than previously forecasted. Massive implications for the future.

