v2.2.0
Release v2.2.0
What's changed
Added features:
- Extended interface of APILLM allowing to pass kwargs to the API
- Improve asynchronous parallelization of LLM calls shortening inference times
- Introduced a
Promptclass to encapsulate instructions and few-shot examples
Further changes:
- Improved error handling
- Improved task-description infusion mechanism for meta-prompts
Full Changelog: here