| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
| |
There's no preferences dialog, so you can't really adjust the prompt
or the model it uses. The default settings work well for me. You may
want to tweak them depending on your model preferences and compute
budget. (Not many can afford to run Llama3-8B at high
quantization. Conversely, you might have a better GPU than me and wish
to run a 27B model or bigger.)
|
| |
|
|
|
|
|
| |
Empty for now, but I am planning to include setting up the API
endpoints and prompts for Smart Summary, among other things.
|
|
This took a while and had me scratching my head often. But I managed
to combine the best parts of Crane and Meson together, allowing me to
have blazing fast Nix builds.
This also adds initial scaffolding for gettext and other cool things.
|