File python3-shell_gpt.changes of Package python3-shell-gpt

-------------------------------------------------------------------
Wed Apr  9 11:29:18 UTC 2025 - Dmitry Markov <werwolf131313@gmail.com>

- upd to 1.4.5
- Changes:
  * Support all LiteLLM supported models by @abhishekbhakat in #604
  * Fix --show-chat and --repl to respect --no-md by @florian-lnx in #513
  * Update Python version requirements and dependencies by @TheR1D in #671
- Fix: Preserve system message in the chat cache by @EmVee381 @TheR1D in #669 #683

-------------------------------------------------------------------
Wed Aug 14 08:53:20 UTC 2024 - Dmitry Markov <werwolf131313@gmail.com>

- upd to 1.4.4
- Fixed
  A bug which caused startup problems on Windows and FreeBSD should be fixed
  Resolved some minor link state and listener management bugs during shutdown

-------------------------------------------------------------------
Sat Apr  6 20:27:04 UTC 2024 - Dmitry Markov <werwolf131313@gmail.com>

- upd to 1.4.3 version

-------------------------------------------------------------------
Thu Feb 22 09:17:02 UTC 2024 - Dmitry Markov <werwolf131313@gmail.com>


    Added new option —md and —no-md to disable or enable markdown output.
    Added new config variable PRETTIFY_MARKDOWN to disable or enable markdown output by default.
    Added new config variable USE_LITELLM to enforce LiteLLM library.


-------------------------------------------------------------------
Sun Feb 18 06:37:55 UTC 2024 - Dmitry Markov <werwolf131313@gmail.com>


    Fix #422: Markdown formatting for chat history by @jeanlucthumm in #444
    New config variable API_BASE_URL #473 and fixing REQUEST_TIMEOUT by @TheR1D in #477
    Minor code optimisations.


-------------------------------------------------------------------
Sat Feb 10 07:49:14 UTC 2024 - Dmitry Markov <werwolf131313@gmail.com>

What's Changed
- Ollama and other LLM backends.
- Markdown formatting now depends on role description.
- Code refactoring and optimisation.

Multiple LLM backends
- ShellGPT now can work with multiple Backends using LiteLLM. You can use locally hosted open source models which are available for free. To use local models, you will need to run your own LLM backend server such as Ollama. To setup ShellGPT with Ollama, please follow this comprehensive guide. Full list of supported models and providers here. Note that ShellGPT is not optimized for local models and may not work as expected❗️

Markdown formatting
- Markdown formatting now depends on the role description. For instance, if the role includes "APPLY MARKDOWN" in its description, the output for this role will be Markdown-formatted. This applies to both default and custom roles. If you would like to disable Markdown formatting, edit the default role description in ~/.config/shell_gpt/roles.

-------------------------------------------------------------------
Sun Jan 28 06:37:49 UTC 2024 - Dmitry Markov <werwolf131313@gmail.com>

- Added --interaction that works with --shell option, e.g. sgpt --shell --no-interaction will output suggested command to stdout. This is usefull when you want to redirect output to somewhere else. For instance sgpt -s "say hi" | pbcopy.
- Fixed issue with stdin and --shell not switching to interactive input mode.
- Changed shell integrations to use new --no-interaction to generate shell commands.
- Moved shell integrations into dedicated file integration.py.
- Changed --install-integration logic, will not download sh script anymore.
- Removed validation for PROMPT argument, now will be empty string by default.
- Fixing an issue when sgpt is being called from non-interactive shell environments such as cron tab.
- Fixed and optimised Dockerfile.
- GitHub codespaces setup.
- Improved tests.
- README.md improvements.
- Shell integration logic has been updated, and it will not work with previous version of integration function in ~/.bashrc or ~/.zshrc. Run sgpt --install-integration to apply new changes, and remove old integration function from your shell profile if you were using it before.
openSUSE Build Service is sponsored by