/doc/apidocs/
/doc/graphs/
/doc/index.html
+/doc/setup.html
mail_directory: "/home/user/AI/mail"
models_directory: "/home/user/AI/models"
default_temperature: 0.7
-llama_cpp_dir_path: "/home/user/AI/llama.cpp"
+llama_cpp_dir_path: "/home/user/AI/llama.cpp/"
batch_thread_count: 10
thread_count: 6
+prompts_directory: "/home/user/.config/alyverkko-cli/prompts"
models:
- alias: "default"
filesystem_path: "WizardLM-2-8x22B.Q5_K_M-00001-of-00005.gguf"
context_size_tokens: 64000
end_of_text_marker: null
- - alias: "maid"
- filesystem_path: "daringmaid-20b.Q4_K_M.gguf"
- context_size_tokens: 4096
+ - alias: "mistral"
+ filesystem_path: "Mistral-Large-Instruct-2407.Q8_0.gguf"
+ context_size_tokens: 32768
end_of_text_marker: null
-prompts:
- - alias: "default"
- prompt: |
- This conversation involves a user and AI assistant where the AI
- is expected to provide not only immediate responses but also detailed and
- well-reasoned analysis. The AI should consider all aspects of the query
- and deliver insights based on logical deductions and comprehensive understanding.
- AI assistant should reply using emacs org-mode syntax.
- Quick recap: *this is bold* [[http://domain.org][This is link]]
- * Heading level 1
- ** Heading level 2
- | Col 1 Row 1 | Col 2 Row 1 |
- | Col 1 Row 2 | Col 2 Row 2 |
- #+BEGIN_SRC python
- print ('Hello, world!')
- #+END_SRC
-
- - alias: "writer"
- prompt: |
- You are best-selling book writer.
3. The AI-generated outline is appended to the original brief,
formatted using org-mode syntax.
-* Setup
-** Requirements
-*Operating System:*
-
-Älyverkko CLI is developed and tested on Debian 12 "Bookworm". It
-should work on any modern Linux distribution with minimal adjustments
-to the installation process.
-
-*Dependencies:*
-- Java Development Kit (JDK) 17 or higher
-- Apache Maven for building the project
-
-*Hardware Requirements:*
-- Modern multi-core CPU.
-- The more RAM you have, the smarter AI model you can use. For
- example, at least 64 GB of RAM is needed to run pretty decent
- [[https://huggingface.co/MaziyarPanahi/WizardLM-2-8x22B-GGUF/tree/main][WizardLM-2-8x22B AI model]].
-- Sufficient disk space to store large language models and
- input/output data.
-
-** Installation
-:PROPERTIES:
-:ID: 0b705a37-9b84-4cd5-878a-fedc9ab09b12
-:END:
-At the moment, to use Älyverkko CLI, you need to:
-- Download sources and build [[https://github.com/ggerganov/llama.cpp][llama.cpp]] project.
-- Download [[id:f5740953-079b-40f4-87d8-b6d1635a8d39][sources]] and build Älyverkko CLI project.
-- Download one or more pre-trained large language models in GGUF
- format. Hugging Face repository [[https://huggingface.co/models?search=GGUF][has lot of them]]. My favorite is
- [[https://huggingface.co/MaziyarPanahi/WizardLM-2-8x22B-GGUF/tree/main][WizardLM-2-8x22B]] for strong problem solving skills.
-
-Follow instructions for obtaining and building Älyverkko CLI on your
-computer that runs Debian 12 operating system:
-
-1. Ensure that you have Java Development Kit (JDK) installed on your
- system.
- : sudo apt-get install openjdk-17-jdk
-
-2. Ensure that you have Apache Maven installed:
- : sudo apt-get install maven
-
-3. Clone the [[id:f5740953-079b-40f4-87d8-b6d1635a8d39][code repository]] or download the [[id:f5740953-079b-40f4-87d8-b6d1635a8d39][source code]] for the
- `alyverkko-cli` application to your local machine.
-
-4. Navigate to the root directory of the cloned/downloaded project in
- your terminal.
+* Installation
-5. Execute the installation script by running
- : ./install
+For information about installation and configuration, see [[id:e469ec1e-402a-476d-a849-662a48eb4f90][Älyverkko
+CLI application setup tutorial]].
- This script will compile the application and install it to
- directory
- : /opt/alyverkko-cli
-
- To facilitate usage from command-line, it will also define
- system-wide command *alyverkko-cli* as well as "Älyverkko CLI"
- launcher in desktop applications menu.
-
-6. Prepare Älyverkko CLI [[id:0fcdae48-81c5-4ae1-bdb9-64ae74e87c45][configuration]] file.
-
-7. Verify that the application has been installed correctly by running
- *alyverkko-cli* in your terminal.
-
-** Configuration
+* Task preparation
:PROPERTIES:
-:ID: 0fcdae48-81c5-4ae1-bdb9-64ae74e87c45
+:ID: 4b7900e4-77c1-45e7-9c54-772d0d3892ea
:END:
-Älyverkko CLI application configuration is done by editing YAML
-formatted configuration file.
-
-Configuration file should be placed under current user home directory:
-: ~/.config/alyverkko-cli.yaml
-
-*** Configuration file example
-
-The application is configured using a YAML-formatted configuration
-file. Below is an example of how the configuration file might look:
-
-#+begin_src yaml
- mail_directory: "/home/user/AI/mail"
- models_directory: "/home/user/AI/models"
- default_temperature: 0.7
- llama_cpp_dir_path: "/home/user/AI/llama.cpp/"
- batch_thread_count: 10
- thread_count: 6
- prompts_directory: "/home/user/.config/alyverkko-cli/prompts"
- models:
- - alias: "default"
- filesystem_path: "WizardLM-2-8x22B.Q5_K_M-00001-of-00005.gguf"
- context_size_tokens: 64000
- end_of_text_marker: null
- - alias: "maid"
- filesystem_path: "daringmaid-20b.Q4_K_M.gguf"
- context_size_tokens: 4096
- end_of_text_marker: null
-#+end_src
-
-*** Configuration file syntax
-
-Here are available parameters:
-
-- mail_directory :: Directory where AI will look for files that
- contain problems to solve.
-
-- models_directory :: Directory where AI models are stored.
- - This option is mandatory.
-
-- prompts_directory :: Directory where prompts are stored.
-
- Example prompts directory content:
- #+begin_verse
- default.txt
- writer.txt
- #+end_verse
-
- Prompt name is file name without extension. File extension should be
- *txt*.
-
- Example content for *writer.txt*:
- : You are best-selling book writer.
-
-- default_temperature :: Defines the default temperature for AI
- responses, affecting randomness in the generation process. Lower
- values make the AI more deterministic and higher values make it more
- creative or random.
- - Default value: 0.7
-
-- llama_cpp_dir_path :: Specifies the filesystem path to the cloned
- and compiled *llama.cpp* directory.
- - Example Value: /home/user/AI/llama.cpp/
- - This option is mandatory.
-
-- batch_thread_count :: Specifies the number of threads to use for
- input prompt processing. CPU computing power is usually the
- bottleneck here.
- - Default value: 10
-
-- thread_count :: Sets the number of threads to be used by the AI
- during response generation. RAM data transfer speed is usually
- bottleneck here. When RAM bandwidth is saturated, increasing thread
- count will no longer increase processing speed, but it will still
- keep CPU cores unnecessarily busy.
- - Default value: 6
-
-- models :: List of available large language models.
- - alias :: Short model alias. Model with alias "default" would be used by default.
- - filesystem_path :: File name of the model as located within
- *models_directory*
- - context_size_tokens :: Context size in tokens that model was
- trained on.
- - end_of_text_marker :: Some models produce certain markers to
- indicate end of their output. If specified here, Älyverkko CLI can
- identify and remove them so that they don't leak into
- conversation. Default value is: *null*.
-
-*** Enlisting available models
-Once Älyverkko CLI is installed and properly configured, you can run
-following command at commandline to see what models are available to
-it:
-
-: alyverkko-cli listmodels
-
-*** Self test
-The *selftest* command performs a series of checks to ensure the
-system is configured correctly:
-
-: alyverkko-cli selftest
-
-It verifies:
-- Configuration file integrity.
-- Model directory existence.
-- The presence of the *llama.cpp* executable.
-
-** Starting daemon
-
-Älyverkko CLI keeps continuously listening for and processing tasks
-from a specified mail directory.
-
-There are multiple alternative ways to start Älyverkko CLI in mail
-processing mode:
-
-**** Start via command line interface
-
-1. Open your terminal.
-
-2. Run the command:
- : alyverkko-cli mail
-
-3. The application will start monitoring the configured mail directory
- for incoming messages and process them accordingly in endless loop.
-
-4. To terminate Älyverkko CLI, just hit *CTRL+c* on the keyboard, or
- close terminal window.
-
-**** Start using your desktop environment application launcher
-
-1. Access the application launcher or application menu on your desktop
- environment.
-
-2. Search for "Älyverkko CLI".
-
-3. Click on the icon to start the application. It will open its own
- terminal.
-
-4. If you want to stop Älyverkko CLI, just close terminal window.
-
-**** Start in the background as systemd system service
-
-During Älyverkko CLI [[id:0b705a37-9b84-4cd5-878a-fedc9ab09b12][installation]], installation script will prompt you
-if you want to install *systemd* service. If you chose *Y*, Alyverkko
-CLI would be immediately started in the background as a system
-service. Also it will be automatically started on every system reboot.
-
-To view service status, use:
-: systemctl -l status alyverkko-cli
-
-If you want to stop or disable service, you can do so using systemd
-facilities:
-
-: sudo systemctl stop alyverkko-cli
-: sudo systemctl disable alyverkko-cli
-
-* Task preparation
The Älyverkko CLI application expects input files for processing in
the form of plain text files within the specified mail directory
--- /dev/null
+:PROPERTIES:
+:ID: e469ec1e-402a-476d-a849-662a48eb4f90
+:END:
+#+SETUPFILE: ~/.emacs.d/org-styles/html/darksun.theme
+#+TITLE: Älyverkko CLI application setup
+#+LANGUAGE: en
+#+LATEX_HEADER: \usepackage[margin=1.0in]{geometry}
+#+LATEX_HEADER: \usepackage{parskip}
+#+LATEX_HEADER: \usepackage[none]{hyphenat}
+
+#+OPTIONS: H:20 num:20
+#+OPTIONS: author:nil
+
+* Requirements
+*Operating System:*
+
+Älyverkko CLI is developed and tested on Debian 12 "Bookworm". It
+should work on any modern Linux distribution with minimal adjustments
+to the installation process.
+
+*Dependencies:*
+- Java Development Kit (JDK) 17 or higher
+- Apache Maven for building the project
+
+*Hardware Requirements:*
+- Modern multi-core CPU.
+- The more RAM you have, the smarter AI model you can use. For
+ example, at least 64 GB of RAM is needed to run pretty decent
+ [[https://huggingface.co/MaziyarPanahi/WizardLM-2-8x22B-GGUF/tree/main][WizardLM-2-8x22B AI model]].
+- Sufficient disk space to store large language models and
+ input/output data.
+
+* Installation
+:PROPERTIES:
+:ID: 0b705a37-9b84-4cd5-878a-fedc9ab09b12
+:END:
+At the moment, to use Älyverkko CLI, you need to:
+- Download sources and build [[https://github.com/ggerganov/llama.cpp][llama.cpp]] project.
+- Download [[id:f5740953-079b-40f4-87d8-b6d1635a8d39][sources]] and build Älyverkko CLI project.
+- Download one or more pre-trained large language models in GGUF
+ format. Hugging Face repository [[https://huggingface.co/models?search=GGUF][has lot of them]]. My favorite is
+ [[https://huggingface.co/MaziyarPanahi/WizardLM-2-8x22B-GGUF/tree/main][WizardLM-2-8x22B]] for strong problem solving skills.
+
+Follow instructions for obtaining and building Älyverkko CLI on your
+computer that runs Debian 12 operating system:
+
+1. Ensure that you have Java Development Kit (JDK) installed on your
+ system.
+ : sudo apt-get install openjdk-17-jdk
+
+2. Ensure that you have Apache Maven installed:
+ : sudo apt-get install maven
+
+3. Clone the [[id:f5740953-079b-40f4-87d8-b6d1635a8d39][code repository]] or download the [[id:f5740953-079b-40f4-87d8-b6d1635a8d39][source code]] for the
+ `alyverkko-cli` application to your local machine.
+
+4. Navigate to the root directory of the cloned/downloaded project in
+ your terminal.
+
+5. Execute the installation script by running
+ : ./install
+
+ This script will compile the application and install it to
+ directory
+ : /opt/alyverkko-cli
+
+ To facilitate usage from command-line, it will also define
+ system-wide command *alyverkko-cli* as well as "Älyverkko CLI"
+ launcher in desktop applications menu.
+
+6. Prepare Älyverkko CLI [[id:0fcdae48-81c5-4ae1-bdb9-64ae74e87c45][configuration]] file.
+
+7. Verify that the application has been installed correctly by running
+ *alyverkko-cli* in your terminal.
+
+* Configuration
+:PROPERTIES:
+:ID: 0fcdae48-81c5-4ae1-bdb9-64ae74e87c45
+:END:
+Älyverkko CLI application configuration is done by editing YAML
+formatted configuration file.
+
+Configuration file should be placed under current user home directory:
+: ~/.config/alyverkko-cli.yaml
+
+** Configuration file example
+
+The application is configured using a YAML-formatted configuration
+file. Below is an example of how the configuration file might look:
+
+#+begin_src yaml
+ mail_directory: "/home/user/AI/mail"
+ models_directory: "/home/user/AI/models"
+ default_temperature: 0.7
+ llama_cpp_dir_path: "/home/user/AI/llama.cpp/"
+ batch_thread_count: 10
+ thread_count: 6
+ prompts_directory: "/home/user/.config/alyverkko-cli/prompts"
+ models:
+ - alias: "default"
+ filesystem_path: "WizardLM-2-8x22B.Q5_K_M-00001-of-00005.gguf"
+ context_size_tokens: 64000
+ end_of_text_marker: null
+ - alias: "mistral"
+ filesystem_path: "Mistral-Large-Instruct-2407.Q8_0.gguf"
+ context_size_tokens: 32768
+ end_of_text_marker: null
+#+end_src
+
+** Configuration file syntax
+
+Here are available parameters:
+
+- mail_directory :: Directory where AI will look for files that
+ contain problems to solve.
+
+- models_directory :: Directory where AI models are stored.
+ - This option is mandatory.
+
+- prompts_directory :: Directory where prompts are stored.
+
+ Example prompts directory content:
+ #+begin_verse
+ default.txt
+ writer.txt
+ #+end_verse
+
+ Prompt name is file name without extension. File extension should be
+ *txt*.
+
+ Example content for *writer.txt*:
+ : You are best-selling book writer.
+
+- default_temperature :: Defines the default temperature for AI
+ responses, affecting randomness in the generation process. Lower
+ values make the AI more deterministic and higher values make it more
+ creative or random.
+ - Default value: 0.7
+
+- llama_cpp_dir_path :: Specifies the filesystem path to the cloned
+ and compiled *llama.cpp* directory.
+ - Example Value: /home/user/AI/llama.cpp/
+ - This option is mandatory.
+
+- batch_thread_count :: Specifies the number of threads to use for
+ input prompt processing. CPU computing power is usually the
+ bottleneck here.
+ - Default value: 10
+
+- thread_count :: Sets the number of threads to be used by the AI
+ during response generation. RAM data transfer speed is usually
+ bottleneck here. When RAM bandwidth is saturated, increasing thread
+ count will no longer increase processing speed, but it will still
+ keep CPU cores unnecessarily busy.
+ - Default value: 6
+
+- models :: List of available large language models.
+ - alias :: Short model alias. Model with alias "default" would be used by default.
+ - filesystem_path :: File name of the model as located within
+ *models_directory*
+ - context_size_tokens :: Context size in tokens that model was
+ trained on.
+ - end_of_text_marker :: Some models produce certain markers to
+ indicate end of their output. If specified here, Älyverkko CLI can
+ identify and remove them so that they don't leak into
+ conversation. Default value is: *null*.
+
+** Enlisting available models
+Once Älyverkko CLI is installed and properly configured, you can run
+following command at commandline to see what models are available to
+it:
+
+: alyverkko-cli listmodels
+
+** Self test
+The *selftest* command performs a series of checks to ensure the
+system is configured correctly:
+
+: alyverkko-cli selftest
+
+It verifies:
+- Configuration file integrity.
+- Model directory existence.
+- The presence of the *llama.cpp* executable.
+
+* Starting daemon
+
+Älyverkko CLI keeps continuously listening for and processing tasks
+from a specified mail directory.
+
+There are multiple alternative ways to start Älyverkko CLI in mail
+processing mode:
+
+*** Start via command line interface
+
+1. Open your terminal.
+
+2. Run the command:
+ : alyverkko-cli mail
+
+3. The application will start monitoring the configured mail directory
+ for incoming messages and process them accordingly in endless loop.
+
+4. To terminate Älyverkko CLI, just hit *CTRL+c* on the keyboard, or
+ close terminal window.
+
+*** Start using your desktop environment application launcher
+
+1. Access the application launcher or application menu on your desktop
+ environment.
+
+2. Search for "Älyverkko CLI".
+
+3. Click on the icon to start the application. It will open its own
+ terminal.
+
+4. If you want to stop Älyverkko CLI, just close terminal window.
+
+*** Start in the background as systemd system service
+
+During Älyverkko CLI [[id:0b705a37-9b84-4cd5-878a-fedc9ab09b12][installation]], installation script will prompt you
+if you want to install *systemd* service. If you chose *Y*, Alyverkko
+CLI would be immediately started in the background as a system
+service. Also it will be automatically started on every system reboot.
+
+To view service status, use:
+: systemctl -l status alyverkko-cli
+
+If you want to stop or disable service, you can do so using systemd
+facilities:
+
+: sudo systemctl stop alyverkko-cli
+: sudo systemctl disable alyverkko-cli
# Export org to html using emacs in batch mode
(
cd doc/
+
rm -f index.html
emacs --batch -l ~/.emacs --visit=index.org --funcall=org-html-export-to-html --kill
+
+ rm setup.html
+ emacs --batch -l ~/.emacs --visit=setup.org --funcall=org-html-export-to-html --kill
)
# Generate class diagrams. See: https://www3.svjatoslav.eu/projects/javainspect/
echo ""
echo "Press ENTER to close this window."
-read
\ No newline at end of file
+read