diff --git a/.gitignore b/.gitignore new file mode 100644 index 0000000..70b8d78 --- /dev/null +++ b/.gitignore @@ -0,0 +1,4 @@ +.idea/ +.vscode/ +venv*/ +**/.env \ No newline at end of file diff --git a/LICENSE b/LICENSE new file mode 100644 index 0000000..261eeb9 --- /dev/null +++ b/LICENSE @@ -0,0 +1,201 @@ + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "[]" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright [yyyy] [name of copyright owner] + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. diff --git a/README.md b/README.md new file mode 100644 index 0000000..157c368 --- /dev/null +++ b/README.md @@ -0,0 +1,5 @@ +# Gemma Fine-Tuning Demo + +This project was forked from https://github.com/google-gemini/gemma-cookbook/tree/main/Gemma/spoken-language-tasks + +It aims to showcase fine-tuning capability of Gemma 2B model, on a specific language tasks in Chavacano. \ No newline at end of file diff --git a/installation.sh b/installation.sh new file mode 100755 index 0000000..775ae30 --- /dev/null +++ b/installation.sh @@ -0,0 +1,18 @@ +#!/bin/bash + +# update apt repository +sudo apt update + +# install software +sudo apt install git python3-venv + +#ll create a virtual environment for the project +python3 -m venv venv + +# to activate virtual environment +source venv/bin/activate +# to deactivate: +# deactivate + +# check version of CUDA drivers +nvcc --version \ No newline at end of file diff --git a/k-gemma-it/.gitignore b/k-gemma-it/.gitignore new file mode 100644 index 0000000..4f0cc45 --- /dev/null +++ b/k-gemma-it/.gitignore @@ -0,0 +1 @@ +weights/*.h5 \ No newline at end of file diff --git a/k-gemma-it/README.md b/k-gemma-it/README.md new file mode 100644 index 0000000..72bd1c4 --- /dev/null +++ b/k-gemma-it/README.md @@ -0,0 +1,3 @@ +# K-Mail replier model tuner + +This project let's you generate tuned versions of a Gemma model and test it. \ No newline at end of file diff --git a/k-gemma-it/deploy_weights.sh b/k-gemma-it/deploy_weights.sh new file mode 100755 index 0000000..008e80b --- /dev/null +++ b/k-gemma-it/deploy_weights.sh @@ -0,0 +1,10 @@ +#!/bin/bash + +# set date and time +date_time=$(date +"%Y_%m_%d_%H%M%S") + +# backup previous weights +mv ../k-mail-replier/k_mail_replier/weights/gemma2-2b_k-tuned.lora.h5 ../k-mail-replier/k_mail_replier/weights/gemma2-2b_k-tuned.lora.h5.$date_time.backup + +# deploy new weights +cp weights/gemma2-2b_k-tuned_4_epoch17.lora.h5 ../k-mail-replier/k_mail_replier/weights/gemma2-2b_k-tuned.lora.h5 \ No newline at end of file diff --git a/k-gemma-it/main.py b/k-gemma-it/main.py new file mode 100644 index 0000000..db3e955 --- /dev/null +++ b/k-gemma-it/main.py @@ -0,0 +1,152 @@ +# +# Copyright 2024 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +import os +from dotenv import load_dotenv +import keras +import datasets + +# Set the backbend before importing Keras +os.environ["KERAS_BACKEND"] = "jax" +# Avoid memory fragmentation on JAX backend. +os.environ["XLA_PYTHON_CLIENT_MEM_FRACTION"] = "1.00" +import keras_nlp + +# Model and tuning configuration +model_id = "gemma2_instruct_2b_en" +token_limit = 256 +batch_size_value = 1 +num_data_limit = 100 +lora_rank = 4 +learning_rate_value = 1e-4 +train_epochs = 20 +lora_name = "gemma2-2b_k-tuned" + +def set_environment(): + """Loads environment variables needed for execution.""" + load_dotenv() + # load Kaggle account info for downloading Gemma + kaggle_username = os.getenv('KAGGLE_USERNAME') + if not kaggle_username: + raise ValueError("KAGGLE_USERNAME environment variable not found. Did you set it in your .env file?") + kaggle_key = os.getenv('KAGGLE_KEY') + if not kaggle_key: + raise ValueError("KAGGLE_KEY environment variable not found. Did you set it in your .env file?") + + +def generate_from_base_model(prompt_text): + """Prints 'Starting generation run...' to the console.""" + print("Starting generation run with base model...") + set_environment() + + # create instance + gemma = keras_nlp.models.GemmaCausalLM.from_preset(model_id) + gemma.summary() + + input = f"user\n{prompt_text}\nmodel\n" + output = gemma.generate(input, max_length=token_limit) + print("\nGemma output:") + print(output) + + +def prepare_tuning_dataset(): + tokenizer = keras_nlp.models.GemmaTokenizer.from_preset(model_id) + + # prompt structure: + # user + # 다음에 대한 이메일 답장을 작성해줘. + # "{EMAIL CONTENT FROM THE CUSTOMER}" + # + # model + # {MODEL ANSWER} + + # load data from repository (or local directory) + from datasets import load_dataset + ds = load_dataset( + # Dataset : https://huggingface.co/datasets/bebechien/korean_cake_boss + "bebechien/korean_cake_boss", + split="train", + ) + print(ds) + data = ds.with_format("np", columns=["input", "output"], output_all_columns=False) + tuning_dataset = [] + + for x in data: + item = f"user\n다음에 대한 이메일 답장을 작성해줘.\n\"{x['input']}\"\nmodel\n{x['output']}" + length = len(tokenizer(item)) + # skip data if the token length is longer than our limit + if length < token_limit: + tuning_dataset.append(item) + if(len(tuning_dataset)>=num_data_limit): + break + + # FOR TESTING ONLY: + print(len(tuning_dataset)) + print(tuning_dataset[0]) + print(tuning_dataset[1]) + print(tuning_dataset[2]) + + return tuning_dataset + + +def tune_model_with_lora(): + set_environment() + + # Prepate the dataset + tuning_dataset = prepare_tuning_dataset() + + # initialize model + gemma = keras_nlp.models.GemmaCausalLM.from_preset(model_id) + + # Enable LoRA for the model and set the LoRA rank to 4. + gemma.backbone.enable_lora(rank=lora_rank) + gemma.summary() + + # Limit the input sequence length (to control memory usage). + gemma.preprocessor.sequence_length = token_limit + + # Use AdamW (a common optimizer for transformer models). + optimizer = keras.optimizers.AdamW( + learning_rate=learning_rate_value, + weight_decay=0.01, + ) + + # Exclude layernorm and bias terms from decay. + optimizer.exclude_from_weight_decay(var_names=["bias", "scale"]) + + gemma.compile( + loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True), + optimizer=optimizer, + weighted_metrics=[keras.metrics.SparseCategoricalAccuracy()], + ) + + class CustomCallback(keras.callbacks.Callback): + def on_epoch_end(self, epoch, logs=None): + model_name = f"weights/{lora_name}_{lora_rank}_epoch{epoch+1}.lora.h5" + gemma.backbone.save_lora_weights(model_name) + + print("Starting tuning run...") + history = gemma.fit(tuning_dataset, epochs=train_epochs, batch_size=batch_size_value, callbacks=[CustomCallback()]) + + +# default method ----------------------------- +if __name__ == "__main__": + print("Starting the default method") + # test generation with base model: + #generate_from_base_model("roses are red") + + # conduct a model tuning run + tune_model_with_lora() \ No newline at end of file diff --git a/k-gemma-it/spoken_language_tasks_with_gemma.ipynb b/k-gemma-it/spoken_language_tasks_with_gemma.ipynb new file mode 100644 index 0000000..0fc1a74 --- /dev/null +++ b/k-gemma-it/spoken_language_tasks_with_gemma.ipynb @@ -0,0 +1,1450 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "id": "cSrJYrFrY2aj" + }, + "source": [ + "##### Copyright 2024 Google LLC." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "cellView": "form", + "id": "i1PHqD-ZY4-c" + }, + "outputs": [], + "source": [ + "# @title Licensed under the Apache License, Version 2.0 (the \"License\");\n", + "# you may not use this file except in compliance with the License.\n", + "# You may obtain a copy of the License at\n", + "#\n", + "# https://www.apache.org/licenses/LICENSE-2.0\n", + "#\n", + "# Unless required by applicable law or agreed to in writing, software\n", + "# distributed under the License is distributed on an \"AS IS\" BASIS,\n", + "# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n", + "# See the License for the specific language governing permissions and\n", + "# limitations under the License." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "YNDq8NbCY7oh" + }, + "source": [ + "# How to Fine-tuning Gemma for Spoken Language Tasks\n", + "\n", + "This notebook demonstrate how to fine tune Gemma for the specific task on replying to email requests that a Korean bakery business might get.\n", + "\n", + "\n", + " \n", + "
\n", + " Run in Google Colab\n", + "
" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "3rzH5Ugf5RlJ" + }, + "source": [ + "## Setup\n", + "\n", + "### Select the Colab runtime\n", + "To complete this tutorial, you'll need to have a Colab runtime with sufficient resources to run the Gemma model:\n", + "\n", + "1. In the upper-right of the Colab window, select **▾ (Additional connection options)**.\n", + "2. Select **Change runtime type**.\n", + "3. Under **Hardware accelerator**, select **L4** or **A100 GPU**.\n", + "\n", + "\n", + "### Gemma setup on Kaggle\n", + "To complete this tutorial, you'll first need to complete the setup instructions at [Gemma setup](https://ai.google.dev/gemma/docs/setup). The Gemma setup instructions show you how to do the following:\n", + "\n", + "* Get access to Gemma on kaggle.com.\n", + "* Select a Colab runtime with sufficient resources to run the Gemma 2B model.\n", + "* Generate and configure a Kaggle username and API key.\n", + "\n", + "After you've completed the Gemma setup, move on to the next section, where you'll set environment variables for your Colab environment." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "URMuBzkMVxpU" + }, + "source": [ + "### Set environemnt variables\n", + "\n", + "Set environement variables for ```KAGGLE_USERNAME``` and ```KAGGLE_KEY```." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "IUOX2hqjV7Ku" + }, + "outputs": [], + "source": [ + "import os\n", + "from google.colab import userdata, drive\n", + "\n", + "# Note: `userdata.get` is a Colab API. If you're not using Colab, set the env\n", + "# vars as appropriate for your system.\n", + "os.environ[\"KAGGLE_USERNAME\"] = userdata.get(\"KAGGLE_USERNAME\")\n", + "os.environ[\"KAGGLE_KEY\"] = userdata.get(\"KAGGLE_KEY\")\n", + "\n", + "# Mounting gDrive for to store artifacts\n", + "drive.mount(\"/content/drive\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "LXfDwRTQVns2" + }, + "source": [ + "### Install dependencies\n", + "\n", + "Install Keras and KerasNLP" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "id": "zHs7wpZusEML" + }, + "outputs": [], + "source": [ + "!pip install -q -U keras-nlp datasets\n", + "!pip install -q -U keras\n", + "\n", + "# Set the backbend before importing Keras\n", + "os.environ[\"KERAS_BACKEND\"] = \"jax\"\n", + "# Avoid memory fragmentation on JAX backend.\n", + "os.environ[\"XLA_PYTHON_CLIENT_MEM_FRACTION\"] = \"1.00\"\n", + "\n", + "import keras_nlp\n", + "import keras\n", + "\n", + "# Run at half precision.\n", + "#keras.config.set_floatx(\"bfloat16\")\n", + "\n", + "# Training Configurations\n", + "token_limit = 512\n", + "num_data_limit = 100\n", + "lora_name = \"cakeboss\"\n", + "lora_rank = 4\n", + "lr_value = 1e-4\n", + "train_epoch = 20\n", + "model_id = \"gemma2_instruct_2b_en\"" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "kUl0t469YfQY" + }, + "source": [ + "## Load Model" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "id": "Gm4jIEqmYfQY" + }, + "outputs": [ + { + "data": { + "text/html": [ + "
Preprocessor: \"gemma_causal_lm_preprocessor\"\n",
+              "
\n" + ], + "text/plain": [ + "\u001b[1mPreprocessor: \"gemma_causal_lm_preprocessor\"\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n",
+              "┃ Layer (type)                                                                                     Config ┃\n",
+              "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n",
+              "│ gemma_tokenizer (GemmaTokenizer)                              │                      Vocab size: 256,000 │\n",
+              "└───────────────────────────────────────────────────────────────┴──────────────────────────────────────────┘\n",
+              "
\n" + ], + "text/plain": [ + "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n", + "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Config\u001b[0m\u001b[1m \u001b[0m┃\n", + "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n", + "│ gemma_tokenizer (\u001b[38;5;33mGemmaTokenizer\u001b[0m) │ Vocab size: \u001b[38;5;34m256,000\u001b[0m │\n", + "└───────────────────────────────────────────────────────────────┴──────────────────────────────────────────┘\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
Model: \"gemma_causal_lm\"\n",
+              "
\n" + ], + "text/plain": [ + "\u001b[1mModel: \"gemma_causal_lm\"\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n",
+              "┃ Layer (type)                   Output Shape                       Param #  Connected to               ┃\n",
+              "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n",
+              "│ padding_mask (InputLayer)     │ (None, None)              │               0 │ -                          │\n",
+              "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
+              "│ token_ids (InputLayer)        │ (None, None)              │               0 │ -                          │\n",
+              "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
+              "│ gemma_backbone                │ (None, None, 2304)        │   2,614,341,888 │ padding_mask[0][0],        │\n",
+              "│ (GemmaBackbone)               │                           │                 │ token_ids[0][0]            │\n",
+              "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
+              "│ token_embedding               │ (None, None, 256000)      │     589,824,000 │ gemma_backbone[0][0]       │\n",
+              "│ (ReversibleEmbedding)         │                           │                 │                            │\n",
+              "└───────────────────────────────┴───────────────────────────┴─────────────────┴────────────────────────────┘\n",
+              "
\n" + ], + "text/plain": [ + "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n", + "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mOutput Shape \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Param #\u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mConnected to \u001b[0m\u001b[1m \u001b[0m┃\n", + "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n", + "│ padding_mask (\u001b[38;5;33mInputLayer\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │ - │\n", + "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n", + "│ token_ids (\u001b[38;5;33mInputLayer\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │ - │\n", + "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n", + "│ gemma_backbone │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m2304\u001b[0m) │ \u001b[38;5;34m2,614,341,888\u001b[0m │ padding_mask[\u001b[38;5;34m0\u001b[0m][\u001b[38;5;34m0\u001b[0m], │\n", + "│ (\u001b[38;5;33mGemmaBackbone\u001b[0m) │ │ │ token_ids[\u001b[38;5;34m0\u001b[0m][\u001b[38;5;34m0\u001b[0m] │\n", + "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n", + "│ token_embedding │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m256000\u001b[0m) │ \u001b[38;5;34m589,824,000\u001b[0m │ gemma_backbone[\u001b[38;5;34m0\u001b[0m][\u001b[38;5;34m0\u001b[0m] │\n", + "│ (\u001b[38;5;33mReversibleEmbedding\u001b[0m) │ │ │ │\n", + "└───────────────────────────────┴───────────────────────────┴─────────────────┴────────────────────────────┘\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
 Total params: 2,614,341,888 (9.74 GB)\n",
+              "
\n" + ], + "text/plain": [ + "\u001b[1m Total params: \u001b[0m\u001b[38;5;34m2,614,341,888\u001b[0m (9.74 GB)\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
 Trainable params: 2,614,341,888 (9.74 GB)\n",
+              "
\n" + ], + "text/plain": [ + "\u001b[1m Trainable params: \u001b[0m\u001b[38;5;34m2,614,341,888\u001b[0m (9.74 GB)\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
 Non-trainable params: 0 (0.00 B)\n",
+              "
\n" + ], + "text/plain": [ + "\u001b[1m Non-trainable params: \u001b[0m\u001b[38;5;34m0\u001b[0m (0.00 B)\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "## 답장 예시\n", + "\n", + "**제목: Re: 결혼기념일 케이크 주문 문의**\n", + "\n", + "안녕하세요, [주문자 이름]님,\n", + "\n", + "안녕하세요. 결혼기념일 케이크 주문을 받으셨네요! 😊 \n", + "\n", + "3호 케이크 1개를 주문하시는군요. \n", + "\n", + "[주문자 이름]님의 결혼기념일을 축하드립니다! \n", + "\n", + "[케이크 종류, 옵션, 주문 날짜 등]에 대한 자세한 정보를 알려주시면 최대한 멋진 케이크를 만들어 드리겠습니다. \n", + "\n", + "[주문자 이름]님께서 원하는 케이크의 디자인, 옵션, 주문 날짜 등을 알려주시면 됩니다. \n", + "\n", + "[주문자 이름]님께서 원하는 케이크를 만들어 드리겠습니다. \n", + "\n", + "감사합니다.\n", + "\n", + "[이름]\n", + "[회사명/전화번호/이메일 주소] \n", + "\n", + "\n", + "**참고:**\n", + "\n", + "* 위 답장은 예시이며, 실제 답장에 필요한 내용을 추가하거나 수정해야 할 수 있습니다. \n", + "* 케이크 종류, 옵션, 주문 날짜 등을 명확하게 알려주는 것이 중요합니다. \n", + "* 답장에 친절하고 긍정적인 분위기를 유지하는 것이 좋습니다. \n", + "* 케이크 주문에 대한 추가 질문이 있으면 언제든지 문의해주세요. \n", + "\n", + "\n", + "\n", + "\n", + "TOTAL TIME ELAPSED: 41.09s\n" + ] + } + ], + "source": [ + "import keras\n", + "import keras_nlp\n", + "\n", + "import time\n", + "\n", + "gemma_lm = keras_nlp.models.GemmaCausalLM.from_preset(model_id)\n", + "gemma_lm.summary()\n", + "\n", + "tick_start = 0\n", + "\n", + "def tick():\n", + " global tick_start\n", + " tick_start = time.time()\n", + "\n", + "def tock():\n", + " print(f\"TOTAL TIME ELAPSED: {time.time() - tick_start:.2f}s\")\n", + "\n", + "def text_gen(prompt):\n", + " tick()\n", + " input = f\"user\\n{prompt}\\nmodel\\n\"\n", + " output = gemma_lm.generate(input, max_length=token_limit)\n", + " print(\"\\nGemma output:\")\n", + " print(output)\n", + " tock()\n", + "\n", + "# inference before fine-tuning\n", + "text_gen(\"다음에 대한 이메일 답장을 작성해줘.\\n\\\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\\\"\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "9T7xe_jzslv4" + }, + "source": [ + "## Load Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "id": "ZiS-KU9osh_N" + }, + "outputs": [ + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "1697225eea94485bad28b735a23036fe", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "README.md: 0%| | 0.00/61.0 [00:00user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요,\n", + "2주 뒤에 있을 아이 생일을 위해 3호 케이크 3개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "고객님, 안녕하세요.\n", + "\n", + "2주 뒤 아이 생일을 위한 3호 케이크 2개 주문 문의 감사합니다.\n", + "네, 3호 케이크 2개 주문 가능합니다.\n", + "\n", + "아이 생일 케이크인 만큼 더욱 신경 써서 정성껏 준비하겠습니다. 혹시 원하시는 디자인이나 특별한 요청 사항이 있으시면 편하게 말씀해주세요.\n", + "\n", + "픽업 날짜와 시간을 알려주시면 더욱 자세한 안내를 도와드리겠습니다.\n", + "\n", + "다시 한번 문의 감사드리며, 아이 생일 진심으로 축하합니다!\n", + "\n", + "[가게 이름] 드림\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요,\n", + "\n", + "9월 15일에 있을 아들의 돌잔치를 위해 케이크를 주문하고 싶습니다.\n", + "- 케이크 종류: 생크림 케이크\n", + "- 크기: 2호\n", + "- 디자인: 아기자기한 동물 디자인\n", + "- 문구: \"첫 생일 축하해, 사랑하는 아들!\"\n", + "- 픽업 날짜 및 시간: 9월 14일 오후 3시\n", + "\n", + "가격 및 주문 가능 여부를 알려주시면 감사하겠습니다.\n", + "\n", + "감사합니다.\n", + "김민지 드림\"\n", + "model\n", + "안녕하세요, 김민지 님,\n", + "\n", + "9월 15일 아드님의 돌잔치를 위한 케이크 주문 문의 감사합니다.\n", + "\n", + "- 생크림 케이크 2호, 아기자기한 동물 디자인, \"첫 생일 축하해, 사랑하는 아들!\" 문구, 9월 14일 오후 3시 픽업 모두 가능합니다.\n", + "- 가격은 5만원입니다.\n", + "\n", + "주문을 원하시면 연락 주세요.\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 박지혜라고 합니다.\n", + "\n", + "10월 5일에 있을 결혼 10주년 기념일을 위해 특별한 디자인의 케이크를 주문하고 싶습니다.\n", + "\n", + "케이크 종류: 레드벨벳 케이크\n", + "크기: 3호\n", + "디자인: 첨부된 사진처럼 웨딩드레스와 턱시도 모양으로 장식된 케이크\n", + "맞춤 디자인 제작이 가능한지, 그리고 예상 가격과 제작 기간을 알려주시면 감사하겠습니다.\n", + "\n", + "감사합니다.\"\n", + "model\n", + "안녕하세요, 박지혜 님,\n", + "\n", + "결혼 10주년 기념일을 위한 특별한 케이크 주문 문의 감사합니다.\n", + "\n", + "첨부해주신 사진과 같은 웨딩드레스 & 턱시도 디자인의 레드벨벳 케이크 3호 제작 가능합니다.\n", + "추가로 원하시는 특별한 문구가 있다면 넣어드릴 수 있습니다.\n", + "맞춤 디자인 제작으로, 가격은 12만원이며 제작 기간은 3일 정도 소요됩니다.\n", + "픽업 날짜와 시간을 알려주시면 그에 맞춰 제작하겠습니다.\n", + "\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n" + ] + } + ], + "source": [ + "import keras\n", + "import keras_nlp\n", + "import datasets\n", + "\n", + "tokenizer = keras_nlp.models.GemmaTokenizer.from_preset(model_id)\n", + "\n", + "# prompt structure\n", + "# user\n", + "# 다음에 대한 이메일 답장을 작성해줘.\n", + "# \"{EMAIL CONTENT FROM THE CUSTOMER}\"\n", + "# \n", + "# model\n", + "# {MODEL ANSWER}\n", + "\n", + "# input, output\n", + "from datasets import load_dataset\n", + "ds = load_dataset(\n", + " \"bebechien/korean_cake_boss\",\n", + " split=\"train\",\n", + ")\n", + "print(ds)\n", + "data = ds.with_format(\"np\", columns=[\"input\", \"output\"], output_all_columns=False)\n", + "train = []\n", + "\n", + "for x in data:\n", + " item = f\"user\\n다음에 대한 이메일 답장을 작성해줘.\\n\\\"{x['input']}\\\"\\nmodel\\n{x['output']}\"\n", + " length = len(tokenizer(item))\n", + " # skip data if the token length is longer than our limit\n", + " if length < token_limit:\n", + " train.append(item)\n", + " if(len(train)>=num_data_limit):\n", + " break\n", + "\n", + "print(len(train))\n", + "print(train[0])\n", + "print(train[1])\n", + "print(train[2])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "Pt7Nr6a7tItO" + }, + "source": [ + "## LoRA Fine-tuning" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "id": "RCucu6oHz53G" + }, + "outputs": [ + { + "data": { + "text/html": [ + "
Preprocessor: \"gemma_causal_lm_preprocessor\"\n",
+              "
\n" + ], + "text/plain": [ + "\u001b[1mPreprocessor: \"gemma_causal_lm_preprocessor\"\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n",
+              "┃ Layer (type)                                                                                     Config ┃\n",
+              "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n",
+              "│ gemma_tokenizer (GemmaTokenizer)                              │                      Vocab size: 256,000 │\n",
+              "└───────────────────────────────────────────────────────────────┴──────────────────────────────────────────┘\n",
+              "
\n" + ], + "text/plain": [ + "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n", + "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Config\u001b[0m\u001b[1m \u001b[0m┃\n", + "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n", + "│ gemma_tokenizer (\u001b[38;5;33mGemmaTokenizer\u001b[0m) │ Vocab size: \u001b[38;5;34m256,000\u001b[0m │\n", + "└───────────────────────────────────────────────────────────────┴──────────────────────────────────────────┘\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
Model: \"gemma_causal_lm\"\n",
+              "
\n" + ], + "text/plain": [ + "\u001b[1mModel: \"gemma_causal_lm\"\u001b[0m\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n",
+              "┃ Layer (type)                   Output Shape                       Param #  Connected to               ┃\n",
+              "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n",
+              "│ padding_mask (InputLayer)     │ (None, None)              │               0 │ -                          │\n",
+              "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
+              "│ token_ids (InputLayer)        │ (None, None)              │               0 │ -                          │\n",
+              "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
+              "│ gemma_backbone                │ (None, None, 2304)        │   2,617,270,528 │ padding_mask[0][0],        │\n",
+              "│ (GemmaBackbone)               │                           │                 │ token_ids[0][0]            │\n",
+              "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n",
+              "│ token_embedding               │ (None, None, 256000)      │     589,824,000 │ gemma_backbone[0][0]       │\n",
+              "│ (ReversibleEmbedding)         │                           │                 │                            │\n",
+              "└───────────────────────────────┴───────────────────────────┴─────────────────┴────────────────────────────┘\n",
+              "
\n" + ], + "text/plain": [ + "┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\n", + "┃\u001b[1m \u001b[0m\u001b[1mLayer (type) \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mOutput Shape \u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1m Param #\u001b[0m\u001b[1m \u001b[0m┃\u001b[1m \u001b[0m\u001b[1mConnected to \u001b[0m\u001b[1m \u001b[0m┃\n", + "┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\n", + "│ padding_mask (\u001b[38;5;33mInputLayer\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │ - │\n", + "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n", + "│ token_ids (\u001b[38;5;33mInputLayer\u001b[0m) │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m) │ \u001b[38;5;34m0\u001b[0m │ - │\n", + "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n", + "│ gemma_backbone │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m2304\u001b[0m) │ \u001b[38;5;34m2,617,270,528\u001b[0m │ padding_mask[\u001b[38;5;34m0\u001b[0m][\u001b[38;5;34m0\u001b[0m], │\n", + "│ (\u001b[38;5;33mGemmaBackbone\u001b[0m) │ │ │ token_ids[\u001b[38;5;34m0\u001b[0m][\u001b[38;5;34m0\u001b[0m] │\n", + "├───────────────────────────────┼───────────────────────────┼─────────────────┼────────────────────────────┤\n", + "│ token_embedding │ (\u001b[38;5;45mNone\u001b[0m, \u001b[38;5;45mNone\u001b[0m, \u001b[38;5;34m256000\u001b[0m) │ \u001b[38;5;34m589,824,000\u001b[0m │ gemma_backbone[\u001b[38;5;34m0\u001b[0m][\u001b[38;5;34m0\u001b[0m] │\n", + "│ (\u001b[38;5;33mReversibleEmbedding\u001b[0m) │ │ │ │\n", + "└───────────────────────────────┴───────────────────────────┴─────────────────┴────────────────────────────┘\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
 Total params: 2,617,270,528 (9.75 GB)\n",
+              "
\n" + ], + "text/plain": [ + "\u001b[1m Total params: \u001b[0m\u001b[38;5;34m2,617,270,528\u001b[0m (9.75 GB)\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
 Trainable params: 2,928,640 (11.17 MB)\n",
+              "
\n" + ], + "text/plain": [ + "\u001b[1m Trainable params: \u001b[0m\u001b[38;5;34m2,928,640\u001b[0m (11.17 MB)\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "
 Non-trainable params: 2,614,341,888 (9.74 GB)\n",
+              "
\n" + ], + "text/plain": [ + "\u001b[1m Non-trainable params: \u001b[0m\u001b[38;5;34m2,614,341,888\u001b[0m (9.74 GB)\n" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Enable LoRA for the model and set the LoRA rank to 4.\n", + "gemma_lm.backbone.enable_lora(rank=lora_rank)\n", + "gemma_lm.summary()\n", + "\n", + "# Limit the input sequence length (to control memory usage).\n", + "gemma_lm.preprocessor.sequence_length = token_limit\n", + "# Use AdamW (a common optimizer for transformer models).\n", + "optimizer = keras.optimizers.AdamW(\n", + " learning_rate=lr_value,\n", + " weight_decay=0.01,\n", + ")\n", + "# Exclude layernorm and bias terms from decay.\n", + "optimizer.exclude_from_weight_decay(var_names=[\"bias\", \"scale\"])\n", + "\n", + "gemma_lm.compile(\n", + " loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),\n", + " optimizer=optimizer,\n", + " weighted_metrics=[keras.metrics.SparseCategoricalAccuracy()],\n", + ")\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "hQQ47kcdpbZ9" + }, + "source": [ + "Note that enabling LoRA reduces the number of trainable parameters significantly." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "id": "26d9npFhAOSp" + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch 1/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 4s/step - loss: 1.1019 - sparse_categorical_accuracy: 0.5969\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "## 답장 예시\n", + "\n", + "**제목: Re: 결혼기념일 케이크 주문 문의**\n", + "\n", + "안녕하세요, [주문자 이름]님,\n", + "\n", + "안녕하세요. 결혼기념일 케이크 주문을 받으셨네요! 😊 \n", + "\n", + "3호 케이크 1개를 주문하시면, 멋진 결혼기념일을 축하드립니다! \n", + "\n", + "[주문 내용에 대한 추가 정보를 적어주세요. 예시: 케이크 종류, 크기, 디자인, 주문 날짜, 배송 지역 등]\n", + "\n", + "[주문 가능 여부에 대한 답변]\n", + "\n", + "* 케이크 주문이 가능합니다. \n", + "* [케이크 주문 가능 여부에 대한 추가 설명]\n", + "\n", + "[케이크 주문 시 추가 정보를 요청할 수 있는 문구]\n", + "\n", + "* [예시: 케이크 디자인에 대한 추가 설명, 배송 날짜, 배송 지역 등]\n", + "\n", + "[주문에 대한 답변을 위한 추가 문구]\n", + "\n", + "* [예시: 케이크 주문이 완료되었는지 확인해주세요.]\n", + "\n", + "감사합니다. \n", + "\n", + "[이름]\n", + "\n", + "[회사/개인 정보]\n", + "\n", + "\n", + "**참고:**\n", + "\n", + "* 위 답변은 예시이며, 실제 답변에 따라 내용을 수정해야 합니다. \n", + "* 케이크 주문에 대한 추가 정보를 적어주세요. \n", + "* 답변에 대한 질문이 있으면 언제든지 문의해주세요. \n", + "\n", + "\n", + "\n", + "\n", + "TOTAL TIME ELAPSED: 39.27s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m128s\u001b[0m 8s/step - loss: 1.1012 - sparse_categorical_accuracy: 0.5964\n", + "Epoch 2/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 643ms/step - loss: 1.0636 - sparse_categorical_accuracy: 0.6019\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "안녕하세요. \n", + "\n", + "결혼기념일 케이크 주문을 환영합니다! 3호 케이크 1개를 주문하시는군요. 😊\n", + "\n", + "[주문 가능 여부에 따라 답변]\n", + "\n", + "* **가능합니다.** [케이크 종류, 크기, 디자인 등에 대한 자세한 정보를 추가]\n", + "* **죄송합니다.** [케이크 주문 가능 여부에 대한 자세한 설명]\n", + "\n", + "[주문 가능 여부에 대한 추가 정보]\n", + "\n", + "* 주문 가능한 날짜와 시간은 [날짜와 시간]입니다.\n", + "* 케이크 가격은 [가격]입니다.\n", + "* 케이크에 대한 추가 정보는 [주문자에게 추가 정보를 제공]\n", + "\n", + "[케이크 주문을 위한 추가 정보]\n", + "\n", + "* 케이크 주문을 위해 [주문 방법]을 이용해주세요.\n", + "* [주문자에게 추가 정보를 제공]\n", + "\n", + "[케이크 주문에 대한 질문]\n", + "\n", + "* [주문자에게 질문]\n", + "\n", + "\n", + " \n", + "\n", + "TOTAL TIME ELAPSED: 11.96s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m45s\u001b[0m 2s/step - loss: 1.0623 - sparse_categorical_accuracy: 0.6016 \n", + "Epoch 3/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 651ms/step - loss: 1.0042 - sparse_categorical_accuracy: 0.6117\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "안녕하세요. \n", + "\n", + "결혼기념일 케이크 주문을 환영합니다! 3호 케이크 1개를 주문하시면 됩니다. \n", + "\n", + "[주문 내용에 대한 추가 정보를 입력하세요. 예시: 케이크 종류, 크기, 옵션 등]\n", + "\n", + "[주문 가능한 날짜와 시간을 입력하세요. 예시: 2023년 10월 28일 오후 2시부터]\n", + "\n", + "[주문 가능한 옵션을 입력하세요. 예시: 배송, 픽업 등]\n", + "\n", + "[주문 완료 후에 연락처를 입력하세요. 예시: 010-1234-5678]\n", + "\n", + "감사합니다. \n", + "\n", + "\n", + "TOTAL TIME ELAPSED: 8.84s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m16s\u001b[0m 2s/step - loss: 1.0023 - sparse_categorical_accuracy: 0.6116 \n", + "Epoch 4/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 650ms/step - loss: 0.9292 - sparse_categorical_accuracy: 0.6305\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "안녕하세요. \n", + "\n", + "결혼기념일 케이크 주문을 환영합니다! 3호 케이크 1개를 주문하시면 됩니다. \n", + "\n", + "[주문 가능한 케이크 종류 및 옵션]\n", + "* [케이크 종류]\n", + "* [케이크 크기]\n", + "* [케이크 디자인]\n", + "* [케이크 옵션]\n", + "\n", + "[주문 가능한 날짜 및 시간]\n", + "* [주문 가능한 날짜]\n", + "* [주문 가능한 시간]\n", + "\n", + "[주문 방법]\n", + "* [주문 방법]\n", + "\n", + "[주문 확인]\n", + "* [주문 확인]\n", + "\n", + "감사합니다. \n", + "\n", + "TOTAL TIME ELAPSED: 7.77s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m15s\u001b[0m 2s/step - loss: 0.9271 - sparse_categorical_accuracy: 0.6304 \n", + "Epoch 5/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 649ms/step - loss: 0.8567 - sparse_categorical_accuracy: 0.6446\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "안녕하세요. 3호 케이크 1개 주문 가능합니다. \n", + "결혼기념일 축하드립니다! \n", + "주문하시면 곧 연락드리겠습니다. \n", + "\n", + "TOTAL TIME ELAPSED: 2.67s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m9s\u001b[0m 961ms/step - loss: 0.8548 - sparse_categorical_accuracy: 0.6447\n", + "Epoch 6/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 646ms/step - loss: 0.7944 - sparse_categorical_accuracy: 0.6753\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "안녕하세요. 3호 케이크 1개 주문 가능합니다. \n", + "결혼기념일 축하드립니다. \n", + "주문하시면 곧 연락드리겠습니다. \n", + "감사합니다. \n", + "\n", + "TOTAL TIME ELAPSED: 2.98s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m10s\u001b[0m 992ms/step - loss: 0.7927 - sparse_categorical_accuracy: 0.6753\n", + "Epoch 7/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 644ms/step - loss: 0.7377 - sparse_categorical_accuracy: 0.6891\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "안녕하세요. 3호 케이크 1개 주문 가능합니다. \n", + "결혼기념일 축하드립니다. \n", + "주문하시면 곧 연락드리겠습니다. \n", + "감사합니다. \n", + "\n", + "TOTAL TIME ELAPSED: 2.97s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m10s\u001b[0m 989ms/step - loss: 0.7360 - sparse_categorical_accuracy: 0.6889\n", + "Epoch 8/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 646ms/step - loss: 0.6835 - sparse_categorical_accuracy: 0.7020\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "안녕하세요. 3호 케이크 1개 주문 가능합니다. \n", + "결혼기념일 축하드립니다. \n", + "주문하시면 맛있게 드실 수 있도록 최선을 다하겠습니다.\n", + "주문하시면 몇 가지 추가 옵션을 알려드릴 수 있습니다.\n", + "\n", + "TOTAL TIME ELAPSED: 4.10s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m11s\u001b[0m 1s/step - loss: 0.6818 - sparse_categorical_accuracy: 0.7019 \n", + "Epoch 9/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 644ms/step - loss: 0.6318 - sparse_categorical_accuracy: 0.7216\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "안녕하세요.\n", + "\n", + "3호 케이크 1개 주문 가능합니다. \n", + "결혼기념일을 축하드립니다.\n", + "주문하시면 맛있게 드실 수 있도록 최선을 다하겠습니다.\n", + "주문하시면 답장 드리겠습니다.\n", + "\n", + "감사합니다.\n", + "\n", + "TOTAL TIME ELAPSED: 3.99s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m11s\u001b[0m 1s/step - loss: 0.6301 - sparse_categorical_accuracy: 0.7216 \n", + "Epoch 10/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 644ms/step - loss: 0.5821 - sparse_categorical_accuracy: 0.7428\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "안녕하세요,\n", + "\n", + "3호 케이크 1개 주문 가능합니다. \n", + "결혼기념일을 축하드립니다!\n", + "주문하시면 \n", + "* 케이크 종류: \n", + "* 맛: \n", + "* 글자: \n", + "* 디자인: \n", + "* 배송 날짜: \n", + "등을 알려주시면 제작 가능합니다.\n", + "\n", + "감사합니다.\n", + "\n", + "TOTAL TIME ELAPSED: 4.81s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m11s\u001b[0m 1s/step - loss: 0.5804 - sparse_categorical_accuracy: 0.7427 \n", + "Epoch 11/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 644ms/step - loss: 0.5329 - sparse_categorical_accuracy: 0.7624\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "고객님, 안녕하세요.\n", + "\n", + "결혼기념일을 위해 3호 케이크 1개 주문 가능합니다. \n", + "다음과 같은 케이크를 제작할 수 있습니다.\n", + "\n", + "* 케이크 종류: 3호 케이크\n", + "* 디자인: 결혼 기념일을 위한 디자인 (예: 결혼 날짜, 사진 등)\n", + "* 옵션: \n", + " * 맛: 딸기, 초콜릿, 바닐라 등\n", + " * 장식: 꽃, 글자, 그림 등\n", + "* 주문 가능 날짜: 2023년 10월 28일\n", + "* 주문 가능 시간: 10:00 - 18:00\n", + "* 주문 가능 금액: 15,000원\n", + "\n", + "다음과 같은 디자인을 원하시면, \n", + "[디자인 요청]을 작성해주세요.\n", + "\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n", + "TOTAL TIME ELAPSED: 11.07s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m18s\u001b[0m 2s/step - loss: 0.5311 - sparse_categorical_accuracy: 0.7624 \n", + "Epoch 12/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 649ms/step - loss: 0.4876 - sparse_categorical_accuracy: 0.7834\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "고객님, 안녕하세요.\n", + "\n", + "결혼기념일을 위한 3호 케이크 주문 가능합니다. \n", + "다음과 같은 케이크를 제작할 수 있습니다.\n", + "\n", + "* 케이크 종류: 3호 케이크\n", + "* 디자인: 결혼 기념일을 위한 디자인 (예: 결혼 날짜, 사진 등)\n", + "* 옵션: \n", + " * 글자 적용 (예: \"사랑하는 당신과 함께 10년을 맞이합니다.\")\n", + " * 장식 (예: 꽃, 깃털 등)\n", + "* 주문 가능 날짜: 2023년 10월 28일\n", + "* 주문 가능 시간: 10:00 - 18:00\n", + "* 가격: 15,000원\n", + "\n", + "다음과 같은 디자인을 원하시나요?\n", + "[가격, 디자인, 옵션 등을 포함한 이미지]\n", + "\n", + "감사합니다.\n", + "TOTAL TIME ELAPSED: 11.14s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m18s\u001b[0m 2s/step - loss: 0.4859 - sparse_categorical_accuracy: 0.7835 \n", + "Epoch 13/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 650ms/step - loss: 0.4461 - sparse_categorical_accuracy: 0.8033\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "고객님, 안녕하세요.\n", + "\n", + "결혼기념일 3호 케이크 주문 가능합니다. \n", + "다음과 같은 케이크를 제작할 수 있습니다.\n", + "\n", + "* 케이크 종류: 3호 케이크\n", + "* 디자인: 결혼 기념일을 위한 디자인 (예: 손님 이름, 결혼 날짜 등)\n", + "* 픽업 날짜 및 시간: 2023년 10월 28일 오전 10시\n", + "* 주문 가능한 옵션: \n", + " * 케이크 픽업\n", + " * 케이크 배달\n", + "* 가격: 15,000원\n", + "* 추가 옵션: \n", + " * 케이크 장식 (예: 꽃, 글자, 사진 등)\n", + " * 케이크 포장 (예: 핸드메이드 포장, 특별한 포장 등)\n", + "\n", + "고객님의 요청에 맞춰 케이크를 제작해 드리겠습니다.\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n", + "TOTAL TIME ELAPSED: 11.90s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m19s\u001b[0m 2s/step - loss: 0.4444 - sparse_categorical_accuracy: 0.8038 \n", + "Epoch 14/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 648ms/step - loss: 0.4074 - sparse_categorical_accuracy: 0.8195\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "고객님, 안녕하세요.\n", + "\n", + "결혼기념일 3호 케이크 주문 가능합니다. \n", + "다음과 같은 종류가 있습니다.\n", + "\n", + "* 3호 케이크: 12인분\n", + "* 디자인: [케이크 디자인 예시: 사랑스러운 로맨틱한 디자인]\n", + "* 픽업 날짜 및 시간: [픽업 날짜 및 시간 예시: 2023년 12월 25일 오전 10시]\n", + "* 가격: 15,000원\n", + "\n", + "주문하시면 상세한 정보와 함께 케이크 디자인을 함께 보여드리겠습니다.\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n", + "TOTAL TIME ELAPSED: 8.32s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m15s\u001b[0m 2s/step - loss: 0.4058 - sparse_categorical_accuracy: 0.8199 \n", + "Epoch 15/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 646ms/step - loss: 0.3711 - sparse_categorical_accuracy: 0.8331\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "고객님, 안녕하세요.\n", + "\n", + "결혼기념일 3호 케이크 주문 가능합니다. \n", + "3호 케이크는 12월 25일 주문 가능합니다.\n", + "다음과 같은 디자인을 원하시나요?\n", + "\n", + "* 흰색 케이크, 꽃 장식\n", + "* 빨간색 케이크, 딸기 장식\n", + "* 초록색 케이크, 초콜릿 장식\n", + "\n", + "혹시 다른 디자인이 있으시면 말씀해주세요.\n", + "TOTAL TIME ELAPSED: 5.83s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m13s\u001b[0m 1s/step - loss: 0.3696 - sparse_categorical_accuracy: 0.8336 \n", + "Epoch 16/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 646ms/step - loss: 0.3365 - sparse_categorical_accuracy: 0.8525\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "고객님, 안녕하세요.\n", + "\n", + "결혼기념일 3호 케이크 주문 가능합니다. 원하시는 디자인이나 특별한 요청 사항이 있으시면 말씀해주세요.\n", + "\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n", + "TOTAL TIME ELAPSED: 3.28s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m10s\u001b[0m 1s/step - loss: 0.3353 - sparse_categorical_accuracy: 0.8529 \n", + "Epoch 17/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 647ms/step - loss: 0.3060 - sparse_categorical_accuracy: 0.8617\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "고객님, 안녕하세요.\n", + "\n", + "결혼기념일 3호 케이크 주문 가능합니다. 원하시는 디자인이나 특별한 요청 사항이 있으시면 말씀해주세요.\n", + "\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n", + "TOTAL TIME ELAPSED: 3.28s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m10s\u001b[0m 1s/step - loss: 0.3049 - sparse_categorical_accuracy: 0.8621 \n", + "Epoch 18/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 647ms/step - loss: 0.2785 - sparse_categorical_accuracy: 0.8732\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "고객님, 안녕하세요.\n", + "\n", + "결혼기념일 3호 케이크 주문 가능합니다. 원하시는 디자인이나 특별한 요청 사항이 있으시면 말씀해주세요.\n", + "\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n", + "TOTAL TIME ELAPSED: 3.28s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m10s\u001b[0m 1s/step - loss: 0.2775 - sparse_categorical_accuracy: 0.8739 \n", + "Epoch 19/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 646ms/step - loss: 0.2531 - sparse_categorical_accuracy: 0.8840\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "고객님, 안녕하세요.\n", + "\n", + "결혼기념일 3호 케이크 주문 가능합니다. 원하시는 디자인이나 특별한 요청 사항이 있으시면 말씀해주세요.\n", + "\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n", + "TOTAL TIME ELAPSED: 3.28s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m10s\u001b[0m 1s/step - loss: 0.2522 - sparse_categorical_accuracy: 0.8847 \n", + "Epoch 20/20\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m0s\u001b[0m 646ms/step - loss: 0.2287 - sparse_categorical_accuracy: 0.8962\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "고객님, 안녕하세요.\n", + "\n", + "결혼기념일 3호 케이크 주문 가능합니다. 3호 케이크는 픽업 또는 배송 가능한지, 각 케이크에 대한 디자인이나 문구가 필요한지 등 추가 문의를 해주시면 감사하겠습니다.\n", + "\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n", + "TOTAL TIME ELAPSED: 4.35s\n", + "\u001b[1m10/10\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m11s\u001b[0m 1s/step - loss: 0.2278 - sparse_categorical_accuracy: 0.8969 \n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiMAAAGdCAYAAADAAnMpAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAABAUElEQVR4nO3dd3wUdeL/8fduOiEJJZBGIBTpECBACMWaA1FRbBSVLihi5fyd8lXBO+9Ez3IWEBBpigqcCqIgiDlRSiCQANKlh5aElkJC2u78/kCjUQIsJJns7uv5eOzjIZOZyXsc1n07+5nPWAzDMAQAAGASq9kBAACAe6OMAAAAU1FGAACAqSgjAADAVJQRAABgKsoIAAAwFWUEAACYijICAABM5Wl2gMtht9t17NgxBQQEyGKxmB0HAABcBsMwlJOTo/DwcFmtZV//cIoycuzYMUVGRpodAwAAXIHDhw+rXr16Zf7cKcpIQECApPMHExgYaHIaAABwObKzsxUZGVnyOV4Wpygjv341ExgYSBkBAMDJXGqIBQNYAQCAqSgjAADAVJQRAABgKsoIAAAwFWUEAACYijICAABMRRkBAACmoowAAABTUUYAAICpKCMAAMBUlBEAAGAqyggAADCVW5eRZdvS9OS8TcrJLzI7CgAAbsspntpbEc4V2vTcwq06lVuolNRMvTOwvdpF1jA7FgAAbsdtr4z4eXto2qAYRdTwU+rpPN0zZa2mrNwnu90wOxoAAG7FbcuIJHWMqqWlT/TQrW3CVGw39OqyXRo8M0kZ2flmRwMAwG24dRmRpCA/L026r71evbuN/Lw8tHrvSd389ip9vyvD7GgAALgFty8jkmSxWNS/U3199Vh3tQgL1OncQg2bvUF//2q7CoptZscDAMClUUZ+p0nd6lr4SFcN6xYlSZq15qDunLxWezPOmhsMAAAXRhn5A18vD03o00ozhnRULX9v7TierT7vrtb8DakyDAa3AgBQ3igjZbipRYi+eaKHujWprXNFNj3z+VY9+ukmZZ1jThIAAMoTZeQiQgJ99dHwWD1zc3N5Wi1a8tNx3fL2KiUfOm12NAAAXAZl5BKsVotGX99Y/304TpG1/HQ085z6TVundxP2yMacJAAAXDXKyGVqX7+mlj7eQ3e0C5fNbuiNFT/rvunrdDzrnNnRAABwapQRBwT4eumt/u30xr3RqubtofUHTqv326v07fY0s6MBAOC0KCMOslgsujumnpY83kOtIwKVmVekUR8l64VF25RfxJwkAAA4ijJyhRoG++uL0d00skdDSdJH6w7pjklr9HN6jsnJAABwLpSRq+DtadVzt7bUnOGdFVzdW7vTc9Tn3dWau+4Qc5IAAHCZKCPl4LqmdfTNE9fq2qZ1VFBs1/OLtunhucnKzCs0OxoAAFUeZaSc1Anw0eyhnfT8rS3k5WHR8u3p6v32KiUdYE4SAAAuhjJSjqxWix7s0UhfjO6mhsH+Op6Vrwc+WK9Ve06YHQ0AgCqLMlIB2tQL0tePdVfPliEqtNk16sNkJR86Y3YsAACqJMpIBfH38dS797VXj2uCda7IpmGzkrTzeLbZsQAAqHIoIxXIx9ND0wbFKKZBTWXnF2vQjCQdPJlrdiwAAKoUykgFq+btqZlDO6lFWKBOni3Q/R+sZwp5AAB+hzJSCYL8vPTh8M5qGOyvo5nn9MAH63XqbIHZsQAAqBIoI5WkToCPPhrRWWFBvtp3IldDZ21QTn6R2bEAADAdZaQS1atZTR+NiFUtf29tPZqlEXM28jwbAIDbo4xUsiZ1q+vD4Z0V4OOppAOn9cjHKSqy2c2OBQCAaSgjJmgdEaQZQzvJx9Oq/+3K0F8XbJHNzrNsAADuiTJiks4Na2nqAzHytFq0eMsxjf9yGw/XAwC4JcqIiW5oXlf/6d9OFov08fpU/Xv5brMjAQBQ6SgjJusTHa6X72wjSZqycp+m/rDP5EQAAFQuykgVMLBzfY3r3VyS9Mo3u/TJ+lSTEwEAUHkoI1XEQ9c11iPXN5YkPbdoqxZvOWZyIgAAKofDZeTHH39Unz59FB4eLovFokWLFl1ym5UrV6pDhw7y8fFRkyZNNHv27CuI6vr+X69meqBLfRmGNHb+Zn2/K8PsSAAAVDiHy0hubq6io6M1efLky1r/wIEDuvXWW3XDDTdo8+bNevLJJ/Xggw9q+fLlDod1dRaLRf+4vbXuaBeuYruhh+cma/3+U2bHAgCgQlmMq7if1GKxaOHCherbt2+Z6zzzzDNasmSJtm3bVrJswIAByszM1LJlyy7r92RnZysoKEhZWVkKDAy80rhOo8hm18MfJSthV4YCfDz16aguah0RZHYsAAAccrmf3xU+ZiQxMVHx8fGllvXq1UuJiYkV/audlpeHVZPv76DYhrWUU1CswTOTtDfjrNmxAACoEBVeRtLS0hQSElJqWUhIiLKzs3Xu3LkLblNQUKDs7OxSL3fj6+WhD4Z0VJuIIJ3OLdSgGet15Eye2bEAACh3VfJumokTJyooKKjkFRkZaXYkUwT4emnO8M5qUre6jmfl64EP1utEToHZsQAAKFcVXkZCQ0OVnp5eall6eroCAwPl5+d3wW3GjRunrKysktfhw4crOmaVVcvfW3NHxKpeTT8dPJWnwTOTlJVXZHYsAADKTYWXkbi4OCUkJJRatmLFCsXFxZW5jY+PjwIDA0u93FlokK/mjohVcHUf7TyerWGzk5RXWGx2LAAAyoXDZeTs2bPavHmzNm/eLOn8rbubN29Waur5WUPHjRunwYMHl6z/8MMPa//+/frb3/6mXbt26b333tOCBQv01FNPlc8RuImoYH99NKKzAn09lZKaqYc+SlZBsc3sWAAAXDWHy8jGjRvVvn17tW/fXpI0duxYtW/fXuPHj5ckHT9+vKSYSFLDhg21ZMkSrVixQtHR0XrjjTf0wQcfqFevXuV0CO6jRVigZg/vrGreHlq156SenLdZxTa72bEAALgqVzXPSGVxt3lGLmX1npMaPnuDCm123Rdbv+RBewAAVCVVZp4RlL/u1wTrnYHtZbFIn6xP1cfrD5kdCQCAK0YZcVI3tw7V/+vVTJL04uLtSj502uREAABcGcqIExt9XWPd0iZURTZDD89NUXp2vtmRAABwGGXEiVksFr12T7SahQToRE6BHp7LHTYAAOdDGXFy/j6emjYoRoG+ntqUmqkXF+8wOxIAAA6hjLiAqGD/kgGtnyal6pP1qZfeCACAKoIy4iKub1ZXT/c8P6B1wuJtSj50xuREAABcHsqIC3nk+sbq3fr8gNbRc5MZ0AoAcAqUERdisVj0+r3RahpSXRk5BRo9N1mFxczQCgCo2igjLsbfx1PvD+pY8gybF7/abnYkAAAuijLigqKC/fX272Zo/TSJAa0AgKqLMuKibvj9gNYvtysllQGtAICqiTLiwh65vrFubhWqQptdD3+UrAwGtAIAqiDKiAuzWCx6vV+0rql7fkDrIx+nMKAVAFDlUEZcXHUfT70/uKMCfD218dAZ/Z0BrQCAKoYy4gYaBvvrnQHnB7R+vD5V8xjQCgCoQigjbuKG5nX11780lSSNZ0ArAKAKoYy4kUeub6JerUJUaLNr9NxkZeQwoBUAYD7KiBuxWi16o187NalbXenZBXpkLgNaAQDmo4y4meo+nnp/UEzJgNaXvt5hdiQAgJujjLihRnWq6+0B7WSxSB+tO6QFGw6bHQkA4MYoI27qxuYheir+/IDW5xdt0+bDmeYGAgC4LcqIG3v0hibq2TLktxlaGdAKADABZcSNWa0Wvdn//IDWtOx8jWGGVgCACSgjbq66j6emDYpRgI+nNhw8o38uYUArAKByUUagxnWq660B7SRJHyYe0oKNDGgFAFQeyggkSTe1KD2gdQsDWgEAlYQyghKP3dhEf2kZosJiux76KFkncgrMjgQAcAOUEZSwWi16s1+0GtfxLxnQWmRjQCsAoGJRRlBKgK+X3h/cUQE+nko6eFrPL9wmwzDMjgUAcGGUEfxJ4zrV9fbAdrJapPkbD+vthD1mRwIAuDDKCC7oxuYh+scdrSVJb323hynjAQAVhjKCMj3QpYHG3NBYkjRu4VZ9vzvD5EQAAFdEGcFFPd2zme7qECGb3dCYj1P005FMsyMBAFwMZQQXZbFY9MpdbdXjmmDlFdo0fPYGpZ7KMzsWAMCFUEZwSd6eVk15IEYtwwJ18myhhsxK0uncQrNjAQBcBGUEl6W6j6dmD+ukiBp+OnAyVyPmbNC5QpvZsQAALoAygstWN9BXc4Z3UpCflzalZurxeZtkszMHCQDg6lBG4JAmdQM0Y0hHeXtatWJHuiYsZlI0AMDVoYzAYR2jaunt/u1ksUhz16XqvZX7zI4EAHBilBFckd5twjT+tpaSpNeW79YXKUdMTgQAcFaUEVyxYd0aatS1jSRJf/vsJ63ac8LkRAAAZ0QZwVV59ubm6hMdrmK7odFzU7T9WJbZkQAAToYygqtitVr0+r1tFdeots4WFGvorA06coZJ0QAAl48ygqvm4+mhqYNi1CwkQCdyCjRkZpIy85gUDQBweSgjKBdBfl6aPbyTwoJ8te9ErkZ+uFH5RUyKBgC4NMoIyk1YkJ9mD+usAF9PbTh4RmMXbJadSdEAAJdAGUG5ahYaoPcHdZS3h1VLt6bppSU7mBQNAHBRlBGUu7jGtfV6v2hJ0qw1B/XBqgMmJwIAVGWUEVSI26PD9X+3NJck/WvpTi3ecszkRACAqooyggozskcjDe0aJUl6esEWJe47ZW4gAECVRBlBhbFYLHrhtpbq3TpUhTa7Rn20UbvSss2OBQCoYigjqFAeVov+07+dOkXVVE5+sYbO3KDjWefMjgUAqEIoI6hwvl4emj64o5rUra607HwNnblBWeeKzI4FAKgiKCOoFDWqeWv2sE6qG+Cj3ek5euijjSooZlI0AABlBJWoXs1qmjWsk6r7eGrd/tN6+r8/MSkaAIAygsrVKjxIUx7oIE+rRV9tOaa/ff6TbBQSAHBrlBFUuh7X1NGb/dvJapE+Sz6i//ffLRQSAHBjlBGY4vbocL07sIM8rBZ9semoxi7YrGKb3exYAAATUEZgmlvbhmnyfe3labXoy83H9MT8zSqikACA26GMwFQ3tw7Te/d3kJeHRUt+Oq7HP91EIQEAN0MZgel6tgrV1Adi5O1h1Tfb0jTm4xQVFlNIAMBdUEZQJdzUIkTTBsfI29Oqb3ek65GPk5mHBADcBGUEVcYNzerqg8Ed5eNp1Xc7M/TwR8nKL6KQAICro4ygSrm2aR3NHNpJvl5Wfb/7hEZRSADA5VFGUOV0axKsWUM7y8/LQz/+fEIPztmoc4UUEgBwVVdURiZPnqyoqCj5+voqNjZWSUlJF13/rbfeUrNmzeTn56fIyEg99dRTys/Pv6LAcA9xjWtrzvDOqubtodV7T2r47A3KKyw2OxYAoAI4XEbmz5+vsWPHasKECUpJSVF0dLR69eqljIyMC67/ySef6Nlnn9WECRO0c+dOzZgxQ/Pnz9f//d//XXV4uLbODWvpw+GdVd3HU4n7T2norA06W0AhAQBX43AZefPNNzVy5EgNGzZMLVu21NSpU1WtWjXNnDnzguuvXbtW3bp103333aeoqCj17NlTAwcOvOTVFECSOkbV0ocjOivAx1NJB05r6Mwk5eQXmR0LAFCOHCojhYWFSk5OVnx8/G87sFoVHx+vxMTEC27TtWtXJScnl5SP/fv3a+nSpbrlllvK/D0FBQXKzs4u9YL76lC/puY+GKtAX09tPHRGg2cmKZtCAgAuw6EycvLkSdlsNoWEhJRaHhISorS0tAtuc9999+kf//iHunfvLi8vLzVu3FjXX3/9Rb+mmThxooKCgkpekZGRjsSEC4qOrKGPH+yiID8vbUrN1KAP1ivrHIUEAFxBhd9Ns3LlSr388st67733lJKSoi+++EJLlizRSy+9VOY248aNU1ZWVsnr8OHDFR0TTqBNvSB9MjJWNat5acuRLD3wwXpl5hWaHQsAcJUcKiPBwcHy8PBQenp6qeXp6ekKDQ294DYvvPCCBg0apAcffFBt2rTRnXfeqZdfflkTJ06U3X7hKb99fHwUGBhY6gVIUqvwIH0ysotq+Xtr69Es3Td9vc7kUkgAwJk5VEa8vb0VExOjhISEkmV2u10JCQmKi4u74DZ5eXmyWkv/Gg8PD0mSYRiO5gXUIixQn47souDq3tpxPFsDp6/TqbMFZscCAFwhh7+mGTt2rKZPn645c+Zo586dGj16tHJzczVs2DBJ0uDBgzVu3LiS9fv06aMpU6Zo3rx5OnDggFasWKEXXnhBffr0KSklgKOahQZo3qguqhPgo11pORo4fZ1O5FBIAMAZeTq6Qf/+/XXixAmNHz9eaWlpateunZYtW1YyqDU1NbXUlZDnn39eFotFzz//vI4ePao6deqoT58++te//lV+RwG31KTu+UJy3/R1+jn9rAZOX6dPRsaqboCv2dEAAA6wGE7wXUl2draCgoKUlZXF+BH8ycGTuRo4fZ2OZ+WrUR1/fTqyi0ICKSQAYLbL/fzm2TRwelHB/po/Kk4RNfy0/0SuBry/TsezzpkdCwBwmSgjcAn1a1fTvFFdVK+mnw6czFX/aet0NJNCAgDOgDIClxFZ63whqV+rmlJP56nf1ETtP3HW7FgAgEugjMCl1KtZTfMf6qKGwf46mnlO905N1NYjWWbHAgBcBGUELicsyE//fThOrSMCdSq3UAPeT9TqPSfNjgUAKANlBC4puLqPPh3ZRV0b11ZuoU3DZifp65+OmR0LAHABlBG4rABfL80a1km3tglTkc3QY59u0keJB82OBQD4A8oIXJqPp4feGdheD3SpL8OQXvhyu/6z4mceRQAAVQhlBC7Pw2rRS3e01pPx10iS3k7Yo+cXbZPNTiEBgKqAMgK3YLFY9GR8U73Ut7UsFunj9al67NMUFRTbzI4GAG6PMgK3MqhLA00a2EFeHhYt3ZqmYbM2KCe/yOxYAODWKCNwO7e2DdPsYZ3l7+2htftO8cRfADAZZQRuqVuTYM0bFafa/t7adjRb905dq8On88yOBQBuiTICt9WmXpA+G91V9Wr66eCpPN01Za12Hs82OxYAuB3KCNxaw2B/fT66q5qHBuhEToH6TUvU+v2nzI4FAG6FMgK3FxLoq/kPxalTVE3l5Bdr0Mwkfbs9zexYAOA2KCOApCA/L300IlbxLUJUWGzXw3OTNX9DqtmxAMAtUEaAX/h6eWjqAx10b0w92Q3pmc+36r2Ve5mtFQAqGGUE+B1PD6v+fU9bPXxdY0nSv5ft1ktf75Sd2VoBoMJQRoA/sFgserZ3cz1/awtJ0sw1BzR2wWYVFttNTgYArokyApThwR6N9J/+0fK0WrRo8zE9+OFG5RUWmx0LAFwOZQS4iDvb19P0IR3l62XVjz+f0H3T1+tMbqHZsQDApVBGgEu4oVldffxgFwX5eWnz4UzdOy1RxzLPmR0LAFwGZQS4DDENauqzh+MUFuSrvRlndfeUtdqTnmN2LABwCZQR4DJdExKgz0Z3VeM6/jqela+7p6xltlYAKAeUEcABETX89N+Hu6pD/RrKzi/WoBlJWrzlmNmxAMCpUUYAB9Xy99YnI7uoV6sQFdrsevzTTZr2wz4mRwOAK0QZAa6Ar5eH3rs/RsO6RUmSJn6zS+O/3C4bk6MBgMMoI8AV8rBaNKFPKz1/awtZLNJH6w7poY+Sda7QZnY0AHAqlBHgKj3Yo5Em39dB3p5WfbczXQOmr9PJswVmxwIAp0EZAcrBLW3C9PGDsapRzUtbDmfqrvfWav+Js2bHAgCnQBkBykmnqFr6fHRXRdbyU+rpPN09Za2SD502OxYAVHmUEaAcNa5TXV+M7qa29YJ0Jq9I901fr2XbjpsdCwCqNMoIUM7qBPho3qguuql5XRUU2zX64xTNXH3A7FgAUGVRRoAKUM3bU9MGxej+2PoyDOkfX+/QS1/vkJ1bfwHgTygjQAXx9LDqn31b6283N5MkzVh9QI9+mqL8Im79BYDfo4wAFchiseiR65vo7QHt5OVh0dKtaXrgg/U6k1todjQAqDIoI0AluKNdhD4cHqsAX09tPHRGd09Zq9RTeWbHAoAqgTICVJK4xrX1+eiuCg/y1f6TubpryhptOZxpdiwAMB1lBKhETUMCtHBMN7UMC9TJs4Ua8P46JexMNzsWAJiKMgJUspBAXy14OE7XNq2jc0U2jfxwo+auO2R2LAAwDWUEMEF1H0/NGNJR/TrWk92Qnl+0Ta8u28WtvwDcEmUEMImXh1Wv3t1WT8U3lSRNWblPTy3YrIJibv0F4F4oI4CJLBaLnoi/Rq/fGy1Pq0Vfbj6mITOTlHWuyOxoAFBpKCNAFXBPTD3NGtZJ1X08tW7/ad0zZa2OnOHWXwDugTICVBE9rqmjBQ/FKSTQR3syzqrv5DU89ReAW6CMAFVIy/BALXzkt1t/B76/Xgs3HTE7FgBUKMoIUMWE1/DTfx+OU8+WISq02fXU/C16bTl32gBwXZQRoAry9/HU1AdiNPr6xpKkyd/v0yMfpyivsNjkZABQ/igjQBVltVr0zM3N9ca90fL2sGrZ9jT1m5aotKx8s6MBQLmijABV3N0x9fTxyFjV8vfWtqPZun3Sav10JNPsWABQbigjgBPoFFVLX47ppqYh1ZWRU6B7pyZqyU/HzY4FAOWCMgI4icha1fT56K66oVkdFRTbNeaTFL2TsEeGwcBWAM6NMgI4kQBfL30wpJNGdG8oSXpzxc96Yt5m5RcxhTwA50UZAZyMh9WiF25rqYl3tZGn1aLFW45pwPvrlJHDwFYAzokyAjipgZ3r68MRnRXk56XNhzPVd9Ia7TiWbXYsAHAYZQRwYl0bB2vRmG5qFOyvY1n5umfqWq3YkW52LABwCGUEcHINg/218JFu6t4kWHmFNo36aKOm/rCPga0AnAZlBHABQdW8NGtYJz3Qpb4MQ3rlm136f5/9pIJiBrYCqPooI4CL8PKw6p992+jvt7eS1SJ9lnxEgz5I0qmzBWZHA4CLoowALmZI1yjNGtZZAT6eSjp4Wn3fW6Of03PMjgUAZaKMAC7ouqZ1tHBMV9WvVU2HT5/T3e+t1crdGWbHAoALoowALqpJ3QAtGtNNnRvWUk5BsYbP3qBZaw4wsBVAlUMZAVxYLX9vzR0Rq34d68luSH//aoeeW7RNRTa72dEAoARlBHBx3p5WvXp3W/3fLc1lsUifrE/VkJlJOpNbaHY0AJBEGQHcgsVi0ahrG2v6oI7y9/bQ2n2ndMfkNdqdxsBWAOa7ojIyefJkRUVFydfXV7GxsUpKSrro+pmZmRozZozCwsLk4+Ojpk2baunSpVcUGMCVi28Zos8f6arIWn5KPZ2nO99bo2Xb0syOBcDNOVxG5s+fr7Fjx2rChAlKSUlRdHS0evXqpYyMC4/ULyws1F/+8hcdPHhQn332mXbv3q3p06crIiLiqsMDcFzz0EAtHtNdXRvXVl6hTQ/PTdbb3+2R3c7AVgDmsBgODq2PjY1Vp06dNGnSJEmS3W5XZGSkHnvsMT377LN/Wn/q1Kl67bXXtGvXLnl5eV1RyOzsbAUFBSkrK0uBgYFXtA8ApRXb7PrX0p2ateagJOnmVqF6o1+0/H08zQ0GwGVc7ue3Q1dGCgsLlZycrPj4+N92YLUqPj5eiYmJF9xm8eLFiouL05gxYxQSEqLWrVvr5Zdfls1W9jTVBQUFys7OLvUCUL48Paya0KeV/n1PW3l7WLVse5ruem+tUk/lmR0NgJtxqIycPHlSNptNISEhpZaHhIQoLe3C3zvv379fn332mWw2m5YuXaoXXnhBb7zxhv75z3+W+XsmTpyooKCgkldkZKQjMQE4oF/HSM17qIvqBPhod3qObp+8Wmv2njQ7FgA3UuF309jtdtWtW1fvv/++YmJi1L9/fz333HOaOnVqmduMGzdOWVlZJa/Dhw9XdEzArXWoX1NfPdpd0ZE1lJlXpMEzk5ggDUClcaiMBAcHy8PDQ+np6aWWp6enKzQ09ILbhIWFqWnTpvLw8ChZ1qJFC6Wlpamw8MLzHPj4+CgwMLDUC0DFCg3y1fxRXXRXhwjZ7Ib+/tUO/Y0n/wKoBA6VEW9vb8XExCghIaFkmd1uV0JCguLi4i64Tbdu3bR3717Z7b/N+Pjzzz8rLCxM3t7eVxgbQEXw9fLQG/dG64XbWspqkf6bfEQD3l+njOx8s6MBcGEOf00zduxYTZ8+XXPmzNHOnTs1evRo5ebmatiwYZKkwYMHa9y4cSXrjx49WqdPn9YTTzyhn3/+WUuWLNHLL7+sMWPGlN9RACg3FotFI7o31JzhnRXk56VNqZnqM2m1Nh/ONDsaABfl8D18/fv314kTJzR+/HilpaWpXbt2WrZsWcmg1tTUVFmtv3WcyMhILV++XE899ZTatm2riIgIPfHEE3rmmWfK7ygAlLse19TRl2O6aeSHG7Un46z6TUvUxDvb6O6YemZHA+BiHJ5nxAzMMwKY52xBsZ6av1krdpwfKzaie0ON691cnh48TQLAxVXIPCMA3E91H09NeyBGj990jSRpxuoDGjprgzLzeNAegPJBGQFwSVarRWP/0lRT7u+gat4eWr33pO6YvEY/p/OgPQBXjzIC4LL1bhOmz0d3Vb2afjp0Kk93Tl6jb7fzoD0AV4cyAsAhLcICtfjR7oprVFu5hTaN+ihZ7yTwoD0AV44yAsBhtfy99eGIzhraNUqS9OaKnzXmkxTlFhSbGwyAU6KMALgiXh5WvXh7K/377vMP2vtmW5runrJWh0/zoD0AjqGMALgq/TpF6tNRXRRc3Ue70nJ0+6TVWsuD9gA4gDIC4KrFNKiprx7rprb1gnQmr0iDZibpg1X7edAegMtCGQFQLsKC/LTgobiSB+39c8lOPTFvs/IKGUcC4OIoIwDKza8P2vv77a3kabVo8ZZjuuu9tUo9xTgSAGWjjAAoVxaLRUO6RumTkb+NI+kzabVW7s4wOxqAKooyAqBCdG5YS18/1l3tImso61yRhs3eoMnf72UcCYA/oYwAqDChQb6a/1AXDexcX4YhvbZ8tx6em6yzzEcC4HcoIwAqlI+nhybe1UYT72ojbw+rlm9PV9/Ja7TvxFmzowGoIigjACrFwM71Nf+hLgoN9NXejLPqO2mNVuxINzsWgCqAMgKg0rSvX1NfPdZdnaNqKaegWCM/3Kg3v93Nc20AN0cZAVCp6gT46OORsSXPtXnnf3s1Ys4GZZ0rMjcYANNQRgBUul+fa/Nmv2j5eFr1/e4Tun3Sau1OyzE7GgATUEYAmOauDvX0+eiuiqjhp0On8tR38hp9/dMxs2MBqGSUEQCmah0RpK8e665uTWrrXJFNj36ySROX7lSxzW52NACVhDICwHS1/L01Z1hnPXRtI0nStB/3a8isJJ3OLTQ5GYDKQBkBUCV4elg17pYWendge/l5eWjN3lPq8+5qbTuaZXY0ABWMMgKgSukTHa6FY7qqQe1qOpp5TndPWasvUo6YHQtABaKMAKhymocGavGY7rqhWR0VFNs1dsEWvbh4u4oYRwK4JMoIgCopqJqXZgzppMdvbCJJmr32oO6fvl4ncgpMTgagvFFGAFRZVqtFY3s207RBMaru46mkg6d127urlHzotNnRAJQjygiAKq9Xq1AtGtNNjev4Kz27QP2mrdO7CXtkYxp5wCVQRgA4hSZ1q2vRmG66PTpcNruhN1b8rPumr9PxrHNmRwNwlSgjAJxGgK+X3h7QTq/fG61q3h5af+C0bn5rlZZtSzM7GoCrQBkB4FQsFovuiamnJY/3UJuIIGWdK9LDc5P13MKtOldoMzsegCtAGQHglBoG++vz0V016pdZWz9en6rbJ63WrrRsk5MBcBRlBIDT8va06v9uaaEPh3dWcHUf7ck4q9snrdGctQdlGAxuBZwFZQSA07u2aR0te7KHbmhWR4XFdk1YvF0jP9zIs20AJ0EZAeASgqv7aObQThp/W0t5e1j13c4M9X77R63de9LsaAAugTICwGVYLBYN795QC8d0VaNf5iS5f8Z6vbpsF1PJA1UYZQSAy2kVHqSvH+uuAZ0iZRjSlJX7dM/URB06lWt2NAAXQBkB4JKqeXvqlbvb6r37OyjQ11NbDmfq1ndWa9Gmo2ZHA/AHlBEALu2WNmH65slr1Smqps4WFOvJ+Zs1dv5mnS0oNjsagF9QRgC4vIgafvp0ZBc9GX+NrBbpi01Hdds7q7TlcKbZ0QCIMgLATXh6WPVkfFPNfyhOETX8dPBUnu6eslZTf9gnOw/cA0xFGQHgVjpF1dLSx3voljahKrYbeuWbXRo8M0kZ2flmRwPcFmUEgNsJqualyfd10Ct3tZGfl4dW7z2pm99epYSd6WZHA9wSZQSAW7JYLBrQub6+eqy7WoYF6nRuoUbM2agXF29XfhEP3AMqE2UEgFtrUre6Fo7pquHdGkqSZq89qFveXqXkQ6dNTga4D8oIALfn4+mh8X1aatawTgoJ9NH+k7m6Z2qiXvp6h84VcpUEqGiUEQD4xQ3N6urbJ6/TPTH1ZBjSjNUH1PvtH5V0gKskQEWijADA7wRV89Lr90Zr1rBOCg301cFTeer/fqJeXLxdeYVMlAZUBMoIAFzADc3q6tux15Y832b22oO6+a1VWrf/lNnRAJdDGQGAMgT6eumVu9tqzvDOCg/yVerpPA14f53Gf7lNuUwnD5QbyggAXMJ1Teto+VPXamDn+pKkDxMPqddbP2rt3pMmJwNcA2UEAC5DgK+XJt7VRnNHxCqihp+OnDmn+z5Yr+cWbuWhe8BVoowAgAO6XxOs5U9dq0FdGkiSPl6fql7/+VGr93CVBLhSlBEAcFB1H0+91Le1PhkZq8hafjqaeU4PzFivcV/8pOz8IrPjAU6HMgIAV6hr42Ate+JaDe0aJUn6NOmwev3nR63cnWFuMMDJUEYA4Cr4+3jqxdtbaf6oLmpQu5qOZ+Vr6KwN+ttnW5R1jqskwOWgjABAOYhtVFvLnrhWw7s1lMUiLdh4RL3+86P+t4snAQOXQhkBgHLi533+GTf/fShODYP9lZadr+GzN+qvC7YoK4+rJEBZKCMAUM46RtXS0sd7aGSP81dJPk85or/85wd9t4OrJMCFUEYAoAL4eXvouVtb6rOHu6pRHX9l5BTowQ836ol5m3TybIHZ8YAqhTICABUopkFNLX28hx66rpGsFunLzcd04+sr9fH6Q7LbDbPjAVUCZQQAKpivl4fG9W6hhY90U+uIQGXnF+u5hdt099S12n4sy+x4gOkoIwBQSaIja+jLMd31Yp+Wqu7jqU2pmerz7mq99PUOppSHW6OMAEAl8rBaNLRbQyX89Trd2jZMdkOasfqA4t/4Qd9sPS7D4KsbuB/KCACYICTQV5Pv66A5wzurfq1qSsvO1+iPUzR89gYdPp1ndjygUlFGAMBE1zWto2+fulaP39hEXh4Wfb/7hP7ynx80+fu9Kiy2mx0PqBSUEQAwma+Xh8b2bKZvnrhWcY1qK7/IrteW79Yt76zSuv2nzI4HVLgrKiOTJ09WVFSUfH19FRsbq6SkpMvabt68ebJYLOrbt++V/FoAcGlN6lbXJyNj9Vb/dgqu7q29GWc14P11+uuCLTrF3CRwYQ6Xkfnz52vs2LGaMGGCUlJSFB0drV69eikj4+JPqTx48KCefvpp9ejR44rDAoCrs1gs6ts+Qgljr9f9sfVLZnC98Y0f9GlSKnOTwCVZDAeHbsfGxqpTp06aNGmSJMlutysyMlKPPfaYnn322QtuY7PZdO2112r48OFatWqVMjMztWjRosv+ndnZ2QoKClJWVpYCAwMdiQsATm1T6hk9t3CbdhzPliR1qF9D/7qzjVqE8d9CVH2X+/nt0JWRwsJCJScnKz4+/rcdWK2Kj49XYmJimdv94x//UN26dTVixIjL+j0FBQXKzs4u9QIAd9S+fk0tfrSbXritpfy9PZSSmqnb3l2tfy3ZoVzmJoGLcKiMnDx5UjabTSEhIaWWh4SEKC0t7YLbrF69WjNmzND06dMv+/dMnDhRQUFBJa/IyEhHYgKAS/H0sGpE94ZK+Ov1uqVNqGx2Q9NXHVD8mz9o2bY05iaB06vQu2lycnI0aNAgTZ8+XcHBwZe93bhx45SVlVXyOnz4cAWmBADnEBrkq/fuj9GsoZ0UWctPx7Py9fDcZD04ZyNzk8CpeTqycnBwsDw8PJSeXvox2Onp6QoNDf3T+vv27dPBgwfVp0+fkmV2+/n75j09PbV79241btz4T9v5+PjIx8fHkWgA4DZuaF5X3za6TpO/36tpP+5Twq4Mrdl3Uk/c1FQjujeUtyezNsC5OPQ31tvbWzExMUpISChZZrfblZCQoLi4uD+t37x5c23dulWbN28ued1+++264YYbtHnzZr5+AYAr5Oftoad7NdM3T/RQbMNayi+y69Vlu3Tz2z/qux3pfHUDp+LQlRFJGjt2rIYMGaKOHTuqc+fOeuutt5Sbm6thw4ZJkgYPHqyIiAhNnDhRvr6+at26danta9SoIUl/Wg4AcFyTugGaN6qLFm46qn8t2an9J3L14Icb1bVxbT13awu1Cg8yOyJwSQ6Xkf79++vEiRMaP3680tLS1K5dOy1btqxkUGtqaqqsVi4RAkBlsVgsuqtDPcW3DNGUlfs0Y/UBrd13Sre9u1r3dKinp3s1U0igr9kxgTI5PM+IGZhnBAAu3+HTeXpt+W4t3nJMkuTn5aGHrmukUdc2UjVvh/8fFLhil/v5TRkBABeVknpG//x6h1JSMyVJIYE+erpnM93doZ6sVou54eAWKCMAABmGoSVbj+vVZbt0+PQ5SVKr8EA9d2sLdW18+VMuAFeCMgIAKFFQbNOctQf1bsJe5fwyc2t8ixCNu6W5GtepbnI6uCrKCADgT06dLdDbCXv08fpU2eyGPK0WPdClgZ646RrV9Pc2Ox5cDGUEAFCmvRlnNXHpTiXsOv/E9UBfTz124zUa3LWBfDw9TE4HV0EZAQBc0pq9J/XPJTu185enAtevVU3P9m6u3q1DZbEwyBVXhzICALgsNruhz5OP6PVvdysjp0CS1LFBTT1/W0u1i6xhbjg4NcoIAMAhuQXFmvbjfr3/4z7lF51/jtgd7cL1t5ubK6KGn8np4IwoIwCAK3I865xeX/6zvth0RIYh+XhaNaJ7Q42+vrECfL3MjgcnQhkBAFyVbUez9M8lO7Ru/2lJUnB1bz1+0zUa0Kk+TwbGZaGMAACummEYWrEjXRO/2aUDJ3MlSRE1/PRk/DW6s32EPD0oJSgbZQQAUG4Ki+2av/Gw3k3YUzLItVEdf/31L83Uu3Uo08vjgigjAIByd67Qpo/WHdR7K/cpM69IktQyLFBP92qqG5rV5XZglEIZAQBUmJz8Is1YfUAfrDqgs79MLx/ToKae7tlMcY1rm5wOVQVlBABQ4c7kFmrqD/s0e+1BFRSfvx24xzXBerpnM0UzR4nbo4wAACpNena+Jv1vr+ZtSFWR7fzHSs+WIfprz2ZqFhpgcjqYhTICAKh0h0/n6a3v9mjhpiOyG5LFIt0RHa4n45sqKtjf7HioZJQRAIBp9mbk6M0VP2vp1jRJkofVon4d6+mxG69ROLO5ug3KCADAdNuOZun1b3dr5e4TkiRvT6seiG2gR25orODqPianQ0WjjAAAqowNB0/rteW7lXTg/Gyu1bw9NLxbQ428tpGC/Jhi3lVRRgAAVYphGFq156ReW75bW49mSZICfT310HWNNaxblKp5e5qcEOWNMgIAqJIMw9Dy7el649vd2pNxVpIUXN1HY25orIGd68vXy8PkhCgvlBEAQJVmsxtavOWo/rNij1JP50mSavt7a1i3KA3qEqWganx94+woIwAAp1Bks2vBxsOasnKfjpw5J0ny9/bQ/V0aaET3hgoJ9DU5Ia4UZQQA4FSKbXYt2XpcU1bu0660HEmSt4dVd3WI0KhrG6lRneomJ4SjKCMAAKdkGIZW7j6hKSv3Keng+btvLBapd+tQPXxdY7WtV8PcgLhslBEAgNPbePC0pv6wT9/tzChZ1r1JsEZf31hdG9fmKcFVHGUEAOAydqflaNoP+/TllmOy2c9/bLWtF6TR1zVWz1ah8rBSSqoiyggAwOUcOZOnD1Yd0LwNqcovOv+U4EbB/nroukbq2z5CPp7cFlyVUEYAAC7r1NkCzVl7UHMSDynrXJEkKSTQRyO6N9R9sQ1U3YcJ1KoCyggAwOWdLSjWvKRUfbDqgNKy8yWdn9V1cFyUhnaL4vk3JqOMAADcRmGxXYs2HdXUH/dp/4lcSZKPp1X9O0VqZI9GiqxVzeSE7okyAgBwOza7oRU70jRl5T5tOXL++TceVov6tA3TqGsbq2U4nyGViTICAHBbhmEocd8pTflhn1btOVmyvEujWhrataHiW9SVp4fVxITugTICAICkbUezNOWHfVq2La3ktuCIGn4aFNdAAzpFqkY1b5MTui7KCAAAv3Ms85zmrjukT5NSdSbv/B04vl5W3dk+QkO6Rql5KJ8v5Y0yAgDABeQX2bR4yzHNWnNQO49nlyyPa1RbQ7tFKb5FCJOolRPKCAAAF2EYhjYcPKPZaw9o+fb0Ul/hDOnaQP071ldQNS+TUzo3yggAAJep7K9w6mlo1yg1Cw0wOaFzoowAAOCg/CKbFm8+pllrS3+F07VxbQ3tGqWb+ArHIZQRAACukGEYSjpwWrPXHtTy7Wn65Rsc1avpp8FxfIVzuSgjAACUg6OZ5/RR4iHN25CqzF++wvHz8tCdHSI0tGuUmobwFU5ZKCMAAJSj/CKbvtx8VLPWHNSutJyS5d2a1NaQOL7CuRDKCAAAFcAwDK0/cFqz1xzUtzt++wonooaf+neKVL+OkQoN8jU3ZBVBGQEAoIIdOZOnuetSS32FY7VINzYP0X2xkbquaV23vlpCGQEAoJLkF9m0bFuaPklKVdKB0yXLw4N81b9TffXrVE9hQX4mJjQHZQQAABPszcjRp0mH9XnKkT9cLamr+2Lru9XVEsoIAAAmyi+yafn2NH28/s9XS/p1ilT/TpEuf7WEMgIAQBWxN+Os5iWl6rMLXC0Z2Lm+rm/mmldLKCMAAFQxv14t+WR9qtb/7mpJWJBvyZ044TVc52oJZQQAgCps34lfrpYkHyl5Ho7VIt3Q7NerJXXk6WE1OeXVoYwAAOAECoptWr49XZ+sP6R1+0tfLenX8fzYEme9WkIZAQDAyew/cVbzNhzWZ8lHdDq3UNL5qyXXN6ur/p0idWPzuvJyoqsllBEAAJxUQbFN325P1yfrU5W4/1TJ8uDq3rqzfYTu7RjpFM/EoYwAAOAC9p84q/kbD+uLlKM6kVNQsrxdZA316xip26LDFOhbNZ8gTBkBAMCFFNvs+uHnE1qw8bASdmao+JeH4vh6WdW7dZju7VhPXRrWlrUK3SJMGQEAwEWdPFugRZuOav6Gw9qTcbZkeWQtP90bE6m7Y+opogoMeqWMAADg4gzD0JYjWVqw8bC+2nxMOQXFkiSLRereJFj9OkbqLy1D5OvlYUo+yggAAG7kXKFNy7Yf14INR0oNeg3y81LfduG6t2OkWkcEVWomyggAAG4q9VSePks+f4vwsaz8kuUtwwLVr2M93dEuQjX9vSs8B2UEAAA3Z7MbWrP3pBZsPKxvt6er0GaXJHl7WPWXViHq1zFS3ZsEV9hzcSgjAACgRGZeob7cfEwLNh7W9mPZJcvDgnx1d4d6Ghhbv9wHvVJGAADABW07mqXPko9o4aajyjp3/rk4s4Z10g3N6pbr77ncz2/Pcv2tAACgymsdEaTWEUF6tndzfbczXSt2pOvaa+qYlocyAgCAm/L18tBtbcN1W9twU3M4z9N2AACAS7qiMjJ58mRFRUXJ19dXsbGxSkpKKnPd6dOnq0ePHqpZs6Zq1qyp+Pj4i64PAADci8NlZP78+Ro7dqwmTJiglJQURUdHq1evXsrIyLjg+itXrtTAgQP1/fffKzExUZGRkerZs6eOHj161eEBAIDzc/humtjYWHXq1EmTJk2SJNntdkVGRuqxxx7Ts88+e8ntbTabatasqUmTJmnw4MGX9Tu5mwYAAOdzuZ/fDl0ZKSwsVHJysuLj43/bgdWq+Ph4JSYmXtY+8vLyVFRUpFq1apW5TkFBgbKzs0u9AACAa3KojJw8eVI2m00hISGlloeEhCgtLe2y9vHMM88oPDy8VKH5o4kTJyooKKjkFRkZ6UhMAADgRCr1bppXXnlF8+bN08KFC+Xr61vmeuPGjVNWVlbJ6/Dhw5WYEgAAVCaH5hkJDg6Wh4eH0tPTSy1PT09XaGjoRbd9/fXX9corr+i7775T27ZtL7quj4+PfHx8HIkGAACclENXRry9vRUTE6OEhISSZXa7XQkJCYqLiytzu3//+9966aWXtGzZMnXs2PHK0wIAAJfj8AysY8eO1ZAhQ9SxY0d17txZb731lnJzczVs2DBJ0uDBgxUREaGJEydKkl599VWNHz9en3zyiaKiokrGllSvXl3Vq1cvx0MBAADOyOEy0r9/f504cULjx49XWlqa2rVrp2XLlpUMak1NTZXV+tsFlylTpqiwsFD33HNPqf1MmDBBL7744tWlBwAATo+n9gIAgApRIfOMAAAAlDeneGrvrxdvmPwMAADn8evn9qW+hHGKMpKTkyNJTH4GAIATysnJUVBQUJk/d4oxI3a7XceOHVNAQIAsFku57Tc7O1uRkZE6fPiwW4xFcafj5VhdlzsdL8fqutzleA3DUE5OjsLDw0vd3PJHTnFlxGq1ql69ehW2/8DAQJf+y/BH7nS8HKvrcqfj5Vhdlzsc78WuiPyKAawAAMBUlBEAAGAqty4jPj4+mjBhgts8B8edjpdjdV3udLwcq+tyt+O9FKcYwAoAAFyXW18ZAQAA5qOMAAAAU1FGAACAqSgjAADAVC5fRiZPnqyoqCj5+voqNjZWSUlJF13/v//9r5o3by5fX1+1adNGS5curaSkV2fixInq1KmTAgICVLduXfXt21e7d+++6DazZ8+WxWIp9fL19a2kxFfuxRdf/FPu5s2bX3QbZz2vUVFRfzpWi8WiMWPGXHB9ZzunP/74o/r06aPw8HBZLBYtWrSo1M8Nw9D48eMVFhYmPz8/xcfHa8+ePZfcr6Pv+8pwsWMtKirSM888ozZt2sjf31/h4eEaPHiwjh07dtF9Xsl7oTJc6rwOHTr0T7lvvvnmS+63Kp5X6dLHe6H3sMVi0WuvvVbmPqvqua0oLl1G5s+fr7Fjx2rChAlKSUlRdHS0evXqpYyMjAuuv3btWg0cOFAjRozQpk2b1LdvX/Xt21fbtm2r5OSO++GHHzRmzBitW7dOK1asUFFRkXr27Knc3NyLbhcYGKjjx4+XvA4dOlRJia9Oq1atSuVevXp1mes683ndsGFDqeNcsWKFJOnee+8tcxtnOqe5ubmKjo7W5MmTL/jzf//733rnnXc0depUrV+/Xv7+/urVq5fy8/PL3Kej7/vKcrFjzcvLU0pKil544QWlpKToiy++0O7du3X77bdfcr+OvBcqy6XOqyTdfPPNpXJ/+umnF91nVT2v0qWP9/fHefz4cc2cOVMWi0V33333RfdbFc9thTFcWOfOnY0xY8aU/Nlmsxnh4eHGxIkTL7h+v379jFtvvbXUstjYWOOhhx6q0JwVISMjw5Bk/PDDD2WuM2vWLCMoKKjyQpWTCRMmGNHR0Ze9viud1yeeeMJo3LixYbfbL/hzZz2nhmEYkoyFCxeW/NlutxuhoaHGa6+9VrIsMzPT8PHxMT799NMy9+Po+94MfzzWC0lKSjIkGYcOHSpzHUffC2a40LEOGTLEuOOOOxzajzOcV8O4vHN7xx13GDfeeONF13GGc1ueXPbKSGFhoZKTkxUfH1+yzGq1Kj4+XomJiRfcJjExsdT6ktSrV68y16/KsrKyJEm1atW66Hpnz55VgwYNFBkZqTvuuEPbt2+vjHhXbc+ePQoPD1ejRo10//33KzU1tcx1XeW8FhYWau7cuRo+fPhFHxjprOf0jw4cOKC0tLRS5y4oKEixsbFlnrsred9XVVlZWbJYLKpRo8ZF13PkvVCVrFy5UnXr1lWzZs00evRonTp1qsx1Xem8pqena8mSJRoxYsQl13XWc3slXLaMnDx5UjabTSEhIaWWh4SEKC0t7YLbpKWlObR+VWW32/Xkk0+qW7duat26dZnrNWvWTDNnztSXX36puXPnym63q2vXrjpy5EglpnVcbGysZs+erWXLlmnKlCk6cOCAevTooZycnAuu7yrnddGiRcrMzNTQoUPLXMdZz+mF/Hp+HDl3V/K+r4ry8/P1zDPPaODAgRd9iJqj74Wq4uabb9aHH36ohIQEvfrqq/rhhx/Uu3dv2Wy2C67vKudVkubMmaOAgADdddddF13PWc/tlXKKp/bCMWPGjNG2bdsu+f1iXFyc4uLiSv7ctWtXtWjRQtOmTdNLL71U0TGvWO/evUv+uW3btoqNjVWDBg20YMGCy/q/DWc1Y8YM9e7dW+Hh4WWu46znFL8pKipSv379ZBiGpkyZctF1nfW9MGDAgJJ/btOmjdq2bavGjRtr5cqVuummm0xMVvFmzpyp+++//5IDy5313F4pl70yEhwcLA8PD6Wnp5danp6ertDQ0AtuExoa6tD6VdGjjz6qr7/+Wt9//73q1avn0LZeXl5q37699u7dW0HpKkaNGjXUtGnTMnO7wnk9dOiQvvvuOz344IMObees51RSyflx5Nxdyfu+Kvm1iBw6dEgrVqxw+NHyl3ovVFWNGjVScHBwmbmd/bz+atWqVdq9e7fD72PJec/t5XLZMuLt7a2YmBglJCSULLPb7UpISCj1f46/FxcXV2p9SVqxYkWZ61clhmHo0Ucf1cKFC/W///1PDRs2dHgfNptNW7duVVhYWAUkrDhnz57Vvn37ysztzOf1V7NmzVLdunV16623OrSds55TSWrYsKFCQ0NLnbvs7GytX7++zHN3Je/7quLXIrJnzx599913ql27tsP7uNR7oao6cuSITp06VWZuZz6vvzdjxgzFxMQoOjra4W2d9dxeNrNH0FakefPmGT4+Psbs2bONHTt2GKNGjTJq1KhhpKWlGYZhGIMGDTKeffbZkvXXrFljeHp6Gq+//rqxc+dOY8KECYaXl5exdetWsw7hso0ePdoICgoyVq5caRw/frzklZeXV7LOH4/373//u7F8+XJj3759RnJysjFgwADD19fX2L59uxmHcNn++te/GitXrjQOHDhgrFmzxoiPjzeCg4ONjIwMwzBc67waxvm7BurXr28888wzf/qZs5/TnJwcY9OmTcamTZsMScabb75pbNq0qeQOkldeecWoUaOG8eWXXxo//fSTcccddxgNGzY0zp07V7KPG2+80Xj33XdL/nyp971ZLnashYWFxu23327Uq1fP2Lx5c6n3cEFBQck+/nisl3ovmOVix5qTk2M8/fTTRmJionHgwAHju+++Mzp06GBcc801Rn5+fsk+nOW8Gsal/x4bhmFkZWUZ1apVM6ZMmXLBfTjLua0oLl1GDMMw3n33XaN+/fqGt7e30blzZ2PdunUlP7vuuuuMIUOGlFp/wYIFRtOmTQ1vb2+jVatWxpIlSyo58ZWRdMHXrFmzStb54/E++eSTJf9uQkJCjFtuucVISUmp/PAO6t+/vxEWFmZ4e3sbERERRv/+/Y29e/eW/NyVzqthGMby5csNScbu3bv/9DNnP6fff//9Bf/e/npMdrvdeOGFF4yQkBDDx8fHuOmmm/7076FBgwbGhAkTSi272PveLBc71gMHDpT5Hv7+++9L9vHHY73Ue8EsFzvWvLw8o2fPnkadOnUMLy8vo0GDBsbIkSP/VCqc5bwaxqX/HhuGYUybNs3w8/MzMjMzL7gPZzm3FcViGIZRoZdeAAAALsJlx4wAAADnQBkBAACmoowAAABTUUYAAICpKCMAAMBUlBEAAGAqyggAADAVZQQAAJiKMgIAAExFGQEAAKaijAAAAFNRRgAAgKn+P5chIWWn8BkHAAAAAElFTkSuQmCC\n", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "class CustomCallback(keras.callbacks.Callback):\n", + " def on_epoch_end(self, epoch, logs=None):\n", + " model_name = f\"/content/drive/MyDrive/{lora_name}_{lora_rank}_epoch{epoch+1}.lora.h5\"\n", + " gemma_lm.backbone.save_lora_weights(model_name)\n", + "\n", + " # Evaluate\n", + " text_gen(\"다음에 대한 이메일 답장을 작성해줘.\\n\\\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\\\"\")\n", + "\n", + "history = gemma_lm.fit(train, epochs=train_epoch, batch_size=2, callbacks=[CustomCallback()])\n", + "\n", + "import matplotlib.pyplot as plt\n", + "plt.plot(history.history['loss'])\n", + "plt.show()" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "id": "gn5-eFiPUkSP" + }, + "outputs": [], + "source": [ + "# Example Code for Load LoRA\n", + "'''\n", + "train_epoch=17\n", + "gemma_lm = keras_nlp.models.GemmaCausalLM.from_preset(model_id)\n", + "# Use the same LoRA rank that you trained\n", + "gemma_lm.backbone.enable_lora(rank=4)\n", + "\n", + "# Load pre-trained LoRA weights\n", + "gemma_lm.backbone.load_lora_weights(f\"/content/drive/MyDrive/{lora_name}_{lora_rank}_epoch{train_epoch}.lora.h5\")\n", + "'''" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "ipg1u_wEKTxG" + }, + "source": [ + "## Try a different sampler\n", + "\n", + "The top-K algorithm randomly picks the next token from the tokens of top K probability." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "id": "nV5mD_HqKZRF" + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "고객님, 안녕하세요.\n", + "\n", + "결혼기념일 3호 케이크 주문 가능합니다! 3호 케이크는 다양한 종류의 맛과 장식이 가능합니다. 원하시는 맛과 장식을 알려주시면 맞춤으로 디자인해 드리겠습니다.\n", + "\n", + "주문 날짜, 시간, 필요한 픽업 날짜 등 추가적인 문의를 말씀해주세요.\n", + "\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n", + "TOTAL TIME ELAPSED: 5.88s\n", + "\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "안녕하세요.,\n", + "\n", + "결혼기념일 3호 케이크 주문 가능합니다! 케이크 맛과 디자인에 대해 문의해주시면 맞춤 제작을 도와드리겠습니다. 혹시 특별히 원하시는 맛이나 디자인이 있으신가요?\n", + "\n", + "감사합니다.\n", + "\n", + "[가장점 케이크 문의 담당자 이름] 드림\n", + "TOTAL TIME ELAPSED: 4.75s\n", + "\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "안녕하세요,\n", + "\n", + "결혼기념일 3호 케이크 주문 가능합니다! 3호 케이크 1개를 주문하시려면, 픽업 날짜와 시간, 원하신 디자인 (혹은 사진)을 말씀해주세요.\n", + "\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n", + "TOTAL TIME ELAPSED: 3.98s\n" + ] + } + ], + "source": [ + "gemma_lm.compile(sampler=\"top_k\")\n", + "text_gen(\"다음에 대한 이메일 답장을 작성해줘.\\n\\\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\\\"\")\n", + "text_gen(\"다음에 대한 이메일 답장을 작성해줘.\\n\\\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\\\"\")\n", + "text_gen(\"다음에 대한 이메일 답장을 작성해줘.\\n\\\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\\\"\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "3m1XaCrlMu3Y" + }, + "source": [ + "Try a slight different prompts" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "id": "qC-MLxYWM1HU" + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Gemma output:\n", + "user\n", + "다음에 대한 답장을 작성해줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + " 안녕하세요. 결혼기념일 3호 케이크 1개 주문 가능합니다. 원하시는 디자인이나 특별사항이 있으시면 말씀해주세요. 감사합니다. 😊\n", + "TOTAL TIME ELAPSED: 2.69s\n", + "\n", + "Gemma output:\n", + "user\n", + "아래에 적절한 답장을 써줘.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "안녕하세요. 결혼기념일을 위한 3호 케이크 주문 가능합니다! 3호 케이크는 [가격]이며, 디자인은 다양하게 선택 가능합니다. 원하신 디자인과 함께 주문하시면 감사하겠습니다. 궁금한 점 있으시면 언제든지 문의해주세요. 😊 \n", + "\n", + "TOTAL TIME ELAPSED: 4.41s\n", + "\n", + "Gemma output:\n", + "user\n", + "다음에 관한 답장을 써주세요.\n", + "\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\"\n", + "model\n", + "안녕하세요.,\n", + "\n", + "3호 케이크 1개 주문 가능합니다. 결혼기념일을 위한 특별한 케이크로 궁금하신 건, 정말 축하드립니다! \n", + "\n", + "주문하실 때 필요한 정보를 알려주시면 더욱 맞춤화해 드리려고 합니다. 예시로, 케이크 디자인, 크기, 문구, 픽업 날짜/시간 등을 말씀해주시면 됩니다.\n", + "\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n", + "TOTAL TIME ELAPSED: 6.30s\n" + ] + } + ], + "source": [ + "text_gen(\"다음에 대한 답장을 작성해줘.\\n\\\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\\\"\")\n", + "text_gen(\"아래에 적절한 답장을 써줘.\\n\\\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\\\"\")\n", + "text_gen(\"다음에 관한 답장을 써주세요.\\n\\\"안녕하세요, 결혼기념일을 위해 3호 케이크 1개를 주문하고 싶은데 가능할까요?\\\"\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "UePc572JSUmd" + }, + "source": [ + "Try a different email inputs" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "id": "8n5LkXU8Sn6D" + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Gemma output:\n", + "user\n", + "다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요,\n", + "\n", + "6월 15일에 있을 행사 답례품으로 쿠키 & 머핀 세트를 대량 주문하고 싶습니다.\n", + "\n", + "수량: 50세트\n", + "구성: 쿠키 2개 + 머핀 1개 (개별 포장)\n", + "디자인: 심플하고 고급스러운 디자인 (리본 포장 등)\n", + "문구: \"감사합니다\" 스티커 부착\n", + "배송 날짜: 6월 14일\n", + "대량 주문 할인 혜택이 있는지, 있다면 견적과 함께 배송 가능 여부를 알려주시면 감사하겠습니다.\n", + "\n", + "감사합니다.\n", + "\n", + "박철수 드림\" \n", + "model\n", + "박철수 님, 안녕하세요.\n", + "\n", + "6월 15일 행사 답례품 주문 문의 감사합니다.\n", + "\n", + "- 50 세트 대량 주문 가능합니다.\n", + "- 쿠키 2개 + 머핀 1개 (개별 포장) 디자인, \"감사합니다\" 스티커 부착\n", + "- 6월 14일 배송 가능합니다.\n", + "- 대량 주문 시 10% 할인 혜택 제공됩니다.\n", + "\n", + "견적 및 배송 가능 여부를 원하시면 곧 전화해주세요.\n", + "\n", + "감사합니다.\n", + "\n", + "[가게 이름] 드림\n", + "TOTAL TIME ELAPSED: 7.54s\n" + ] + } + ], + "source": [ + "text_gen(\"\"\"다음에 대한 이메일 답장을 작성해줘.\n", + "\"안녕하세요,\n", + "\n", + "6월 15일에 있을 행사 답례품으로 쿠키 & 머핀 세트를 대량 주문하고 싶습니다.\n", + "\n", + "수량: 50세트\n", + "구성: 쿠키 2개 + 머핀 1개 (개별 포장)\n", + "디자인: 심플하고 고급스러운 디자인 (리본 포장 등)\n", + "문구: \"감사합니다\" 스티커 부착\n", + "배송 날짜: 6월 14일\n", + "대량 주문 할인 혜택이 있는지, 있다면 견적과 함께 배송 가능 여부를 알려주시면 감사하겠습니다.\n", + "\n", + "감사합니다.\n", + "\n", + "박철수 드림\" \"\"\")\n" + ] + } + ], + "metadata": { + "accelerator": "GPU", + "colab": { + "name": "spoken_language_tasks_with_gemma.ipynb", + "toc_visible": true + }, + "kernelspec": { + "display_name": "Python 3", + "name": "python3" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/k-gemma-it/tune_model.sh b/k-gemma-it/tune_model.sh new file mode 100755 index 0000000..fe4f8a2 --- /dev/null +++ b/k-gemma-it/tune_model.sh @@ -0,0 +1,9 @@ +#!/bin/bash + +# activate virtual environment +source ../venv/bin/activate + +# delete any previously generated weights +rm -f weights/*.h5 + +python3 main.py \ No newline at end of file diff --git a/k-gemma-it/weights/tuning-output-readme.txt b/k-gemma-it/weights/tuning-output-readme.txt new file mode 100644 index 0000000..1769704 --- /dev/null +++ b/k-gemma-it/weights/tuning-output-readme.txt @@ -0,0 +1 @@ +New weights generated from tuning are written here as *.h5 files. \ No newline at end of file diff --git a/k-mail-replier/.gitignore b/k-mail-replier/.gitignore new file mode 100644 index 0000000..62c1b88 --- /dev/null +++ b/k-mail-replier/.gitignore @@ -0,0 +1,10 @@ +.idea/ +.vscode/ +.venv*/ +venv*/ +__pycache__/ +dist/ +.coverage* +htmlcov/ +.tox/ +docs/_build/ diff --git a/k-mail-replier/README.md b/k-mail-replier/README.md new file mode 100644 index 0000000..af3cc34 --- /dev/null +++ b/k-mail-replier/README.md @@ -0,0 +1,105 @@ +# Spoken Language Tasks Assistant with Gemma + +This tutorial walks you through setting up, running, and extending a spoken +language task application built with Gemma and Python. The application provides +a basic web user interface that you can modify to fit your needs. The application +is built to generate replies to customer emails for a fictitious Korean bakery, +and all the language input and output is handled entirely in Korean. You can use +this application pattern with any language and any business task that uses text +input and text output. + +## Project setup + +These instructions walk you through getting this project set up for +development and testing. The general steps are installing some prerequisite +software, cloning the project from the code repository, setting a few environment +variables, and running the configuration installation. + +### Install the prerequisites + +This project uses Python 3 and Python Poetry to manage packages and +run the application. The following installation instructions are for a Linux +host machine. + +To install the required software: + +* Install Python 3 and the `venv` virtual environment package for Python. +
+sudo apt update
+sudo apt install git pip python3-venv
+
+ +### Clone and configure the project + +Download the project code and use the Poetry installation command to download +the required dependencies and configure the project. You need +[git](https://git-scm.com/) source control software to retrieve the +project source code. + +To download the project code: + +1. Clone the git repository using the following command. +
+git clone https://github.com/google-gemini/gemma-cookbook.git
+
+1. Optionally, configure your local git repository to use sparse checkout, + so you have only the files for the project. +
+cd gemma-cookbook/
+git sparse-checkout set Gemma/spoken-language-tasks/
+git sparse-checkout init --cone
+
+ +To install the Python libraries: + +1. Configure and activate Python virtual environment (venv) for this project: +
+python3 -m venv venv
+source venv/bin/activate
+
+1. Install the required Python libraries for this project using the {{setup_python}} script. +
+./setup_python.sh
+
+ +### Set environment variables + +Set a few environment variables that are required to allow this code +project to run, including a Kaggle user name and Kaggle token key. +You must have a Kaggle account and request access to the Gemma model. + +You add your Kaggle Username and Kaggle Token Key to two `.env` files, +which are read by the web application and the tuning program, respectively. + +Caution: Treat your Kaggle Token Key like a password and protect it appropriately. +Don't embed your key in publicly published code. + +To set the environment variables: + +1. Obtain your Kaggle username and your token key by following the instructions + in the [Kaggle documentation](https://www.kaggle.com/docs/api#authentication) +1. Get access to the Gemma model by following the *Get access to Gemma* + instructions in the [Gemma Setup](/gemma/docs/setup#get-access) page. +1. Create environment variable files for the project, by creating a + `.env` text file at *each* these location in your clone of the project: +
+k-mail-replier/k_mail_replier/.env
+k-gemma-it/.env
+
+1. After creating the `.env` text files, add the following settings to **both** files: +
+KAGGLE_USERNAME=<YOUR_KAGGLE_USERNAME_HERE>
+KAGGLE_KEY=<YOUR_KAGGLE_KEY_HERE>
+
+ +### Run and test the application + +1. In a terminal window, navigate to the `spoken-language-tasks/k-mail-replier/k_mail_replier/` + directory. +
+cd spoken-language-tasks/k-mail-replier/
+
+1. Run the application using the `run_flask_app.sh` script: +
+./run_flask_app.sh
+
diff --git a/k-mail-replier/k_mail_replier/__init__.py b/k-mail-replier/k_mail_replier/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/k-mail-replier/k_mail_replier/app.py b/k-mail-replier/k_mail_replier/app.py new file mode 100644 index 0000000..e29b563 --- /dev/null +++ b/k-mail-replier/k_mail_replier/app.py @@ -0,0 +1,58 @@ +# +# Copyright 2024 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +from flask import Flask, render_template, request +#from k_mail_replier.models.gemini import create_message_processor +from k_mail_replier.models.gemma import create_message_processor + +app = Flask(__name__, static_url_path='/static', static_folder='static') +customer_request = None +model_processor = create_message_processor() # initialize model + +@app.route('/', methods=['GET', 'POST']) +def index(): + global customer_request + global model_processor + """Set up web interface and handle POST input.""" + # First run behavior: load a test email + if customer_request is None: + customer_request = get_test_email() + return render_template('index.html', request=customer_request) + + # Process email data + if request.method == 'POST': + prompt = get_prompt() + customer_request = request.form['request'] + prompt += customer_request + result = model_processor(prompt) + # re-render page with data: + return render_template('index.html', request=customer_request, result=result) + + return render_template('index.html') + +if __name__ == '__main__': + app.run(host="0.0.0.0", debug=True) + +def get_prompt(): + """Write a polite reply to this email thanks the sender for the request and saying that we will reply with more detail soon:""" + return "발신자에게 요청에 대한 감사를 전하고, 곧 자세한 내용을 알려드리겠다고 정중하게 답장해 주세요:\n" + +def get_test_email(): + try: + with open('data/email-001-ko.txt', 'r') as file: + email_content = file.read() + except FileNotFoundError: + email_content = "Error: File not found!" + return email_content \ No newline at end of file diff --git a/k-mail-replier/k_mail_replier/data/email-001-ko.txt b/k-mail-replier/k_mail_replier/data/email-001-ko.txt new file mode 100644 index 0000000..0fae2a3 --- /dev/null +++ b/k-mail-replier/k_mail_replier/data/email-001-ko.txt @@ -0,0 +1,10 @@ +발신자: birthday-parent-3458@pretend.mail +제목: 생일 파티 케이크 요청 + +안녕하세요! + +다음 주에 6살이 되는 아들을 위해 케이크를 주문하고 싶습니다. 그는 딸기 필링과 레이싱카가 얹힌 초콜릿 케이크를 좋아할 겁니다. 우리는 총 8명의 아이들과 파티를 열 예정입니다. + +케이크 크기를 추천해 주시고 예상 비용을 알려주세요. 감사합니다! + +-- 생일 부모 \ No newline at end of file diff --git a/k-mail-replier/k_mail_replier/data/email-001.txt b/k-mail-replier/k_mail_replier/data/email-001.txt new file mode 100644 index 0000000..f72fb38 --- /dev/null +++ b/k-mail-replier/k_mail_replier/data/email-001.txt @@ -0,0 +1,10 @@ +From: birthday-parent-3458@pretend.mail +Subject: Birthday Party Cake Request + +Hello! + +I would like to order a cake for my son who is turning 6 next week. He would love chocolate cake with strawberry filling and a race car on top. We will be having a party with 8 kids total. + +Please recommend a cake size and provide an estimated cost. Thank you! + +-- Birthday Parent \ No newline at end of file diff --git a/k-mail-replier/k_mail_replier/models/__init__.py b/k-mail-replier/k_mail_replier/models/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/k-mail-replier/k_mail_replier/models/gemini.py b/k-mail-replier/k_mail_replier/models/gemini.py new file mode 100644 index 0000000..8b007b7 --- /dev/null +++ b/k-mail-replier/k_mail_replier/models/gemini.py @@ -0,0 +1,38 @@ +# +# Copyright 2024 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +import google.generativeai as genai +from dotenv import load_dotenv +import os + +def initialize_model(): + """Loads environment variables and configures the GenAI client.""" + load_dotenv() + api_key = os.getenv('API_KEY') + if not api_key: + raise ValueError("API_KEY environment variable not found. Did you set it in your .env file?") + genai.configure(api_key=api_key) + return genai.GenerativeModel('gemini-1.5-flash') # Return the initialized model + +def create_message_processor(): + """Creates a message processor function with a persistent model.""" + model = initialize_model() + + def process_message(message): + """Processes a message using the GenAI model.""" + response = model.generate_content(message) + print(response.text) # REMOVE: FOR TESTING ONLY + return response.text + return process_message \ No newline at end of file diff --git a/k-mail-replier/k_mail_replier/models/gemma.py b/k-mail-replier/k_mail_replier/models/gemma.py new file mode 100644 index 0000000..9a2d2d1 --- /dev/null +++ b/k-mail-replier/k_mail_replier/models/gemma.py @@ -0,0 +1,86 @@ +# +# Copyright 2024 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +import os +import re +from dotenv import load_dotenv + +# Set the backbend before importing Keras +os.environ["KERAS_BACKEND"] = "jax" +# Avoid memory fragmentation on JAX backend. +os.environ["XLA_PYTHON_CLIENT_MEM_FRACTION"] = "1.00" + +import keras_nlp + +def initialize_model(): + """Loads environment variables and configures the Gemma model.""" + load_dotenv() + # load Kaggle account info for downloading Gemma + kaggle_username = os.getenv('KAGGLE_USERNAME') + if not kaggle_username: + raise ValueError("KAGGLE_USERNAME environment variable not found. Did you set it in your .env file?") + kaggle_key = os.getenv('KAGGLE_KEY') + if not kaggle_key: + raise ValueError("KAGGLE_KEY environment variable not found. Did you set it in your .env file?") + + # create instance using Gemma 2 2B instruction tuned model + gemma = keras_nlp.models.GemmaCausalLM.from_preset("gemma2_instruct_2b_en") + #gemma.summary() # REMOVE: FOR TESTING ONLY + + # load and compile tuned model weights + gemma.backbone.enable_lora(rank=4) + gemma.backbone.load_lora_weights(f"./weights/gemma2-2b_k-tuned.lora.h5") + #gemma.compile(sampler="top_k") + gemma.compile(sampler=keras_nlp.samplers.TopKSampler(k=3, temperature=0.1)) + + return gemma # Return the initialized model + +def create_message_processor(): + """Creates a message processor function with a persistent model.""" + model = initialize_model() + + def process_message(prompt_text): + """Processes a message using a local Gemma model.""" + input = f"user\n{prompt_text}\nmodel\n" + response = model.generate(input, max_length=512) + # remove response tags + response = extract_substring(response) + + print(response) # REMOVE: FOR TESTING ONLY + return response + + return process_message + +def extract_substring(text): + """ + Extracts the substring between "\nmodel" and the next "". + + Args: + text: The input text. + + Returns: + The extracted substring, or None if no match is found. + """ + match = re.search(r".*\nmodel\n(.*)", text, re.DOTALL) + if match: + return match.group(1).strip() + else: + return text + +# default method +if __name__ == "__main__": + process_message = create_message_processor() + process_message("roses are red") + #print(extract_substring("user\nTHE PROMPT\nmodel\nTHE TEXT RESPONSE")) \ No newline at end of file diff --git a/k-mail-replier/k_mail_replier/static/css/style.css b/k-mail-replier/k_mail_replier/static/css/style.css new file mode 100644 index 0000000..ff5a7bf --- /dev/null +++ b/k-mail-replier/k_mail_replier/static/css/style.css @@ -0,0 +1,203 @@ +:root { + --gradient-top: #fda64f; + --gradient-bottom: #fc9d3d; + --colorized-text: #e87500; + --black: #070600; + --white: #f7f7ff; + --button-bg: var(--colorized-text); + --button-bg-hover: #b75c01; + --button-text: white; + --border-radius: 1rem; + --elevation-1: 0px 1px 1px 0px rgba(0, 0, 0, 0.14), + 0px 2px 1px -1px rgba(0, 0, 0, 0.12), + 0px 1px 3px 0px rgba(0, 0, 0, 0.2); + --margin-left: 5rem; + --display-font: "Poppins", Roboto, sans-serif; + --text-font: Roboto, sans-serif; +} + +body { + background: var(--white); + color: var(--black); + font-family: var(--text-font); + margin: 0; +} + +.container { + display: grid; + grid-template-areas: "left-col main-col"; + grid-template-columns: 15rem auto; + height: 100vh; + width: 99vw; + overflow: hidden; +} + +.left-column { + background: linear-gradient(180deg, var(--gradient-top), var(--gradient-bottom)); + border-radius: 0 var(--border-radius) var(--border-radius) 0; + grid-area: left-col; + padding: 1.5rem; +} + +.right-column { + display: grid; + grid-area: main-col; + grid-template-areas: "topbar" + "form" + "output"; + grid-template-rows: 4rem .5fr 1fr; +} + +.logo { + position: relative; + text-align: center; + top: 3rem; +} + +.topbar { + padding: 1rem; + text-align: right; + color: var(--colorized-text); +} + +.settings { + content: 'settings'; +} + +.material-symbols-outlined { + cursor: pointer; + font-variation-settings: 'FILL' 1, + 'wght' 400, + 'GRAD' 0, + 'opsz' 24; + font-size: 2rem !important; + user-select: none; +} + +h1 { + color: var(--white); + font-family: var(--display-font); + font-size: 4rem; + margin: auto; + width: 100%; +} + +h2 { + color: var(--colorized-text); + font-family: var(--display-font); +} + +.form-container { + display: grid; + grid-area: form; + grid-template-areas: "formbox import"; + grid-template-columns: min-content; + justify-content: left; + margin-left: var(--margin-left); +} + +.import-button-group { + grid-area: import; +} + +form { + display: inline-grid; + grid-area: formbox; + grid-template-rows: min-content; + width: fit-content; +} + +textarea { + width: 45vw; + height: 20vh; + padding: 1rem; + font-family: var(--text-font); + font-size: 1rem; + border: 1px solid #efd6d6; + border-radius: 1rem; + resize: none; + box-shadow: var(--elevation-1); +} + +textarea:focus { + outline-color: var(--gradient-bottom); +} + +button { + padding: .7rem 2.5rem; + background-color: var(--button-bg); + color: var(--button-text); + border: none; + border-radius: 5px; + cursor: pointer; + height: fit-content; + width: fit-content; + margin: 1rem auto; + font-family: var(--display-font); + font-weight: 600; + transition: background .3s; +} + +button:hover, +button:focus { + background: var(--button-bg-hover); +} + +.import, +.copy-to-clipboard { + margin-left: 1rem; + padding: .5rem; +} + +.button-group { + display: inline-flex; +} + +.tooltip { + background: var(--black); + border-radius: .5em; + color: var(--white); + filter: opacity(.75); + font-family: var(--display-font); + font-size: .9rem; + font-weight: 500; + position: relative; + top: 5rem; + left: -3rem; + height: fit-content; + opacity: 0; + padding: .5em 1em; + transition: opacity .2s; + visibility: hidden; +} + +button:hover + .tooltip, +button:focus + .tooltip { + opacity: 1; + visibility: visible; +} + +.output { + grid-area: output; + margin-left: var(--margin-left); +} + +.form-container, +.export { + grid-template-columns: min-content; + justify-content: left; +} + +.export { + display: grid; + grid-template-areas: "output copybutton"; + width: fit-content; +} + +.copy-button-group { + grid-area: copybutton; +} + +.json-output { + grid-area: output; +} diff --git a/k-mail-replier/k_mail_replier/static/js/script.js b/k-mail-replier/k_mail_replier/static/js/script.js new file mode 100644 index 0000000..a2f05fd --- /dev/null +++ b/k-mail-replier/k_mail_replier/static/js/script.js @@ -0,0 +1,68 @@ +/** + * @fileoverview Functions used in events attached in templates/index.html. + */ + +/** + * Opens a file select dialog to import the file text into a text area. + * + * @param {Element} inputTextArea - Text area to import the file text into. + */ +function openFileSelectDialog(inputTextArea) { + const fileInput = document.createElement('input'); + fileInput.type = 'file'; + fileInput.accept = '.txt'; + + fileInput.addEventListener('change', function() { + importTextFile(fileInput, inputTextArea); + }); + + fileInput.click(); +} + +/** + * Reads an input text file and imports the content into a text area. + * + * @param input - File to be imported + * @param {Element} inputTextArea - Text area to import the file text into + */ +function importTextFile(input, inputTextArea) { + const file = input.files[0]; + + if (file) { + const reader = new FileReader(); + + reader.onload = function(e) { + inputTextArea.value = e.target.result; + } + + reader.onerror = function(e) { + console.error("Error reading file:", e); + } + + reader.readAsText(file); + } else { + console.warn("Please select a text file."); + } +} + +/** + * Copy the text of the output text area to the user's clipboard. + * + * @param {Element} copyTextArea - Text area that has the content to be copied + * @param {Element} copyTooltip - Tooltip associated with the copy button + */ +function copyToClipboard(copyTextArea, copyTooltip) { + copyTextArea.select(); + document.execCommand('copy'); + copyTooltip.textContent = 'Copied!'; +} + +/** + * Resets the copy to clipboard tooltip text when the user moves the cursor + * away from the clipboard button. + * + * @param {Element} copyTooltip - Tooltip associated with the copy button + */ +function copyTooltipReset(copyTooltip) { + copyTooltip.textContent = 'Copy to clipboard'; +} diff --git a/k-mail-replier/k_mail_replier/templates/index.html b/k-mail-replier/k_mail_replier/templates/index.html new file mode 100644 index 0000000..e9f66b3 --- /dev/null +++ b/k-mail-replier/k_mail_replier/templates/index.html @@ -0,0 +1,82 @@ + + + + k-mail - 사업 문의 with Google AI + + + + + + + +
+
+ +
+
+
+ settings + account_circle +
+
+
+ + +
+
+ +
+
+ {% if result %} +
+

생성된 답변

+
+ +
+ + +
+
+
+ {% endif %} +
+
+ + + + + diff --git a/k-mail-replier/k_mail_replier/weights/gemma2-2b_k-tuned.lora.h5 b/k-mail-replier/k_mail_replier/weights/gemma2-2b_k-tuned.lora.h5 new file mode 100644 index 0000000..0886f12 Binary files /dev/null and b/k-mail-replier/k_mail_replier/weights/gemma2-2b_k-tuned.lora.h5 differ diff --git a/k-mail-replier/run_flask_app.sh b/k-mail-replier/run_flask_app.sh new file mode 100755 index 0000000..fbd093d --- /dev/null +++ b/k-mail-replier/run_flask_app.sh @@ -0,0 +1,8 @@ +#!/bin/bash + +# command line test of Gemma model generate process +source ../venv/bin/activate + +# run from /k_mail_replier directory +cd k_mail_replier/ +flask run \ No newline at end of file diff --git a/k-mail-replier/tests/__init__.py b/k-mail-replier/tests/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/k-mail-replier/tests/gemma_generate.sh b/k-mail-replier/tests/gemma_generate.sh new file mode 100755 index 0000000..593269a --- /dev/null +++ b/k-mail-replier/tests/gemma_generate.sh @@ -0,0 +1,8 @@ +#!/bin/bash + +# command line test of Gemma model generate process +# run from /test directory +source ../../venv/bin/activate + +cd ../k_mail_replier/ +python3 models/gemma.py \ No newline at end of file diff --git a/requirements.txt b/requirements.txt new file mode 100644 index 0000000..f639059 --- /dev/null +++ b/requirements.txt @@ -0,0 +1,113 @@ +absl-py +aiohappyeyeballs +aiohttp +aiosignal +annotated-types +astunparse +async-timeout +attrs +bleach +blinker +cachetools +certifi +charset-normalizer +click +contourpy +cycler +datasets +dill +filelock +flask +flatbuffers +fonttools +frozenlist +fsspec +gast +google-ai-generativelanguage +google-api-core +google-api-python-client +google-auth +google-auth-httplib2 +google-generativeai +google-pasta +googleapis-common-protos +grpcio +grpcio-status +h5py +httplib2 +huggingface-hub +idna +importlib-metadata +importlib-resources +itsdangerous +jax +jax-cuda12-pjrt +jax-cuda12-plugin +jaxlib +jinja2 +kagglehub +keras +keras-nlp +kiwisolver +libclang +Markdown +markdown-it-py +MarkupSafe +matplotlib +mdurl +ml-dtypes +multidict +multiprocess +namex +numpy +nvidia-cublas-cu12 +nvidia-cuda-cupti-cu12 +nvidia-cuda-nvcc-cu12 +nvidia-cuda-runtime-cu12 +nvidia-cudnn-cu12 +nvidia-cufft-cu12 +nvidia-cusolver-cu12 +nvidia-cusparse-cu12 +nvidia-nccl-cu12 +nvidia-nvjitlink-cu12 +opt-einsum +optree +packaging +pandas +pillow +proto-plus +protobuf +pyarrow +pyasn1 +pyasn1-modules +pydantic +pydantic-core +pygments +pyparsing +python-dateutil +python-dotenv +pytz +PyYAML +regex +requests +rich +rsa +scipy +six +tensorboard +tensorboard-data-server +tensorflow +tensorflow-io-gcs-filesystem +tensorflow-text +termcolor +tqdm +typing-extensions +tzdata +uritemplate +urllib3 +webencodings +werkzeug +wrapt +xxhash +yarl +zipp diff --git a/requirements.txt.bak b/requirements.txt.bak new file mode 100644 index 0000000..3644e9a --- /dev/null +++ b/requirements.txt.bak @@ -0,0 +1,113 @@ +absl-py==2.1.0 +aiohappyeyeballs==2.4.0 +aiohttp==3.10.5 +aiosignal==1.3.1 +annotated-types==0.7.0 +astunparse==1.6.3 +async-timeout==4.0.3 +attrs==24.2.0 +bleach==6.1.0 +blinker==1.8.2 +cachetools==5.5.0 +certifi==2024.8.30 +charset-normalizer==3.3.2 +click==8.1.7 +contourpy==1.3.0 +cycler==0.12.1 +datasets==2.21.0 +dill==0.3.8 +filelock==3.15.4 +flask==3.0.3 +flatbuffers==24.3.25 +fonttools==4.53.1 +frozenlist==1.4.1 +fsspec==2024.6.1 +gast==0.6.0 +google-ai-generativelanguage==0.6.6 +google-api-core==2.19.2 +google-api-python-client==2.143.0 +google-auth==2.34.0 +google-auth-httplib2==0.2.0 +google-generativeai==0.7.2 +google-pasta==0.2.0 +googleapis-common-protos==1.65.0 +grpcio==1.66.1 +grpcio-status==1.62.3 +h5py==3.11.0 +httplib2==0.22.0 +huggingface-hub==0.24.6 +idna==3.8 +importlib-metadata==8.4.0 +importlib-resources==6.4.4 +itsdangerous==2.2.0 +jax==0.4.30 +jax-cuda12-pjrt==0.4.30 +jax-cuda12-plugin==0.4.30 +jaxlib==0.4.30 +jinja2==3.1.4 +kagglehub==0.2.9 +keras==3.5.0 +keras-nlp==0.14.4 +kiwisolver==1.4.7 +libclang==18.1.1 +Markdown==3.7 +markdown-it-py==3.0.0 +MarkupSafe==2.1.5 +matplotlib==3.9.2 +mdurl==0.1.2 +ml-dtypes==0.4.0 +multidict==6.0.5 +multiprocess==0.70.16 +namex==0.0.8 +numpy==1.26.4 +nvidia-cublas-cu12==12.6.1.4 +nvidia-cuda-cupti-cu12==12.6.68 +nvidia-cuda-nvcc-cu12==12.6.68 +nvidia-cuda-runtime-cu12==12.6.68 +nvidia-cudnn-cu12==9.3.0.75 +nvidia-cufft-cu12==11.2.6.59 +nvidia-cusolver-cu12==11.6.4.69 +nvidia-cusparse-cu12==12.5.3.3 +nvidia-nccl-cu12==2.22.3 +nvidia-nvjitlink-cu12==12.6.68 +opt-einsum==3.3.0 +optree==0.12.1 +packaging==24.1 +pandas==2.2.2 +pillow==10.4.0 +proto-plus==1.24.0 +protobuf==4.25.4 +pyarrow==17.0.0 +pyasn1==0.6.0 +pyasn1-modules==0.4.0 +pydantic==2.8.2 +pydantic-core==2.20.1 +pygments==2.18.0 +pyparsing==3.1.4 +python-dateutil==2.9.0.post0 +python-dotenv==1.0.1 +pytz==2024.1 +PyYAML==6.0.2 +regex==2024.7.24 +requests==2.32.3 +rich==13.8.0 +rsa==4.9 +scipy==1.13.1 +six==1.16.0 +tensorboard==2.17.1 +tensorboard-data-server==0.7.2 +tensorflow==2.17.0 +tensorflow-io-gcs-filesystem==0.37.1 +tensorflow-text==2.17.0 +termcolor==2.4.0 +tqdm==4.66.5 +typing-extensions==4.12.2 +tzdata==2024.1 +uritemplate==4.1.1 +urllib3==2.2.2 +webencodings==0.5.1 +werkzeug==3.0.4 +wrapt==1.16.0 +xxhash==3.5.0 +yarl==1.9.7 +zipp==3.20.1 diff --git a/setup_python.sh b/setup_python.sh new file mode 100755 index 0000000..2e6b4eb --- /dev/null +++ b/setup_python.sh @@ -0,0 +1,24 @@ +#!/bin/bash + +# activate virtual environment +source venv/bin/activate + +pip install -r requirements.txt + +# note: record python installation as follows +# pip freeze > requirements.txt + +# ------------------------- +# key package installations (for manual installation) +# pip install python-dotenv +# pip install Flask bleach # Only needed for web application + +# (Optional) for Gemini API testing +# pip install google-generativeai + +# Gemma specific software +# pip install keras-nlp +# pip install "jax[cuda12]" # install jax for CUDA 12 drivers + +# pip install datasets # only needed for tuning +# pip install matplotlib # only needed for tuning evaluation