From b75c091c38fce49cd8a0afd210252e831da5024e Mon Sep 17 00:00:00 2001 From: Quynh Le Date: Fri, 22 Dec 2023 22:41:57 -0800 Subject: [PATCH] update most of the remaining docs --- README.md | 3 ++- docs/GETTING_STARTED.md | 24 ++++++++++++---------- docs/PROJECT_PHILOSOPHY.md | 4 ++-- docs/dev/howtos.md | 14 ++++++------- docs/integrations/lepton_ai.md | 5 +++-- docs/integrations/vectara.md | 7 ++++--- openssa/integrations/llama_index/README.md | 22 ++++++++++---------- 7 files changed, 42 insertions(+), 37 deletions(-) diff --git a/README.md b/README.md index 5c2dd65c0..3f80b7979 100644 --- a/README.md +++ b/README.md @@ -81,7 +81,8 @@ Our primary audience includes: 4. A committer to OpenSSA -### Getting Started as an End-User +## Getting Started as an End-User +Go straight to [OpenSSA Streamlit app](https://openssa.streamlit.app/) and start building your own SSA with your domain document today! ### Getting Started as a Developer diff --git a/docs/GETTING_STARTED.md b/docs/GETTING_STARTED.md index 36ae31321..79277276f 100644 --- a/docs/GETTING_STARTED.md +++ b/docs/GETTING_STARTED.md @@ -2,25 +2,27 @@ ## Who Are You? -1. An end-user of OpenSSM-based applications +1. An end-user of OpenSSA-based applications -2. A developer of applications or services using OpenSSM +2. A developer of applications or services using OpenSSA -3. An aspiring contributor to OpenSSM +3. An aspiring contributor to OpenSSA -4. A committer to OpenSSM +4. A committer to OpenSSA ## Getting Started as an End-User +Go straight to [OpenSSA Streamlit app](https://openssa.streamlit.app/) and start building your own SSA with your domain document today! + ## Getting Started as a Developer -See some example user programs in the [examples/notebooks](./examples/notebooks) directory. For example, the see the sample use case on ALD semiconductor knowledge, do: +See some example user programs in the [examples/notebooks](./examples/notebooks) directory. For example, to see the sample use case on ALD semiconductor knowledge, do: ```bash % cd examples/notebooks ``` -### Common `make` targets for OpenSSM developers +### Common `make` targets for OpenSSA developers See [MAKEFILE](dev/makefile_info.md) for more details. @@ -32,7 +34,7 @@ See [MAKEFILE](dev/makefile_info.md) for more details. % make poetry-init % make poetry-install -% make install # local installation of openssm +% make install # local installation of OpenSSA % make pypi-auth # only for maintainers % make publish # only for maintainers @@ -40,9 +42,9 @@ See [MAKEFILE](dev/makefile_info.md) for more details. ## Getting Started as an Aspiring Contributor -OpenSSM is a community-driven initiative, and we warmly welcome contributions. Whether it's enhancing existing models, creating new SSMs for different industrial domains, or improving our documentation, every contribution counts. See our [Contribution Guide](../CONTRIBUTING.md) for more details. +OpenSSA is a community-driven initiative, and we warmly welcome contributions. Whether it's enhancing existing models, creating new SSMs for different industrial domains, or improving our documentation, every contribution counts. See our [Contribution Guide](../CONTRIBUTING.md) for more details. -You can begin contributing to the OpenSSM project in the `contrib/` directory. +You can begin contributing to the OpenSSA project in the `contrib/` directory. ## Getting Started as a Committer @@ -50,11 +52,11 @@ You already know what to do. ## Community -Join our vibrant community of AI enthusiasts, researchers, developers, and businesses who are democratizing industrial AI through SSMs. Participate in the discussions, share your ideas, or ask for help on our [Community Discussions](https://github.com/aitomatic/openssm/discussions). +Join our vibrant community of AI enthusiasts, researchers, developers, and businesses who are democratizing industrial AI through SSMs. Participate in the discussions, share your ideas, or ask for help on our [Community Discussions](https://github.com/aitomatic/OpenSSA/discussions). ## License -OpenSSM is released under the [Apache 2.0 License](./LICENSE.md). +OpenSSA is released under the [Apache 2.0 License](./LICENSE.md). ## Links diff --git a/docs/PROJECT_PHILOSOPHY.md b/docs/PROJECT_PHILOSOPHY.md index 793213f54..feef9c6fd 100644 --- a/docs/PROJECT_PHILOSOPHY.md +++ b/docs/PROJECT_PHILOSOPHY.md @@ -1,6 +1,6 @@ -# OpenSSM Project Philosophy +# OpenSSA Project Philosophy -At OpenSSM, we believe in the democratization of AI. Our goal is to create an ecosystem where anyone, regardless of their resources, can have access to efficient and domain-specific AI solutions. We envision a future where AI is not only accessible but also robust, reliable, and trustworthy. +At OpenSSA, we believe in the democratization of AI. Our goal is to create an ecosystem where anyone, regardless of their resources, can have access to efficient and domain-specific AI solutions. We envision a future where AI is not only accessible but also robust, reliable, and trustworthy. Our project is guided by the following principles: diff --git a/docs/dev/howtos.md b/docs/dev/howtos.md index 59fc6880c..86887332f 100644 --- a/docs/dev/howtos.md +++ b/docs/dev/howtos.md @@ -2,21 +2,21 @@ ## Observability -`OpenSSM` has built-in observability and tracing. +`OpenSSA` has built-in observability and tracing. ## Logging -Users of `OpenSSM` can use the `logger` object provided by the `OpenSSM` package: +Users of `OpenSSA` can use the `logger` object provided by the `OpenSSA` package: ```python -from OpenSSM import logger +from OpenSSA import logger logger.warning("xyz = %s", xyz) ``` -If you are an `OpenSSM` contributor, you may use the `openssm` logger: +If you are an `OpenSSA` contributor, you may use the `OpenSSA` logger: ```python -from openssm import mlogger +from OpenSSA import mlogger mlogger.warning("xyz = %s", xyz) ``` @@ -25,7 +25,7 @@ mlogger.warning("xyz = %s", xyz) There are some useful decorators for automatically logging function entry and exit. ```python -from openssm import Logs +from OpenSSA import Logs @Logs.do_log_entry_and_exit() # upon both entry and exit def func(param1, param2): @@ -40,7 +40,7 @@ The above will automatically log function entry with its parameters, and functio If you want to use your own logger with its own name, use ```python -from openssm import Logs +from OpenSSA import Logs my_logger = Logs.get_logger(app_name, logger.INFO) @Logs.do_log_entry_and_exit(logger=my_logger) diff --git a/docs/integrations/lepton_ai.md b/docs/integrations/lepton_ai.md index 0fa9d1fda..6ebe5981c 100644 --- a/docs/integrations/lepton_ai.md +++ b/docs/integrations/lepton_ai.md @@ -2,7 +2,7 @@ [Lepton.AI](https://lepton.ai) is a developer-centric platform to build, fine-tune, and deploy large models. -With OpenSSM, you can create SSMs by calling the Lepton pipeline with just a few lines of code. +With OpenSSA, you can create SSMs by calling the Lepton pipeline with just a few lines of code. ```python from openssm import BaseSSM, LeptonSLMFactory @@ -12,10 +12,11 @@ response = ssm.discuss(conversation_id, "what is abc?") ## Integration Architecture -In the OpenSSM context, Lepton helps finetune and distill the SLM (small language model) that front-ends an SSM. +In the OpenSSA context, Lepton helps finetune and distill the SLM (small language model) that front-ends an SSM. ```python ![Lepton Integration](../diagrams/ssm-lepton-integration.drawio.png) ``` ## Roadmap +To be updated \ No newline at end of file diff --git a/docs/integrations/vectara.md b/docs/integrations/vectara.md index d1a350ed6..f55ca29eb 100644 --- a/docs/integrations/vectara.md +++ b/docs/integrations/vectara.md @@ -2,10 +2,10 @@ [Vectara](https://vectara.com/) is a developer-first API platform for easily building conversational search experiences that feature best-in-class Retrieval, Summarization, and “Grounded Generation” that all but eliminates hallucinations. -With OpenSSM, you can simply use `Vectara` with just a few lines of code. +With OpenSSA, you can simply use `Vectara` with just a few lines of code. ```python -from openssm import VectaraSSM +from openssa import VectaraSSM ssm = VectaraSSM() ssm.read_directory("path/to/directory") response = ssm.discuss(conversation_id, "what is xyz?") @@ -13,10 +13,11 @@ response = ssm.discuss(conversation_id, "what is xyz?") ## Integration Architecture -In the OpenSSM context, Vectara is treated as a backend, as shown below.. +In the OpenSSA context, Vectara is treated as a backend, as shown below.. ![LlamaIndex Integration](../diagrams/ssm-llama-index-integration.drawio.png) `LlamaIndexSSM` is simply an SSM with a passthrough (dummy) SLM that sends user queries directory to the Vectara backend. ## Roadmap +To be updated diff --git a/openssa/integrations/llama_index/README.md b/openssa/integrations/llama_index/README.md index 64fbb7e44..52fbe67ba 100644 --- a/openssa/integrations/llama_index/README.md +++ b/openssa/integrations/llama_index/README.md @@ -1,6 +1,6 @@ -# OpenSSM and LlamaIndex Integration +# OpenSSA and LlamaIndex Integration -This guide provides an overview and examples of how Small Specialist Models (SSMs, from the [OpenSSM](https://github.com/aitomatic/openssm) project) integrate with LlamaIndex. +This guide provides an overview and examples of how Small Specialist Models (SSMs, from the [OpenSSA](https://github.com/aitomatic/OpenSSA) project) integrate with LlamaIndex. ## Overview @@ -24,10 +24,10 @@ Here are some examples to get you started. ### Basic Integration -OpenSSM makes using LlamaIndex as simple as 3 lines of code: +OpenSSA makes using LlamaIndex as simple as 3 lines of code: ```python -from openssm import LlamaIndexSSM # Instantiate a LlamaIndexSSM +from OpenSSA import LlamaIndexSSM # Instantiate a LlamaIndexSSM ssm = LlamaIndexSSM() ssm.read_directory('docs/ylecun') # Read the docs for the first time @@ -48,7 +48,7 @@ ssm.load('storage/ylecun') # Load the index from storage In the example below, we put a domain-specific SSM (an SLM or small language model trained on data related to Yann LeCun’s work) in front of LlamaIndex. ```python -from openssm import LlamaIndexSSM, FineTunedSLM +from OpenSSA import LlamaIndexSSM, FineTunedSLM slm = FineTunedSLM(...) # Instantiate a domain-specific SLM ssm = LlamaIndexSSM(slm=slm) # Instantiate a LlamaIndexSSM with the SLM @@ -59,7 +59,7 @@ response = ssm.discuss("What is the main point made by Yann LeCun?") The response from this ssm would be much richer and more informed about Yann LeCun’s work than a generic SSM performing the same task. -In all of the above examples, the SSM is using LlamaIndex as a [`Backend`](/openssm/core/backend/abstract_backend), as shown below. +In all of the above examples, the SSM is using LlamaIndex as a [`Backend`](/OpenSSA/core/backend/abstract_backend), as shown below. ![Integration Architecture](../../../docs/diagrams/ssm-llama-index-integration.drawio.png) @@ -74,7 +74,7 @@ Here, we cover three primary use cases: An agent can retrieve context-specific data to inform responses. For example, in a financial setting: ```python -from openssm import LlamaIndexSSM, ContextRetrievalAgent +from OpenSSA import LlamaIndexSSM, ContextRetrievalAgent context = """ XYZ company reported Q2 revenues of $4.5 billion, up 18% YoY. The rise is primarily due to a 32% growth in their cloud division. @@ -94,7 +94,7 @@ This agent can retrieve and analyze data from relevant financial reports, taking In cases where the set of tools is extensive, the agent can retrieve the most relevant ones dynamically during query time. For example, in a data analysis setting: ```python -from openssm import LlamaIndexSSM, FunctionRetrievalAgent +from OpenSSA import LlamaIndexSSM, FunctionRetrievalAgent agent = FunctionRetrievalAgent('tools/data_tools') ssm = LlamaIndexSSM(agents=[tool_agent]) @@ -107,10 +107,10 @@ This allows the SSM to retrieve and apply the most suitable data analysis tool b #### Query Planning -For more complex tasks, OpenSSM can be made capable of advanced query planning thanks to LlamaIndex. It could, for instance, plan and execute a series of queries to answer a question about a company’s revenue growth over specific months. +For more complex tasks, OpenSSA can be made capable of advanced query planning thanks to LlamaIndex. It could, for instance, plan and execute a series of queries to answer a question about a company’s revenue growth over specific months. ```python -from openssm import LlamaIndexSSM, QueryPlanningAgent +from OpenSSA import LlamaIndexSSM, QueryPlanningAgent query_plan_tool = QueryPlanTool.from_defaults( query_engine_tools=[query_tool_sept, query_tool_june, query_tool_march] @@ -127,7 +127,7 @@ This illustrates how an SSM with a Query Planning Agent can plan and execute a s ### Future Enhancements -As we continue to enhance the integration between OpenSSM and LlamaIndex, here are a few promising directions: +As we continue to enhance the integration between OpenSSA and LlamaIndex, here are a few promising directions: - **SSMs as agents for LlamaIndex**: We are exploring ways to make SSMs available as agents for LlamaIndex, allowing for more complex interactions between SSMs and LlamaIndex.