Skip to content

Commit

Permalink
Merge pull request #99 from aitomatic/docs
Browse files Browse the repository at this point in the history
  • Loading branch information
TheVinhLuong102 authored Dec 23, 2023
2 parents 05b7b04 + 49810e0 commit b913357
Show file tree
Hide file tree
Showing 7 changed files with 42 additions and 37 deletions.
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,8 @@ Our primary audience includes:

4. A committer to OpenSSA

### Getting Started as an End-User
## Getting Started as an End-User
Go straight to [OpenSSA Streamlit app](https://openssa.streamlit.app/) and start building your own SSA with your domain document today!


### Getting Started as a Developer
Expand Down
24 changes: 13 additions & 11 deletions docs/GETTING_STARTED.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,25 +2,27 @@

## Who Are You?

1. An end-user of OpenSSM-based applications
1. An end-user of OpenSSA-based applications

2. A developer of applications or services using OpenSSM
2. A developer of applications or services using OpenSSA

3. An aspiring contributor to OpenSSM
3. An aspiring contributor to OpenSSA

4. A committer to OpenSSM
4. A committer to OpenSSA

## Getting Started as an End-User
Go straight to [OpenSSA Streamlit app](https://openssa.streamlit.app/) and start building your own SSA with your domain document today!


## Getting Started as a Developer

See some example user programs in the [examples/notebooks](./examples/notebooks) directory. For example, the see the sample use case on ALD semiconductor knowledge, do:
See some example user programs in the [examples/notebooks](./examples/notebooks) directory. For example, to see the sample use case on ALD semiconductor knowledge, do:

```bash
% cd examples/notebooks
```

### Common `make` targets for OpenSSM developers
### Common `make` targets for OpenSSA developers

See [MAKEFILE](dev/makefile_info.md) for more details.

Expand All @@ -32,29 +34,29 @@ See [MAKEFILE](dev/makefile_info.md) for more details.

% make poetry-init
% make poetry-install
% make install # local installation of openssm
% make install # local installation of OpenSSA

% make pypi-auth # only for maintainers
% make publish # only for maintainers
```

## Getting Started as an Aspiring Contributor

OpenSSM is a community-driven initiative, and we warmly welcome contributions. Whether it's enhancing existing models, creating new SSMs for different industrial domains, or improving our documentation, every contribution counts. See our [Contribution Guide](../CONTRIBUTING.md) for more details.
OpenSSA is a community-driven initiative, and we warmly welcome contributions. Whether it's enhancing existing models, creating new SSMs for different industrial domains, or improving our documentation, every contribution counts. See our [Contribution Guide](../CONTRIBUTING.md) for more details.

You can begin contributing to the OpenSSM project in the `contrib/` directory.
You can begin contributing to the OpenSSA project in the `contrib/` directory.

## Getting Started as a Committer

You already know what to do.

## Community

Join our vibrant community of AI enthusiasts, researchers, developers, and businesses who are democratizing industrial AI through SSMs. Participate in the discussions, share your ideas, or ask for help on our [Community Discussions](https://github.com/aitomatic/openssm/discussions).
Join our vibrant community of AI enthusiasts, researchers, developers, and businesses who are democratizing industrial AI through SSMs. Participate in the discussions, share your ideas, or ask for help on our [Community Discussions](https://github.com/aitomatic/OpenSSA/discussions).

## License

OpenSSM is released under the [Apache 2.0 License](./LICENSE.md).
OpenSSA is released under the [Apache 2.0 License](./LICENSE.md).

## Links

Expand Down
4 changes: 2 additions & 2 deletions docs/PROJECT_PHILOSOPHY.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# OpenSSM Project Philosophy
# OpenSSA Project Philosophy

At OpenSSM, we believe in the democratization of AI. Our goal is to create an ecosystem where anyone, regardless of their resources, can have access to efficient and domain-specific AI solutions. We envision a future where AI is not only accessible but also robust, reliable, and trustworthy.
At OpenSSA, we believe in the democratization of AI. Our goal is to create an ecosystem where anyone, regardless of their resources, can have access to efficient and domain-specific AI solutions. We envision a future where AI is not only accessible but also robust, reliable, and trustworthy.

Our project is guided by the following principles:

Expand Down
14 changes: 7 additions & 7 deletions docs/dev/howtos.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,21 +2,21 @@

## Observability

`OpenSSM` has built-in observability and tracing.
`OpenSSA` has built-in observability and tracing.

## Logging

Users of `OpenSSM` can use the `logger` object provided by the `OpenSSM` package:
Users of `OpenSSA` can use the `logger` object provided by the `OpenSSA` package:

```python
from OpenSSM import logger
from OpenSSA import logger
logger.warning("xyz = %s", xyz)
```

If you are an `OpenSSM` contributor, you may use the `openssm` logger:
If you are an `OpenSSA` contributor, you may use the `OpenSSA` logger:

```python
from openssm import mlogger
from OpenSSA import mlogger
mlogger.warning("xyz = %s", xyz)
```

Expand All @@ -25,7 +25,7 @@ mlogger.warning("xyz = %s", xyz)
There are some useful decorators for automatically logging function entry and exit.

```python
from openssm import Logs
from OpenSSA import Logs

@Logs.do_log_entry_and_exit() # upon both entry and exit
def func(param1, param2):
Expand All @@ -40,7 +40,7 @@ The above will automatically log function entry with its parameters, and functio
If you want to use your own logger with its own name, use

```python
from openssm import Logs
from OpenSSA import Logs
my_logger = Logs.get_logger(app_name, logger.INFO)

@Logs.do_log_entry_and_exit(logger=my_logger)
Expand Down
5 changes: 3 additions & 2 deletions docs/integrations/lepton_ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

[Lepton.AI](https://lepton.ai) is a developer-centric platform to build, fine-tune, and deploy large models.

With OpenSSM, you can create SSMs by calling the Lepton pipeline with just a few lines of code.
With OpenSSA, you can create SSMs by calling the Lepton pipeline with just a few lines of code.

```python
from openssm import BaseSSM, LeptonSLMFactory
Expand All @@ -12,10 +12,11 @@ response = ssm.discuss(conversation_id, "what is abc?")

## Integration Architecture

In the OpenSSM context, Lepton helps finetune and distill the SLM (small language model) that front-ends an SSM.
In the OpenSSA context, Lepton helps finetune and distill the SLM (small language model) that front-ends an SSM.

```python
![Lepton Integration](../diagrams/ssm-lepton-integration.drawio.png)
```

## Roadmap
To be updated
7 changes: 4 additions & 3 deletions docs/integrations/vectara.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,21 +2,22 @@

[Vectara](https://vectara.com/) is a developer-first API platform for easily building conversational search experiences that feature best-in-class Retrieval, Summarization, and “Grounded Generation” that all but eliminates hallucinations.

With OpenSSM, you can simply use `Vectara` with just a few lines of code.
With OpenSSA, you can simply use `Vectara` with just a few lines of code.

```python
from openssm import VectaraSSM
from openssa import VectaraSSM
ssm = VectaraSSM()
ssm.read_directory("path/to/directory")
response = ssm.discuss(conversation_id, "what is xyz?")
```

## Integration Architecture

In the OpenSSM context, Vectara is treated as a backend, as shown below..
In the OpenSSA context, Vectara is treated as a backend, as shown below..

![LlamaIndex Integration](../diagrams/ssm-llama-index-integration.drawio.png)

`LlamaIndexSSM` is simply an SSM with a passthrough (dummy) SLM that sends user queries directory to the Vectara backend.

## Roadmap
To be updated
22 changes: 11 additions & 11 deletions openssa/integrations/llama_index/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# OpenSSM and LlamaIndex Integration
# OpenSSA and LlamaIndex Integration

This guide provides an overview and examples of how Small Specialist Models (SSMs, from the [OpenSSM](https://github.com/aitomatic/openssm) project) integrate with LlamaIndex.
This guide provides an overview and examples of how Small Specialist Models (SSMs, from the [OpenSSA](https://github.com/aitomatic/OpenSSA) project) integrate with LlamaIndex.

## Overview

Expand All @@ -24,10 +24,10 @@ Here are some examples to get you started.

### Basic Integration

OpenSSM makes using LlamaIndex as simple as 3 lines of code:
OpenSSA makes using LlamaIndex as simple as 3 lines of code:

```python
from openssm import LlamaIndexSSM # Instantiate a LlamaIndexSSM
from OpenSSA import LlamaIndexSSM # Instantiate a LlamaIndexSSM

ssm = LlamaIndexSSM()
ssm.read_directory('docs/ylecun') # Read the docs for the first time
Expand All @@ -48,7 +48,7 @@ ssm.load('storage/ylecun') # Load the index from storage
In the example below, we put a domain-specific SSM (an SLM or small language model trained on data related to Yann LeCun’s work) in front of LlamaIndex.

```python
from openssm import LlamaIndexSSM, FineTunedSLM
from OpenSSA import LlamaIndexSSM, FineTunedSLM

slm = FineTunedSLM(...) # Instantiate a domain-specific SLM
ssm = LlamaIndexSSM(slm=slm) # Instantiate a LlamaIndexSSM with the SLM
Expand All @@ -59,7 +59,7 @@ response = ssm.discuss("What is the main point made by Yann LeCun?")

The response from this ssm would be much richer and more informed about Yann LeCun’s work than a generic SSM performing the same task.

In all of the above examples, the SSM is using LlamaIndex as a [`Backend`](/openssm/core/backend/abstract_backend), as shown below.
In all of the above examples, the SSM is using LlamaIndex as a [`Backend`](/OpenSSA/core/backend/abstract_backend), as shown below.

![Integration Architecture](../../../docs/diagrams/ssm-llama-index-integration.drawio.png)

Expand All @@ -74,7 +74,7 @@ Here, we cover three primary use cases:
An agent can retrieve context-specific data to inform responses. For example, in a financial setting:

```python
from openssm import LlamaIndexSSM, ContextRetrievalAgent
from OpenSSA import LlamaIndexSSM, ContextRetrievalAgent

context = """
XYZ company reported Q2 revenues of $4.5 billion, up 18% YoY. The rise is primarily due to a 32% growth in their cloud division.
Expand All @@ -94,7 +94,7 @@ This agent can retrieve and analyze data from relevant financial reports, taking
In cases where the set of tools is extensive, the agent can retrieve the most relevant ones dynamically during query time. For example, in a data analysis setting:

```python
from openssm import LlamaIndexSSM, FunctionRetrievalAgent
from OpenSSA import LlamaIndexSSM, FunctionRetrievalAgent

agent = FunctionRetrievalAgent('tools/data_tools')
ssm = LlamaIndexSSM(agents=[tool_agent])
Expand All @@ -107,10 +107,10 @@ This allows the SSM to retrieve and apply the most suitable data analysis tool b

#### Query Planning

For more complex tasks, OpenSSM can be made capable of advanced query planning thanks to LlamaIndex. It could, for instance, plan and execute a series of queries to answer a question about a company’s revenue growth over specific months.
For more complex tasks, OpenSSA can be made capable of advanced query planning thanks to LlamaIndex. It could, for instance, plan and execute a series of queries to answer a question about a company’s revenue growth over specific months.

```python
from openssm import LlamaIndexSSM, QueryPlanningAgent
from OpenSSA import LlamaIndexSSM, QueryPlanningAgent

query_plan_tool = QueryPlanTool.from_defaults(
query_engine_tools=[query_tool_sept, query_tool_june, query_tool_march]
Expand All @@ -127,7 +127,7 @@ This illustrates how an SSM with a Query Planning Agent can plan and execute a s

### Future Enhancements

As we continue to enhance the integration between OpenSSM and LlamaIndex, here are a few promising directions:
As we continue to enhance the integration between OpenSSA and LlamaIndex, here are a few promising directions:

- **SSMs as agents for LlamaIndex**: We are exploring ways to make SSMs available as agents for LlamaIndex, allowing for more complex interactions between SSMs and LlamaIndex.

Expand Down

0 comments on commit b913357

Please sign in to comment.