Skip to content

Commit e8c7e94

Browse files
committed
fix: add docs
1 parent 86afaf6 commit e8c7e94

File tree

15 files changed

+326
-8
lines changed

15 files changed

+326
-8
lines changed

README.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,8 +17,6 @@ _Limitations by design:_
1717
_High level flow_
1818
![hl-flow diagram](./docs/hl-flow.png)
1919

20-
Living documentation [link](https://lucid.app/lucidspark/4d09749b-ad86-456b-a5ad-2f9e70a2b546/edit?viewport_loc=-1181%2C-734%2C4250%2C2266%2C0_0&invitationId=inv_bc6af9c8-3b41-4b00-b535-255d109f0afb)
21-
2220
## Flow Overview
2321

2422
Walk the directory and create a list of all source files - excluding specified such as bin, dist, vendor, node_modules, etc...

docs/EventCatalog-Internal-Flow.png

405 KB
Loading
272 KB
Loading

docs/asyncapi.md

Lines changed: 110 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,110 @@
1+
# AsyncAPI Bindings
2+
3+
[AsyncAPI standard spec](https://www.asyncapi.com/docs/reference/specification/v2.6.0#asyncapi-specification) describes all the possible elements that a valid asyncAPI document can have.
4+
5+
For the purposes of the existing setup and needs at Next Plc (Warehouse systems) we don't necessarily need all of them to be included in our bindings (either as annotation keys or model bindings to structs).
6+
7+
## Required
8+
9+
The current [AsyncAPI standard spec](https://www.asyncapi.com/docs/reference/specification/v2.6.0) is at version `2.6.0`.
10+
11+
The tool will deal with all the relevant sections to be able to build an AsyncAPI spec file from within a single repo.
12+
13+
The asyncAPI is built from the `Application` - i.e. service down, each service will have a toplevel description - `info` key, which will in turn include `channels`.
14+
15+
### Channels
16+
17+
Channels is a map of Channel - where a channel is an entity describing the transport layer. This can be a ServiceBus Queue/Topic.
18+
19+
The name of the channel, whilst it may contain all sorts of unicode and whitespace characters, it SHOULD conform to a machine-readable standard as it will be processed multiple times during the generation process.
20+
21+
See [notes](./notes.md) for the relationship diagram and precedence hierarchy
22+
23+
### Annotateable Properties
24+
25+
|annotation key|required?|description|options|examples|
26+
|---|---|---|---|---|
27+
|`category`|yes|Which part of the AsyncAPI document will this snippet relate to|`["root","info","server","channel","operation","subOperation","pubOperation","message"]`||
28+
|`type`|yes|The type of a propery in an AsyncAPI section |`["json_schema","example","description","title","nameId"]`||
29+
|`id`|yes (except on root/info)| name of the service. Will default to parent folder name - unless overridden. will be converted to this format:`urn:$business_domain:$bounded_context_domain:$service_name` => `urn:domain:packing:domain.packing.app`|||
30+
|`parent`|no|The parent of this annotation if a message or operation ||
31+
32+
### Examples
33+
34+
- `type`: Example
35+
36+
```cs
37+
//+gendoc id=PackingAreaEvent category=message type=example
38+
namespace domain.Packing.Services.PackArea.Contracts.Events;
39+
40+
public class PackingAreaEvent : domainMessage<PackingAreaEventPayload>
41+
{
42+
public PackingAreaEvent(PackingAreaEventPayload payload)
43+
{
44+
MessageTypeName = nameof(PackingAreaEvent);
45+
SourceSystem = PackAreaServiceConstants.Name;
46+
Guid = Guid.NewGuid();
47+
CreationDate = DateTime.UtcNow;
48+
Number = 1;
49+
NumberOf = 1;
50+
Owner = string.Empty;
51+
Stream = String.Empty;
52+
Payload = payload;
53+
}
54+
}
55+
//-gendoc
56+
```
57+
58+
- `type`: JSON_schema can be defined in any file like below
59+
60+
```cs
61+
namespace domain.Packing.Services.PackArea.Contracts.Events;
62+
63+
public class Bar
64+
{
65+
public string Type { get; set; }
66+
67+
public string Name { get; set; }
68+
}
69+
70+
/* below is an example of schema in code
71+
//+gendoc id=PackingAreaEvent category=message type=json_schema
72+
{
73+
"$schema": "http://json-schema.org/draft-07/schema",
74+
"$id": "http://example.com/example.json",
75+
"type": "object",
76+
"required": [
77+
"payload",
78+
"guid",
79+
"creationDate",
80+
"messageTypeName",
81+
"version",
82+
"owner",
83+
"stream",
84+
"sourceSystem"
85+
],
86+
"properties": {
87+
"payload": {
88+
"$id": "#/properties/payload",
89+
"type": "object",
90+
"required": [
91+
"packAreaId",
92+
"enabled",
93+
"warehouseCode",
94+
"mode",
95+
"itemThreshold",
96+
...truncated for brevity
97+
}
98+
//-gendoc
99+
*/
100+
101+
```
102+
103+
> However, the recommended way is to keep your schema in a file named => `MESSAGE_NAME.schema.json` [see example](../src/test/domain.sample/src/schemas/PackingAreaEvent.schema.json)
104+
105+
106+
## Nice To Have
107+
108+
- `servers` keyword describes the technology providing the transport layer - e.g. Kafka/RabbitMQ or specifically in case of Next the `ServiceBus Namespace`
109+
- it may contain a map of multiple implementations - e.g. dev/preprod/prod
110+
- > in conjunction with a channel key a full URL can be constructed to use by the client(s) to either publish or subscribe to messages on that ServiceBus's Topic/Queue/Topic-Subscription

docs/context-flows.png

25.5 KB
Loading

docs/hl-flow.png

586 KB
Loading

docs/internals.md

Lines changed: 66 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
# Internals
2+
3+
![Diagram of the internal flow](./EventCatalog-Internal-Flow.png)
4+
5+
## Lexer
6+
7+
TOKENs are kept to the following types - `token.TokenType` - not listed as the list is likely to change/grow.
8+
9+
Special consideration will need to be given to files that are not able to contain comments or anything outside of their existing syntax - e.g. .json most commonly containing schemas.
10+
11+
> these cases a convention will need to be followed where by the name of the message that it is describing must be in the name of the file.
12+
13+
## Parser
14+
15+
Not using an existing parser generator like CGF, BNF, or EBNF is on purposes as the input source will only ever really be composed of parts we care about i.e. `gendoc` markers their beginning and end and what they enclose within them as text.
16+
17+
We'll use the overly simplified Pratt Parser (top down method) as we'll have no need to for expression parsing only statement nodes creation with the associated/helper attributes.
18+
19+
[more...]()
20+
21+
### Generation
22+
23+
Once a flat list of statements (`[]GenDocBlock`) is ready we then need to sort it in order of precedence. Precedence is set based on `category` (shorthand `c`) found on an annotation.
24+
25+
[more...]()
26+
27+
once sorted we need to build an interim tree, as at this point we have no idea how many nodes there will be, it has to be an `n-ary tree`
28+
29+
## Gendoc Tree
30+
31+
The tree looks like the below diagram.
32+
33+
![](./EventCatalog-ServiceContextTree.png)
34+
35+
Then highlights the order in which it's walked. It is using the __BFS (BreadthFirstSearch) algorithm__ to walk each level and perform the merging of information from all the *leaf* nodes.
36+
37+
Also worth noting is that it is using an internal indexer for quicker O(n) lookups when performing the sort. The tree is walked multiple times to ensure the orphans are assigned to parents in case they weren't in the tree when it was walked previously.
38+
39+
```mermaid
40+
flowchart TD
41+
root(0_""\nroot)
42+
orphaned(0_orphaned)
43+
any_any[any_any]
44+
45+
parented(0_parented)
46+
srv(1_serviceId)
47+
chan(2_channelId)
48+
op(3_operationId)
49+
msg(4_messageId)
50+
usrv[unsorted srvs]
51+
umsg[unsorted messages]
52+
uop[unsorted operations]
53+
uchan[unsorted channels]
54+
55+
root --> orphaned
56+
orphaned -.-> |1..n| any_any[any_any]
57+
root --> parented
58+
parented --> |1..n| srv
59+
srv -.->|1..n| usrv
60+
srv -->|1..n| chan
61+
chan -.->|1..n| uchan
62+
chan -->|1..1| op
63+
op -.->|1..n| uop
64+
op -->|1..1| msg
65+
msg -->|1..n| umsg
66+
```

docs/mermaid_hierarchy.png

58.9 KB
Loading

docs/notes.md

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
# Notes
2+
3+
- Step 1 extract all gendoc statements a single file?? or json parseable list by service/repo
4+
- step 2 parse all the files only containing step 1 (interim state) building a node tree by precedence where top of the tree has higher importance
5+
- e.g. service>>channel>operation>message
6+
7+
Diagram in Code
8+
9+
```mermaid
10+
flowchart TD
11+
M[Message]
12+
A[Service] -->|1..n| C(Channel)
13+
C --> OpType{Publish/Sub}
14+
OpType -->|Publish| Operation[Operation]
15+
OpType -->|Subscribe| Operation[Operation]
16+
Operation[Operation] --> M
17+
```

docs/usage.md

Lines changed: 127 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,127 @@
1+
# Usage
2+
3+
## SourceCode Annotation
4+
5+
The program works by identifying markers in the source code, and extracing them and sorting them based on ancestral precedence - i.e. who is a parent/child/grandchild.
6+
7+
### Markers and Annotation
8+
9+
Markers have to be used in the exact form, beginning `//+gendoc annotationKey=annotationVal` and end `//-gendoc`.
10+
11+
### Tips
12+
13+
You can do multiline annotations by adding `\` before a line break e.g.
14+
15+
```csharp
16+
public class Bar() {
17+
/*
18+
//+gendoc category=message \
19+
type=example
20+
*/
21+
public class Foo() {
22+
23+
}
24+
//-gendoc
25+
}
26+
```
27+
28+
## CLI
29+
30+
Download the published binary from [here](TODO).
31+
32+
> replace `darwin` in `.../gendoc-darwin/overview` for your platform = `linux|windows|darwin`
33+
34+
```sh
35+
chmod +x ./gendoc-$PLATFORM
36+
mv ./gendoc-$PLATFORM /usr/local/bin/gendoc
37+
```
38+
39+
After moving to a location which is on your $PATH, you should be able to run the following commands.
40+
41+
The CLI comes with the following commands, for furher info and help use the `--help|-h` flag at any point.
42+
43+
```bash
44+
gendoc -h
45+
gendoc --version
46+
```
47+
48+
### Commands
49+
50+
The CLI has 2 main commands that are run either against a single repo that generates the interim output and the command that reads in the interim output and generates the AsyncAPI compliant document.
51+
52+
- `--input`|`--output` options currently support 2 types of `"storage implementation"`
53+
- `local://` => pointing to a local filesystem
54+
- `azblob://` => pointing to an Azure storageaccount/blob in this format `azblob://STORAGE_ACCOUNT_NAME/CONTAINER_NAME`. The utility handles the virtual path and object creation.
55+
- additional `storageClients` can be added easily by providing a new implementation on the storageAdapter
56+
57+
For ease of use, you can enable shell completion for your shell.
58+
59+
`gendoc completion -h` to see the options, e.g. for powershell `gendoc completion powershell`
60+
61+
>Not tested on Windows, users may need to suffix the binary with `.exe`.
62+
63+
#### SingleContext
64+
65+
This command is run against a single directory which holds source code of any type and generates an interim output.
66+
67+
It can be run in validate only mode by setting `--dry-run` flag, it will ensure that any annotations that have been added are set correctly and there is no syntax errors.
68+
69+
```sh
70+
gendoc single-context --input local:///path/to/src/domain.sample --output local:///path/to/out/interim --is-service --verbose --dry-run
71+
```
72+
73+
If any errors occur they are returned to the terminal. In dry-run mode no files are emitted to the output location.
74+
75+
```sh
76+
gendoc single-context --input local:///path/to/src/domain.sample --output local:///path/to/out/interim --is-service --business-ctx s2s --business-domain domain
77+
```
78+
79+
`--business-ctx` and `--business-domain` are purely for tagging/description/name generation purposes
80+
81+
> Currently `--input` for single-context can only be a `local://` i.e. stored on the local filesystem
82+
83+
##### EnvVariable expansion
84+
85+
The content can include environment variable like text to avoid repetition, however it will fail if the variable is not set.
86+
87+
Example:
88+
89+
```text
90+
//+gendoc category=message type=description id=foo
91+
this is some description with $foo
92+
//-gendoc
93+
```
94+
95+
Ensure that the environment variable is present otherwise it will fail with either an `unset` or `set but empty` error.
96+
97+
See tests for [more examples](../src/go/async-api-gen-doc/internal/parser/parser_test.go)
98+
99+
#### GlobalContext
100+
101+
This command is run against a directory containing zero or more interim output files from generated from across many repos or (single-context sources).
102+
103+
```sh
104+
gendoc global-context --input local:///path/to/src/domain.sample --output local:///path/to/out/interim
105+
```
106+
107+
### Local Example
108+
109+
Point it to an input directory of any repo - e.g. `domain.Packing.DirectDespatchAggregation`.
110+
111+
This will generate the interim code that the
112+
113+
```sh
114+
gendoc single-context --input local://$FULL_PATH_TO/domain.Packing.DirectDespatchAggregation --is-service --bounded-ctx Packing --business-domain domain \
115+
--repo "https://github.com/repo" \
116+
--output local://$HOME/.gendoc/poc
117+
```
118+
119+
The output will be populated with a directory called `current` which will include the interim output(s) from the single-context runs.
120+
121+
This is then used as in input for the global-context and it will output a full AsyncAPI document in the output directory in this case `local://$HOME/.gendoc/poc/processed` here.
122+
123+
```sh
124+
gendoc global-context --input local://$HOME/.gendoc/poc/current --output local://$HOME/.gendoc/poc/processed
125+
```
126+
127+
The files are emitted with the `AsyncAPI.ID` as the name in the `asyncapi` directory, e.g.: `asyncapi/urn:domain:Packing:domain.Packing.DirectDespatchAggregation.yml`.

0 commit comments

Comments
 (0)