Skip to content

Commit a1fe4a3

Browse files
Merge pull request #42 from topcoder-platform/new-readme
New readme
2 parents 7f1c869 + 8e36ac8 commit a1fe4a3

File tree

3 files changed

+93
-46
lines changed

3 files changed

+93
-46
lines changed

README.md

Lines changed: 66 additions & 46 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,35 @@
11
# Topcoder Challenge API
22

3-
## Dependencies
3+
This microservice provides access and interaction with all sorts of Challenge data.
44

5-
- nodejs https://nodejs.org/en/ (v10)
6-
- DynamoDB
7-
- AWS S3
8-
- Elasticsearch v6
9-
- Docker, Docker Compose
5+
### Development deployment status
6+
[![CircleCI](https://circleci.com/gh/topcoder-platform/challenge-api/tree/develop.svg?style=svg)](https://circleci.com/gh/topcoder-platform/challenge-api/tree/develop)
7+
### Production deployment status
8+
[![CircleCI](https://circleci.com/gh/topcoder-platform/challenge-api/tree/master.svg?style=svg)](https://circleci.com/gh/topcoder-platform/challenge-api/tree/master)
9+
10+
## Swagger definition
11+
12+
- [Swagger](https://api.topcoder.com/v5/challenges/docs/)
13+
14+
## Intended use
15+
16+
- Production API
17+
18+
## Related repos
19+
20+
- [Resources API](https://github.com/topcoder-platform/resources-api)
21+
- [ES Processor](https://github.com/topcoder-platform/challenge-processor-es) - Updates data in ElasticSearch
22+
- [Legacy Processor](https://github.com/topcoder-platform/legacy-challenge-processor) - Moves data from DynamoDB back to Informix
23+
- [Legacy Migration Script](https://github.com/topcoder-platform/legacy-challenge-migration-script) - Moves data from Informix to DynamoDB
24+
- [Frontend App](https://github.com/topcoder-platform/challenge-engine-ui)
25+
26+
## Prerequisites
27+
- [NodeJS](https://nodejs.org/en/) (v10)
28+
- [DynamoDB](https://aws.amazon.com/dynamodb/)
29+
- [AWS S3](https://aws.amazon.com/s3/)
30+
- [Elasticsearch v6](https://www.elastic.co/)
31+
- [Docker](https://www.docker.com/)
32+
- [Docker Compose](https://docs.docker.com/compose/)
1033

1134
## Configuration
1235

@@ -46,46 +69,9 @@ The following parameters can be set in config files or in env variables:
4669
- SCOPES: the configurable M2M token scopes, refer `config/default.js` for more details
4770
- M2M_AUDIT_HANDLE: the audit name used when perform create/update operation using M2M token
4871

49-
Set the following environment variables so that the app can get TC M2M token (use 'set' insted of 'export' for Windows OS):
50-
51-
- export AUTH0_CLIENT_ID=8QovDh27SrDu1XSs68m21A1NBP8isvOt
52-
- export AUTH0_CLIENT_SECRET=3QVxxu20QnagdH-McWhVz0WfsQzA1F8taDdGDI4XphgpEYZPcMTF4lX3aeOIeCzh
53-
- export AUTH0_URL=https://topcoder-dev.auth0.com/oauth/token
54-
- export AUTH0_AUDIENCE=https://m2m.topcoder-dev.com/
55-
56-
Also properly configure AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION, ATTACHMENT_S3_BUCKET, IS_LOCAL_DB config parameters.
57-
58-
Test configuration is at `config/test.js`. You don't need to change them.
59-
The following test parameters can be set in config file or in env variables:
60-
61-
- ADMIN_TOKEN: admin token
62-
- COPILOT_TOKEN: copilot token
63-
- USER_TOKEN: user token
64-
- EXPIRED_TOKEN: expired token
65-
- INVALID_TOKEN: invalid token
66-
- M2M_FULL_ACCESS_TOKEN: M2M full access token
67-
- M2M_READ_ACCESS_TOKEN: M2M read access token
68-
- M2M_UPDATE_ACCESS_TOKEN: M2M update (including 'delete') access token
69-
- S3_ENDPOINT: endpoint of AWS S3 API, for unit and e2e test only; default to `localhost:9000`
70-
71-
## AWS S3 Setup
72-
Go to https://console.aws.amazon.com/ and login. Choose S3 from Service folder and click `Create bucket`. Following the instruction to create S3 bucket.
73-
74-
## Local services setup
75-
In the `local` folder, run `docker-compose up`
76-
It starts Elasticsearch, DynamoDB and S3 compatible server.
72+
You can find sample `.env` files inside the `/docs` directory.
7773

78-
## Mock api
79-
For postman verification, please use the mock api under mock-api folder. It provides mock endpoint to fetch challenge resources and groups.
80-
You need to ensure DynamoDB configuration in `mock-api/config/default.js` is consistent with `config/default.js`
81-
Go to `mock-api` folder and run commands `npm i` and `npm start` to start the mock-api listening on port 4000
82-
83-
## Create Tables
84-
1. Make sure DynamoDB are running as per instructions above.
85-
2. Make sure you have configured all config parameters. Refer [Configuration](#configuration)
86-
3. Run `npm run create-tables` to create tables.
87-
88-
## Scripts
74+
## Available commands
8975
1. Drop/delete tables: `npm run drop-tables`
9076
2. Creating tables: `npm run create-tables`
9177
3. Seed/Insert data to tables: `npm run seed-tables`
@@ -99,6 +85,22 @@ Go to `mock-api` folder and run commands `npm i` and `npm start` to start the mo
9985

10086
## Local Deployment
10187

88+
### AWS S3 Setup
89+
Go to https://console.aws.amazon.com/ and login. Choose S3 from Service folder and click `Create bucket`. Following the instruction to create S3 bucket.
90+
91+
### Local services setup
92+
In the `local` folder, run `docker-compose up` to start Elasticsearch, DynamoDB and S3 compatible server.
93+
94+
### Create Tables
95+
1. Make sure DynamoDB are running as per instructions above.
96+
2. Make sure you have configured all config parameters. Refer [Configuration](#configuration)
97+
3. Run `npm run create-tables` to create tables.
98+
99+
### Mock API
100+
The provided mock API provides mock endpoint to fetch challenge resources and groups so you don't have to deploy the related services locally.
101+
You need to ensure DynamoDB configuration in `mock-api/config/default.js` is consistent with `config/default.js`
102+
Go to `mock-api` folder and run commands `npm i` and `npm start` to start the mock-api listening on port 4000
103+
102104
- Install dependencies `npm install`
103105
- Run lint `npm run lint`
104106
- Run lint fix `npm run lint:fix`
@@ -110,8 +112,27 @@ Go to `mock-api` folder and run commands `npm i` and `npm start` to start the mo
110112
- App is running at `http://localhost:3000`
111113
- Start mock-api, go to `mock-api` folder, run `npm i` and `npm start`, mock api is running at `http://localhost:4000`
112114

115+
## Production deployment
116+
117+
- TBD
118+
113119
## Running tests
114120

121+
### Configuration
122+
123+
Test configuration is at `config/test.js`. You don't need to change them.
124+
The following test parameters can be set in config file or in env variables:
125+
126+
- ADMIN_TOKEN: admin token
127+
- COPILOT_TOKEN: copilot token
128+
- USER_TOKEN: user token
129+
- EXPIRED_TOKEN: expired token
130+
- INVALID_TOKEN: invalid token
131+
- M2M_FULL_ACCESS_TOKEN: M2M full access token
132+
- M2M_READ_ACCESS_TOKEN: M2M read access token
133+
- M2M_UPDATE_ACCESS_TOKEN: M2M update (including 'delete') access token
134+
- S3_ENDPOINT: endpoint of AWS S3 API, for unit and e2e test only; default to `localhost:9000`
135+
115136
### Prepare
116137
- Start Local services.
117138
- Start Mock API.
@@ -159,4 +180,3 @@ Refer to the verification document `Verification.md`
159180

160181
- In the app-constants.js Topics field, the used topics are using a test topic,
161182
the suggested ones are commented out, because these topics are not created in TC dev Kafka yet.
162-

docs/dev.env

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
AUTH0_CLIENT_ID=8QovDh27SrDu1XSs68m21A1NBP8isvOt
2+
AUTH0_CLIENT_SECRET=3QVxxu20QnagdH-McWhVz0WfsQzA1F8taDdGDI4XphgpEYZPcMTF4lX3aeOIeCzh
3+
AUTH0_URL=https://topcoder-dev.auth0.com/oauth/token
4+
AUTH0_AUDIENCE=https://m2m.topcoder-dev.com/

docs/prod.env

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
AUTH0_CLIENT_ID=
2+
AUTH0_CLIENT_SECRET=
3+
AUTH0_URL=
4+
AUTH0_AUDIENCE=
5+
AUTH_SECRET=
6+
BUSAPI_URL=
7+
AWS_ACCESS_KEY_ID=
8+
AWS_SECRET_ACCESS_KEY=
9+
AWS_REGION=
10+
IS_LOCAL_DB=false
11+
DYNAMODB_URL=
12+
ATTACHMENT_S3_BUCKET=
13+
S3_API_VERSION=
14+
ES_HOST=
15+
ES_API_VERSION=
16+
ES_INDEX=
17+
ES_TYPE=
18+
ES_REFRESH=true
19+
RESOURCES_API_URL=
20+
GROUPS_API_URL=
21+
PROJECTS_API_URL=
22+
COPILOT_RESOURCE_ROLE_IDS=
23+
M2M_AUDIT_HANDLE=

0 commit comments

Comments
 (0)