Skip to content

Commit 605accc

Browse files
committed
Adds readme
1 parent e882455 commit 605accc

File tree

3 files changed

+136
-0
lines changed

3 files changed

+136
-0
lines changed

README.md

Lines changed: 133 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,133 @@
1+
# EmotionRecServer
2+
Ktor server for providing emotion recognition. This is done through a trained TensorFlow model that is either located locally or uploaded to Google Cloud Platform [ML Engine](https://cloud.google.com/ml-engine/).
3+
4+
I wrote an [article covering the specific TensorFlow model I trained](https://medium.com/@jsflo.dev/training-a-tensorflow-model-to-recognize-emotions-a20c3bcd6468).
5+
6+
## Modes of inference
7+
* Local Inference
8+
* GCP Inference [ML Engine]
9+
10+
To be able to choose which inference mode the server will use you will have to set a property in the application configuration found in `/api_ktor/src/main/resources/application.conf`.
11+
12+
```
13+
ktor {
14+
...
15+
application {
16+
...
17+
gcp = false
18+
}
19+
}
20+
21+
```
22+
23+
Setting *gcp* to *false* will configure the server to use the local inference otherwise it will use the model hosted in GCP.
24+
25+
### Local Inference
26+
LocalInference will use the Java TensorFlow Api to load the trained model.
27+
28+
To be able to use local inference you first have to point the Server to the location of your model.
29+
30+
**TODO**: Currently this is done through a static variable but this should be moved to a config file.
31+
32+
`/api_ktor/src/main/kotlin/com/emotionrec/api/Server.kt`
33+
```java
34+
val LOCAL_INF_MODEL = "./src/main/resources/1"
35+
val LOCAL_INF_TAG = "serve"
36+
```
37+
38+
The first is the location of the model relative to the project and the other is the tag used while saving the model through the [SavedModelApi](https://medium.com/@jsflo.dev/saving-and-loading-a-tensorflow-model-using-the-savedmodel-api-17645576527).
39+
40+
### GCP ML Engine Inference
41+
In order to use the the predictions coming from the model hosted on GCP you will have to upload a saved model to your GCP account. You will need to have two things.
42+
* Path to your model
43+
* Credential File
44+
45+
#### Path to your model
46+
`"projects/ml-happy-rec/models/happy_rec_model/versions/v2:predict"`
47+
48+
This is a hardcoded path found in the `RetrofitNetwork.kt` and should be changed to point to your specific model.
49+
50+
#### Credentials
51+
**TODO**: Value should be set through config
52+
53+
`/api_ktor/src/main/kotlin/com/emotionrec/api/Server.kt`
54+
```java
55+
val GOOGLE_CRED_FILE = "happy_rec_cred.json"
56+
```
57+
58+
This is currently handled through the use of the Google credential file given through GCP.
59+
60+
## Running the Server
61+
To run the api_ktor application: `./gradlew api_ktor:run`
62+
which will use the default settings (port) defined in the application.conf file.
63+
64+
```
65+
ktor {
66+
deployment {
67+
port = 8378
68+
environment = development
69+
watch = [ emotionrec ]
70+
}
71+
72+
application {
73+
id = emotionrec
74+
modules = [com.emotionrec.api.ServerKt.main]
75+
gcp = false
76+
}
77+
}
78+
```
79+
### Simple Api
80+
**GET** /ping
81+
* Used for sanity checks and returns "pong"
82+
83+
**POST** /prediction
84+
* Accepts [PostPredictionData].
85+
* Expects the [PostPredictionData.image_array]:
86+
* to be an array of size **2304**
87+
* String array separated by a delimiter [PostPredictionData.delimiter] (default: [DEFAULT_DELIMITER])
88+
89+
* Responds with [PredictionError] or [PredictionResponse]
90+
91+
**POST** /predictionImage
92+
* Accepts mutlipart file image upload
93+
* Responds with [PredictionError] or [PredictionResponse]
94+
95+
#### PredictionResponse
96+
```json
97+
{
98+
"sortedPredictions": [
99+
{
100+
"probability": 0.99999285,
101+
"emotion": "ANGRY"
102+
},
103+
{
104+
"probability": 0.0000035176417,
105+
"emotion": "SAD"
106+
},
107+
{
108+
"probability": 0.0000018190486,
109+
"emotion": "FEAR"
110+
},
111+
{
112+
"probability": 0.0000018007337,
113+
"emotion": "NEUTRAL"
114+
},
115+
{
116+
"probability": 1.873281e-8,
117+
"emotion": "HAPPY"
118+
},
119+
{
120+
"probability": 3.4072745e-11,
121+
"emotion": "DISGUST"
122+
},
123+
{
124+
"probability": 2.9763858e-12,
125+
"emotion": "SURPRISE"
126+
}
127+
],
128+
"guessedPrediction": {
129+
"probability": 0.99999285,
130+
"emotion": "ANGRY"
131+
}
132+
}
133+
```

api_ktor/src/main/kotlin/com/emotionrec/api/Server.kt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,7 @@ private val logger = KotlinLogging.logger { }
2626
private const val GOOGLE_CRED_FILE = "happy_rec_cred.json"
2727
private const val GCP_SCOPE = "https://www.googleapis.com/auth/cloud-platform"
2828

29+
// TODO: This should be done through a config file too
2930
private const val LOCAL_INF_MODEL = "./src/main/resources/1"
3031
private const val LOCAL_INF_TAG = "serve"
3132

infrastructure/src/main/kotlin/com/emotionrec/gcpinference/network/RetrofitNetwork.kt

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,8 @@ private const val BASE_URL = "https://$API_NAME.googleapis.com/$API_VERSION/"
1717
private const val READ_TIMEOUT_SECONDS = 180L
1818

1919
interface GcpPredictionApi {
20+
21+
// TODO: PUll out specific path pull it from config
2022
@POST("projects/ml-happy-rec/models/happy_rec_model/versions/v2:predict")
2123
fun getPredictions(@Body predictionsInput: GcpPredictionInput): Call<GcpPredictionResult>
2224
}

0 commit comments

Comments
 (0)