Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
  • Loading branch information
vladd-bit committed Oct 1, 2024
2 parents f60c88b + 365309a commit a452a00
Show file tree
Hide file tree
Showing 4 changed files with 60 additions and 22 deletions.
14 changes: 11 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ By default the MedCAT service will be running on port `5000`. MedCAT models will
If you have a gpu and wish to use it, please change the `docker/docker-compose.yml` file, use the `cogstacksystems/medcat-service-gpu:latest` image or change the `build:` directive to build `../Dockerfile_gpu`.

### <span style="color:red">IMPORTANT !</span>
If you wish to run this docker service manually, use the docker/docker-compose.yml file, execute `docker-compose up -d` whilst in the `docker` folder.
If you wish to run this docker service manually, use the docker/docker-compose.yml file, execute `docker compose up -d` whilst in the `docker` folder.

Alternatively, an example script `./docker/run_example_medmen.sh` was provided to run the Docker container with MedCAT service. The script will download an example model (using the `./models/download_medmen.sh` script),it will use an example environment configuration, then it will build and start the service using the provided Docker Compose file, the service <b><span style="color:red">WONT WORK</span></b> without the model being present.

Expand All @@ -70,7 +70,7 @@ All models should be mounted from the `models/` folder.
1. cd ./models/
2. bash ./download_medmen.sh
3. cd ../docker/
4. docker-compose up -d
4. docker compose up -d
DONE!
```
Or, if you wish to use the above mentioned script ( the sample model is downloaded via script, you don't need to do anything):
Expand Down Expand Up @@ -104,6 +104,14 @@ and the received result:
}
```

Additional DE-ID query sample (make sure you have a de-id model loaded):

curl -XPOST http://localhost:5555/api/process \
-H 'Content-Type: application/json' \
-d '{"content":{"text":"Patient Information: Full Name: John Michael Doe \n Gender: Male \n Date of Birth: January 15, 1975 (Age: 49) \n Patient ID: 567890123 \n Address: 1234 Elm Street, Springfield, IL 62701 \n Phone Number: (555) 123-4567 \n Email: [email protected] \n Emergency Contact: Jane Doe (Wife) \n Phone: (555) 987-6543 \n Relationship: Spouse"}}'

Make sure you have the following option enabled in `envs/env_medcat` , `DEID_MODE=True`.

process_bulk example :

```
Expand Down Expand Up @@ -273,7 +281,7 @@ The mode in which annotation entities should be outputted in the JSON response,
newer versions of MedCAT (1.2+) output entities as a dict, where the id of the entity is a key and the rest of the data is a value, so for "dict",
the output is
```
{"annotations": [{"0": {"cui": "C0027361", "id": 0,.....}, "1": {"cui": "C001111", "id": 1......}]}
{"annotations": [{"0": {"cui": "C0027361", "id": 0,.....}, "1": {"cui": "C001111", "id": 1......}}]}
```
This setting can be configured in the ```./envs/env_medcat``` file, using the ```ANNOTATIONS_ENTITY_OUTPUT_MODE``` variable.
By default, the output of these entities is set to respect the output of the MedCAT package, hence the latter will be used. Please change the above mentioned env variable and make sure your CogStack-Nifi annotation script is adapted accordingly.
Expand Down
40 changes: 40 additions & 0 deletions docker/docker-compose-gpu.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
services:
nlp-medcat-service-production:
container_name: cogstack-medcat-service-production
### Multiple images available:
## default image, only CPU support: cogstacksystems/medcat-service:latest
## GPU support: cogstacksystems/medcat-service-gpu:latest
# image: cogstacksystems/medcat-service:latest
platform: linux
restart: always
## Default dockerfile: ../Dockerfile
## GPU dockerfile: ../Dockerfile_gpu
build:
context: ../
dockerfile: "Dockerfile_gpu"
environment:
- http_proxy=$HTTP_PROXY
- https_proxy=$HTTPS_PROXY
- no_proxy=$no_proxy
env_file:
- ../envs/env_app
- ../envs/env_medcat
volumes:
- ../models:/cat/models/:rw
ports:
- "5555:5000"
networks:
- cognet

deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
device_ids: ['0']
capabilities: ["gpu", "utility", "compute", "video"]

networks:
cognet:
name: cogstack-net
11 changes: 0 additions & 11 deletions docker/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
version: "3.6"

services:
nlp-medcat-service-production:
container_name: cogstack-medcat-service-production
Expand Down Expand Up @@ -27,15 +25,6 @@ services:
- "5555:5000"
networks:
- cognet

# uncomment this only when you have gpu access
#deploy:
# resources:
# reservations:
# devices:
# - driver: nvidia
# device_ids: ['0']
# capabilities: ["gpu", "utility", "compute", "video"]

networks:
cognet:
Expand Down
17 changes: 9 additions & 8 deletions medcat_service/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,10 +1,11 @@
Flask==3.0.2
gunicorn==22.0.0
injector==0.21.0
Flask==3.0.3
gunicorn==23.0.0
injector==0.22.0
flask-injector==0.15.0
setuptools==75.1.0
simplejson==3.19.3
werkzeug==3.0.4
setuptools_rust==1.10.1
medcat==1.13.0
setuptools==65.5.1
simplejson==3.19.2
werkzeug==3.0.3
setuptools_rust==1.9.0
transformers==4.42.0
# pinned because of issues with de-id models and past models (it will not do any de-id)
transformers==4.39.1

0 comments on commit a452a00

Please sign in to comment.