Skip to content
This repository has been archived by the owner on Jan 23, 2021. It is now read-only.

Commit

Permalink
CAF-3031: Use released component versions for java-postgres, policy a…
Browse files Browse the repository at this point in the history
…dmin, worker-policy and classification-service. Update links to renamed internal deploy repository. Mark as ready for release.
  • Loading branch information
Michael McAlynn committed Jun 1, 2017
1 parent 75afae4 commit 10f13c2
Show file tree
Hide file tree
Showing 7 changed files with 12 additions and 15 deletions.
10 changes: 5 additions & 5 deletions docs/pages/en-us/Adding_a_Worker/Main.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ The process of adding a worker to the default Data Processing workflow consists
+ Update the Docker Compose overlay file that provides additional debug settings for Data Processing services.
+ Update the Data Processing workflow definition to include the new worker.

A compose file that deploys only open source components is available [here](https://github.com/CAFDataProcessing/data-processing-service-deploy). If the new worker is open source then it may be added to this file if required. Otherwise it should be added to the Enterprise Edition compose file [here](https://github.hpe.com/caf/data-processing-service-deploy).
A compose file that deploys only open source components is available [here](https://github.com/CAFDataProcessing/data-processing-service-deploy). If the new worker is open source then it may be added to this file if required. Otherwise it should be added to the Enterprise Edition compose file [here](https://github.hpe.com/caf/data-processing-service-internal-deploy).

For Enterprise Edition, in addition to the default data Processing workflow, there is also a "minimal" form of the workflow which may also be updated with the new worker if required.

Expand Down Expand Up @@ -151,23 +151,23 @@ If your worker has any preconditions that must be satisfied for it to perform it

For Enterprise Edition, in addition to the default Data Processing workflow and the services that it requires, there is also a minimal Data Processing workflow.

If it is desirable to include the new worker in the minimal workflow as well as the default workflow, then similar changes need to be made in the minimal workflow definitions. The minimal Data Processing definitions can be found [here](https://github.hpe.com/caf/data-processing-service-deploy/tree/develop/minimal).
If it is desirable to include the new worker in the minimal workflow as well as the default workflow, then similar changes need to be made in the minimal workflow definitions. The minimal Data Processing definitions can be found [here](https://github.hpe.com/caf/data-processing-service-internal-deploy/tree/develop/minimal).

### Add the worker to the minimal workflow's Docker Compose file

The minimal Data Processing workflow's Docker Compose file can be found [here](https://github.hpe.com/caf/data-processing-service-deploy/blob/develop/minimal/docker-compose.yml).
The minimal Data Processing workflow's Docker Compose file can be found [here](https://github.hpe.com/caf/data-processing-service-internal-deploy/blob/develop/minimal/docker-compose.yml).

The changes to make in this file are analogous to those made for the default definition, as described in [Adding a Worker to the Docker Compose File](#adding-a-worker-to-the-docker-compose-file).

### Add the worker to the minimal workflow's debug Docker Compose file

The minimal Data Processing workflow's debug Docker Compose overlay file can be found [here](https://github.hpe.com/caf/data-processing-service-deploy/blob/develop/minimal/docker-compose.debug.yml).
The minimal Data Processing workflow's debug Docker Compose overlay file can be found [here](https://github.hpe.com/caf/data-processing-service-internal-deploy/blob/develop/minimal/docker-compose.debug.yml).

The changes to make in this file are analogous to those made for the default definition, as described in [Updating the Debug Docker Compose File](#updating-the-debug-docker-compose-file).

### Add the worker to the minimal workflow's definition

The minimal Data Processing workflow definition can be found [here](https://github.hpe.com/caf/data-processing-service-deploy/blob/develop/minimal/processing-workflow.json).
The minimal Data Processing workflow definition can be found [here](https://github.hpe.com/caf/data-processing-service-internal-deploy/blob/develop/minimal/processing-workflow.json).

The changes to make in this file are analogous to those made for the default workflow definition, as described in [Updating the Data Processing Workflow Definition](#updating-the-data-processing-workflow-definition).

Expand Down
2 changes: 1 addition & 1 deletion docs/pages/en-us/Getting-Started.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ The recommended method for deploying data processing components is through a [Do

* The open source Data Processing compose file [here](https://github.com/CAFDataProcessing/data-processing-service-deploy) uses the open source data processing components e.g. Binary Hash, Markup, Boilerplate, Language Detection.

* A compose file for Enterprise Edition to deploy the full suite of data processing services is available [here](https://github.hpe.com/caf/data-processing-service-deploy). This includes actions that require a license such as Metadata Extraction, Speech to Text, Entity Extraction and more. Details on obtaining a license for Enterprise Edition can be found on the [Overview](./Overview) page.
* A compose file for Enterprise Edition to deploy the full suite of data processing services is available [here](https://github.hpe.com/caf/data-processing-service-internal-deploy). This includes actions that require a license such as Metadata Extraction, Speech to Text, Entity Extraction and more. Details on obtaining a license for Enterprise Edition can be found on the [Overview](./Overview) page.

### Prerequisites

Expand Down
2 changes: 1 addition & 1 deletion docs/pages/en-us/Text_Extract/Main.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ The Text Extract processing operation is only available with the Enterprise Edit

### Deployment

The components required for text extract are included as part of the Enterprise Edition data processing service compose file available [here](https://github.hpe.com/caf/data-processing-service-deploy). Refer to Data Processing Getting Started [here](../Getting-Started) for deploy instructions. If not using the compose file, you will need to deploy the workflow worker (with text extract handler and converter on its classpath), workflow database and extract worker for this action.
The components required for text extract are included as part of the Enterprise Edition data processing service compose file available [here](https://github.hpe.com/caf/data-processing-service-internal-deploy). Refer to Data Processing Getting Started [here](../Getting-Started) for deploy instructions. If not using the compose file, you will need to deploy the workflow worker (with text extract handler and converter on its classpath), workflow database and extract worker for this action.

### Creating Text Extract Actions

Expand Down
7 changes: 4 additions & 3 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -83,9 +83,10 @@

<properties>
<caf.boilerplate.service.version>2.0.1-7</caf.boilerplate.service.version>
<caf.classification.service.version>1.0.0-SNAPSHOT</caf.classification.service.version>
<caf.corepolicy.version>1.0.0-SNAPSHOT</caf.corepolicy.version>
<caf.worker.policy.version>1.0.0-SNAPSHOT</caf.worker.policy.version>
<caf.classification.service.version>1.0.0-2</caf.classification.service.version>
<caf.container.policy.admin.name>cafdataprocessing/policy-admin-elasticsearch:1.0.0</caf.container.policy.admin.name>
<caf.corepolicy.version>1.0.0-4</caf.corepolicy.version>
<caf.worker.policy.version>1.0.0-3</caf.worker.policy.version>
</properties>

<dependencyManagement>
Expand Down
1 change: 0 additions & 1 deletion processing-service-container/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,6 @@

<properties>
<caf.container.processing.service.name>cafinternal/prereleases:processing-service-${project.version}</caf.container.processing.service.name>
<caf.container.policy.admin.name>cafinternal/prereleases:policy-admin-elasticsearch-${caf.corepolicy.version}</caf.container.policy.admin.name>

<internal.hibernate.connectionstring>jdbc:postgresql://corepolicydb-postgres:5432/&lt;dbname&gt;</internal.hibernate.connectionstring>
<hibernate.user>postgres</hibernate.user>
Expand Down
2 changes: 0 additions & 2 deletions release-notes-1.0.0.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
!not-ready-for-release!

#### Version Number
${version-number}

Expand Down
3 changes: 1 addition & 2 deletions utils/data-processing-databases-container/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,7 @@

<properties>
<caf.container.databases.name>cafinternal/prereleases:data-processing-databases-${project.version}</caf.container.databases.name>
<!-- TODO update to open source java-postgres image when available -->
<caf.container.javapostgres.name>cafinternal/prereleases:java-postgres-1.0.0-SNAPSHOT</caf.container.javapostgres.name>
<caf.container.javapostgres.name>cafapi/java-postgres:1.11.0-218</caf.container.javapostgres.name>
</properties>

<dependencies>
Expand Down

0 comments on commit 10f13c2

Please sign in to comment.