We have done it for our PCF installation of 1.7 version as well.
Below is from our notes and learning that we made during the spike to try it out. I hope it helps.
Spring Cloud Task allows a user to develop and run shortlived microservices using Spring Cloud and run them locally, in the cloud, even on Spring Cloud Data Flow.
We have used the sample Spring Cloud application : thumbnail-generator as the shortlived microservice for our usecase.
Detailed reference guide about Spring Cloud Task is at spring docs.
Please note that, the we have used Rabbit Binder and for that to work a RabbitMq instance is created in the PCF Space and bound to the app using the variable explained below.
Steps and Commands
Develop and Install the Task app in local maven - mvn clean install
Set Maven remote repo, if want to use private one for other dependency downloads:
export MAVEN_REMOTE_REPOSITORIES_REPO1_URL=YOUR_NEXUS_URL
Setup Cloud Foundry destination:
export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_URL=PCF API Endpoint
export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_ORG=Org Name
export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_SPACE=Space Name
export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_DOMAIN=Site domain
export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_USERNAME=user name
export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_PASSWORD=password
export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_SKIP_SSL_VALIDATION=true
export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_STAGING_TIMEOUT=300s
export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_STARTUP_TIMEOUT=300s
export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_BUILDPACK=java_buildpack_offline export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_MEMORY=512
export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_TASK_MEMORY=512
export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_ENABLE_RANDOM_APP_NAME_PREFIX=false
export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_SERVICES=myMessageStream_rabbitmq_server
SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_SERVICES can be a comma separated
list of service instances we would like to bind with our stream app.
SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_TASK_SERVICES for listing desired services
to be bound to the task app
Start Dataflow Server -
Build Dataflow Shell - mvn clean package
Start DF Shell - mvn spring-boot:run
Import out of box apps - app import --uri use_the_bitly_url_for_stream-applications-rabbit-maven
For importing the customised list from local file system use Uri for a file
system: --uri=file://<full path to the jar file on local system>
Register apps -
app register --name task-processor --type processor --uri maven://org.springframework.cloud.stream.app:tasklaunchrequest-transform-processor-rabbit:jar:1.1.0.BUILD-SNAPSHOT
app register --name task-launcher-local --type sink --uri maven://org.springframework.cloud.stream.app:task-launcher-local-sink-rabbit:jar:1.0.4.RELEASE
Create & Deploy Stream -
On Local:
stream create myMessageStream --definition "rabbit --rabbit.queues=cloud-stream-source --rabbit.requeue=true --spring.rabbitmq.host=localhost --spring.rabbitmq.port=5672 --spring.cloud.stream.bindings.output.contentType='text/plain' | task-processor --uri=maven://com.example:thumbnail-generator:0.0.1-SNAPSHOT | task-launcher-local" --deploy
On PCF:
stream create myMessageStream --definition "rabbit --rabbit.queues=cloud-stream-source --rabbit.requeue=true --spring.rabbitmq.host=${vcap.services.p-rabbitmq.credentials.protocols.amqp.host} --spring.rabbitmq.port=${vcap.services.p-rabbitmq.credentials.protocols.amqp.port} --spring.cloud.stream.bindings.output.contentType='text/plain' | task-processor --uri=maven://com.example:thumbnail-generator:0.0.1-SNAPSHOT | task-launcher-local" --deploy
As the result in case of a successful deployment to PCF you can find 3 running applications viz. rabbit listener, task processor and task launcher on CF in order to enable the event driven architecture microservices.
For IBM MQ to be the source, the stream definition would be
... --definition "jms --jms.destination=<value>
--spring.jms.jndi-name=<value> --spring.cloud.stream.bindings.output.contentType='text/plain' ...
As to mention, all of the above operations can also be completed using the Dataflow Server Dashboard.
I hope it helps.