1
votes

System Information

Spring Cloud Data Flow Cloud Foundry: v1.1.0.RELEASE Pivotal Cloud Foundry: v1.7.12 CF Client (Windows): cf.exe version 6.23.1+a70deb3.2017-01-13 cf-v3-plugin: 0.6.7

Launching the timestamp task app with no parameters fails. Logs from Spring Cloud Data Flow Server show the following stack trace at https://gist.github.com/anonymous/420f3928b7831a11b378fc6792be1ffc.

Running cf v3-apps outputs

name          total_desired_instances
ticktock-ts   0

Then cf v3-rt ticktock-ts start now produces

OK

Running task start on app ticktock-ts...

Tailing logs for app ticktock-ts...

Failed to run task start:
{   "code": 330002,   "description": "Feature Disabled: task_creation",   "error_code": "CF-FeatureDisabled"}

This task_creation feature is disabled and my PCF admin says it cannot be enabled on the PCF version we have.

Thank you.

2

2 Answers

0
votes

Tasks were considered 'experimental' prior to PCF 1.9.x. I see you are running the vci-cli plugin, so you must know this. Try setting up the SCDF with the variable that allows experimental tasks to run in the older versions of PCF:

cf set-env foundry-server SPRING_CLOUD_DATAFLOW_FEATURES_EXPERIMENTAL_TASKSENABLED true

Also, I got the same error you received both with the SCDF version of 1.1.0-release as well as 1.2.0-build-snapshot (the most recent I can find). I noticed that error only appeared after repeated launch attempts. By un-registering and destroying the task and starting over, I would only get that error after multiple attempts.

0
votes

We have done it for our PCF installation of 1.7 version as well.

Below is from our notes and learning that we made during the spike to try it out. I hope it helps.

Spring Cloud Task allows a user to develop and run shortlived microservices using Spring Cloud and run them locally, in the cloud, even on Spring Cloud Data Flow. We have used the sample Spring Cloud application : thumbnail-generator as the shortlived microservice for our usecase.

Detailed reference guide about Spring Cloud Task is at spring docs.

Please note that, the we have used Rabbit Binder and for that to work a RabbitMq instance is created in the PCF Space and bound to the app using the variable explained below.

Steps and Commands

  • Develop and Install the Task app in local maven - mvn clean install

  • Set Maven remote repo, if want to use private one for other dependency downloads: export MAVEN_REMOTE_REPOSITORIES_REPO1_URL=YOUR_NEXUS_URL

  • Setup Cloud Foundry destination: export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_URL=PCF API Endpoint export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_ORG=Org Name export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_SPACE=Space Name export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_DOMAIN=Site domain export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_USERNAME=user name export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_PASSWORD=password export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_SKIP_SSL_VALIDATION=true export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_STAGING_TIMEOUT=300s export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_STARTUP_TIMEOUT=300s export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_BUILDPACK=java_buildpack_offline export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_MEMORY=512 export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_TASK_MEMORY=512 export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_ENABLE_RANDOM_APP_NAME_PREFIX=false export SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_SERVICES=myMessageStream_rabbitmq_server

    SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_STREAM_SERVICES can be a comma separated list of service instances we would like to bind with our stream app. SPRING_CLOUD_DEPLOYER_CLOUDFOUNDRY_TASK_SERVICES for listing desired services to be bound to the task app

  • Start Dataflow Server -

    • If running Stream app locally:

      1. Build Dataflow Server Local
        • mvn clean package
      2. Start DF Server Local - mvn spring-boot:run
    • If running Stream app on PCF:

      1. Build Dataflow Server CF
        • mvn clean package
      2. Start DF Server CF (skip the java opts, if not required) - java -Djavax.net.ssl.trustStore=./keystore.jks -Djavax.net.ssl.trustStorePassword=password -jar target/spring-cloud-dataflow-server-cloudfoundry-1.1.0.BUILD-SNAPSHOT.jar
  • Build Dataflow Shell - mvn clean package

  • Start DF Shell - mvn spring-boot:run

  • Import out of box apps - app import --uri use_the_bitly_url_for_stream-applications-rabbit-maven

    For importing the customised list from local file system use Uri for a file system: --uri=file://<full path to the jar file on local system>

  • Register apps -

    1. app register --name task-processor --type processor --uri maven://org.springframework.cloud.stream.app:tasklaunchrequest-transform-processor-rabbit:jar:1.1.0.BUILD-SNAPSHOT
    2. app register --name task-launcher-local --type sink --uri maven://org.springframework.cloud.stream.app:task-launcher-local-sink-rabbit:jar:1.0.4.RELEASE
  • Create & Deploy Stream -

    1. On Local: stream create myMessageStream --definition "rabbit --rabbit.queues=cloud-stream-source --rabbit.requeue=true --spring.rabbitmq.host=localhost --spring.rabbitmq.port=5672 --spring.cloud.stream.bindings.output.contentType='text/plain' | task-processor --uri=maven://com.example:thumbnail-generator:0.0.1-SNAPSHOT | task-launcher-local" --deploy

    2. On PCF: stream create myMessageStream --definition "rabbit --rabbit.queues=cloud-stream-source --rabbit.requeue=true --spring.rabbitmq.host=${vcap.services.p-rabbitmq.credentials.protocols.amqp.host} --spring.rabbitmq.port=${vcap.services.p-rabbitmq.credentials.protocols.amqp.port} --spring.cloud.stream.bindings.output.contentType='text/plain' | task-processor --uri=maven://com.example:thumbnail-generator:0.0.1-SNAPSHOT | task-launcher-local" --deploy

As the result in case of a successful deployment to PCF you can find 3 running applications viz. rabbit listener, task processor and task launcher on CF in order to enable the event driven architecture microservices.

For IBM MQ to be the source, the stream definition would be

... --definition "jms --jms.destination=<value>
--spring.jms.jndi-name=<value> --spring.cloud.stream.bindings.output.contentType='text/plain' ...

As to mention, all of the above operations can also be completed using the Dataflow Server Dashboard.

I hope it helps.