3
votes

I am required to parameterize all variables in my kettle job and a transformation (the jobs will be run in AWS and all params are passed in as environment variables).

My connections, paths and various other parameters in the job and its attendant transformation use the ${SOURCE_DB_PASSWORD}, ${OUTPUT_DIRECTORY} style.

When I set these as environment variables in Data Integration UI, they all work and job runs successfully in the UI tool. When I run them from a bash script:

#!/bin/sh
export SOURCE_DB_HOST=services.db.dev
export SOURCE_DB_PORT=3306

kitchen.sh -param:SOURCE_DB_PORT=$SOURCE_DB_PORT -param:SOURCE_DB_HOST=$SOURCE_DB_HOST -file MY_JOB.kjb

The job and the transformation it calls do not pick up the variables. The error being:

Cannot load connection class because of underlying exception: 'java.lang.NumberFormatException: For input string: "${SOURCE_DB_PORT}"'

So without using jndi files, or kettle.properties, I need some way of mapping environment variables to parameters/variables inside PDI jobs and transformations.

[PDI version 8.1 on Mac OS X 10.13]

5
Windows or linux?AlainD
sorry, issue appears on Mac OS X 10.13, Java 8Serge Merzliakov

5 Answers

3
votes

Using the the -param:SOURCE_DB_HOST=value syntax on the command line and the ${SOURCE_DB_HOST} syntax inside jobs and transformations is the correct way to go.

What you need to do in transformations (but not for jobs it appears) is to add to the parameters explicitly to the transformation properties (control-T or on mac command-T in the transformation workspace). Screen shot attached.

Running the job or the transformation directly from a shell script will then work.

enter image description here

1
votes

Maybe the issue is not caused by the parameters, but with the Table Input. Can you check if the Replace variables in script is checked.

enter image description here

0
votes

With the step Get variable, you can change the parameters into variables (columns), before the Table input step, without forgetting to specify the Insert data from step.

As the issue comes from a type format error, you may want to see what PDI thinks of the variable with the Write to log step.

Tell me if the Number format exception persist.

0
votes

This is a typical error when pentaho expects an integer in the port number but receives s string. Check whether there is any space after the port number in your shell script

0
votes

I've just been trying to get this stuff working. The -param (or /param) command line flags only seem to work if you specify the parameters in the job spec, and then explicitly pass them down to any transformations that need them.

To get system properties in that are universally accessible within the job I've used:

set "OPT=-Dname1=value -Dname2=value"

Before calling kitchen. This puts the -D flags into the java options. Hopefully this will work with database specs.

Of course you can put them in kettle.properties, but if you want to run against different environments that gets messy.