1
votes

I have a pentaho BI server running on a CentOS machine (osapp), then the MySQL database with repository, staging and data wharehouse in another CentOS machine (osdb). I run Spoon from my Windows PC. I created a transformation to run a Pentaho Report, export to PDF and send by email, automatically every 8 hours. If I run the transformation in my Spoon, it runs as expected, but using the carte server I set on the BI server (osapp) gives an error in Pentaho Reporting Output step, with no log. I can't see the details of the error. If I run it using Pan.sh, is the same, but I get an output like this:

sh pan.sh -rep=mitbirep -user=admin -pass=admin -trans="HighRisk Containers" -level=basic
INFO  02-01 18:56:06,187 - Pan - Logging is at level : Basic logging
INFO  02-01 18:56:06,188 - Pan - Start of run.
INFO  02-01 18:56:06,284 - RepositoriesMeta - Reading repositories XML file: /opt/pentaho/data-integration/repositories.xml
INFO  02-01 18:56:08,144 - HighRisk Containers - Dispatching started for transformation [HighRisk Containers]
INFO  02-01 18:56:08,241 - Data Grid - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
INFO  02-01 18:56:08,594 - LibBase 1.2.2.13667 started.
INFO  02-01 18:56:08,840 - LibLoader 1.2.2.13667 started.
INFO  02-01 18:56:08,946 - LibFonts 1.2.3.13670 started.
INFO  02-01 18:56:09,008 - LibSerializer 1.2.2.13667 started.
INFO  02-01 18:56:09,169 - LibFormula 1.2.3.13667 started.
INFO  02-01 18:56:09,321 - LibFormat 1.2.3.13667 started.
INFO  02-01 18:56:09,384 - LibXML 1.2.2.13670 started.
INFO  02-01 18:56:09,439 - LibRepository 1.2.3.14252 started.
INFO  02-01 18:56:09,499 - LibDocBundle 1.2.3.14252 started.
WARN  02-01 18:56:10,312 - No configuration found. Configuring ehcache from ehcache-failsafe.xml  found in the classpath: jar:file:/opt/pentaho/data-integration/libext/reporting/ehcache-core-2.0.1.jar!/ehcache-failsafe.xml
INFO  02-01 18:56:12,674 - Completed font registration.
INFO  02-01 18:56:12,735 - Completed font registration.
INFO  02-01 18:56:12,797 - Completed font registration.
INFO  02-01 18:56:17,983 - Pentaho Reporting Engine Classic 3.8.2-GA.14313 started.
WARN  02-01 18:56:18,673 - Unknown tag <http://jfreereport.sourceforge.net/namespaces/datasources/sql:query-definitions>: Start to ignore this element and all of its childs.  [Location: Line=16 Column=27]
WARN  02-01 18:56:20,149 - No configuration found. Configuring ehcache from ehcache-failsafe.xml  found in the classpath: jar:file:/opt/pentaho/data-integration/libext/reporting/ehcache-core-2.0.1.jar!/ehcache-failsafe.xml
WARN  02-01 18:56:20,208 - Creating a new instance of CacheManager using the diskStorePath "/tmp" which is already used by an existing CacheManager.
The source of the configuration was net.sf.ehcache.config.generator.ConfigurationSource$DefaultConfigurationSource@463684dc.
The diskStore path for this CacheManager will be set to /tmp/ehcache_auto_created_1357170980208.
To avoid this warning consider using the CacheManager factory methods to create a singleton CacheManager or specifying a separate ehcache configuration (ehcache.xml) for each CacheManager instance.
INFO  02-01 18:56:20,326 - RepositoriesMeta - Reading repositories XML file: /opt/pentaho/data-integration/repositories.xml
ERROR 02-01 18:56:20,337 - 2132298740: Report processing failed.
ERROR 02-01 18:56:20,340 - Writing PDF failed.
org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Unable to load Kettle-Transformation
    at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleTransFromFileProducer.loadTransformation(KettleTransFromFileProducer.java:145)
    at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.AbstractKettleTransformationProducer.performQuery(AbstractKettleTransformationProducer.java:313)
    at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleDataFactory.queryData(KettleDataFactory.java:134)
    at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStatic(CompoundDataFactory.java:135)
    at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryInternal(CachingDataFactory.java:421)
    at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryStatic(CachingDataFactory.java:183)
    at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStatic(CompoundDataFactory.java:130)
    at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryData(CompoundDataFactory.java:85)
    at org.pentaho.reporting.engine.classic.core.states.datarow.ReportDataRow.createDataRow(ReportDataRow.java:97)
    at org.pentaho.reporting.engine.classic.core.states.datarow.DefaultFlowController.performQuery(DefaultFlowController.java:188)
    at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.initializeForMasterReport(ProcessState.java:287)
    at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.prepareReportProcessing(AbstractReportProcessor.java:469)
    at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processReport(AbstractReportProcessor.java:1522)
    at org.pentaho.reporting.engine.classic.core.modules.output.pageable.pdf.PdfReportUtil.createPDF(PdfReportUtil.java:122)
    at org.pentaho.reporting.engine.classic.core.modules.output.pageable.pdf.PdfReportUtil.createPDF(PdfReportUtil.java:70)
    at org.pentaho.reporting.engine.classic.core.modules.output.pageable.pdf.PdfReportUtil.createPDF(PdfReportUtil.java:154)
    at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:219)
    at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processRow(PentahoReportingOutput.java:108)
    at org.pentaho.di.trans.step.RunThread.run(RunThread.java:40)
    at java.lang.Thread.run(Thread.java:722)
ParentException:
org.pentaho.reporting.libraries.resourceloader.ResourceKeyCreationException: Unable to create key: No loader was able to handle the given key data: C:\Users\mocando.PANAMA\Documents\Infrastructure\ktrs\High Risk Containers.ktr
    at org.pentaho.reporting.libraries.resourceloader.DefaultResourceManagerBackend.createKey(DefaultResourceManagerBackend.java:76)
    at org.pentaho.reporting.libraries.docbundle.BundleResourceManagerBackend.createKey(BundleResourceManagerBackend.java:88)
    at org.pentaho.reporting.libraries.resourceloader.ResourceManager.createKey(ResourceManager.java:146)
    at org.pentaho.reporting.libraries.resourceloader.ResourceManager.createKey(ResourceManager.java:132)
    at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleTransFromFileProducer.createKey(KettleTransFromFileProducer.java:89)
    at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleTransFromFileProducer.loadTransformation(KettleTransFromFileProducer.java:124)
    at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.AbstractKettleTransformationProducer.performQuery(AbstractKettleTransformationProducer.java:313)
    at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleDataFactory.queryData(KettleDataFactory.java:134)
    at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStatic(CompoundDataFactory.java:135)
    at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryInternal(CachingDataFactory.java:421)
    at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryStatic(CachingDataFactory.java:183)
    at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStatic(CompoundDataFactory.java:130)
    at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryData(CompoundDataFactory.java:85)
    at org.pentaho.reporting.engine.classic.core.states.datarow.ReportDataRow.createDataRow(ReportDataRow.java:97)
    at org.pentaho.reporting.engine.classic.core.states.datarow.DefaultFlowController.performQuery(DefaultFlowController.java:188)
    at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.initializeForMasterReport(ProcessState.java:287)
    at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.prepareReportProcessing(AbstractReportProcessor.java:469)
    at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processReport(AbstractReportProcessor.java:1522)
    at org.pentaho.reporting.engine.classic.core.modules.output.pageable.pdf.PdfReportUtil.createPDF(PdfReportUtil.java:122)
    at org.pentaho.reporting.engine.classic.core.modules.output.pageable.pdf.PdfReportUtil.createPDF(PdfReportUtil.java:70)
    at org.pentaho.reporting.engine.classic.core.modules.output.pageable.pdf.PdfReportUtil.createPDF(PdfReportUtil.java:154)
    at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:219)
    at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processRow(PentahoReportingOutput.java:108)
    at org.pentaho.di.trans.step.RunThread.run(RunThread.java:40)
    at java.lang.Thread.run(Thread.java:722)
log4j:WARN No appenders could be found for logger (org.pentaho.di).
log4j:WARN Please initialize the log4j system properly.
[root@mitosapp data-integration]#
[root@mitosapp data-integration]# sh pan.sh -rep=mitbirep -user=admin -pass=admin -trans="CMD Schedule" -level=basic
INFO  02-01 19:07:42,067 - Pan - Logging is at level : Basic logging
INFO  02-01 19:07:42,069 - Pan - Start of run.
INFO  02-01 19:07:42,196 - RepositoriesMeta - Reading repositories XML file: /opt/pentaho/data-integration/repositories.xml
INFO  02-01 19:07:44,686 - CMD Schedule - Dispatching started for transformation [CMD Schedule]
INFO  02-01 19:07:44,792 - Data Grid - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
INFO  02-01 19:07:45,275 - LibBase 1.2.2.13667 started.
INFO  02-01 19:07:45,606 - LibLoader 1.2.2.13667 started.
INFO  02-01 19:07:45,730 - LibFonts 1.2.3.13670 started.
INFO  02-01 19:07:45,811 - LibSerializer 1.2.2.13667 started.
INFO  02-01 19:07:46,019 - LibFormula 1.2.3.13667 started.
INFO  02-01 19:07:46,221 - LibFormat 1.2.3.13667 started.
INFO  02-01 19:07:46,311 - LibXML 1.2.2.13670 started.
INFO  02-01 19:07:46,405 - LibRepository 1.2.3.14252 started.
INFO  02-01 19:07:46,494 - LibDocBundle 1.2.3.14252 started.
WARN  02-01 19:07:47,561 - No configuration found. Configuring ehcache from ehcache-failsafe.xml  found in the classpath: jar:file:/opt/pentaho/data-integration/libext/reporting/ehcache-core-2.0.1.jar!/ehcache-failsafe.xml
INFO  02-01 19:07:51,016 - Completed font registration.
INFO  02-01 19:07:51,114 - Completed font registration.
INFO  02-01 19:07:51,234 - Completed font registration.
INFO  02-01 19:07:53,342 - New update(s) found: 2.4.7 [http://www.terracotta.org/confluence/display/release/Release+Notes+Ehcache+Core+2.4]. Please check http://ehcache.org for the latest version.
INFO  02-01 19:07:59,056 - Pentaho Reporting Engine Classic 3.8.2-GA.14313 started.
ERROR 02-01 19:07:59,068 - Pentaho Reporting Output - Unexpected error
ERROR 02-01 19:07:59,079 - Pentaho Reporting Output - org.pentaho.di.core.exception.KettleException:
There was an unexpected error processing report '/var/lib/tomcat6/webapps/ROOT/CMD_schedule.prpt' to produce file '/var/lib/tomcat6/webapps/ROOT/cmd_schedule.pdf' with processor: PDF.
Failed to open URL connection

    at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:239)
    at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processRow(PentahoReportingOutput.java:108)
    at org.pentaho.di.trans.step.RunThread.run(RunThread.java:40)
    at java.lang.Thread.run(Thread.java:722)
Caused by: org.pentaho.reporting.libraries.resourceloader.ResourceLoadingException: Failed to open URL connection
    at org.pentaho.reporting.libraries.resourceloader.loader.URLResourceData.getResourceAsStream(URLResourceData.java:128)
    at org.pentaho.reporting.libraries.resourceloader.loader.AbstractResourceData.getResource(AbstractResourceData.java:101)
    at org.pentaho.reporting.libraries.docbundle.bundleloader.ZipResourceBundleLoader.loadBundle(ZipResourceBundleLoader.java:81)
    at org.pentaho.reporting.libraries.resourceloader.DefaultResourceManagerBackend.loadResourceBundle(DefaultResourceManagerBackend.java:389)
    at org.pentaho.reporting.libraries.resourceloader.ResourceManager.loadResourceBundle(ResourceManager.java:262)
    at org.pentaho.reporting.libraries.resourceloader.ResourceManager.load(ResourceManager.java:284)
    at org.pentaho.reporting.libraries.resourceloader.ResourceManager.create(ResourceManager.java:405)
    at org.pentaho.reporting.libraries.resourceloader.ResourceManager.create(ResourceManager.java:370)
    at org.pentaho.reporting.libraries.resourceloader.ResourceManager.createDirectly(ResourceManager.java:207)
    at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.loadMasterReport(PentahoReportingOutput.java:144)
    at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:160)
    ... 3 more

INFO  02-01 19:07:59,081 - Pentaho Reporting Output - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)
INFO  02-01 19:07:59,081 - CMD Schedule - CMD Schedule
INFO  02-01 19:07:59,081 - CMD Schedule - CMD Schedule
INFO  02-01 19:07:59,162 - Pan - Finished!
INFO  02-01 19:07:59,163 - Pan - Start=2013/01/02 19:07:42.070, Stop=2013/01/02 19:07:59.162
INFO  02-01 19:07:59,163 - Pan - Processing ended after 17 seconds.

Any ideas how may I accomplish this?

Thanks

1
UPDATE: I modified the data source in the report, and now is giving me no data. The report is empty, and I get this error from pan.sh: Deprecated behavior: None of the data-factories was able to handle the query 'HighRisk Containers'. Returning empty tablemodel instead of failing hard.Martin Ocando

1 Answers

1
votes

The problem appears to be that Pan is not finding your transform in the repository you've pointed it to. I'm not that familiar with this mechanism, but if you just need to get something working, you can instead launch Pan with a specified Kettle transform file like so

pan.sh -file='/path/to/my_transform.ktr'

This is normally how I execute transforms and it works perfectly.