最具影响力的数字化技术在线社区

乔帮主 发表于 2014-10-29 17:07:24

Linux下面部署和手工执行kettleJob

login as: hadoop
hadoop@192.168.0.2's password:
Last login: Sat Mar 2 17:48:03 2013 from 192.168.0.13
$ crontab -l
31 19 * * * /bin/sh /pentaho/Plan/a.sh
15 23 * * * /bin/sh /pentaho/Plan/b.sh
10 7 * * * /bin/sh /pentaho/Plan/c.sh
30 16 * * * /bin/sh /pentaho/Plan/d.sh
0 13 * * * /bin/sh /pentaho/Plan/e.sh
0 13 * * 7 /bin/sh /pentaho/Plan/week1.sh
10 13 * * 1 /bin/sh /pentaho/Plan/week2.sh
0 12 1 * * /bin/sh /pentaho/Plan/month1.sh
$linux 下面的批处理文件格式为:sh------------------------------------------------------------------------------------------------------------#!/bin/shexport JAVA_HOME=/usr/local/java
export HADOOP_HOME=/hadoop/hadoopcd /pentaho/pentaho/data-integration./kitchen.sh -rep 192.168.0.13.PDI_Repository -user username -pass password -dir /目录名称 -job job名称 -level=basic>>/pentaho/Plan/job_loaddw.log解释:-level=basic>>/pentaho/Plan/job名称.log 设置日志级别 输入日志sh------------------------------------------------------------------------------------------------------------手工执行kettle job 的命令如下:login as: hadoop
hadoop@192.168.0.2's password:
Last login: Thu Apr 5 11:09:24 2012 from 192.168.0.13
$ cd ..
$ cd ..
$ ls
: esvn lib mnt pentaho srv
bin etc lib64 mysql proc sys
boot etl lost+found net root tmp
dev hadoop media netxtreme2-6.2.23-1.src.rpm sbin usr
eclipse home misc opt selinux var
$ cd pentaho
$ cd pentaho
$ cd data*
$
./kitchen.sh -rep IP地址.PDI_Repository -user username -pass password -dir /job路径 -job job名称

页: [1]
查看完整版本: Linux下面部署和手工执行kettleJob