Azkaban3.x插件安装与使用

1,379 阅读1分钟

安装Plugin前置

1、插件下载地址:github.com/azkaban/azk…

2、解压缩


[hadoop@xinxingdata001 software]$ tar -zxvf azkaban-plugins-3.0.0.tar.gz 

[hadoop@xinxingdata001 software]$ cd azkaban-plugins-3.0.0/plugins/jobtype/jobtypes

[hadoop@xinxingdata001 jobtypes]$ cp * ~/app/azkaban/plugins/jobtypes/

修改通用插件配置

官网介绍:azkaban.readthedocs.io/en/latest/p…

1、配置commonprivate.properties

[hadoop@xinxingdata001 jobtypes]$ cd ~/app/azkaban/plugins/jobtypes/

[hadoop@xinxingdata001 jobtypes]$ vim commonprivate.properties

## hadoop的安全性版本,我使用的Hadoop为2.x所以改为2
#hadoop.security.manager.class=azkaban.security.HadoopSecurityManager_H_1_0
hadoop.security.manager.class=azkaban.security.HadoopSecurityManager_H_2_0

hadoop.share=/home/hadoop/app/hadoop/share/hadoop

#不加会报错,az会把hadoop.classpath替换为private.properties中的jobtype.classpath
hadoop.classpath=

# 全局classPath
# 把你的Hadoop需要用到的jar以及配置文件路径配置在此处(几乎所有插件都需要用到hadoop)
jobtype.global.classpath=${hadoop.home}/etc/hadoop/*,${hadoop.share}/common/*,${hadoop.share}/common/lib/*,${hadoop.share}/hdfs/*,${hadoop.share}/hdfs/lib/*,${hadoop.share}/mapreduce/*,${hadoop.share}/mapreduce/lib/*,${hadoop.share}/yarn/*,${hadoop.share}/yarn/lib*

# 配置软件的家目录
hadoop.home=/home/hadoop/app/hadoop
hive.home=/home/hadoop/app/hive
spark.home=/home/hadoop/app/spark

# 关闭安全性选项
execute.as.user=false

2、配置common.properties

[hadoop@xinxingdata001 jobtypes]$ vim common.properties

# 在此处设置好xxx.home后,会传递给用户程序(spark、hive)
hadoop.home=/home/hadoop/app/hadoop
hive.home=/home/hadoop/app/hive
spark.home=/home/hadoop/app/spark

修改某个单独组件的额外配置(以HiveType为例)

Azkaban的作者帮我们内置了Command这个插件。

官网介绍:azkaban.readthedocs.io/en/latest/j…

[hadoop@xinxingdata001 jobtypes]$ cd hive/

注意,如果你在通用(全局)已经配置过,该配置只需要新增HiveType需要的即可

1、配置private.properties

#jobtype.classpath=${hadoop.home}/conf,${hadoop.home}/lib/*,${hive.home}/lib/*,${hive.home}/conf,${hive.aux.jar.path}
#我们已经在全局变量中配置过Hadoop的lib以及conf
jobtype.classpath=${hive.home}/lib/*,${hive.home}/conf,${hive.aux.jar.path}

jobtype.class=azkaban.jobtype.HadoopHiveJob

#hive.aux.jar.path=${hive.home}/aux/lib
hive.aux.jar.path=${hive.home}/auxlib

2、配置plugin.properties

#hive.aux.jars.path=${hive.home}/aux/lib
hive.aux.jars.path=${hive.home}/auxlib

hive.jvm.args=-Dhive.querylog.location=. -Dhive.exec.scratchdir=/tmp/hive-${user.to.proxy} -Dhive.aux.jars.path=${hive.aux.jars.path}

jobtype.jvm.args=${hive.jvm.args}

测试

UPDATE JOBTYPE

spark

[hadoop@xinxingdata001 ~]$ cd /home/hadoop/app/azkaban/plugins/jobtypes/spark
[hadoop@xinxingdata001 spark]$ vim private.properties 
jobtype.class=azkaban.jobtype.HadoopSparkJob

HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
YARN_CONF_DIR=$HADOOP_HOME/etc/hadoop

jobtype.classpath=/home/hadoop/app/spark/conf/*,/home/hadoop/app/spark/jars