hive数据源新增配置参数2个

79 阅读1分钟

通过业务小功能实现,在 Java 代码中学习设计模式。

原有接口

public Result createDataSource(@ApiIgnore @RequestAttribute(value = Constants.SESSION_USER) User loginUser,
                               @ApiParam(name = "dataSourceParam", value = "DATA_SOURCE_PARAM", required = true) @RequestBody String jsonStr) {
    BaseDataSourceParamDTO dataSourceParam = DataSourceUtils.buildDatasourceParam(jsonStr);
    return dataSourceService.createDataSource(loginUser, dataSourceParam);
}

前端传参

{
	"type": "HIVE",
	"label": "HIVE",
	"name": "bbb",
	"note": "1111",
	"host": "10.11.14.30",
	"port": 10000,
	"principal": "",
	"javaSecurityKrb5Conf": "",
	"loginUserKeytabUsername": "",
	"loginUserKeytabPath": "",
	"mode": "",
	"userName": "",
	"password": "",
	"database": "default",
	"connectType": "",
	"other": null,
	"testFlag": -1,
	"endpoint": "",
	"MSIClientId": "",
	"dbUser": "",
	"authType": "0"
}

需求:新增2个参数,一个是 defaultFS、一个是 hadoopConfig

梳理原有接口逻辑

根据不同的数据源类型处理参数

BaseDataSourceParamDTO dataSourceParam = DataSourceUtils.buildDatasourceParam(jsonStr);

获取 connectionParam 字段

ConnectionParam connectionParam = DataSourceUtils.buildConnectionParams(datasourceParam);

实现

1)修改 BaseHDFSConnectionParam、BaseHDFSDataSourceParamDTO、HiveConnectionParam、HiveDataSourceParamDTO、HiveDataSourceProcessor

新增属性

protected String defaultFS;
protected String hadoopConfig;

新增实现逻辑

hiveConnectionParam.setDefaultFS(hiveParam.getDefaultFS());
hiveConnectionParam.setHadoopConfig(hiveParam.getHadoopConfig());