Browse Source
* Add hive cli task plugin * Add docs for hive-cli task plugin * Fix hive-cli task related front-end nits and checkstyle * Fix exception handling logic3.1.0-release
Eric Gao
2 years ago
committed by
GitHub
86 changed files with 1413 additions and 396 deletions
@ -0,0 +1,56 @@ |
|||||||
|
# Hive CLI |
||||||
|
|
||||||
|
## Overview |
||||||
|
|
||||||
|
Use `Hive Cli Task` to create a `Hive Cli` type task and execute hive SQL from scripts or files. |
||||||
|
The workers run `hive -e` to execute hive sql from scripts or `hive -f` to execute from files in `Resource Center`. |
||||||
|
|
||||||
|
## Hive CLI Task vs SQL Task With Hive Datasource |
||||||
|
|
||||||
|
In DolphinScheduler, we have both `Hive CLI Task` and `SQL Task With Hive Datasource` for different scenarios. |
||||||
|
You could choose between these two based on your needs. |
||||||
|
|
||||||
|
- The `Hive CLI` task plugin connects directly to `HDFS` and the `Hive Metastore` for hive task executions, |
||||||
|
which requires your workers to have access to those services, such as related `Hive` libs, `Hive` and `HDFS` configuration files. |
||||||
|
However, `Hive CLI Task` provides better stability for scheduling in production. |
||||||
|
- `SQL Task With Hive Datasource` does not require access to `Hive` libs, `Hive` and |
||||||
|
`HDFS` configuration files and supports `Kerberos` for authentication. However, you may encounter `HiveServer2` failures |
||||||
|
if your hive sql task scheduling puts significant pressure on it. |
||||||
|
|
||||||
|
## Create Task |
||||||
|
|
||||||
|
- Click `Project Management-Project Name-Workflow Definition`, and click the `Create Workflow` button to enter the DAG editing page. |
||||||
|
- Drag <img src="../../../../img/tasks/icons/hivecli.png" width="15"/> from the toolbar to the canvas. |
||||||
|
|
||||||
|
## Task Parameters |
||||||
|
|
||||||
|
| **Parameter** | **Description** | |
||||||
|
|------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |
||||||
|
| Node Name | The name of the task. Node names within the same workflow must be unique. | |
||||||
|
| Run Flag | Indicating whether to schedule the task. If you do not need to execute the task, you can turn on the `Prohibition execution` switch. | |
||||||
|
| Description | Describing the function of this node. | |
||||||
|
| Task Priority | When the number of the worker threads is insufficient, the worker executes task according to the priority. When two tasks have the same priority, the worker will execute them in `first come first served` fashion. | |
||||||
|
| Worker Group | Machines which execute the tasks. If you choose `default`, scheduler will send the task to a random worker. | |
||||||
|
| Task Group Name | Resource group of tasks. It will not take effect if not configured. | |
||||||
|
| Environment Name | Environment to execute the task. | |
||||||
|
| Number of Failed Retries | The number of task retries for failures. You could select it by drop-down menu or fill it manually. | |
||||||
|
| Failure Retry Interval | Interval of task retries for failures. You could select it by drop-down menu or fill it manually. | |
||||||
|
| CPU Quota | Assign the specified CPU time quota to the task executed. Takes a percentage value. Default -1 means unlimited. For example, the full CPU load of one core is 100%, and that of 16 cores is 1600%. You could configure it by [task.resource.limit.state](../../architecture/configuration.md). | |
||||||
|
| Max Memory | Assign the specified max memory to the task executed. Exceeding this limit will trigger oom to be killed and will not automatically retry. Takes an MB value. Default -1 means unlimited. You could configure it by [task.resource.limit.state](../../architecture/configuration.md). | |
||||||
|
| Timeout Alarm | Alarm for task timeout. When the task exceeds the "timeout threshold", an alarm email will send. | |
||||||
|
| Hive Cli Task Execution Type | The type of hive cli task execution, choose either `FROM_SCRIPT` or `FROM_FILE`. | |
||||||
|
| Hive SQL Script | If you choose `FROM_SCRIPT` for `Hive Cli Task Execution Type`, you need to fill in your SQL script. | |
||||||
|
| Hive Cli Options | Extra options for hive cli, such as `--verbose` | |
||||||
|
| Resources | If you choose `FROM_FILE` for `Hive Cli Task Execution Type`, you need to select your SQL file. | |
||||||
|
|
||||||
|
## Task Example |
||||||
|
|
||||||
|
### Hive Cli Task Example |
||||||
|
|
||||||
|
This example below illustrates how to create a `Hive CLI` task node and execute hive SQL from script: |
||||||
|
|
||||||
|
![demo-hive-cli-from-script](../../../../img/tasks/demo/hive_cli_from_script.png) |
||||||
|
|
||||||
|
This example below illustrates how to create a `Hive CLI` task node and execute hive SQL from file: |
||||||
|
|
||||||
|
![demo-hive-cli-from-file](../../../../img/tasks/demo/hive_cli_from_file.png) |
@ -0,0 +1,57 @@ |
|||||||
|
# Hive CLI |
||||||
|
|
||||||
|
## 综述 |
||||||
|
|
||||||
|
使用`Hive Cli任务插件`创建`Hive Cli`类型的任务执行SQL脚本语句或者SQL任务文件。 |
||||||
|
执行任务的worker会通过`hive -e`命令执行hive SQL脚本语句或者通过`hive -f`命令执行`资源中心`中的hive SQL文件。 |
||||||
|
|
||||||
|
## Hive CLI任务 VS 连接Hive数据源的SQL任务 |
||||||
|
|
||||||
|
在DolphinScheduler中,我们有`Hive CLI任务插件`和`使用Hive数据源的SQL插件`提供用户在不同场景下使用,您可以根据需要进行选择。 |
||||||
|
|
||||||
|
- `Hive CLI任务插件`直接连接`HDFS`和`Hive Metastore`来执行hive类型的任务,所以需要能够访问到对应的服务。 |
||||||
|
执行任务的worker节点需要有相应的`Hive` jar包以及`Hive`和`HDFS`的配置文件。 |
||||||
|
但是在生产调度中,`Hive CLI任务插件`能够提供更可靠的稳定性。 |
||||||
|
- `使用Hive数据源的SQL插件`不需要您在worker节点上有相应的`Hive` jar包以及`Hive`和`HDFS`的配置文件,而且支持 `Kerberos`认证。 |
||||||
|
但是在生产调度中,若调度压力很大,使用这种方式可能会遇到`HiveServer2`服务过载失败等问题。 |
||||||
|
|
||||||
|
## 创建任务 |
||||||
|
|
||||||
|
- 点击项目管理-项目名称-工作流定义,点击"创建工作流"按钮,进入DAG编辑页面。 |
||||||
|
- 工具栏中拖动 <img src="../../../../img/tasks/icons/hivecli.png" width="15"/> 到画板中,即可完成创建。 |
||||||
|
|
||||||
|
## 任务参数 |
||||||
|
|
||||||
|
- 前置任务:选择当前任务的前置任务,会将被选择的前置任务设置为当前任务的上游。 |
||||||
|
|
||||||
|
| **任务参数** | **描述** | |
||||||
|
|---------------|-------------------------------------------------------------------------------------------------------------------------------------| |
||||||
|
| 任务名称 | 设置任务的名称。一个工作流定义中的节点名称是唯一的。 | |
||||||
|
| 运行标志 | 标识这个节点是否需要正常调度,如果不需要执行,可以打开禁止执行开关。 | |
||||||
|
| 描述 | 描述该节点的功能。 | |
||||||
|
| 任务优先级 | worker线程数不足时,根据优先级从高到低依次执行,优先级一样时根据先进先出原则执行。 | |
||||||
|
| Worker分组 | 任务分配给worker组的机器机执行,选择Default,会随机选择一台worker机执行。 | |
||||||
|
| 任务组名称 | 任务资源组,如果没有配置的话就不会生效。 | |
||||||
|
| 环境名称 | 配置任务执行的环境。 | |
||||||
|
| 失败重试次数 | 任务失败重新提交的次数,支持下拉和手填。 | |
||||||
|
| 失败重试间隔 | 任务失败重新提交任务的时间间隔,支持下拉和手填。 | |
||||||
|
| CPU 配额 | 为执行的任务分配指定的CPU时间配额,单位百分比,默认-1代表不限制,例如1个核心的CPU满载是100%,16个核心的是1600%。 [task.resource.limit.state](../../architecture/configuration.md) | |
||||||
|
| 最大内存 | 为执行的任务分配指定的内存大小,超过会触发OOM被Kill同时不会进行自动重试,单位MB,默认-1代表不限制。这个功能由 [task.resource.limit.state](../../architecture/configuration.md) 控制。 | |
||||||
|
| 超时告警 | 勾选超时告警、超时失败,当任务超过"超时时长"后,会发送告警邮件并且任务执行失败.这个功能由 [task.resource.limit.state](../../architecture/configuration.md) 控制。 | |
||||||
|
| Hive Cli 任务类型 | Hive Cli任务执行方式,可以选择`FROM_SCRIPT`或者`FROM_FILE`。 | |
||||||
|
| Hive SQL 脚本 | 手动填入您的Hive SQL脚本语句。 | |
||||||
|
| Hive Cli 选项 | Hive Cli的其他选项,如`--verbose`。 | |
||||||
|
| 资源 | 如果您选择`FROM_FILE`作为Hive Cli任务类型,您需要在资源中选择Hive SQL文件。 | |
||||||
|
|
||||||
|
## 任务样例 |
||||||
|
|
||||||
|
### Hive CLI任务样例 |
||||||
|
|
||||||
|
下面的样例演示了如何使用`Hive CLI`任务节点执行Hive SQL脚本语句: |
||||||
|
|
||||||
|
![demo-hive-cli-from-script](../../../../img/tasks/demo/hive_cli_from_script.png) |
||||||
|
|
||||||
|
下面的样例演示了如何使用`Hive CLI`任务节点从资源中心的Hive SQL |
||||||
|
|
||||||
|
![demo-hive-cli-from-file](../../../../img/tasks/demo/hive_cli_from_file.png) |
||||||
|
|
After Width: | Height: | Size: 379 KiB |
After Width: | Height: | Size: 386 KiB |
After Width: | Height: | Size: 389 KiB |
@ -0,0 +1,40 @@ |
|||||||
|
<?xml version="1.0" encoding="UTF-8"?> |
||||||
|
<!-- |
||||||
|
~ Licensed to the Apache Software Foundation (ASF) under one or more |
||||||
|
~ contributor license agreements. See the NOTICE file distributed with |
||||||
|
~ this work for additional information regarding copyright ownership. |
||||||
|
~ The ASF licenses this file to You under the Apache License, Version 2.0 |
||||||
|
~ (the "License"); you may not use this file except in compliance with |
||||||
|
~ the License. You may obtain a copy of the License at |
||||||
|
~ |
||||||
|
~ http://www.apache.org/licenses/LICENSE-2.0 |
||||||
|
~ |
||||||
|
~ Unless required by applicable law or agreed to in writing, software |
||||||
|
~ distributed under the License is distributed on an "AS IS" BASIS, |
||||||
|
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||||
|
~ See the License for the specific language governing permissions and |
||||||
|
~ limitations under the License. |
||||||
|
--> |
||||||
|
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" |
||||||
|
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> |
||||||
|
<modelVersion>4.0.0</modelVersion> |
||||||
|
<parent> |
||||||
|
<groupId>org.apache.dolphinscheduler</groupId> |
||||||
|
<artifactId>dolphinscheduler-task-plugin</artifactId> |
||||||
|
<version>dev-SNAPSHOT</version> |
||||||
|
</parent> |
||||||
|
|
||||||
|
<artifactId>dolphinscheduler-task-hivecli</artifactId> |
||||||
|
<packaging>jar</packaging> |
||||||
|
|
||||||
|
<dependencies> |
||||||
|
<dependency> |
||||||
|
<groupId>org.apache.dolphinscheduler</groupId> |
||||||
|
<artifactId>dolphinscheduler-spi</artifactId> |
||||||
|
</dependency> |
||||||
|
<dependency> |
||||||
|
<groupId>org.apache.dolphinscheduler</groupId> |
||||||
|
<artifactId>dolphinscheduler-task-api</artifactId> |
||||||
|
</dependency> |
||||||
|
</dependencies> |
||||||
|
</project> |
@ -0,0 +1,33 @@ |
|||||||
|
/* |
||||||
|
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||||
|
* contributor license agreements. See the NOTICE file distributed with |
||||||
|
* this work for additional information regarding copyright ownership. |
||||||
|
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||||
|
* (the "License"); you may not use this file except in compliance with |
||||||
|
* the License. You may obtain a copy of the License at |
||||||
|
* |
||||||
|
* http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
* |
||||||
|
* Unless required by applicable law or agreed to in writing, software |
||||||
|
* distributed under the License is distributed on an "AS IS" BASIS, |
||||||
|
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||||
|
* See the License for the specific language governing permissions and |
||||||
|
* limitations under the License. |
||||||
|
*/ |
||||||
|
|
||||||
|
package org.apache.dolphinscheduler.plugin.task.hivecli; |
||||||
|
|
||||||
|
import lombok.experimental.UtilityClass; |
||||||
|
|
||||||
|
@UtilityClass |
||||||
|
public class HiveCliConstants { |
||||||
|
|
||||||
|
public static final String TYPE_SCRIPT = "SCRIPT"; |
||||||
|
|
||||||
|
public static final String TYPE_FILE = "FILE"; |
||||||
|
|
||||||
|
public static final String HIVE_CLI_EXECUTE_FILE = "hive -f"; |
||||||
|
|
||||||
|
public static final String HIVE_CLI_EXECUTE_SCRIPT = "hive -e \"%s\""; |
||||||
|
|
||||||
|
} |
@ -0,0 +1,59 @@ |
|||||||
|
/* |
||||||
|
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||||
|
* contributor license agreements. See the NOTICE file distributed with |
||||||
|
* this work for additional information regarding copyright ownership. |
||||||
|
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||||
|
* (the "License"); you may not use this file except in compliance with |
||||||
|
* the License. You may obtain a copy of the License at |
||||||
|
* |
||||||
|
* http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
* |
||||||
|
* Unless required by applicable law or agreed to in writing, software |
||||||
|
* distributed under the License is distributed on an "AS IS" BASIS, |
||||||
|
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||||
|
* See the License for the specific language governing permissions and |
||||||
|
* limitations under the License. |
||||||
|
*/ |
||||||
|
|
||||||
|
package org.apache.dolphinscheduler.plugin.task.hivecli; |
||||||
|
|
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.model.ResourceInfo; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.parameters.AbstractParameters; |
||||||
|
import org.apache.dolphinscheduler.spi.utils.StringUtils; |
||||||
|
|
||||||
|
import java.util.List; |
||||||
|
|
||||||
|
import lombok.Data; |
||||||
|
|
||||||
|
@Data |
||||||
|
public class HiveCliParameters extends AbstractParameters { |
||||||
|
|
||||||
|
private String hiveSqlScript; |
||||||
|
|
||||||
|
private String hiveCliTaskExecutionType; |
||||||
|
|
||||||
|
private String hiveCliOptions; |
||||||
|
|
||||||
|
private List<ResourceInfo> resourceList; |
||||||
|
|
||||||
|
@Override |
||||||
|
public boolean checkParameters() { |
||||||
|
if (!StringUtils.isNotEmpty(hiveCliTaskExecutionType)) { |
||||||
|
return false; |
||||||
|
} |
||||||
|
|
||||||
|
if (HiveCliConstants.TYPE_SCRIPT.equals(hiveCliTaskExecutionType)) { |
||||||
|
return StringUtils.isNotEmpty(hiveSqlScript); |
||||||
|
} else if (HiveCliConstants.TYPE_FILE.equals(hiveCliTaskExecutionType)) { |
||||||
|
return (resourceList != null) && (resourceList.size() > 0); |
||||||
|
} else { |
||||||
|
return false; |
||||||
|
} |
||||||
|
|
||||||
|
} |
||||||
|
|
||||||
|
@Override |
||||||
|
public List<ResourceInfo> getResourceFilesList() { |
||||||
|
return this.resourceList; |
||||||
|
} |
||||||
|
} |
@ -0,0 +1,133 @@ |
|||||||
|
/* |
||||||
|
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||||
|
* contributor license agreements. See the NOTICE file distributed with |
||||||
|
* this work for additional information regarding copyright ownership. |
||||||
|
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||||
|
* (the "License"); you may not use this file except in compliance with |
||||||
|
* the License. You may obtain a copy of the License at |
||||||
|
* |
||||||
|
* http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
* |
||||||
|
* Unless required by applicable law or agreed to in writing, software |
||||||
|
* distributed under the License is distributed on an "AS IS" BASIS, |
||||||
|
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||||
|
* See the License for the specific language governing permissions and |
||||||
|
* limitations under the License. |
||||||
|
*/ |
||||||
|
|
||||||
|
package org.apache.dolphinscheduler.plugin.task.hivecli; |
||||||
|
|
||||||
|
import static org.apache.dolphinscheduler.plugin.task.api.TaskConstants.EXIT_CODE_FAILURE; |
||||||
|
|
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.AbstractTaskExecutor; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.ShellCommandExecutor; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.TaskException; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.TaskExecutionContext; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.model.Property; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.model.ResourceInfo; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.model.TaskResponse; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.parameters.AbstractParameters; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.parser.ParamUtils; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.parser.ParameterUtils; |
||||||
|
import org.apache.dolphinscheduler.spi.utils.JSONUtils; |
||||||
|
|
||||||
|
import org.apache.commons.lang3.StringUtils; |
||||||
|
|
||||||
|
import java.util.ArrayList; |
||||||
|
import java.util.List; |
||||||
|
import java.util.Map; |
||||||
|
|
||||||
|
public class HiveCliTask extends AbstractTaskExecutor { |
||||||
|
|
||||||
|
private HiveCliParameters hiveCliParameters; |
||||||
|
|
||||||
|
private final ShellCommandExecutor shellCommandExecutor; |
||||||
|
|
||||||
|
private final TaskExecutionContext taskExecutionContext; |
||||||
|
|
||||||
|
public HiveCliTask(TaskExecutionContext taskExecutionContext) { |
||||||
|
super(taskExecutionContext); |
||||||
|
this.taskExecutionContext = taskExecutionContext; |
||||||
|
|
||||||
|
this.shellCommandExecutor = new ShellCommandExecutor(this::logHandle, |
||||||
|
taskExecutionContext, |
||||||
|
logger); |
||||||
|
} |
||||||
|
|
||||||
|
@Override |
||||||
|
public void init() { |
||||||
|
logger.info("hiveCli task params {}", taskExecutionContext.getTaskParams()); |
||||||
|
|
||||||
|
hiveCliParameters = JSONUtils.parseObject(taskExecutionContext.getTaskParams(), HiveCliParameters.class); |
||||||
|
|
||||||
|
if (!hiveCliParameters.checkParameters()) { |
||||||
|
throw new TaskException("hiveCli task params is not valid"); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
@Override |
||||||
|
public void handle() throws TaskException { |
||||||
|
try { |
||||||
|
final TaskResponse taskResponse = shellCommandExecutor.run(buildCommand()); |
||||||
|
setExitStatusCode(taskResponse.getExitStatusCode()); |
||||||
|
setAppIds(taskResponse.getAppIds()); |
||||||
|
setProcessId(taskResponse.getProcessId()); |
||||||
|
setVarPool(shellCommandExecutor.getVarPool()); |
||||||
|
} catch (InterruptedException e) { |
||||||
|
Thread.currentThread().interrupt(); |
||||||
|
logger.error("The current HiveCLI Task has been interrupted", e); |
||||||
|
setExitStatusCode(EXIT_CODE_FAILURE); |
||||||
|
throw new TaskException("The current HiveCLI Task has been interrupted", e); |
||||||
|
} catch (Exception e) { |
||||||
|
logger.error("hiveCli task failure", e); |
||||||
|
setExitStatusCode(EXIT_CODE_FAILURE); |
||||||
|
throw new TaskException("run hiveCli task error", e); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
protected String buildCommand() { |
||||||
|
|
||||||
|
final List<String> args = new ArrayList<>(); |
||||||
|
|
||||||
|
final String type = hiveCliParameters.getHiveCliTaskExecutionType(); |
||||||
|
|
||||||
|
// TODO: make sure type is not unknown
|
||||||
|
if (HiveCliConstants.TYPE_FILE.equals(type)) { |
||||||
|
args.add(HiveCliConstants.HIVE_CLI_EXECUTE_FILE); |
||||||
|
final List<ResourceInfo> resourceInfos = hiveCliParameters.getResourceList(); |
||||||
|
if (resourceInfos.size() > 1) { |
||||||
|
logger.warn("more than 1 files detected, use the first one by default"); |
||||||
|
} |
||||||
|
|
||||||
|
args.add(StringUtils.stripStart(resourceInfos.get(0).getResourceName(), "/")); |
||||||
|
} else { |
||||||
|
final String script = hiveCliParameters.getHiveSqlScript(); |
||||||
|
args.add(String.format(HiveCliConstants.HIVE_CLI_EXECUTE_SCRIPT, script)); |
||||||
|
} |
||||||
|
|
||||||
|
final String hiveCliOptions = hiveCliParameters.getHiveCliOptions(); |
||||||
|
if (StringUtils.isNotEmpty(hiveCliOptions)) { |
||||||
|
args.add(hiveCliOptions); |
||||||
|
} |
||||||
|
|
||||||
|
final Map<String, Property> paramsMap = taskExecutionContext.getPrepareParamsMap(); |
||||||
|
final String command = |
||||||
|
ParameterUtils.convertParameterPlaceholders(String.join(" ", args), ParamUtils.convert(paramsMap)); |
||||||
|
|
||||||
|
logger.info("hiveCli task command: {}", command); |
||||||
|
|
||||||
|
return command; |
||||||
|
|
||||||
|
} |
||||||
|
|
||||||
|
@Override |
||||||
|
public AbstractParameters getParameters() { |
||||||
|
return hiveCliParameters; |
||||||
|
} |
||||||
|
|
||||||
|
@Override |
||||||
|
public void cancelApplication(boolean cancelApplication) throws Exception { |
||||||
|
shellCommandExecutor.cancelApplication(); |
||||||
|
} |
||||||
|
|
||||||
|
} |
@ -0,0 +1,49 @@ |
|||||||
|
/* |
||||||
|
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||||
|
* contributor license agreements. See the NOTICE file distributed with |
||||||
|
* this work for additional information regarding copyright ownership. |
||||||
|
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||||
|
* (the "License"); you may not use this file except in compliance with |
||||||
|
* the License. You may obtain a copy of the License at |
||||||
|
* |
||||||
|
* http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
* |
||||||
|
* Unless required by applicable law or agreed to in writing, software |
||||||
|
* distributed under the License is distributed on an "AS IS" BASIS, |
||||||
|
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||||
|
* See the License for the specific language governing permissions and |
||||||
|
* limitations under the License. |
||||||
|
*/ |
||||||
|
|
||||||
|
package org.apache.dolphinscheduler.plugin.task.hivecli; |
||||||
|
|
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.AbstractTask; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.TaskChannel; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.TaskExecutionContext; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.parameters.AbstractParameters; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.parameters.ParametersNode; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.parameters.resource.ResourceParametersHelper; |
||||||
|
import org.apache.dolphinscheduler.spi.utils.JSONUtils; |
||||||
|
|
||||||
|
public class HiveCliTaskChannel implements TaskChannel { |
||||||
|
|
||||||
|
@Override |
||||||
|
public void cancelApplication(boolean status) { |
||||||
|
|
||||||
|
} |
||||||
|
|
||||||
|
@Override |
||||||
|
public AbstractTask createTask(TaskExecutionContext taskExecutionContext) { |
||||||
|
return new HiveCliTask(taskExecutionContext); |
||||||
|
} |
||||||
|
|
||||||
|
@Override |
||||||
|
public AbstractParameters parseParameters(ParametersNode parametersNode) { |
||||||
|
return JSONUtils.parseObject(parametersNode.getTaskParams(), HiveCliParameters.class); |
||||||
|
} |
||||||
|
|
||||||
|
@Override |
||||||
|
public ResourceParametersHelper getResources(String parameters) { |
||||||
|
return null; |
||||||
|
} |
||||||
|
} |
@ -0,0 +1,45 @@ |
|||||||
|
/* |
||||||
|
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||||
|
* contributor license agreements. See the NOTICE file distributed with |
||||||
|
* this work for additional information regarding copyright ownership. |
||||||
|
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||||
|
* (the "License"); you may not use this file except in compliance with |
||||||
|
* the License. You may obtain a copy of the License at |
||||||
|
* |
||||||
|
* http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
* |
||||||
|
* Unless required by applicable law or agreed to in writing, software |
||||||
|
* distributed under the License is distributed on an "AS IS" BASIS, |
||||||
|
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||||
|
* See the License for the specific language governing permissions and |
||||||
|
* limitations under the License. |
||||||
|
*/ |
||||||
|
|
||||||
|
package org.apache.dolphinscheduler.plugin.task.hivecli; |
||||||
|
|
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.TaskChannel; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.TaskChannelFactory; |
||||||
|
import org.apache.dolphinscheduler.spi.params.base.PluginParams; |
||||||
|
|
||||||
|
import java.util.List; |
||||||
|
|
||||||
|
import com.google.auto.service.AutoService; |
||||||
|
|
||||||
|
@AutoService(TaskChannelFactory.class) |
||||||
|
public class HiveCliTaskChannelFactory implements TaskChannelFactory { |
||||||
|
|
||||||
|
@Override |
||||||
|
public TaskChannel create() { |
||||||
|
return new HiveCliTaskChannel(); |
||||||
|
} |
||||||
|
|
||||||
|
@Override |
||||||
|
public String getName() { |
||||||
|
return "HIVECLI"; |
||||||
|
} |
||||||
|
|
||||||
|
@Override |
||||||
|
public List<PluginParams> getParams() { |
||||||
|
return null; |
||||||
|
} |
||||||
|
} |
@ -0,0 +1,105 @@ |
|||||||
|
/* |
||||||
|
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||||
|
* contributor license agreements. See the NOTICE file distributed with |
||||||
|
* this work for additional information regarding copyright ownership. |
||||||
|
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||||
|
* (the "License"); you may not use this file except in compliance with |
||||||
|
* the License. You may obtain a copy of the License at |
||||||
|
* |
||||||
|
* http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
* |
||||||
|
* Unless required by applicable law or agreed to in writing, software |
||||||
|
* distributed under the License is distributed on an "AS IS" BASIS, |
||||||
|
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||||
|
* See the License for the specific language governing permissions and |
||||||
|
* limitations under the License. |
||||||
|
*/ |
||||||
|
|
||||||
|
package org.apache.dolphinscheduler.plugin.task.hivecli; |
||||||
|
|
||||||
|
import static org.mockito.Mockito.spy; |
||||||
|
import static org.mockito.Mockito.when; |
||||||
|
|
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.TaskExecutionContext; |
||||||
|
import org.apache.dolphinscheduler.plugin.task.api.model.ResourceInfo; |
||||||
|
import org.apache.dolphinscheduler.spi.utils.JSONUtils; |
||||||
|
|
||||||
|
import java.util.ArrayList; |
||||||
|
import java.util.List; |
||||||
|
|
||||||
|
import org.junit.Assert; |
||||||
|
import org.junit.Test; |
||||||
|
import org.junit.runner.RunWith; |
||||||
|
import org.mockito.Mockito; |
||||||
|
import org.mockito.junit.MockitoJUnitRunner; |
||||||
|
|
||||||
|
@RunWith(MockitoJUnitRunner.class) |
||||||
|
public class HiveCliTaskTest { |
||||||
|
|
||||||
|
public static final String EXPECTED_HIVE_CLI_TASK_EXECUTE_FROM_SCRIPT_COMMAND = |
||||||
|
"hive -e \"SHOW DATABASES;\""; |
||||||
|
|
||||||
|
public static final String EXPECTED_HIVE_CLI_TASK_EXECUTE_FROM_FILE_COMMAND = |
||||||
|
"hive -f sql_tasks/hive_task.sql"; |
||||||
|
|
||||||
|
public static final String EXPECTED_HIVE_CLI_TASK_EXECUTE_WITH_OPTIONS = |
||||||
|
"hive -e \"SHOW DATABASES;\" --verbose"; |
||||||
|
|
||||||
|
@Test |
||||||
|
public void hiveCliTaskExecuteSqlFromScript() throws Exception { |
||||||
|
String hiveCliTaskParameters = buildHiveCliTaskExecuteSqlFromScriptParameters(); |
||||||
|
HiveCliTask hiveCliTask = prepareHiveCliTaskForTest(hiveCliTaskParameters); |
||||||
|
hiveCliTask.init(); |
||||||
|
Assert.assertEquals(hiveCliTask.buildCommand(), EXPECTED_HIVE_CLI_TASK_EXECUTE_FROM_SCRIPT_COMMAND); |
||||||
|
} |
||||||
|
|
||||||
|
@Test |
||||||
|
public void hiveCliTaskExecuteSqlFromFile() throws Exception { |
||||||
|
String hiveCliTaskParameters = buildHiveCliTaskExecuteSqlFromFileParameters(); |
||||||
|
HiveCliTask hiveCliTask = prepareHiveCliTaskForTest(hiveCliTaskParameters); |
||||||
|
hiveCliTask.init(); |
||||||
|
Assert.assertEquals(hiveCliTask.buildCommand(), EXPECTED_HIVE_CLI_TASK_EXECUTE_FROM_FILE_COMMAND); |
||||||
|
} |
||||||
|
|
||||||
|
@Test |
||||||
|
public void hiveCliTaskExecuteWithOptions() throws Exception { |
||||||
|
String hiveCliTaskParameters = buildHiveCliTaskExecuteWithOptionsParameters(); |
||||||
|
HiveCliTask hiveCliTask = prepareHiveCliTaskForTest(hiveCliTaskParameters); |
||||||
|
hiveCliTask.init(); |
||||||
|
Assert.assertEquals(hiveCliTask.buildCommand(), EXPECTED_HIVE_CLI_TASK_EXECUTE_WITH_OPTIONS); |
||||||
|
} |
||||||
|
|
||||||
|
private HiveCliTask prepareHiveCliTaskForTest(final String hiveCliTaskParameters) { |
||||||
|
TaskExecutionContext taskExecutionContext = Mockito.mock(TaskExecutionContext.class); |
||||||
|
when(taskExecutionContext.getTaskParams()).thenReturn(hiveCliTaskParameters); |
||||||
|
HiveCliTask hiveCliTask = spy(new HiveCliTask(taskExecutionContext)); |
||||||
|
return hiveCliTask; |
||||||
|
} |
||||||
|
|
||||||
|
private String buildHiveCliTaskExecuteSqlFromScriptParameters() { |
||||||
|
final HiveCliParameters hiveCliParameters = new HiveCliParameters(); |
||||||
|
hiveCliParameters.setHiveCliTaskExecutionType("SCRIPT"); |
||||||
|
hiveCliParameters.setHiveSqlScript("SHOW DATABASES;"); |
||||||
|
return JSONUtils.toJsonString(hiveCliParameters); |
||||||
|
} |
||||||
|
|
||||||
|
private String buildHiveCliTaskExecuteSqlFromFileParameters() { |
||||||
|
final HiveCliParameters hiveCliParameters = new HiveCliParameters(); |
||||||
|
hiveCliParameters.setHiveCliTaskExecutionType("FILE"); |
||||||
|
List<ResourceInfo> resources = new ArrayList<>(); |
||||||
|
ResourceInfo sqlResource = new ResourceInfo(); |
||||||
|
sqlResource.setResourceName("/sql_tasks/hive_task.sql"); |
||||||
|
resources.add(sqlResource); |
||||||
|
hiveCliParameters.setResourceList(resources); |
||||||
|
return JSONUtils.toJsonString(hiveCliParameters); |
||||||
|
} |
||||||
|
|
||||||
|
private String buildHiveCliTaskExecuteWithOptionsParameters() { |
||||||
|
final HiveCliParameters hiveCliParameters = new HiveCliParameters(); |
||||||
|
hiveCliParameters.setHiveCliTaskExecutionType("SCRIPT"); |
||||||
|
hiveCliParameters.setHiveSqlScript("SHOW DATABASES;"); |
||||||
|
hiveCliParameters.setHiveCliOptions("--verbose"); |
||||||
|
return JSONUtils.toJsonString(hiveCliParameters); |
||||||
|
} |
||||||
|
|
||||||
|
} |
After Width: | Height: | Size: 91 KiB |
After Width: | Height: | Size: 11 KiB |
@ -0,0 +1,62 @@ |
|||||||
|
/* |
||||||
|
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||||
|
* contributor license agreements. See the NOTICE file distributed with |
||||||
|
* this work for additional information regarding copyright ownership. |
||||||
|
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||||
|
* (the "License"); you may not use this file except in compliance with |
||||||
|
* the License. You may obtain a copy of the License at |
||||||
|
* |
||||||
|
* http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
* |
||||||
|
* Unless required by applicable law or agreed to in writing, software |
||||||
|
* distributed under the License is distributed on an "AS IS" BASIS, |
||||||
|
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||||
|
* See the License for the specific language governing permissions and |
||||||
|
* limitations under the License. |
||||||
|
*/ |
||||||
|
import { useI18n } from 'vue-i18n' |
||||||
|
import { useCustomParams, useResources } from '.' |
||||||
|
import type { IJsonItem } from '../types' |
||||||
|
|
||||||
|
export function useHiveCli(model: { [field: string]: any }): IJsonItem[] { |
||||||
|
const { t } = useI18n() |
||||||
|
|
||||||
|
return [ |
||||||
|
{ |
||||||
|
type: 'select', |
||||||
|
field: 'hiveCliTaskExecutionType', |
||||||
|
span: 12, |
||||||
|
name: t('project.node.hive_cli_task_execution_type'), |
||||||
|
options: HIVE_CLI_TASK_EXECUTION_TYPES |
||||||
|
}, |
||||||
|
{ |
||||||
|
type: 'editor', |
||||||
|
field: 'hiveSqlScript', |
||||||
|
name: t('project.node.hive_sql_script'), |
||||||
|
props: { |
||||||
|
language: 'sql' |
||||||
|
} |
||||||
|
}, |
||||||
|
{ |
||||||
|
type: 'input', |
||||||
|
field: 'hiveCliOptions', |
||||||
|
name: t('project.node.hive_cli_options'), |
||||||
|
props: { |
||||||
|
placeholder: t('project.node.hive_cli_options_tips') |
||||||
|
} |
||||||
|
}, |
||||||
|
useResources(), |
||||||
|
...useCustomParams({ model, field: 'localParams', isSimple: false }) |
||||||
|
] |
||||||
|
} |
||||||
|
|
||||||
|
export const HIVE_CLI_TASK_EXECUTION_TYPES = [ |
||||||
|
{ |
||||||
|
label: 'FROM_SCRIPT', |
||||||
|
value: 'SCRIPT' |
||||||
|
}, |
||||||
|
{ |
||||||
|
label: 'FROM_FILE', |
||||||
|
value: 'FILE' |
||||||
|
} |
||||||
|
] |
@ -0,0 +1,80 @@ |
|||||||
|
/* |
||||||
|
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||||
|
* contributor license agreements. See the NOTICE file distributed with |
||||||
|
* this work for additional information regarding copyright ownership. |
||||||
|
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||||
|
* (the "License"); you may not use this file except in compliance with |
||||||
|
* the License. You may obtain a copy of the License at |
||||||
|
* |
||||||
|
* http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
* |
||||||
|
* Unless required by applicable law or agreed to in writing, software |
||||||
|
* distributed under the License is distributed on an "AS IS" BASIS, |
||||||
|
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||||
|
* See the License for the specific language governing permissions and |
||||||
|
* limitations under the License. |
||||||
|
*/ |
||||||
|
|
||||||
|
import { reactive } from 'vue' |
||||||
|
import * as Fields from '../fields/index' |
||||||
|
import type { IJsonItem, INodeData, ITaskData } from '../types' |
||||||
|
|
||||||
|
export function useHiveCli({ |
||||||
|
projectCode, |
||||||
|
from = 0, |
||||||
|
readonly, |
||||||
|
data |
||||||
|
}: { |
||||||
|
projectCode: number |
||||||
|
from?: number |
||||||
|
readonly?: boolean |
||||||
|
data?: ITaskData |
||||||
|
}) { |
||||||
|
const model = reactive({ |
||||||
|
name: '', |
||||||
|
taskType: 'HIVECLI', |
||||||
|
flag: 'YES', |
||||||
|
description: '', |
||||||
|
timeoutFlag: false, |
||||||
|
localParams: [], |
||||||
|
environmentCode: null, |
||||||
|
failRetryInterval: 1, |
||||||
|
failRetryTimes: 0, |
||||||
|
workerGroup: 'default', |
||||||
|
delayTime: 0, |
||||||
|
timeout: 30 |
||||||
|
} as INodeData) |
||||||
|
|
||||||
|
let extra: IJsonItem[] = [] |
||||||
|
if (from === 1) { |
||||||
|
extra = [ |
||||||
|
Fields.useTaskType(model, readonly), |
||||||
|
Fields.useProcessName({ |
||||||
|
model, |
||||||
|
projectCode, |
||||||
|
isCreate: !data?.id, |
||||||
|
from, |
||||||
|
processName: data?.processName |
||||||
|
}) |
||||||
|
] |
||||||
|
} |
||||||
|
|
||||||
|
return { |
||||||
|
json: [ |
||||||
|
Fields.useName(from), |
||||||
|
...extra, |
||||||
|
Fields.useRunFlag(), |
||||||
|
Fields.useDescription(), |
||||||
|
Fields.useTaskPriority(), |
||||||
|
Fields.useWorkerGroup(), |
||||||
|
Fields.useEnvironmentName(model, !model.id), |
||||||
|
...Fields.useTaskGroup(model, projectCode), |
||||||
|
...Fields.useFailed(), |
||||||
|
Fields.useDelayTime(model), |
||||||
|
...Fields.useTimeoutAlarm(model), |
||||||
|
...Fields.useHiveCli(model), |
||||||
|
Fields.usePreTasks() |
||||||
|
] as IJsonItem[], |
||||||
|
model |
||||||
|
} |
||||||
|
} |
Loading…
Reference in new issue