Browse Source

[Feature][Plugin]Task (#6105)

* [Improvement][dao]When I search for the keyword description, the web UI shows empty (#5952)

* [Bug][WorkerServer] SqlTask NullPointerException #5549

* [Improvement][dao]When I search for the keyword Modify User, the web UI shows empty #5428

* [Improvement][dao]When I search for the keyword Modify User, the web UI shows empty #5428

* [Improvement][dao]When I search for the keyword Modify User, the web UI shows empty #5428

* [Improvement][dao]When I search for the keyword Modify User, the web UI shows empty #5428

* [Improvement][dao]When I search for the keyword Modify User, the web UI shows empty #5428

* [Improvement][dao]When I search for the keyword Modify User, the web UI shows empty #5428

* [Improvement][dao]When I search for the keyword description, the web UI shows empty #5428

* fix the readme typing issue (#5998)

* Fix unchecked type conversions

* Use indentation level reported by checkstyle

* Reorganize CI workflows to fasten the wasted time and resources (#6011)

* Add standalone server module to make it easier to develop (#6022)

* Task node of SWITCH (#5939)

* [Feature-#5273][server-master] Task node of SWITCH (#5922)



Co-authored-by: wangxj <wangxj31>

* remove description of bonecp (#6030)

Co-authored-by: shaojwu <shaojwu@ebay.com>

* [Improvement][Api Module]split alert group list-paging interface (#5941)

* [Improvement][Api Module]split alert group list-paging interface

* [FIX-#6007]Wrong complement date (#6026)

* [FIX-#6007]Wrong complement date

* [style]Wrong complement date

* [Improvement-6024][dist] Remove useless packaging commands (#6029)

·Remove useless packaging commands in dolphinscheduler-bin.xml

This closes #6024

Co-authored-by: mask <liuhu@zhiyoutec.com>

* [FIX-5908][MasterServer] When executing an compensation task, the execution thread would have a NPE (#5909)

* fix the npe in MasterExec

* fix the compile error

* Add `.asf.yaml` to easily set the GitHub metadata (#6035)

* fix dead server cannot stop (#6046)

* Enhancement Translation (#6042)

* replaced Loading... with i18n

* modified Edit zh_CN translation

* Delete zh_CN.js

Co-authored-by: David <dailidong66@gmail.com>

* fix bug #6053 zh_CN.js is lost

* [Feature][Plugin]Task

* fix license head

* fix code style

* fix code style

* fix build err

Co-authored-by: didiaode18 <563646039@qq.com>
Co-authored-by: Roy <yongjuncao1213@gmail.com>
Co-authored-by: lyxell <alyxell@kth.se>
Co-authored-by: Wenjun Ruan <wenjun@apache.org>
Co-authored-by: kezhenxu94 <kezhenxu94@apache.org>
Co-authored-by: myangle1120 <942542838@qq.com>
Co-authored-by: wangxj3 <857234426@qq.com>
Co-authored-by: gabry.wu <gabrywu@apache.org>
Co-authored-by: shaojwu <shaojwu@ebay.com>
Co-authored-by: Shukun Zhang <60541766+andream7@users.noreply.github.com>
Co-authored-by: linquan <1175687813@qq.com>
Co-authored-by: mask <39329477+Narcasserun@users.noreply.github.com>
Co-authored-by: mask <liuhu@zhiyoutec.com>
Co-authored-by: kyoty <echohlne@gmail.com>
Co-authored-by: RichardStark <49977764+RichardStark@users.noreply.github.com>
Co-authored-by: David <dailidong66@gmail.com>
Co-authored-by: lenboo <baoliang.leon@gmail.com>
2.0.0-release
Kirs 3 years ago committed by GitHub
parent
commit
4715de11fc
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
  1. 47
      .asf.yaml
  2. 1
      .github/actions/reviewdog-setup
  3. 53
      .github/actions/sanity-check/action.yml
  4. 38
      .github/workflows/backend.yml
  5. 22
      .github/workflows/e2e.yml
  6. 31
      .github/workflows/frontend.yml
  7. 99
      .github/workflows/unit-test.yml
  8. 3
      .gitmodules
  9. 5
      .licenserc.yaml
  10. 2
      README_zh_CN.md
  11. 4
      docker/build/hooks/build
  12. 13
      dolphinscheduler-alert-plugin/dolphinscheduler-alert-email/pom.xml
  13. 23
      dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/controller/AlertGroupController.java
  14. 1
      dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/enums/Status.java
  15. 8
      dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/AlertGroupService.java
  16. 41
      dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/AlertGroupServiceImpl.java
  17. 38
      dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/controller/AlertGroupControllerTest.java
  18. 28
      dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/AlertGroupServiceTest.java
  19. 1
      dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/Constants.java
  20. 66
      dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/enums/DbType.java
  21. 4
      dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/enums/TaskType.java
  22. 12
      dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/graph/DAG.java
  23. 19
      dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/model/TaskNode.java
  24. 91
      dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/task/switchtask/SwitchParameters.java
  25. 49
      dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/task/switchtask/SwitchResultVo.java
  26. 18
      dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/utils/PropertyUtils.java
  27. 3
      dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/utils/TaskParametersUtils.java
  28. 1
      dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/datasource/SpringConnectionFactory.java
  29. 25
      dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskInstance.java
  30. 10
      dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/mapper/AlertGroupMapper.java
  31. 50
      dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/utils/DagHelper.java
  32. 38
      dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/vo/AlertGroupVo.java
  33. 11
      dolphinscheduler-dao/src/main/resources/org/apache/dolphinscheduler/dao/mapper/AlertGroupMapper.xml
  34. 3
      dolphinscheduler-dao/src/main/resources/org/apache/dolphinscheduler/dao/mapper/WorkFlowLineageMapper.xml
  35. 99
      dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/utils/DagHelperTest.java
  36. 2
      dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/utils/ResourceProcessDefinitionUtilsTest.java
  37. 7
      dolphinscheduler-dist/pom.xml
  38. 3
      dolphinscheduler-dist/release-docs/LICENSE
  39. 9
      dolphinscheduler-dist/src/main/assembly/dolphinscheduler-bin.xml
  40. 27
      dolphinscheduler-dist/src/main/provisio/dolphinscheduler.xml
  41. 11
      dolphinscheduler-registry-plugin/dolphinscheduler-registry-zookeeper/src/main/java/org/apache/dolphinscheduler/plugin/registry/zookeeper/ZookeeperRegistry.java
  42. 14
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/entity/TaskExecutionContext.java
  43. 2
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/master/dispatch/executor/NettyExecutorManager.java
  44. 15
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/master/runner/MasterBaseTaskExecThread.java
  45. 19
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/master/runner/MasterExecThread.java
  46. 180
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/master/runner/SwitchTaskExecThread.java
  47. 38
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/utils/SwitchTaskUtils.java
  48. 2
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/WorkerServer.java
  49. 4
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/plugin/TaskPluginManager.java
  50. 25
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/processor/TaskExecuteProcessor.java
  51. 1
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/runner/RetryReportTaskStatusThread.java
  52. 6
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/runner/TaskExecuteThread.java
  53. 1
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/runner/WorkerManagerThread.java
  54. 553
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/AbstractCommandExecutor.java
  55. 193
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/AbstractTask.java
  56. 95
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/AbstractYarnTask.java
  57. 69
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/CommandExecuteResult.java
  58. 188
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/PythonCommandExecutor.java
  59. 119
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/ShellCommandExecutor.java
  60. 80
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/TaskManager.java
  61. 346
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/TaskProps.java
  62. 580
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/datax/DataxTask.java
  63. 143
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/flink/FlinkTask.java
  64. 332
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/http/HttpTask.java
  65. 145
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/mr/MapReduceTask.java
  66. 341
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/procedure/ProcedureTask.java
  67. 144
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/python/PythonTask.java
  68. 185
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/shell/ShellTask.java
  69. 155
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/spark/SparkTask.java
  70. 510
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sql/SqlTask.java
  71. 75
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/SqoopConstants.java
  72. 95
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/SqoopTask.java
  73. 85
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/CommonGenerator.java
  74. 124
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/SqoopJobGenerator.java
  75. 63
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/sources/HdfsSourceGenerator.java
  76. 73
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/sources/HiveSourceGenerator.java
  77. 135
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/sources/MysqlSourceGenerator.java
  78. 86
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/targets/HdfsTargetGenerator.java
  79. 91
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/targets/HiveTargetGenerator.java
  80. 115
      dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/targets/MysqlTargetGenerator.java
  81. 6
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/master/MasterExecThreadTest.java
  82. 167
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/master/SwitchTaskTest.java
  83. 4
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/processor/TaskExecuteProcessorTest.java
  84. 192
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/runner/TaskExecuteThreadTest.java
  85. 190
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/runner/WorkerManagerThreadTest.java
  86. 250
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/shell/ShellCommandExecutorTest.java
  87. 144
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/sql/SqlExecutorTest.java
  88. 58
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/PythonCommandExecutorTest.java
  89. 115
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/ShellTaskReturnTest.java
  90. 192
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/TaskManagerTest.java
  91. 77
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/TaskParamsTest.java
  92. 493
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/datax/DataxTaskTest.java
  93. 208
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/http/HttpTaskTest.java
  94. 119
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/procedure/ProcedureTaskTest.java
  95. 125
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/shell/ShellTaskTest.java
  96. 134
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/spark/SparkTaskTest.java
  97. 241
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/sql/SqlTaskTest.java
  98. 228
      dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/sqoop/SqoopTaskTest.java
  99. 1
      dolphinscheduler-service/src/main/java/org/apache/dolphinscheduler/service/process/ProcessService.java
  100. 14
      dolphinscheduler-service/src/test/java/org/apache/dolphinscheduler/service/process/ProcessServiceTest.java
  101. Some files were not shown because too many files have changed in this diff Show More

47
.asf.yaml

@ -0,0 +1,47 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
github:
description: |
Apache DolphinScheduler is a distributed and extensible workflow scheduler platform with powerful DAG
visual interfaces, dedicated to solving complex job dependencies in the data pipeline and providing
various types of jobs available `out of the box`.
homepage: https://dolphinscheduler.apache.org/
labels:
- airflow
- schedule
- job-scheduler
- oozie
- task-scheduler
- azkaban
- distributed-schedule-system
- workflow-scheduling-system
- etl-dependency
- workflow-platform
- cronjob-schedule
- job-schedule
- task-schedule
- workflow-schedule
- data-schedule
enabled_merge_buttons:
squash: true
merge: false
rebase: false
protected_branches:
dev:
required_status_checks:
strict: true

1
.github/actions/reviewdog-setup

@ -0,0 +1 @@
Subproject commit 2fc905b1875f2e6b91c4201a4dc6eaa21b86547e

53
.github/actions/sanity-check/action.yml

@ -0,0 +1,53 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
name: "Sanity Check"
description: |
Action to perform some very basic lightweight checks, like code styles, license headers, etc.,
and fail fast to avoid wasting resources running heavyweight checks, like unit tests, e2e tests.
inputs:
token:
description: 'The GitHub API token'
required: false
runs:
using: "composite"
steps:
- name: Check License Header
uses: apache/skywalking-eyes@a63f4afcc287dfb3727ecc45a4afc55a5e69c15f
- uses: ./.github/actions/reviewdog-setup
with:
reviewdog_version: v0.10.2
- shell: bash
run: ./mvnw -B -q checkstyle:checkstyle-aggregate
- shell: bash
env:
REVIEWDOG_GITHUB_API_TOKEN: ${{ inputs.token }}
run: |
if [[ -n "${{ inputs.token }}" ]]; then
reviewdog -f=checkstyle \
-reporter="github-pr-check" \
-filter-mode="added" \
-fail-on-error="true" < target/checkstyle-result.xml
fi

38
.github/workflows/ci_backend.yml → .github/workflows/backend.yml

@ -19,8 +19,10 @@ name: Backend
on:
push:
branches:
- dev
paths:
- '.github/workflows/ci_backend.yml'
- '.github/workflows/backend.yml'
- 'package.xml'
- 'pom.xml'
- 'dolphinscheduler-alert/**'
@ -31,7 +33,7 @@ on:
- 'dolphinscheduler-server/**'
pull_request:
paths:
- '.github/workflows/ci_backend.yml'
- '.github/workflows/backend.yml'
- 'package.xml'
- 'pom.xml'
- 'dolphinscheduler-alert/**'
@ -41,20 +43,34 @@ on:
- 'dolphinscheduler-rpc/**'
- 'dolphinscheduler-server/**'
concurrency:
group: backend-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
jobs:
Compile-check:
build:
name: Build
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
submodule: true
- name: Check License Header
uses: apache/skywalking-eyes@ec88b7d850018c8983f87729ea88549e100c5c82
- name: Set up JDK 1.8
uses: actions/setup-java@v1
submodules: true
- name: Sanity Check
uses: ./.github/actions/sanity-check
with:
token: ${{ secrets.GITHUB_TOKEN }} # We only need to pass this token in one workflow
- uses: actions/cache@v2
with:
java-version: 1.8
- name: Compile
run: mvn -B clean install -Prelease -Dmaven.test.skip=true -Dhttp.keepAlive=false -Dmaven.wagon.http.pool=false -Dmaven.wagon.httpconnectionManager.ttlSeconds=120
path: ~/.m2/repository
key: ${{ runner.os }}-maven
- name: Build and Package
run: |
./mvnw -B clean install \
-Prelease \
-Dmaven.test.skip=true \
-Dcheckstyle.skip=true \
-Dhttp.keepAlive=false \
-Dmaven.wagon.http.pool=false \
-Dmaven.wagon.httpconnectionManager.ttlSeconds=120
- name: Check dependency license
run: tools/dependencies/check-LICENSE.sh

22
.github/workflows/ci_e2e.yml → .github/workflows/e2e.yml

@ -20,26 +20,26 @@ env:
DOCKER_DIR: ./docker
LOG_DIR: /tmp/dolphinscheduler
name: e2e Test
name: Test
jobs:
concurrency:
group: e2e-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
build:
name: Test
jobs:
test:
name: E2E
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
submodule: true
- name: Check License Header
uses: apache/skywalking-eyes@ec88b7d850018c8983f87729ea88549e100c5c82
submodules: true
- name: Sanity Check
uses: ./.github/actions/sanity-check
- uses: actions/cache@v1
with:
path: ~/.m2/repository
key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }}
restore-keys: |
${{ runner.os }}-maven-
key: ${{ runner.os }}-maven
- name: Build Image
run: |
sh ./docker/build/hooks/build

31
.github/workflows/ci_frontend.yml → .github/workflows/frontend.yml

@ -19,31 +19,44 @@ name: Frontend
on:
push:
branches:
- dev
paths:
- '.github/workflows/ci_frontend.yml'
- '.github/workflows/frontend.yml'
- 'dolphinscheduler-ui/**'
pull_request:
paths:
- '.github/workflows/ci_frontend.yml'
- '.github/workflows/frontend.yml'
- 'dolphinscheduler-ui/**'
defaults:
run:
working-directory: dolphinscheduler-ui
concurrency:
group: frontend-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
jobs:
Compile-check:
build:
name: Build
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest, macos-latest]
os: [ ubuntu-latest, macos-latest ]
steps:
- uses: actions/checkout@v2
with:
submodule: true
submodules: true
- if: matrix.os == 'ubuntu-latest'
name: Sanity Check
uses: ./.github/actions/sanity-check
- name: Set up Node.js
uses: actions/setup-node@v1
uses: actions/setup-node@v2
with:
version: 8
- name: Compile
node-version: 8
- name: Compile and Build
run: |
cd dolphinscheduler-ui
npm install node-sass --unsafe-perm
npm install
npm run lint

99
.github/workflows/ci_ut.yml → .github/workflows/unit-test.yml

@ -15,69 +15,71 @@
# limitations under the License.
#
name: Test
on:
pull_request:
paths-ignore:
- '**/*.md'
- 'dolphinscheduler-ui'
push:
paths-ignore:
- '**/*.md'
- 'dolphinscheduler-ui'
branches:
- dev
env:
LOG_DIR: /tmp/dolphinscheduler
name: Unit Test
concurrency:
group: unit-test-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
jobs:
build:
name: Build
unit-test:
name: Unit Test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
submodule: true
- name: Check License Header
uses: apache/skywalking-eyes@ec88b7d850018c8983f87729ea88549e100c5c82
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Only enable review / suggestion here
- uses: actions/cache@v1
submodules: true
- name: Sanity Check
uses: ./.github/actions/sanity-check
- name: Set up JDK 1.8
uses: actions/setup-java@v2
with:
java-version: 8
distribution: 'adopt'
- uses: actions/cache@v2
with:
path: ~/.m2/repository
key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }}
restore-keys: |
${{ runner.os }}-maven-
key: ${{ runner.os }}-maven
- name: Bootstrap database
run: |
sed -i "/image: bitnami\/postgresql/a\ ports:\n - 5432:5432" $(pwd)/docker/docker-swarm/docker-compose.yml
sed -i "/image: bitnami\/zookeeper/a\ ports:\n - 2181:2181" $(pwd)/docker/docker-swarm/docker-compose.yml
docker-compose -f $(pwd)/docker/docker-swarm/docker-compose.yml up -d dolphinscheduler-zookeeper dolphinscheduler-postgresql
until docker logs docker-swarm_dolphinscheduler-postgresql_1 2>&1 | grep 'listening on IPv4 address'; do echo "waiting for postgresql ready ..."; sleep 1; done
docker run --rm --network docker-swarm_dolphinscheduler -v $(pwd)/sql/dolphinscheduler_postgre.sql:/docker-entrypoint-initdb.d/dolphinscheduler_postgre.sql bitnami/postgresql:latest bash -c "PGPASSWORD=root psql -h docker-swarm_dolphinscheduler-postgresql_1 -U root -d dolphinscheduler -v ON_ERROR_STOP=1 -f /docker-entrypoint-initdb.d/dolphinscheduler_postgre.sql"
- name: Set up JDK 1.8
uses: actions/setup-java@v1
with:
java-version: 1.8
- name: Git fetch unshallow
run: |
git fetch --unshallow
git config remote.origin.fetch "+refs/heads/*:refs/remotes/origin/*"
git fetch origin
- name: Compile
run: |
export MAVEN_OPTS='-Dmaven.repo.local=.m2/repository -Dhttp.keepAlive=false -Dmaven.wagon.http.pool=false -Dmaven.wagon.httpconnectionManager.ttlSeconds=120 -XX:+TieredCompilation -XX:TieredStopAtLevel=1 -XX:+CMSClassUnloadingEnabled -XX:+UseConcMarkSweepGC -XX:-UseGCOverheadLimit -Xmx5g'
mvn clean verify -B -Dmaven.test.skip=false
docker run --rm --network docker-swarm_dolphinscheduler -v $(pwd)/sql/dolphinscheduler_postgre.sql:/docker-entrypoint-initdb.d/dolphinscheduler_postgre.sql bitnami/postgresql:11.11.0 bash -c "PGPASSWORD=root psql -h docker-swarm_dolphinscheduler-postgresql_1 -U root -d dolphinscheduler -v ON_ERROR_STOP=1 -f /docker-entrypoint-initdb.d/dolphinscheduler_postgre.sql"
- name: Run Unit tests
run: ./mvnw clean verify -B -Dmaven.test.skip=false
- name: Upload coverage report to codecov
run: |
CODECOV_TOKEN="09c2663f-b091-4258-8a47-c981827eb29a" bash <(curl -s https://codecov.io/bash)
run: CODECOV_TOKEN="09c2663f-b091-4258-8a47-c981827eb29a" bash <(curl -s https://codecov.io/bash)
# Set up JDK 11 for SonarCloud.
- name: Set up JDK 1.11
uses: actions/setup-java@v1
- name: Set up JDK 11
uses: actions/setup-java@v2
with:
java-version: 1.11
java-version: 11
distribution: 'adopt'
- name: Run SonarCloud Analysis
run: >
mvn --batch-mode verify sonar:sonar
./mvnw --batch-mode verify sonar:sonar
-Dsonar.coverage.jacoco.xmlReportPaths=target/site/jacoco/jacoco.xml
-Dmaven.test.skip=true
-Dcheckstyle.skip=true
-Dsonar.host.url=https://sonarcloud.io
-Dsonar.organization=apache
-Dsonar.core.codeCoveragePlugin=jacoco
@ -88,31 +90,16 @@ jobs:
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
- name: Collect logs
continue-on-error: true
run: |
mkdir -p ${LOG_DIR}
docker-compose -f $(pwd)/docker/docker-swarm/docker-compose.yml logs dolphinscheduler-postgresql > ${LOG_DIR}/db.txt
continue-on-error: true
Checkstyle:
name: Check code style
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Upload logs
uses: actions/upload-artifact@v2
continue-on-error: true
with:
submodule: true
- name: check code style
env:
WORKDIR: ./
REVIEWDOG_GITHUB_API_TOKEN: ${{ secrets.GITHUB_TOKEN }}
CHECKSTYLE_CONFIG: style/checkstyle.xml
REVIEWDOG_VERSION: v0.10.2
run: |
wget -O - -q https://github.com/checkstyle/checkstyle/releases/download/checkstyle-8.43/checkstyle-8.43-all.jar > /opt/checkstyle.jar
wget -O - -q https://raw.githubusercontent.com/reviewdog/reviewdog/master/install.sh | sh -s -- -b /opt ${REVIEWDOG_VERSION}
java -jar /opt/checkstyle.jar "${WORKDIR}" -c "${CHECKSTYLE_CONFIG}" -f xml \
| /opt/reviewdog -f=checkstyle \
-reporter="${INPUT_REPORTER:-github-pr-check}" \
-filter-mode="${INPUT_FILTER_MODE:-added}" \
-fail-on-error="${INPUT_FAIL_ON_ERROR:-false}"
name: unit-test-logs
path: ${LOG_DIR}

3
.gitmodules vendored

@ -24,3 +24,6 @@
[submodule ".github/actions/translate-on-issue"]
path = .github/actions/translate-on-issue
url = https://github.com/xingchun-chen/translation-helper.git
[submodule ".github/actions/reviewdog-setup"]
path = .github/actions/reviewdog-setup
url = https://github.com/reviewdog/action-setup

5
.licenserc.yaml

@ -40,5 +40,10 @@ header:
- '**/.gitignore'
- '**/LICENSE'
- '**/NOTICE'
- '**/node_modules/**'
- '.github/actions/comment-on-issue/**'
- '.github/actions/lable-on-issue/**'
- '.github/actions/reviewdog-setup/**'
- '.github/actions/translate-on-issue/**'
comment: on-failure

2
README_zh_CN.md

@ -82,7 +82,7 @@ dolphinscheduler-dist/target/apache-dolphinscheduler-${latest.release.version}-s
## 感谢
Dolphin Scheduler使用了很多优秀的开源项目,比如google的guava、guice、grpc,netty,ali的bonecp,quartz,以及apache的众多开源项目等等,
Dolphin Scheduler使用了很多优秀的开源项目,比如google的guava、guice、grpc,netty,quartz,以及apache的众多开源项目等等,
正是由于站在这些开源项目的肩膀上,才有Dolphin Scheduler的诞生的可能。对此我们对使用的所有开源软件表示非常的感谢!我们也希望自己不仅是开源的受益者,也能成为开源的贡献者,也希望对开源有同样热情和信念的伙伴加入进来,一起为开源献出一份力!
## 获得帮助

4
docker/build/hooks/build

@ -39,8 +39,8 @@ echo "Repo: $DOCKER_REPO"
echo -e "Current Directory is $(pwd)\n"
# maven package(Project Directory)
echo -e "mvn -B clean package -Prelease -Dmaven.test.skip=true -Dhttp.keepAlive=false -Dmaven.wagon.http.pool=false -Dmaven.wagon.httpconnectionManager.ttlSeconds=120"
mvn -B clean package -Prelease -Dmaven.test.skip=true -Dhttp.keepAlive=false -Dmaven.wagon.http.pool=false -Dmaven.wagon.httpconnectionManager.ttlSeconds=120
echo -e "./mvnw -B clean package -Prelease -Dmaven.test.skip=true -Dhttp.keepAlive=false -Dmaven.wagon.http.pool=false -Dmaven.wagon.httpconnectionManager.ttlSeconds=120"
./mvnw -B clean package -Prelease -Dmaven.test.skip=true -Dhttp.keepAlive=false -Dmaven.wagon.http.pool=false -Dmaven.wagon.httpconnectionManager.ttlSeconds=120
# mv dolphinscheduler-bin.tar.gz file to docker/build directory
echo -e "mv $(pwd)/dolphinscheduler-dist/target/apache-dolphinscheduler-${VERSION}-bin.tar.gz $(pwd)/docker/build/\n"

13
dolphinscheduler-alert-plugin/dolphinscheduler-alert-email/pom.xml

@ -31,17 +31,6 @@
<packaging>dolphinscheduler-plugin</packaging>
<dependencies>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-collections4</artifactId>
@ -131,4 +120,4 @@
<finalName>dolphinscheduler-alert-email-${project.version}</finalName>
</build>
</project>
</project>

23
dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/controller/AlertGroupController.java

@ -20,6 +20,7 @@ package org.apache.dolphinscheduler.api.controller;
import static org.apache.dolphinscheduler.api.enums.Status.CREATE_ALERT_GROUP_ERROR;
import static org.apache.dolphinscheduler.api.enums.Status.DELETE_ALERT_GROUP_ERROR;
import static org.apache.dolphinscheduler.api.enums.Status.LIST_PAGING_ALERT_GROUP_ERROR;
import static org.apache.dolphinscheduler.api.enums.Status.QUERY_ALERT_GROUP_ERROR;
import static org.apache.dolphinscheduler.api.enums.Status.QUERY_ALL_ALERTGROUP_ERROR;
import static org.apache.dolphinscheduler.api.enums.Status.UPDATE_ALERT_GROUP_ERROR;
@ -139,6 +140,28 @@ public class AlertGroupController extends BaseController {
searchVal = ParameterUtils.handleEscapes(searchVal);
return alertGroupService.listPaging(loginUser, searchVal, pageNo, pageSize);
}
/**
* check alarm group detail by Id
*
* @param loginUser login user
* @param id alert group id
* @return one alert group
*/
@ApiOperation(value = "queryAlertGroupById", notes = "QUERY_ALERT_GROUP_BY_ID_NOTES")
@ApiImplicitParams({@ApiImplicitParam(name = "id", value = "ALERT_GROUP_ID", dataType = "Int", example = "1")
})
@PostMapping(value = "/query")
@ResponseStatus(HttpStatus.OK)
@ApiException(QUERY_ALERT_GROUP_ERROR)
@AccessLogAnnotation(ignoreRequestArgs = "loginUser")
public Result queryAlertGroupById(@ApiIgnore @RequestAttribute(value = Constants.SESSION_USER) User loginUser,
@RequestParam("id") Integer id) {
Map<String, Object> result = alertGroupService.queryAlertGroupById(loginUser, id);
return returnDataList(result);
}
/**
* updateProcessInstance alert group

1
dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/enums/Status.java

@ -211,6 +211,7 @@ public enum Status {
WORKER_ADDRESS_INVALID(10177, "worker address {0} invalid", "worker地址[{0}]无效"),
QUERY_WORKER_ADDRESS_LIST_FAIL(10178, "query worker address list fail ", "查询worker地址列表失败"),
TRANSFORM_PROJECT_OWNERSHIP(10179, "Please transform project ownership [{0}]", "请先转移项目所有权[{0}]"),
QUERY_ALERT_GROUP_ERROR(10180, "query alert group error", "查询告警组错误"),
UDF_FUNCTION_NOT_EXIST(20001, "UDF function not found", "UDF函数不存在"),
UDF_FUNCTION_EXISTS(20002, "UDF function already exists", "UDF函数已存在"),

8
dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/AlertGroupService.java

@ -34,6 +34,14 @@ public interface AlertGroupService {
*/
Map<String, Object> queryAlertgroup();
/**
* query alert group by id
*
* @param loginUser login user
* @param id alert group id
* @return one alert group
*/
Map<String, Object> queryAlertGroupById(User loginUser, Integer id);
/**
* paging query alarm group list
*

41
dolphinscheduler-api/src/main/java/org/apache/dolphinscheduler/api/service/impl/AlertGroupServiceImpl.java

@ -27,6 +27,7 @@ import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.dao.entity.AlertGroup;
import org.apache.dolphinscheduler.dao.entity.User;
import org.apache.dolphinscheduler.dao.mapper.AlertGroupMapper;
import org.apache.dolphinscheduler.dao.vo.AlertGroupVo;
import java.util.Date;
import java.util.HashMap;
@ -70,6 +71,33 @@ public class AlertGroupServiceImpl extends BaseServiceImpl implements AlertGroup
return result;
}
/**
* query alert group by id
*
* @param loginUser login user
* @param id alert group id
* @return one alert group
*/
@Override
public Map<String, Object> queryAlertGroupById(User loginUser, Integer id) {
Map<String, Object> result = new HashMap<>();
result.put(Constants.STATUS, false);
//only admin can operate
if (isNotAdmin(loginUser, result)) {
return result;
}
//check if exist
AlertGroup alertGroup = alertGroupMapper.selectById(id);
if (alertGroup == null) {
putMsg(result, Status.ALERT_GROUP_NOT_EXIST);
return result;
}
result.put("data", alertGroup);
putMsg(result, Status.SUCCESS);
return result;
}
/**
* paging query alarm group list
*
@ -88,13 +116,14 @@ public class AlertGroupServiceImpl extends BaseServiceImpl implements AlertGroup
return result;
}
Page<AlertGroup> page = new Page<>(pageNo, pageSize);
IPage<AlertGroup> alertGroupIPage = alertGroupMapper.queryAlertGroupPage(
page, searchVal);
PageInfo<AlertGroup> pageInfo = new PageInfo<>(pageNo, pageSize);
pageInfo.setTotal((int) alertGroupIPage.getTotal());
pageInfo.setTotalList(alertGroupIPage.getRecords());
Page<AlertGroupVo> page = new Page<>(pageNo, pageSize);
IPage<AlertGroupVo> alertGroupVoIPage = alertGroupMapper.queryAlertGroupVo(page, searchVal);
PageInfo<AlertGroupVo> pageInfo = new PageInfo<>(pageNo, pageSize);
pageInfo.setTotal((int) alertGroupVoIPage.getTotal());
pageInfo.setTotalList(alertGroupVoIPage.getRecords());
result.setData(pageInfo);
putMsg(result, Status.SUCCESS);
return result;
}

38
dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/controller/AlertGroupControllerTest.java

@ -49,6 +49,7 @@ public class AlertGroupControllerTest extends AbstractControllerTest {
paramsMap.add("groupName","cxc test group name");
paramsMap.add("groupType", AlertType.EMAIL.toString());
paramsMap.add("description","cxc junit 测试告警描述");
paramsMap.add("alertInstanceIds", "");
MvcResult mvcResult = mockMvc.perform(post("/alert-group/create")
.header("sessionId", sessionId)
.params(paramsMap))
@ -93,40 +94,41 @@ public class AlertGroupControllerTest extends AbstractControllerTest {
}
@Test
public void testUpdateAlertgroup() throws Exception {
public void testQueryAlertGroupById() throws Exception {
MultiValueMap<String, String> paramsMap = new LinkedMultiValueMap<>();
paramsMap.add("id","22");
paramsMap.add("groupName", "hd test group name");
paramsMap.add("groupType",AlertType.EMAIL.toString());
paramsMap.add("description","update alter group");
MvcResult mvcResult = mockMvc.perform(post("/alert-group/update")
MvcResult mvcResult = mockMvc.perform(post("/alert-group/query")
.header("sessionId", sessionId)
.params(paramsMap))
.andExpect(status().isOk())
.andExpect(content().contentType(MediaType.APPLICATION_JSON_UTF8))
.andReturn();
Result result = JSONUtils.parseObject(mvcResult.getResponse().getContentAsString(), Result.class);
Assert.assertTrue(result != null && result.isSuccess());
Assert.assertTrue(result != null && result.isStatus(Status.ALERT_GROUP_NOT_EXIST));
logger.info(mvcResult.getResponse().getContentAsString());
}
@Test
public void testVerifyGroupName() throws Exception {
public void testUpdateAlertgroup() throws Exception {
MultiValueMap<String, String> paramsMap = new LinkedMultiValueMap<>();
paramsMap.add("groupName","hd test group name");
MvcResult mvcResult = mockMvc.perform(get("/alert-group/verify-group-name")
paramsMap.add("id","22");
paramsMap.add("groupName", "cxc test group name");
paramsMap.add("groupType",AlertType.EMAIL.toString());
paramsMap.add("description","update alter group");
paramsMap.add("alertInstanceIds", "");
MvcResult mvcResult = mockMvc.perform(post("/alert-group/update")
.header("sessionId", sessionId)
.params(paramsMap))
.andExpect(status().isOk())
.andExpect(content().contentType(MediaType.APPLICATION_JSON_UTF8))
.andReturn();
Result result = JSONUtils.parseObject(mvcResult.getResponse().getContentAsString(), Result.class);
Assert.assertTrue(result != null && result.isStatus(Status.ALERT_GROUP_EXIST));
Assert.assertTrue(result != null && result.isStatus(Status.ALERT_GROUP_NOT_EXIST));
logger.info(mvcResult.getResponse().getContentAsString());
}
@Test
public void testVerifyGroupNameNotExit() throws Exception {
public void testVerifyGroupName() throws Exception {
MultiValueMap<String, String> paramsMap = new LinkedMultiValueMap<>();
paramsMap.add("groupName","cxc test group name");
MvcResult mvcResult = mockMvc.perform(get("/alert-group/verify-group-name")
@ -136,24 +138,22 @@ public class AlertGroupControllerTest extends AbstractControllerTest {
.andExpect(content().contentType(MediaType.APPLICATION_JSON_UTF8))
.andReturn();
Result result = JSONUtils.parseObject(mvcResult.getResponse().getContentAsString(), Result.class);
Assert.assertTrue(result != null && result.isSuccess());
Assert.assertTrue(result != null && result.isStatus(Status.ALERT_GROUP_EXIST));
logger.info(mvcResult.getResponse().getContentAsString());
}
@Test
public void testGrantUser() throws Exception {
public void testVerifyGroupNameNotExit() throws Exception {
MultiValueMap<String, String> paramsMap = new LinkedMultiValueMap<>();
paramsMap.add("alertgroupId","2");
paramsMap.add("userIds","2");
MvcResult mvcResult = mockMvc.perform(post("/alert-group/grant-user")
paramsMap.add("groupName","cxc test group name");
MvcResult mvcResult = mockMvc.perform(get("/alert-group/verify-group-name")
.header("sessionId", sessionId)
.params(paramsMap))
.andExpect(status().isOk())
.andExpect(content().contentType(MediaType.APPLICATION_JSON_UTF8))
.andReturn();
Result result = JSONUtils.parseObject(mvcResult.getResponse().getContentAsString(), Result.class);
Assert.assertTrue(result != null && result.isSuccess());
Assert.assertTrue(result != null && result.isStatus(Status.ALERT_GROUP_EXIST));
logger.info(mvcResult.getResponse().getContentAsString());
}
@ -168,7 +168,7 @@ public class AlertGroupControllerTest extends AbstractControllerTest {
.andExpect(content().contentType(MediaType.APPLICATION_JSON_UTF8))
.andReturn();
Result result = JSONUtils.parseObject(mvcResult.getResponse().getContentAsString(), Result.class);
Assert.assertTrue(result != null && result.isSuccess());
Assert.assertTrue(result != null && result.isStatus(Status.ALERT_GROUP_NOT_EXIST));
logger.info(mvcResult.getResponse().getContentAsString());
}
}

28
dolphinscheduler-api/src/test/java/org/apache/dolphinscheduler/api/service/AlertGroupServiceTest.java

@ -30,6 +30,7 @@ import org.apache.dolphinscheduler.common.utils.CollectionUtils;
import org.apache.dolphinscheduler.dao.entity.AlertGroup;
import org.apache.dolphinscheduler.dao.entity.User;
import org.apache.dolphinscheduler.dao.mapper.AlertGroupMapper;
import org.apache.dolphinscheduler.dao.vo.AlertGroupVo;
import java.util.ArrayList;
import java.util.List;
@ -77,10 +78,10 @@ public class AlertGroupServiceTest {
@Test
public void testListPaging() {
IPage<AlertGroup> page = new Page<>(1, 10);
IPage<AlertGroupVo> page = new Page<>(1, 10);
page.setTotal(1L);
page.setRecords(getList());
Mockito.when(alertGroupMapper.queryAlertGroupPage(any(Page.class), eq(groupName))).thenReturn(page);
page.setRecords(getAlertGroupVoList());
Mockito.when(alertGroupMapper.queryAlertGroupVo(any(Page.class), eq(groupName))).thenReturn(page);
User user = new User();
// no operate
Result result = alertGroupService.listPaging(user, groupName, 1, 10);
@ -90,7 +91,7 @@ public class AlertGroupServiceTest {
user.setUserType(UserType.ADMIN_USER);
result = alertGroupService.listPaging(user, groupName, 1, 10);
logger.info(result.toString());
PageInfo<AlertGroup> pageInfo = (PageInfo<AlertGroup>) result.getData();
PageInfo<AlertGroupVo> pageInfo = (PageInfo<AlertGroupVo>) result.getData();
Assert.assertTrue(CollectionUtils.isNotEmpty(pageInfo.getTotalList()));
}
@ -216,4 +217,23 @@ public class AlertGroupServiceTest {
return alertGroup;
}
/**
* get AlertGroupVo list
*/
private List<AlertGroupVo> getAlertGroupVoList() {
List<AlertGroupVo> alertGroupVos = new ArrayList<>();
alertGroupVos.add(getAlertGroupVoEntity());
return alertGroupVos;
}
/**
* get AlertGroupVo entity
*/
private AlertGroupVo getAlertGroupVoEntity() {
AlertGroupVo alertGroupVo = new AlertGroupVo();
alertGroupVo.setId(1);
alertGroupVo.setGroupName(groupName);
return alertGroupVo;
}
}

1
dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/Constants.java

@ -776,6 +776,7 @@ public final class Constants {
public static final String PROCESS_INSTANCE_STATE = "processInstanceState";
public static final String PARENT_WORKFLOW_INSTANCE = "parentWorkflowInstance";
public static final String CONDITION_RESULT = "conditionResult";
public static final String SWITCH_RESULT = "switchResult";
public static final String DEPENDENCE = "dependence";
public static final String TASK_TYPE = "taskType";
public static final String TASK_LIST = "taskList";

66
dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/enums/DbType.java

@ -14,65 +14,45 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.common.enums;
import com.baomidou.mybatisplus.annotation.EnumValue;
import static java.util.stream.Collectors.toMap;
import java.util.HashMap;
import java.util.Arrays;
import java.util.Map;
/**
* data base types
*/
public enum DbType {
/**
* 0 mysql
* 1 postgresql
* 2 hive
* 3 spark
* 4 clickhouse
* 5 oracle
* 6 sqlserver
* 7 db2
* 8 presto
*/
MYSQL(0, "mysql"),
POSTGRESQL(1, "postgresql"),
HIVE(2, "hive"),
SPARK(3, "spark"),
CLICKHOUSE(4, "clickhouse"),
ORACLE(5, "oracle"),
SQLSERVER(6, "sqlserver"),
DB2(7, "db2"),
PRESTO(8, "presto");
import com.baomidou.mybatisplus.annotation.EnumValue;
import com.google.common.base.Functions;
DbType(int code, String descp) {
public enum DbType {
MYSQL(0),
POSTGRESQL(1),
HIVE(2),
SPARK(3),
CLICKHOUSE(4),
ORACLE(5),
SQLSERVER(6),
DB2(7),
PRESTO(8),
H2(9);
DbType(int code) {
this.code = code;
this.descp = descp;
}
@EnumValue
private final int code;
private final String descp;
public int getCode() {
return code;
}
public String getDescp() {
return descp;
}
private static HashMap<Integer, DbType> DB_TYPE_MAP =new HashMap<>();
static {
for (DbType dbType:DbType.values()){
DB_TYPE_MAP.put(dbType.getCode(),dbType);
}
}
private static final Map<Integer, DbType> DB_TYPE_MAP =
Arrays.stream(DbType.values()).collect(toMap(DbType::getCode, Functions.identity()));
public static DbType of(int type){
if(DB_TYPE_MAP.containsKey(type)){
public static DbType of(int type) {
if (DB_TYPE_MAP.containsKey(type)) {
return DB_TYPE_MAP.get(type);
}
return null;

4
dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/enums/TaskType.java

@ -51,7 +51,9 @@ public enum TaskType {
DATAX(10, "DATAX"),
CONDITIONS(11, "CONDITIONS"),
SQOOP(12, "SQOOP"),
WATERDROP(13, "WATERDROP");
WATERDROP(13, "WATERDROP"),
SWITCH(14, "SWITCH"),
;
TaskType(int code, String desc) {
this.code = code;

12
dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/graph/DAG.java

@ -432,13 +432,11 @@ public class DAG<Node, NodeInfo, EdgeInfo> {
* @return all neighbor nodes of the node
*/
private Set<Node> getNeighborNodes(Node node, final Map<Node, Map<Node, EdgeInfo>> edges) {
final Map<Node, EdgeInfo> neighborEdges = edges.get(node);
if (neighborEdges == null) {
return Collections.EMPTY_MAP.keySet();
}
return neighborEdges.keySet();
final Map<Node, EdgeInfo> neighborEdges = edges.get(node);
if (neighborEdges == null) {
return Collections.emptySet();
}
return neighborEdges.keySet();
}

19
dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/model/TaskNode.java

@ -14,6 +14,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.common.model;
import org.apache.dolphinscheduler.common.Constants;
@ -33,7 +34,6 @@ import java.util.Objects;
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
import com.fasterxml.jackson.databind.annotation.JsonSerialize;
public class TaskNode {
/**
@ -129,6 +129,10 @@ public class TaskNode {
@JsonSerialize(using = JSONUtils.JsonDataSerializer.class)
private String conditionResult;
@JsonDeserialize(using = JSONUtils.JsonDataDeserializer.class)
@JsonSerialize(using = JSONUtils.JsonDataSerializer.class)
private String switchResult;
/**
* task instance priority
*/
@ -365,6 +369,10 @@ public class TaskNode {
return TaskType.CONDITIONS.getDesc().equalsIgnoreCase(this.getType());
}
public boolean isSwitchTask() {
return TaskType.SWITCH.toString().equalsIgnoreCase(this.getType());
}
public List<PreviousTaskNode> getPreTaskNodeList() {
return preTaskNodeList;
}
@ -380,6 +388,7 @@ public class TaskNode {
}
taskParams.put(Constants.CONDITION_RESULT, this.conditionResult);
taskParams.put(Constants.DEPENDENCE, this.dependence);
taskParams.put(Constants.SWITCH_RESULT, this.switchResult);
return JSONUtils.toJsonString(taskParams);
}
@ -417,4 +426,12 @@ public class TaskNode {
+ ", delayTime=" + delayTime
+ '}';
}
public String getSwitchResult() {
return switchResult;
}
public void setSwitchResult(String switchResult) {
this.switchResult = switchResult;
}
}

91
dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/task/switchtask/SwitchParameters.java

@ -0,0 +1,91 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.common.task.switchtask;
import org.apache.dolphinscheduler.common.enums.DependentRelation;
import org.apache.dolphinscheduler.common.process.ResourceInfo;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import java.util.ArrayList;
import java.util.List;
public class SwitchParameters extends AbstractParameters {
private DependentRelation dependRelation;
private String relation;
private List<String> nextNode;
@Override
public boolean checkParameters() {
return true;
}
@Override
public List<ResourceInfo> getResourceFilesList() {
return new ArrayList<>();
}
private int resultConditionLocation;
private List<SwitchResultVo> dependTaskList;
public DependentRelation getDependRelation() {
return dependRelation;
}
public void setDependRelation(DependentRelation dependRelation) {
this.dependRelation = dependRelation;
}
public int getResultConditionLocation() {
return resultConditionLocation;
}
public void setResultConditionLocation(int resultConditionLocation) {
this.resultConditionLocation = resultConditionLocation;
}
public String getRelation() {
return relation;
}
public void setRelation(String relation) {
this.relation = relation;
}
public List<SwitchResultVo> getDependTaskList() {
return dependTaskList;
}
public void setDependTaskList(List<SwitchResultVo> dependTaskList) {
this.dependTaskList = dependTaskList;
}
public List<String> getNextNode() {
return nextNode;
}
public void setNextNode(Object nextNode) {
if (nextNode instanceof String) {
List<String> nextNodeList = new ArrayList<>();
nextNodeList.add(String.valueOf(nextNode));
this.nextNode = nextNodeList;
} else {
this.nextNode = (ArrayList) nextNode;
}
}
}

49
dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/task/switchtask/SwitchResultVo.java

@ -0,0 +1,49 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.common.task.switchtask;
import java.util.ArrayList;
import java.util.List;
public class SwitchResultVo {
private String condition;
private List<String> nextNode;
public String getCondition() {
return condition;
}
public void setCondition(String condition) {
this.condition = condition;
}
public List<String> getNextNode() {
return nextNode;
}
public void setNextNode(Object nextNode) {
if (nextNode instanceof String) {
List<String> nextNodeList = new ArrayList<>();
nextNodeList.add(String.valueOf(nextNode));
this.nextNode = nextNodeList;
} else {
this.nextNode = (ArrayList) nextNode;
}
}
}

18
dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/utils/PropertyUtils.java

@ -34,15 +34,7 @@ import java.util.Set;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* property utils
* single instance
*/
public class PropertyUtils {
/**
* logger
*/
private static final Logger logger = LoggerFactory.getLogger(PropertyUtils.class);
private static final Properties properties = new Properties();
@ -55,9 +47,6 @@ public class PropertyUtils {
loadPropertyFile(COMMON_PROPERTIES_PATH);
}
/**
* init properties
*/
public static synchronized void loadPropertyFile(String... propertyFiles) {
for (String fileName : propertyFiles) {
try (InputStream fis = PropertyUtils.class.getResourceAsStream(fileName);) {
@ -68,6 +57,13 @@ public class PropertyUtils {
System.exit(1);
}
}
// Override from system properties
System.getProperties().forEach((k, v) -> {
final String key = String.valueOf(k);
logger.info("Overriding property from system property: {}", key);
PropertyUtils.setValue(key, String.valueOf(v));
});
}
/**

3
dolphinscheduler-common/src/main/java/org/apache/dolphinscheduler/common/utils/TaskParametersUtils.java

@ -31,6 +31,7 @@ import org.apache.dolphinscheduler.common.task.spark.SparkParameters;
import org.apache.dolphinscheduler.common.task.sql.SqlParameters;
import org.apache.dolphinscheduler.common.task.sqoop.SqoopParameters;
import org.apache.dolphinscheduler.common.task.subprocess.SubProcessParameters;
import org.apache.dolphinscheduler.common.task.switchtask.SwitchParameters;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@ -82,6 +83,8 @@ public class TaskParametersUtils {
return JSONUtils.parseObject(parameter, ConditionsParameters.class);
case "SQOOP":
return JSONUtils.parseObject(parameter, SqoopParameters.class);
case "SWITCH":
return JSONUtils.parseObject(parameter, SwitchParameters.class);
default:
logger.error("not support task type: {}", taskType);
return null;

1
dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/datasource/SpringConnectionFactory.java

@ -166,6 +166,7 @@ public class SpringConnectionFactory {
Properties properties = new Properties();
properties.setProperty("MySQL", "mysql");
properties.setProperty("PostgreSQL", "pg");
properties.setProperty("h2", "h2");
databaseIdProvider.setProperties(properties);
return databaseIdProvider;
}

25
dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/entity/TaskInstance.java

@ -23,6 +23,7 @@ import org.apache.dolphinscheduler.common.enums.Flag;
import org.apache.dolphinscheduler.common.enums.Priority;
import org.apache.dolphinscheduler.common.enums.TaskType;
import org.apache.dolphinscheduler.common.task.dependent.DependentParameters;
import org.apache.dolphinscheduler.common.task.switchtask.SwitchParameters;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import java.io.Serializable;
@ -174,6 +175,12 @@ public class TaskInstance implements Serializable {
@TableField(exist = false)
private DependentParameters dependency;
/**
* switch dependency
*/
@TableField(exist = false)
private SwitchParameters switchDependency;
/**
* duration
*/
@ -426,6 +433,20 @@ public class TaskInstance implements Serializable {
this.dependency = dependency;
}
public SwitchParameters getSwitchDependency() {
if (this.switchDependency == null) {
Map<String, Object> taskParamsMap = JSONUtils.toMap(this.getTaskParams(), String.class, Object.class);
this.switchDependency = JSONUtils.parseObject((String) taskParamsMap.get(Constants.SWITCH_RESULT), SwitchParameters.class);
}
return this.switchDependency;
}
public void setSwitchDependency(SwitchParameters switchDependency) {
Map<String, Object> taskParamsMap = JSONUtils.toMap(this.getTaskParams(), String.class, Object.class);
taskParamsMap.put(Constants.SWITCH_RESULT,JSONUtils.toJsonString(switchDependency));
this.setTaskParams(JSONUtils.toJsonString(taskParamsMap));
}
public Flag getFlag() {
return flag;
}
@ -510,6 +531,10 @@ public class TaskInstance implements Serializable {
return TaskType.CONDITIONS.getDesc().equalsIgnoreCase(this.taskType);
}
public boolean isSwitchTask() {
return TaskType.SWITCH.getDesc().equalsIgnoreCase(this.taskType);
}
/**
* determine if you can try again
*

10
dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/mapper/AlertGroupMapper.java

@ -18,6 +18,7 @@
package org.apache.dolphinscheduler.dao.mapper;
import org.apache.dolphinscheduler.dao.entity.AlertGroup;
import org.apache.dolphinscheduler.dao.vo.AlertGroupVo;
import org.apache.ibatis.annotations.Param;
@ -82,4 +83,13 @@ public interface AlertGroupMapper extends BaseMapper<AlertGroup> {
* @return
*/
String queryAlertGroupInstanceIdsById(@Param("alertGroupId") int alertGroupId);
/**
* query alertGroupVo page list
* @param page page
* @param groupName groupName
* @return IPage<AlertGroupVo>: include alert group id and group_name
*/
IPage<AlertGroupVo> queryAlertGroupVo(Page<AlertGroupVo> page,
@Param("groupName") String groupName);
}

50
dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/utils/DagHelper.java

@ -14,8 +14,8 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.dao.utils;
package org.apache.dolphinscheduler.dao.utils;
import org.apache.dolphinscheduler.common.enums.TaskDependType;
import org.apache.dolphinscheduler.common.graph.DAG;
@ -23,6 +23,8 @@ import org.apache.dolphinscheduler.common.model.TaskNode;
import org.apache.dolphinscheduler.common.model.TaskNodeRelation;
import org.apache.dolphinscheduler.common.process.ProcessDag;
import org.apache.dolphinscheduler.common.task.conditions.ConditionsParameters;
import org.apache.dolphinscheduler.common.task.switchtask.SwitchParameters;
import org.apache.dolphinscheduler.common.task.switchtask.SwitchResultVo;
import org.apache.dolphinscheduler.common.utils.CollectionUtils;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.dao.entity.ProcessTaskRelation;
@ -281,6 +283,9 @@ public class DagHelper {
} else if (dag.getNode(preNodeName).isConditionsTask()) {
List<String> conditionTaskList = parseConditionTask(preNodeName, skipTaskNodeList, dag, completeTaskList);
startVertexes.addAll(conditionTaskList);
} else if (dag.getNode(preNodeName).isSwitchTask()) {
List<String> conditionTaskList = parseSwitchTask(preNodeName, skipTaskNodeList, dag, completeTaskList);
startVertexes.addAll(conditionTaskList);
} else {
startVertexes = dag.getSubsequentNodes(preNodeName);
}
@ -355,6 +360,49 @@ public class DagHelper {
return conditionTaskList;
}
/**
* parse condition task find the branch process
* set skip flag for another one.
*
* @param nodeName
* @return
*/
public static List<String> parseSwitchTask(String nodeName,
Map<String, TaskNode> skipTaskNodeList,
DAG<String, TaskNode, TaskNodeRelation> dag,
Map<String, TaskInstance> completeTaskList) {
List<String> conditionTaskList = new ArrayList<>();
TaskNode taskNode = dag.getNode(nodeName);
if (!taskNode.isSwitchTask()) {
return conditionTaskList;
}
if (!completeTaskList.containsKey(nodeName)) {
return conditionTaskList;
}
conditionTaskList = skipTaskNode4Switch(taskNode, skipTaskNodeList, completeTaskList, dag);
return conditionTaskList;
}
private static List<String> skipTaskNode4Switch(TaskNode taskNode, Map<String, TaskNode> skipTaskNodeList,
Map<String, TaskInstance> completeTaskList,
DAG<String, TaskNode, TaskNodeRelation> dag) {
SwitchParameters switchParameters = completeTaskList.get(taskNode.getName()).getSwitchDependency();
int resultConditionLocation = switchParameters.getResultConditionLocation();
List<SwitchResultVo> conditionResultVoList = switchParameters.getDependTaskList();
List<String> switchTaskList = conditionResultVoList.get(resultConditionLocation).getNextNode();
if (CollectionUtils.isEmpty(switchTaskList)) {
switchTaskList = new ArrayList<>();
}
conditionResultVoList.remove(resultConditionLocation);
for (SwitchResultVo info : conditionResultVoList) {
if (CollectionUtils.isEmpty(info.getNextNode())) {
continue;
}
setTaskNodeSkip(info.getNextNode().get(0), dag, completeTaskList, skipTaskNodeList);
}
return switchTaskList;
}
/**
* set task node and the post nodes skip flag
*/

38
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/ITargetGenerator.java → dolphinscheduler-dao/src/main/java/org/apache/dolphinscheduler/dao/vo/AlertGroupVo.java

@ -15,22 +15,36 @@
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sqoop.generator;
import org.apache.dolphinscheduler.common.task.sqoop.SqoopParameters;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
package org.apache.dolphinscheduler.dao.vo;
/**
* Target Generator Interface
* AlertGroupVo
*/
public interface ITargetGenerator {
public class AlertGroupVo {
/**
* generate the target script
*
* @param sqoopParameters sqoopParameters
* @param taskExecutionContext taskExecutionContext
* @return target script
* primary key
*/
private int id;
/**
* group_name
*/
String generate(SqoopParameters sqoopParameters, TaskExecutionContext taskExecutionContext);
private String groupName;
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public String getGroupName() {
return groupName;
}
public void setGroupName(String groupName) {
this.groupName = groupName;
}
}

11
dolphinscheduler-dao/src/main/resources/org/apache/dolphinscheduler/dao/mapper/AlertGroupMapper.xml

@ -32,6 +32,15 @@
</if>
order by update_time desc
</select>
<select id="queryAlertGroupVo" resultType="org.apache.dolphinscheduler.dao.vo.AlertGroupVo">
select id, group_name
from t_ds_alertgroup
where 1 = 1
<if test="groupName != null and groupName != ''">
and group_name like concat('%', #{groupName}, '%')
</if>
order by update_time desc
</select>
<select id="queryByGroupName" resultType="org.apache.dolphinscheduler.dao.entity.AlertGroup">
select
<include refid="baseSql"/>
@ -70,4 +79,4 @@
select alert_instance_ids from t_ds_alertgroup
where id = #{alertGroupId}
</select>
</mapper>
</mapper>

3
dolphinscheduler-dao/src/main/resources/org/apache/dolphinscheduler/dao/mapper/WorkFlowLineageMapper.xml

@ -64,8 +64,7 @@
and project_code = #{projectCode}
</select>
<select id="queryWorkFlowLineageByCode" resultType="org.apache.dolphinscheduler.dao.entity.WorkFlowLineage"
databaseId="mysql">
<select id="queryWorkFlowLineageByCode" resultType="org.apache.dolphinscheduler.dao.entity.WorkFlowLineage">
select tepd.id as work_flow_id,tepd.name as work_flow_name,
"" as source_work_flow_id,
tepd.release_state as work_flow_publish_status,

99
dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/utils/DagHelperTest.java

@ -25,6 +25,8 @@ import org.apache.dolphinscheduler.common.graph.DAG;
import org.apache.dolphinscheduler.common.model.TaskNode;
import org.apache.dolphinscheduler.common.model.TaskNodeRelation;
import org.apache.dolphinscheduler.common.process.ProcessDag;
import org.apache.dolphinscheduler.common.task.switchtask.SwitchParameters;
import org.apache.dolphinscheduler.common.task.switchtask.SwitchResultVo;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.dao.entity.ProcessData;
import org.apache.dolphinscheduler.dao.entity.TaskInstance;
@ -251,6 +253,10 @@ public class DagHelperTest {
skipNodeList.clear();
completeTaskList.remove("3");
taskInstance = new TaskInstance();
Map<String, Object> taskParamsMap = new HashMap<>();
taskParamsMap.put(Constants.SWITCH_RESULT, "");
taskInstance.setTaskParams(JSONUtils.toJsonString(taskParamsMap));
taskInstance.setState(ExecutionStatus.FAILURE);
completeTaskList.put("3", taskInstance);
postNodes = DagHelper.parsePostNodes(null, skipNodeList, dag, completeTaskList);
@ -259,6 +265,17 @@ public class DagHelperTest {
Assert.assertEquals(2, skipNodeList.size());
Assert.assertTrue(skipNodeList.containsKey("5"));
Assert.assertTrue(skipNodeList.containsKey("7"));
// dag: 1-2-3-5-7 4-3-6
// 3-if , complete:1/2/3/4
// 1.failure:3 expect post:6 skip:5/7
dag = generateDag2();
skipNodeList.clear();
completeTaskList.clear();
taskInstance.setSwitchDependency(getSwitchNode());
completeTaskList.put("1", taskInstance);
postNodes = DagHelper.parsePostNodes("1", skipNodeList, dag, completeTaskList);
Assert.assertEquals(1, postNodes.size());
}
/**
@ -286,7 +303,6 @@ public class DagHelperTest {
node2.setPreTasks(JSONUtils.toJsonString(dep2));
taskNodeList.add(node2);
TaskNode node4 = new TaskNode();
node4.setId("4");
node4.setName("4");
@ -351,6 +367,87 @@ public class DagHelperTest {
return DagHelper.buildDagGraph(processDag);
}
/**
* 1->2->3->5->7
* 4->3->6
* 2->8->5->7
*
* @return dag
* @throws JsonProcessingException if error throws JsonProcessingException
*/
private DAG<String, TaskNode, TaskNodeRelation> generateDag2() throws IOException {
List<TaskNode> taskNodeList = new ArrayList<>();
TaskNode node = new TaskNode();
node.setId("0");
node.setName("0");
node.setType("SHELL");
taskNodeList.add(node);
TaskNode node1 = new TaskNode();
node1.setId("1");
node1.setName("1");
node1.setType("switch");
node1.setDependence(JSONUtils.toJsonString(getSwitchNode()));
taskNodeList.add(node1);
TaskNode node2 = new TaskNode();
node2.setId("2");
node2.setName("2");
node2.setType("SHELL");
List<String> dep2 = new ArrayList<>();
dep2.add("1");
node2.setPreTasks(JSONUtils.toJsonString(dep2));
taskNodeList.add(node2);
TaskNode node4 = new TaskNode();
node4.setId("4");
node4.setName("4");
node4.setType("SHELL");
List<String> dep4 = new ArrayList<>();
dep4.add("1");
node4.setPreTasks(JSONUtils.toJsonString(dep4));
taskNodeList.add(node4);
TaskNode node5 = new TaskNode();
node5.setId("4");
node5.setName("4");
node5.setType("SHELL");
List<String> dep5 = new ArrayList<>();
dep5.add("1");
node5.setPreTasks(JSONUtils.toJsonString(dep5));
taskNodeList.add(node5);
List<String> startNodes = new ArrayList<>();
List<String> recoveryNodes = new ArrayList<>();
List<TaskNode> destTaskNodeList = DagHelper.generateFlowNodeListByStartNode(taskNodeList,
startNodes, recoveryNodes, TaskDependType.TASK_POST);
List<TaskNodeRelation> taskNodeRelations = DagHelper.generateRelationListByFlowNodes(destTaskNodeList);
ProcessDag processDag = new ProcessDag();
processDag.setEdges(taskNodeRelations);
processDag.setNodes(destTaskNodeList);
return DagHelper.buildDagGraph(processDag);
}
private SwitchParameters getSwitchNode() {
SwitchParameters conditionsParameters = new SwitchParameters();
SwitchResultVo switchResultVo1 = new SwitchResultVo();
switchResultVo1.setCondition(" 2 == 1");
switchResultVo1.setNextNode("2");
SwitchResultVo switchResultVo2 = new SwitchResultVo();
switchResultVo2.setCondition(" 2 == 2");
switchResultVo2.setNextNode("4");
List<SwitchResultVo> list = new ArrayList<>();
list.add(switchResultVo1);
list.add(switchResultVo2);
conditionsParameters.setDependTaskList(list);
conditionsParameters.setNextNode("5");
conditionsParameters.setRelation("AND");
// in: AND(AND(1 is SUCCESS))
return conditionsParameters;
}
@Test
public void testBuildDagGraph() {
String shellJson = "{\"globalParams\":[],\"tasks\":[{\"type\":\"SHELL\",\"id\":\"tasks-9527\",\"name\":\"shell-1\","

2
dolphinscheduler-dao/src/test/java/org/apache/dolphinscheduler/dao/utils/ResourceProcessDefinitionUtilsTest.java

@ -31,7 +31,7 @@ public class ResourceProcessDefinitionUtilsTest {
@Test
public void getResourceProcessDefinitionMapTest(){
List<Map<String,Object>> mapList = new ArrayList<>();
Map<String,Object> map = new HashMap();
Map<String,Object> map = new HashMap<>();
map.put("code",1L);
map.put("resource_ids","1,2,3");
mapList.add(map);

7
dolphinscheduler-dist/pom.xml vendored

@ -37,6 +37,11 @@
<artifactId>dolphinscheduler-server</artifactId>
</dependency>
<dependency>
<groupId>org.apache.dolphinscheduler</groupId>
<artifactId>dolphinscheduler-standalone-server</artifactId>
</dependency>
<dependency>
<groupId>org.apache.dolphinscheduler</groupId>
<artifactId>dolphinscheduler-api</artifactId>
@ -377,4 +382,4 @@
</profiles>
</project>
</project>

3
dolphinscheduler-dist/release-docs/LICENSE vendored

@ -249,6 +249,7 @@ The text of each license is also included at licenses/LICENSE-[project].txt.
curator-client 4.3.0: https://mvnrepository.com/artifact/org.apache.curator/curator-client/4.3.0, Apache 2.0
curator-framework 4.3.0: https://mvnrepository.com/artifact/org.apache.curator/curator-framework/4.3.0, Apache 2.0
curator-recipes 4.3.0: https://mvnrepository.com/artifact/org.apache.curator/curator-recipes/4.3.0, Apache 2.0
curator-test 2.12.0: https://mvnrepository.com/artifact/org.apache.curator/curator-test/2.12.0, Apache 2.0
datanucleus-api-jdo 4.2.1: https://mvnrepository.com/artifact/org.datanucleus/datanucleus-api-jdo/4.2.1, Apache 2.0
datanucleus-core 4.1.6: https://mvnrepository.com/artifact/org.datanucleus/datanucleus-core/4.1.6, Apache 2.0
datanucleus-rdbms 4.1.7: https://mvnrepository.com/artifact/org.datanucleus/datanucleus-rdbms/4.1.7, Apache 2.0
@ -557,4 +558,4 @@ Apache 2.0 licenses
========================================
BSD licenses
========================================
d3 3.5.17: https://github.com/d3/d3 BSD-3-Clause
d3 3.5.17: https://github.com/d3/d3 BSD-3-Clause

9
dolphinscheduler-dist/src/main/assembly/dolphinscheduler-bin.xml vendored

@ -61,15 +61,6 @@
<outputDirectory>conf</outputDirectory>
</fileSet>
<fileSet>
<directory>${basedir}/../dolphinscheduler-common/src/main/resources/bin</directory>
<includes>
<include>*.*</include>
</includes>
<directoryMode>755</directoryMode>
<outputDirectory>bin</outputDirectory>
</fileSet>
<fileSet>
<directory>${basedir}/../dolphinscheduler-dao/src/main/resources</directory>
<includes>

27
dolphinscheduler-dist/src/main/provisio/dolphinscheduler.xml vendored

@ -15,6 +15,12 @@
~ limitations under the License.
-->
<runtime>
<artifactSet to="/lib">
<exclusion>org.slf4j:slf4j-log4j12</exclusion>
<exclusion>org.slf4j:slf4j-api</exclusion>
<exclusion>ch.qos.logback:logback-classic</exclusion>
</artifactSet>
<!-- Target -->
<archive name="${project.artifactId}-${project.version}.tar.gz" hardLinkIncludes="**/*.jar"/>
<!-- Notices -->
@ -80,4 +86,25 @@
<unpack/>
</artifact>
</artifactSet>
<artifactSet to="lib/plugin/task/python">
<artifact id="${project.groupId}:dolphinscheduler-task-python:zip:${project.version}">
<unpack/>
</artifact>
</artifactSet>
<artifactSet to="lib/plugin/task/flink">
<artifact id="${project.groupId}:dolphinscheduler-task-flink:zip:${project.version}">
<unpack/>
</artifact>
</artifactSet>
<artifactSet to="lib/plugin/task/spark">
<artifact id="${project.groupId}:dolphinscheduler-task-spark:zip:${project.version}">
<unpack/>
</artifact>
</artifactSet>
<artifactSet to="lib/plugin/task/http">
<artifact id="${project.groupId}:dolphinscheduler-task-http:zip:${project.version}">
<unpack/>
</artifact>
</artifactSet>
</runtime>

11
dolphinscheduler-registry-plugin/dolphinscheduler-registry-zookeeper/src/main/java/org/apache/dolphinscheduler/plugin/registry/zookeeper/ZookeeperRegistry.java

@ -47,6 +47,7 @@ import org.apache.curator.framework.recipes.locks.InterProcessMutex;
import org.apache.curator.retry.ExponentialBackoffRetry;
import org.apache.curator.utils.CloseableUtils;
import org.apache.zookeeper.CreateMode;
import org.apache.zookeeper.KeeperException;
import org.apache.zookeeper.ZooDefs;
import org.apache.zookeeper.data.ACL;
@ -195,12 +196,7 @@ public class ZookeeperRegistry implements Registry {
@Override
public void remove(String key) {
try {
client.delete().deletingChildrenIfNeeded().forPath(key);
} catch (Exception e) {
throw new RegistryException("zookeeper remove error", e);
}
delete(key);
}
@Override
@ -269,6 +265,9 @@ public class ZookeeperRegistry implements Registry {
client.delete()
.deletingChildrenIfNeeded()
.forPath(nodePath);
} catch (KeeperException.NoNodeException ignore) {
// the node is not exist, we can believe the node has been removed
} catch (Exception e) {
throw new RegistryException("zookeeper delete key error", e);
}

14
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/entity/TaskExecutionContext.java

@ -19,6 +19,7 @@ package org.apache.dolphinscheduler.server.entity;
import org.apache.dolphinscheduler.common.enums.ExecutionStatus;
import org.apache.dolphinscheduler.common.enums.TaskTimeoutStrategy;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.remote.command.Command;
import org.apache.dolphinscheduler.remote.command.TaskExecuteRequestCommand;
@ -221,6 +222,19 @@ public class TaskExecutionContext implements Serializable {
*/
private String varPool;
/**
* business param
*/
private Map<String, Property> paramsMap;
public Map<String, Property> getParamsMap() {
return paramsMap;
}
public void setParamsMap(Map<String, Property> paramsMap) {
this.paramsMap = paramsMap;
}
/**
* procedure TaskExecutionContext
*/

2
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/master/dispatch/executor/NettyExecutorManager.java

@ -178,7 +178,7 @@ public class NettyExecutorManager extends AbstractExecutorManager<Boolean>{
* @return nodes
*/
private Set<String> getAllNodes(ExecutionContext context){
Set<String> nodes = Collections.EMPTY_SET;
Set<String> nodes = Collections.emptySet();
/**
* executor type
*/

15
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/master/runner/MasterBaseTaskExecThread.java

@ -17,11 +17,13 @@
package org.apache.dolphinscheduler.server.master.runner;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.enums.ExecutionStatus;
import org.apache.dolphinscheduler.common.enums.TaskTimeoutStrategy;
import org.apache.dolphinscheduler.common.enums.TimeoutFlag;
import org.apache.dolphinscheduler.common.task.TaskTimeoutParameter;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.LoggerUtils;
import org.apache.dolphinscheduler.dao.AlertDao;
import org.apache.dolphinscheduler.dao.entity.ProcessInstance;
import org.apache.dolphinscheduler.dao.entity.TaskDefinition;
@ -201,7 +203,9 @@ public class MasterBaseTaskExecThread implements Callable<Boolean> {
try {
if (taskInstance.isConditionsTask()
|| taskInstance.isDependTask()
|| taskInstance.isSubProcess()) {
|| taskInstance.isSubProcess()
|| taskInstance.isSwitchTask()
) {
return true;
}
if (taskInstance.getState().typeIsFinished()) {
@ -321,4 +325,13 @@ public class MasterBaseTaskExecThread implements Callable<Boolean> {
long usedTime = (System.currentTimeMillis() - startTime.getTime()) / 1000;
return timeoutSeconds - usedTime;
}
protected String getThreadName() {
logger = LoggerFactory.getLogger(LoggerUtils.buildTaskId(LoggerUtils.TASK_LOGGER_INFO_PREFIX,
processInstance.getProcessDefinitionCode(),
processInstance.getProcessDefinitionVersion(),
taskInstance.getProcessInstanceId(),
taskInstance.getId()));
return String.format(Constants.TASK_LOG_INFO_FORMAT, processService.formatTaskAppId(this.taskInstance));
}
}

19
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/master/runner/MasterExecThread.java

@ -46,6 +46,7 @@ import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.OSUtils;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.dao.entity.ProcessDefinition;
import org.apache.dolphinscheduler.dao.entity.ProcessInstance;
import org.apache.dolphinscheduler.dao.entity.ProjectUser;
import org.apache.dolphinscheduler.dao.entity.Schedule;
@ -433,6 +434,8 @@ public class MasterExecThread implements Runnable {
abstractExecThread = new DependentTaskExecThread(taskInstance);
} else if (taskInstance.isConditionsTask()) {
abstractExecThread = new ConditionsTaskExecThread(taskInstance);
} else if (taskInstance.isSwitchTask()) {
abstractExecThread = new SwitchTaskExecThread(taskInstance);
} else {
abstractExecThread = new MasterTaskExecThread(taskInstance);
}
@ -523,9 +526,9 @@ public class MasterExecThread implements Runnable {
return taskInstance;
}
public void getPreVarPool(TaskInstance taskInstance, Set<String> preTask) {
Map<String,Property> allProperty = new HashMap<>();
Map<String,TaskInstance> allTaskInstance = new HashMap<>();
public void getPreVarPool(TaskInstance taskInstance, Set<String> preTask) {
Map<String, Property> allProperty = new HashMap<>();
Map<String, TaskInstance> allTaskInstance = new HashMap<>();
if (CollectionUtils.isNotEmpty(preTask)) {
for (String preTaskName : preTask) {
TaskInstance preTaskInstance = completeTaskList.get(preTaskName);
@ -563,17 +566,17 @@ public class MasterExecThread implements Runnable {
TaskInstance otherTask = allTaskInstance.get(proName);
if (otherTask.getEndTime().getTime() > preTaskInstance.getEndTime().getTime()) {
allProperty.put(proName, thisProperty);
allTaskInstance.put(proName,preTaskInstance);
allTaskInstance.put(proName, preTaskInstance);
} else {
allProperty.put(proName, otherPro);
}
} else {
allProperty.put(proName, thisProperty);
allTaskInstance.put(proName,preTaskInstance);
allTaskInstance.put(proName, preTaskInstance);
}
} else {
allProperty.put(proName, thisProperty);
allTaskInstance.put(proName,preTaskInstance);
allTaskInstance.put(proName, preTaskInstance);
}
}
@ -945,7 +948,7 @@ public class MasterExecThread implements Runnable {
if (!sendTimeWarning && checkProcessTimeOut(processInstance)) {
processAlertManager.sendProcessTimeoutAlert(processInstance,
processService.findProcessDefinition(processInstance.getProcessDefinitionCode(),
processInstance.getProcessDefinitionVersion()));
processInstance.getProcessDefinitionVersion()));
sendTimeWarning = true;
}
for (Map.Entry<MasterBaseTaskExecThread, Future<Boolean>> entry : activeTaskNode.entrySet()) {
@ -974,7 +977,9 @@ public class MasterExecThread implements Runnable {
task.getName(), task.getId(), task.getState());
// node success , post node submit
if (task.getState() == ExecutionStatus.SUCCESS) {
ProcessDefinition relatedProcessDefinition = processInstance.getProcessDefinition();
processInstance = processService.findProcessInstanceById(processInstance.getId());
processInstance.setProcessDefinition(relatedProcessDefinition);
processInstance.setVarPool(task.getVarPool());
processService.updateProcessInstance(processInstance);
completeTaskList.put(task.getName(), task);

180
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/master/runner/SwitchTaskExecThread.java

@ -0,0 +1,180 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.master.runner;
import org.apache.dolphinscheduler.common.enums.DependResult;
import org.apache.dolphinscheduler.common.enums.ExecutionStatus;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.task.switchtask.SwitchParameters;
import org.apache.dolphinscheduler.common.task.switchtask.SwitchResultVo;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.NetUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.dao.entity.TaskInstance;
import org.apache.dolphinscheduler.server.utils.LogUtils;
import org.apache.dolphinscheduler.server.utils.SwitchTaskUtils;
import java.util.Date;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
public class SwitchTaskExecThread extends MasterBaseTaskExecThread {
protected final String rgex = "['\"]*\\$\\{(.*?)\\}['\"]*";
/**
* complete task map
*/
private Map<String, ExecutionStatus> completeTaskList = new ConcurrentHashMap<>();
/**
* switch result
*/
private DependResult conditionResult;
/**
* constructor of MasterBaseTaskExecThread
*
* @param taskInstance task instance
*/
public SwitchTaskExecThread(TaskInstance taskInstance) {
super(taskInstance);
taskInstance.setStartTime(new Date());
}
@Override
public Boolean submitWaitComplete() {
try {
this.taskInstance = submit();
logger.info("taskInstance submit end");
Thread.currentThread().setName(getThreadName());
initTaskParameters();
logger.info("switch task start");
waitTaskQuit();
updateTaskState();
} catch (Exception e) {
logger.error("switch task run exception", e);
}
return true;
}
private void waitTaskQuit() {
List<TaskInstance> taskInstances = processService.findValidTaskListByProcessId(
taskInstance.getProcessInstanceId()
);
for (TaskInstance task : taskInstances) {
completeTaskList.putIfAbsent(task.getName(), task.getState());
}
SwitchParameters switchParameters = taskInstance.getSwitchDependency();
List<SwitchResultVo> switchResultVos = switchParameters.getDependTaskList();
SwitchResultVo switchResultVo = new SwitchResultVo();
switchResultVo.setNextNode(switchParameters.getNextNode());
switchResultVos.add(switchResultVo);
int finalConditionLocation = switchResultVos.size() - 1;
int i = 0;
conditionResult = DependResult.SUCCESS;
for (SwitchResultVo info : switchResultVos) {
logger.info("the {} execution ", (i + 1));
logger.info("original condition sentence:{}", info.getCondition());
if (StringUtils.isEmpty(info.getCondition())) {
finalConditionLocation = i;
break;
}
String content = setTaskParams(info.getCondition().replaceAll("'", "\""), rgex);
logger.info("format condition sentence::{}", content);
Boolean result = null;
try {
result = SwitchTaskUtils.evaluate(content);
} catch (Exception e) {
logger.info("error sentence : {}", content);
conditionResult = DependResult.FAILED;
//result = false;
break;
}
logger.info("condition result : {}", result);
if (result) {
finalConditionLocation = i;
break;
}
i++;
}
switchParameters.setDependTaskList(switchResultVos);
switchParameters.setResultConditionLocation(finalConditionLocation);
taskInstance.setSwitchDependency(switchParameters);
//conditionResult = DependResult.SUCCESS;
logger.info("the switch task depend result : {}", conditionResult);
}
/**
* update task state
*/
private void updateTaskState() {
ExecutionStatus status;
if (this.cancel) {
status = ExecutionStatus.KILL;
} else {
status = (conditionResult == DependResult.SUCCESS) ? ExecutionStatus.SUCCESS : ExecutionStatus.FAILURE;
}
taskInstance.setEndTime(new Date());
taskInstance.setState(status);
processService.updateTaskInstance(taskInstance);
}
private void initTaskParameters() {
taskInstance.setLogPath(LogUtils.getTaskLogPath(processInstance.getProcessDefinitionCode(),
processInstance.getProcessDefinitionVersion(),
taskInstance.getProcessInstanceId(),
taskInstance.getId()));
this.taskInstance.setStartTime(new Date());
this.taskInstance.setHost(NetUtils.getAddr(masterConfig.getListenPort()));
this.taskInstance.setState(ExecutionStatus.RUNNING_EXECUTION);
this.processService.saveTaskInstance(taskInstance);
}
public String setTaskParams(String content, String rgex) {
Pattern pattern = Pattern.compile(rgex);
Matcher m = pattern.matcher(content);
Map<String, Property> globalParams = JSONUtils.toList(processInstance.getGlobalParams(), Property.class).stream().collect(Collectors.toMap(Property::getProp, Property -> Property));
Map<String, Property> varParams = JSONUtils.toList(taskInstance.getVarPool(), Property.class).stream().collect(Collectors.toMap(Property::getProp, Property -> Property));
if (varParams.size() > 0) {
varParams.putAll(globalParams);
globalParams = varParams;
}
while (m.find()) {
String paramName = m.group(1);
Property property = globalParams.get(paramName);
if (property == null) {
return "";
}
String value = property.getValue();
if (!org.apache.commons.lang.math.NumberUtils.isNumber(value)) {
value = "\"" + value + "\"";
}
logger.info("paramName:{},paramValue{}", paramName, value);
content = content.replace("${" + paramName + "}", value);
}
return content;
}
}

38
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/utils/SwitchTaskUtils.java

@ -0,0 +1,38 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.utils;
import javax.script.ScriptEngine;
import javax.script.ScriptEngineManager;
import javax.script.ScriptException;
public class SwitchTaskUtils {
private static ScriptEngineManager manager;
private static ScriptEngine engine;
static {
manager = new ScriptEngineManager();
engine = manager.getEngineByName("js");
}
public static boolean evaluate(String expression) throws ScriptException {
Object result = engine.eval(expression);
return (Boolean) result;
}
}

2
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/WorkerServer.java

@ -167,6 +167,7 @@ public class WorkerServer implements IStoppable {
}));
}
// todo better
private void initTaskPlugin() {
taskPluginManager = new TaskPluginManager();
DolphinPluginManagerConfig taskPluginManagerConfig = new DolphinPluginManagerConfig();
@ -211,6 +212,7 @@ public class WorkerServer implements IStoppable {
this.nettyRemotingServer.close();
this.workerRegistryClient.unRegistry();
this.alertClientService.close();
this.springApplicationContext.close();
} catch (Exception e) {
logger.error("worker server stop exception ", e);
}

4
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/plugin/TaskPluginManager.java

@ -1,4 +1,4 @@
package org.apache.dolphinscheduler.server.worker.plugin;/*
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
@ -15,6 +15,8 @@ package org.apache.dolphinscheduler.server.worker.plugin;/*
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.plugin;
import static java.lang.String.format;
import static java.util.Objects.requireNonNull;

25
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/processor/TaskExecuteProcessor.java

@ -114,10 +114,10 @@ public class TaskExecuteProcessor implements NettyRequestProcessor {
@Override
public void process(Channel channel, Command command) {
Preconditions.checkArgument(CommandType.TASK_EXECUTE_REQUEST == command.getType(),
String.format("invalid command type : %s", command.getType()));
String.format("invalid command type : %s", command.getType()));
TaskExecuteRequestCommand taskRequestCommand = JSONUtils.parseObject(
command.getBody(), TaskExecuteRequestCommand.class);
command.getBody(), TaskExecuteRequestCommand.class);
logger.info("received command : {}", taskRequestCommand);
@ -135,12 +135,7 @@ public class TaskExecuteProcessor implements NettyRequestProcessor {
}
setTaskCache(taskExecutionContext);
// custom logger
Logger taskLogger = LoggerFactory.getLogger(LoggerUtils.buildTaskId(LoggerUtils.TASK_LOGGER_INFO_PREFIX,
taskExecutionContext.getProcessDefineCode(),
taskExecutionContext.getProcessDefineVersion(),
taskExecutionContext.getProcessInstanceId(),
taskExecutionContext.getTaskInstanceId()));
// todo custom logger
taskExecutionContext.setHost(NetUtils.getAddr(workerConfig.getListenPort()));
taskExecutionContext.setLogPath(LogUtils.getTaskLogPath(taskExecutionContext));
@ -150,7 +145,6 @@ public class TaskExecuteProcessor implements NettyRequestProcessor {
logger.info("task instance local execute path : {}", execLocalPath);
taskExecutionContext.setExecutePath(execLocalPath);
FileUtils.taskLoggerThreadLocal.set(taskLogger);
try {
FileUtils.createWorkDirIfAbsent(execLocalPath);
if (CommonUtils.isSudoEnable() && workerConfig.getWorkerTenantAutoCreate()) {
@ -159,13 +153,12 @@ public class TaskExecuteProcessor implements NettyRequestProcessor {
} catch (Throwable ex) {
String errorLog = String.format("create execLocalPath : %s", execLocalPath);
LoggerUtils.logError(Optional.of(logger), errorLog, ex);
LoggerUtils.logError(Optional.ofNullable(taskLogger), errorLog, ex);
taskExecutionContextCacheManager.removeByTaskInstanceId(taskExecutionContext.getTaskInstanceId());
}
FileUtils.taskLoggerThreadLocal.remove();
taskCallbackService.addRemoteChannel(taskExecutionContext.getTaskInstanceId(),
new NettyRemoteChannel(channel, command.getOpaque()));
new NettyRemoteChannel(channel, command.getOpaque()));
// delay task process
long remainTime = DateUtils.getRemainTime(taskExecutionContext.getFirstSubmitTime(), taskExecutionContext.getDelayTime() * 60L);
@ -181,7 +174,7 @@ public class TaskExecuteProcessor implements NettyRequestProcessor {
this.doAck(taskExecutionContext);
// submit task to manager
if (!workerManager.offer(new TaskExecuteThread(taskExecutionContext, taskCallbackService, taskLogger, alertClientService, taskPluginManager))) {
if (!workerManager.offer(new TaskExecuteThread(taskExecutionContext, taskCallbackService, alertClientService, taskPluginManager))) {
logger.info("submit task to manager error, queue is full, queue size is {}", workerManager.getQueueSize());
}
}
@ -223,9 +216,9 @@ public class TaskExecuteProcessor implements NettyRequestProcessor {
*/
private String getExecLocalPath(TaskExecutionContext taskExecutionContext) {
return FileUtils.getProcessExecDir(taskExecutionContext.getProjectCode(),
taskExecutionContext.getProcessDefineCode(),
taskExecutionContext.getProcessDefineVersion(),
taskExecutionContext.getProcessInstanceId(),
taskExecutionContext.getTaskInstanceId());
taskExecutionContext.getProcessDefineCode(),
taskExecutionContext.getProcessDefineVersion(),
taskExecutionContext.getProcessInstanceId(),
taskExecutionContext.getTaskInstanceId());
}
}

1
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/runner/RetryReportTaskStatusThread.java

@ -49,6 +49,7 @@ public class RetryReportTaskStatusThread implements Runnable {
public void start(){
Thread thread = new Thread(this,"RetryReportTaskStatusThread");
thread.setDaemon(true);
thread.start();
}

6
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/runner/TaskExecuteThread.java

@ -114,24 +114,20 @@ public class TaskExecuteThread implements Runnable, Delayed {
*/
public TaskExecuteThread(TaskExecutionContext taskExecutionContext,
TaskCallbackService taskCallbackService,
Logger taskLogger,
AlertClientService alertClientService) {
this.taskExecutionContext = taskExecutionContext;
this.taskCallbackService = taskCallbackService;
this.taskExecutionContextCacheManager = SpringApplicationContext.getBean(TaskExecutionContextCacheManagerImpl.class);
this.taskLogger = taskLogger;
this.alertClientService = alertClientService;
}
public TaskExecuteThread(TaskExecutionContext taskExecutionContext,
TaskCallbackService taskCallbackService,
Logger taskLogger,
AlertClientService alertClientService,
TaskPluginManager taskPluginManager) {
this.taskExecutionContext = taskExecutionContext;
this.taskCallbackService = taskCallbackService;
this.taskExecutionContextCacheManager = SpringApplicationContext.getBean(TaskExecutionContextCacheManagerImpl.class);
this.taskLogger = taskLogger;
this.alertClientService = alertClientService;
this.taskPluginManager = taskPluginManager;
}
@ -176,7 +172,7 @@ public class TaskExecuteThread implements Runnable, Delayed {
//TODO Temporary operation, To be adjusted
TaskRequest taskRequest = JSONUtils.parseObject(JSONUtils.toJsonString(taskExecutionContext), TaskRequest.class);
task = taskChannel.createTask(taskRequest, taskLogger);
task = taskChannel.createTask(taskRequest);
// task init
this.task.init();
//init varPool

1
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/runner/WorkerManagerThread.java

@ -123,6 +123,7 @@ public class WorkerManagerThread implements Runnable {
public void start() {
Thread thread = new Thread(this, this.getClass().getName());
thread.setDaemon(true);
thread.start();
}

553
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/AbstractCommandExecutor.java

@ -1,553 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task;
import static org.apache.dolphinscheduler.common.Constants.EXIT_CODE_FAILURE;
import static org.apache.dolphinscheduler.common.Constants.EXIT_CODE_KILL;
import static org.apache.dolphinscheduler.common.Constants.EXIT_CODE_SUCCESS;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.enums.ExecutionStatus;
import org.apache.dolphinscheduler.common.thread.Stopper;
import org.apache.dolphinscheduler.common.thread.ThreadUtils;
import org.apache.dolphinscheduler.common.utils.CommonUtils;
import org.apache.dolphinscheduler.common.utils.HadoopUtils;
import org.apache.dolphinscheduler.common.utils.LoggerUtils;
import org.apache.dolphinscheduler.common.utils.OSUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.utils.ProcessUtils;
import org.apache.dolphinscheduler.server.worker.cache.TaskExecutionContextCacheManager;
import org.apache.dolphinscheduler.server.worker.cache.impl.TaskExecutionContextCacheManagerImpl;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import java.lang.reflect.Field;
import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.Collections;
import java.util.LinkedList;
import java.util.List;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.concurrent.TimeUnit;
import java.util.function.Consumer;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.slf4j.Logger;
/**
* abstract command executor
*/
public abstract class AbstractCommandExecutor {
/**
* rules for extracting application ID
*/
protected static final Pattern APPLICATION_REGEX = Pattern.compile(Constants.APPLICATION_REGEX);
protected StringBuilder varPool = new StringBuilder();
/**
* process
*/
private Process process;
/**
* log handler
*/
protected Consumer<LinkedBlockingQueue<String>> logHandler;
/**
* logger
*/
protected Logger logger;
/**
* log collection
*/
protected final LinkedBlockingQueue<String> logBuffer;
protected boolean logOutputIsScuccess = false;
/**
* taskExecutionContext
*/
protected TaskExecutionContext taskExecutionContext;
/**
* taskExecutionContextCacheManager
*/
private TaskExecutionContextCacheManager taskExecutionContextCacheManager;
public AbstractCommandExecutor(Consumer<LinkedBlockingQueue<String>> logHandler,
TaskExecutionContext taskExecutionContext,
Logger logger) {
this.logHandler = logHandler;
this.taskExecutionContext = taskExecutionContext;
this.logger = logger;
this.logBuffer = new LinkedBlockingQueue<>();
this.taskExecutionContextCacheManager = SpringApplicationContext.getBean(TaskExecutionContextCacheManagerImpl.class);
}
/**
* build process
*
* @param commandFile command file
* @throws IOException IO Exception
*/
private void buildProcess(String commandFile) throws IOException {
// setting up user to run commands
List<String> command = new LinkedList<>();
//init process builder
ProcessBuilder processBuilder = new ProcessBuilder();
// setting up a working directory
processBuilder.directory(new File(taskExecutionContext.getExecutePath()));
// merge error information to standard output stream
processBuilder.redirectErrorStream(true);
// setting up user to run commands
if (!OSUtils.isWindows() && CommonUtils.isSudoEnable()) {
command.add("sudo");
command.add("-u");
command.add(taskExecutionContext.getTenantCode());
}
command.add(commandInterpreter());
command.addAll(commandOptions());
command.add(commandFile);
// setting commands
processBuilder.command(command);
process = processBuilder.start();
// print command
printCommand(command);
}
/**
* task specific execution logic
*
* @param execCommand execCommand
* @return CommandExecuteResult
* @throws Exception if error throws Exception
*/
public CommandExecuteResult run(String execCommand) throws Exception {
CommandExecuteResult result = new CommandExecuteResult();
int taskInstanceId = taskExecutionContext.getTaskInstanceId();
// If the task has been killed, then the task in the cache is null
if (null == taskExecutionContextCacheManager.getByTaskInstanceId(taskInstanceId)) {
result.setExitStatusCode(EXIT_CODE_KILL);
return result;
}
if (StringUtils.isEmpty(execCommand)) {
taskExecutionContextCacheManager.removeByTaskInstanceId(taskInstanceId);
return result;
}
String commandFilePath = buildCommandFilePath();
// create command file if not exists
createCommandFileIfNotExists(execCommand, commandFilePath);
//build process
buildProcess(commandFilePath);
// parse process output
parseProcessOutput(process);
Integer processId = getProcessId(process);
result.setProcessId(processId);
// cache processId
taskExecutionContext.setProcessId(processId);
boolean updateTaskExecutionContextStatus = taskExecutionContextCacheManager.updateTaskExecutionContext(taskExecutionContext);
if (Boolean.FALSE.equals(updateTaskExecutionContextStatus)) {
ProcessUtils.kill(taskExecutionContext);
result.setExitStatusCode(EXIT_CODE_KILL);
return result;
}
// print process id
logger.info("process start, process id is: {}", processId);
// if timeout occurs, exit directly
long remainTime = getRemaintime();
// waiting for the run to finish
boolean status = process.waitFor(remainTime, TimeUnit.SECONDS);
// if SHELL task exit
if (status) {
// set appIds
List<String> appIds = getAppIds(taskExecutionContext.getLogPath());
result.setAppIds(String.join(Constants.COMMA, appIds));
// SHELL task state
result.setExitStatusCode(process.exitValue());
// if yarn task , yarn state is final state
if (process.exitValue() == 0) {
result.setExitStatusCode(isSuccessOfYarnState(appIds) ? EXIT_CODE_SUCCESS : EXIT_CODE_FAILURE);
}
} else {
logger.error("process has failure , exitStatusCode:{}, processExitValue:{}, ready to kill ...",
result.getExitStatusCode(), process.exitValue());
ProcessUtils.kill(taskExecutionContext);
result.setExitStatusCode(EXIT_CODE_FAILURE);
}
logger.info("process has exited, execute path:{}, processId:{} ,exitStatusCode:{} ,processWaitForStatus:{} ,processExitValue:{}",
taskExecutionContext.getExecutePath(), processId, result.getExitStatusCode(), status, process.exitValue());
return result;
}
public String getVarPool() {
return varPool.toString();
}
/**
* cancel application
*
* @throws Exception exception
*/
public void cancelApplication() throws Exception {
if (process == null) {
return;
}
// clear log
clear();
int processId = getProcessId(process);
logger.info("cancel process: {}", processId);
// kill , waiting for completion
boolean killed = softKill(processId);
if (!killed) {
// hard kill
hardKill(processId);
// destory
process.destroy();
process = null;
}
}
/**
* soft kill
*
* @param processId process id
* @return process is alive
* @throws InterruptedException interrupted exception
*/
private boolean softKill(int processId) {
if (processId != 0 && process.isAlive()) {
try {
// sudo -u user command to run command
String cmd = String.format("kill %d", processId);
cmd = OSUtils.getSudoCmd(taskExecutionContext.getTenantCode(), cmd);
logger.info("soft kill task:{}, process id:{}, cmd:{}", taskExecutionContext.getTaskAppId(), processId, cmd);
Runtime.getRuntime().exec(cmd);
} catch (IOException e) {
logger.info("kill attempt failed", e);
}
}
return !process.isAlive();
}
/**
* hard kill
*
* @param processId process id
*/
private void hardKill(int processId) {
if (processId != 0 && process.isAlive()) {
try {
String cmd = String.format("kill -9 %d", processId);
cmd = OSUtils.getSudoCmd(taskExecutionContext.getTenantCode(), cmd);
logger.info("hard kill task:{}, process id:{}, cmd:{}", taskExecutionContext.getTaskAppId(), processId, cmd);
Runtime.getRuntime().exec(cmd);
} catch (IOException e) {
logger.error("kill attempt failed ", e);
}
}
}
/**
* print command
*
* @param commands process builder
*/
private void printCommand(List<String> commands) {
String cmdStr;
try {
cmdStr = ProcessUtils.buildCommandStr(commands);
logger.info("task run command:\n{}", cmdStr);
} catch (Exception e) {
logger.error(e.getMessage(), e);
}
}
/**
* clear
*/
private void clear() {
LinkedBlockingQueue<String> markerLog = new LinkedBlockingQueue<>();
markerLog.add(ch.qos.logback.classic.ClassicConstants.FINALIZE_SESSION_MARKER.toString());
if (!logBuffer.isEmpty()) {
// log handle
logHandler.accept(logBuffer);
}
logHandler.accept(markerLog);
}
/**
* get the standard output of the process
*
* @param process process
*/
private void parseProcessOutput(Process process) {
String threadLoggerInfoName = String.format(LoggerUtils.TASK_LOGGER_THREAD_NAME + "-%s", taskExecutionContext.getTaskAppId());
ExecutorService getOutputLogService = ThreadUtils.newDaemonSingleThreadExecutor(threadLoggerInfoName + "-" + "getOutputLogService");
getOutputLogService.submit(() -> {
try (BufferedReader inReader = new BufferedReader(new InputStreamReader(process.getInputStream()))) {
String line;
logBuffer.add("welcome to use bigdata scheduling system...");
while ((line = inReader.readLine()) != null) {
if (line.startsWith("${setValue(")) {
varPool.append(line.substring("${setValue(".length(), line.length() - 2));
varPool.append("$VarPool$");
} else {
logBuffer.add(line);
}
}
} catch (Exception e) {
logger.error(e.getMessage(), e);
} finally {
logOutputIsScuccess = true;
}
});
getOutputLogService.shutdown();
ExecutorService parseProcessOutputExecutorService = ThreadUtils.newDaemonSingleThreadExecutor(threadLoggerInfoName);
parseProcessOutputExecutorService.submit(() -> {
try {
long lastFlushTime = System.currentTimeMillis();
while (logBuffer.size() > 0 || !logOutputIsScuccess) {
if (logBuffer.size() > 0) {
lastFlushTime = flush(lastFlushTime);
} else {
Thread.sleep(Constants.DEFAULT_LOG_FLUSH_INTERVAL);
}
}
} catch (Exception e) {
logger.error(e.getMessage(), e);
} finally {
clear();
}
});
parseProcessOutputExecutorService.shutdown();
}
/**
* check yarn state
*
* @param appIds application id list
* @return is success of yarn task state
*/
public boolean isSuccessOfYarnState(List<String> appIds) {
boolean result = true;
try {
for (String appId : appIds) {
logger.info("check yarn application status, appId:{}", appId);
while (Stopper.isRunning()) {
ExecutionStatus applicationStatus = HadoopUtils.getInstance().getApplicationStatus(appId);
if (logger.isDebugEnabled()) {
logger.debug("check yarn application status, appId:{}, final state:{}", appId, applicationStatus.name());
}
if (applicationStatus.equals(ExecutionStatus.FAILURE)
|| applicationStatus.equals(ExecutionStatus.KILL)) {
return false;
}
if (applicationStatus.equals(ExecutionStatus.SUCCESS)) {
break;
}
ThreadUtils.sleep(Constants.SLEEP_TIME_MILLIS);
}
}
} catch (Exception e) {
logger.error("yarn applications: {} , query status failed, exception:{}", StringUtils.join(appIds, ","), e);
result = false;
}
return result;
}
public int getProcessId() {
return getProcessId(process);
}
/**
* get app links
*
* @param logPath log path
* @return app id list
*/
private List<String> getAppIds(String logPath) {
List<String> logs = convertFile2List(logPath);
List<String> appIds = new ArrayList<>();
/**
* analysis log?get submited yarn application id
*/
for (String log : logs) {
String appId = findAppId(log);
if (StringUtils.isNotEmpty(appId) && !appIds.contains(appId)) {
logger.info("find app id: {}", appId);
appIds.add(appId);
}
}
return appIds;
}
/**
* convert file to list
*
* @param filename file name
* @return line list
*/
private List<String> convertFile2List(String filename) {
List<String> lineList = new ArrayList<>(100);
File file = new File(filename);
if (!file.exists()) {
return lineList;
}
try (BufferedReader br = new BufferedReader(new InputStreamReader(new FileInputStream(filename), StandardCharsets.UTF_8))) {
String line = null;
while ((line = br.readLine()) != null) {
lineList.add(line);
}
} catch (Exception e) {
logger.error(String.format("read file: %s failed : ", filename), e);
}
return lineList;
}
/**
* find app id
*
* @param line line
* @return appid
*/
private String findAppId(String line) {
Matcher matcher = APPLICATION_REGEX.matcher(line);
if (matcher.find()) {
return matcher.group();
}
return null;
}
/**
* get remain times
*
* @return remain time
*/
private long getRemaintime() {
long usedTime = (System.currentTimeMillis() - taskExecutionContext.getStartTime().getTime()) / 1000;
long remainTime = taskExecutionContext.getTaskTimeout() - usedTime;
if (remainTime < 0) {
throw new RuntimeException("task execution time out");
}
return remainTime;
}
/**
* get process id
*
* @param process process
* @return process id
*/
private int getProcessId(Process process) {
int processId = 0;
try {
Field f = process.getClass().getDeclaredField(Constants.PID);
f.setAccessible(true);
processId = f.getInt(process);
} catch (Throwable e) {
logger.error(e.getMessage(), e);
}
return processId;
}
/**
* when log buffer siz or flush time reach condition , then flush
*
* @param lastFlushTime last flush time
* @return last flush time
*/
private long flush(long lastFlushTime) {
long now = System.currentTimeMillis();
/**
* when log buffer siz or flush time reach condition , then flush
*/
if (logBuffer.size() >= Constants.DEFAULT_LOG_ROWS_NUM || now - lastFlushTime > Constants.DEFAULT_LOG_FLUSH_INTERVAL) {
lastFlushTime = now;
/** log handle */
logHandler.accept(logBuffer);
}
return lastFlushTime;
}
protected List<String> commandOptions() {
return Collections.emptyList();
}
protected abstract String buildCommandFilePath();
protected abstract String commandInterpreter();
protected abstract void createCommandFileIfNotExists(String execCommand, String commandFile) throws IOException;
}

193
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/AbstractTask.java

@ -1,193 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task;
import static ch.qos.logback.classic.ClassicConstants.FINALIZE_SESSION_MARKER;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.enums.ExecutionStatus;
import org.apache.dolphinscheduler.common.enums.TaskType;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import java.util.StringJoiner;
import java.util.concurrent.LinkedBlockingQueue;
import org.slf4j.Logger;
/**
* executive task
*/
public abstract class AbstractTask {
/**
* taskExecutionContext
**/
TaskExecutionContext taskExecutionContext;
/**
* log record
*/
protected Logger logger;
/**
* SHELL process pid
*/
protected int processId;
/**
* other resource manager appId , for example : YARN etc
*/
protected String appIds;
/**
* cancel
*/
protected volatile boolean cancel = false;
/**
* exit code
*/
protected volatile int exitStatusCode = -1;
/**
* constructor
*
* @param taskExecutionContext taskExecutionContext
* @param logger logger
*/
protected AbstractTask(TaskExecutionContext taskExecutionContext, Logger logger) {
this.taskExecutionContext = taskExecutionContext;
this.logger = logger;
}
/**
* init task
*
* @throws Exception exception
*/
public void init() throws Exception {
}
/**
* task handle
*
* @throws Exception exception
*/
public abstract void handle() throws Exception;
/**
* result processing
*
* @throws Exception exception
*/
public void after() throws Exception {
}
/**
* cancel application
*
* @param status status
* @throws Exception exception
*/
public void cancelApplication(boolean status) throws Exception {
this.cancel = status;
}
/**
* log handle
*
* @param logs log list
*/
public void logHandle(LinkedBlockingQueue<String> logs) {
// note that the "new line" is added here to facilitate log parsing
if (logs.contains(FINALIZE_SESSION_MARKER.toString())) {
logger.info(FINALIZE_SESSION_MARKER, FINALIZE_SESSION_MARKER.toString());
} else {
StringJoiner joiner = new StringJoiner("\n\t");
while (!logs.isEmpty()) {
joiner.add(logs.poll());
}
logger.info(" -> {}", joiner);
}
}
/**
* get exit status code
*
* @return exit status code
*/
public int getExitStatusCode() {
return exitStatusCode;
}
public void setExitStatusCode(int exitStatusCode) {
this.exitStatusCode = exitStatusCode;
}
public String getAppIds() {
return appIds;
}
public void setAppIds(String appIds) {
this.appIds = appIds;
}
public int getProcessId() {
return processId;
}
public void setProcessId(int processId) {
this.processId = processId;
}
/**
* get task parameters
*
* @return AbstractParameters
*/
public abstract AbstractParameters getParameters();
private boolean typeIsNormalTask(String taskType) {
return !(TaskType.SUB_PROCESS.getDesc().equalsIgnoreCase(taskType) || TaskType.DEPENDENT.getDesc().equalsIgnoreCase(taskType));
}
/**
* get exit status according to exitCode
*
* @return exit status
*/
public ExecutionStatus getExitStatus() {
ExecutionStatus status;
switch (getExitStatusCode()) {
case Constants.EXIT_CODE_SUCCESS:
status = ExecutionStatus.SUCCESS;
break;
case Constants.EXIT_CODE_KILL:
status = ExecutionStatus.KILL;
break;
default:
status = ExecutionStatus.FAILURE;
break;
}
return status;
}
}

95
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/AbstractYarnTask.java

@ -1,95 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task;
import org.apache.dolphinscheduler.dao.entity.TaskInstance;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.utils.ProcessUtils;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import org.apache.dolphinscheduler.service.process.ProcessService;
import org.slf4j.Logger;
/**
* abstract yarn task
*/
public abstract class AbstractYarnTask extends AbstractTask {
/**
* process task
*/
private ShellCommandExecutor shellCommandExecutor;
/**
* process database access
*/
protected ProcessService processService;
/**
* Abstract Yarn Task
* @param taskExecutionContext taskExecutionContext
* @param logger logger
*/
public AbstractYarnTask(TaskExecutionContext taskExecutionContext, Logger logger) {
super(taskExecutionContext, logger);
this.processService = SpringApplicationContext.getBean(ProcessService.class);
this.shellCommandExecutor = new ShellCommandExecutor(this::logHandle,
taskExecutionContext,
logger);
}
@Override
public void handle() throws Exception {
try {
// SHELL task exit code
CommandExecuteResult commandExecuteResult = shellCommandExecutor.run(buildCommand());
setExitStatusCode(commandExecuteResult.getExitStatusCode());
setAppIds(commandExecuteResult.getAppIds());
setProcessId(commandExecuteResult.getProcessId());
} catch (Exception e) {
logger.error("yarn process failure", e);
exitStatusCode = -1;
throw e;
}
}
/**
* cancel application
* @param status status
* @throws Exception exception
*/
@Override
public void cancelApplication(boolean status) throws Exception {
cancel = true;
// cancel process
shellCommandExecutor.cancelApplication();
TaskInstance taskInstance = processService.findTaskInstanceById(taskExecutionContext.getTaskInstanceId());
if (status && taskInstance != null){
ProcessUtils.killYarnJob(taskExecutionContext);
}
}
/**
* create command
* @return String
* @throws Exception exception
*/
protected abstract String buildCommand() throws Exception;
/**
* set main jar name
*/
protected abstract void setMainJarName();
}

69
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/CommandExecuteResult.java

@ -1,69 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task;
/**
* command execute result
*/
public class CommandExecuteResult {
/**
* command exit code
*/
private Integer exitStatusCode;
/**
* appIds
*/
private String appIds;
/**
* process id
*/
private Integer processId;
public CommandExecuteResult(){
this.exitStatusCode = 0;
}
public Integer getExitStatusCode() {
return exitStatusCode;
}
public void setExitStatusCode(Integer exitStatusCode) {
this.exitStatusCode = exitStatusCode;
}
public String getAppIds() {
return appIds;
}
public void setAppIds(String appIds) {
this.appIds = appIds;
}
public Integer getProcessId() {
return processId;
}
public void setProcessId(Integer processId) {
this.processId = processId;
}
}

188
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/PythonCommandExecutor.java

@ -1,188 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.utils.FileUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.Collections;
import java.util.List;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.function.Consumer;
import java.util.regex.Pattern;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* python command executor
*/
public class PythonCommandExecutor extends AbstractCommandExecutor {
/**
* logger
*/
private static final Logger logger = LoggerFactory.getLogger(PythonCommandExecutor.class);
/**
* python
*/
public static final String PYTHON = "python";
private static final Pattern PYTHON_PATH_PATTERN = Pattern.compile("/bin/python[\\d.]*$");
/**
* constructor
* @param logHandler log handler
* @param taskExecutionContext taskExecutionContext
* @param logger logger
*/
public PythonCommandExecutor(Consumer<LinkedBlockingQueue<String>> logHandler,
TaskExecutionContext taskExecutionContext,
Logger logger) {
super(logHandler,taskExecutionContext,logger);
}
/**
* build command file path
*
* @return command file path
*/
@Override
protected String buildCommandFilePath() {
return String.format("%s/py_%s.command", taskExecutionContext.getExecutePath(), taskExecutionContext.getTaskAppId());
}
/**
* create command file if not exists
* @param execCommand exec command
* @param commandFile command file
* @throws IOException io exception
*/
@Override
protected void createCommandFileIfNotExists(String execCommand, String commandFile) throws IOException {
logger.info("tenantCode :{}, task dir:{}", taskExecutionContext.getTenantCode(), taskExecutionContext.getExecutePath());
if (!Files.exists(Paths.get(commandFile))) {
logger.info("generate command file:{}", commandFile);
StringBuilder sb = new StringBuilder();
sb.append("#-*- encoding=utf8 -*-\n");
sb.append("\n\n");
sb.append(execCommand);
logger.info(sb.toString());
// write data to file
FileUtils.writeStringToFile(new File(commandFile),
sb.toString(),
StandardCharsets.UTF_8);
}
}
/**
* get command options
* @return command options list
*/
@Override
protected List<String> commandOptions() {
// unbuffered binary stdout and stderr
return Collections.singletonList("-u");
}
/**
* Gets the command path to which Python can execute
* @return python command path
*/
@Override
protected String commandInterpreter() {
String pythonHome = getPythonHome(taskExecutionContext.getEnvFile());
return getPythonCommand(pythonHome);
}
/**
* get python command
*
* @param pythonHome python home
* @return python command
*/
public static String getPythonCommand(String pythonHome) {
if (StringUtils.isEmpty(pythonHome)) {
return PYTHON;
}
File file = new File(pythonHome);
if (file.exists() && file.isFile()) {
return pythonHome;
}
if (PYTHON_PATH_PATTERN.matcher(pythonHome).find()) {
return pythonHome;
}
return Paths.get(pythonHome, "/bin/python").toString();
}
/**
* get python home
*
* @param envPath env path
* @return python home
*/
public static String getPythonHome(String envPath) {
BufferedReader br = null;
StringBuilder sb = new StringBuilder();
try {
br = new BufferedReader(new InputStreamReader(new FileInputStream(envPath)));
String line;
while ((line = br.readLine()) != null) {
if (line.contains(Constants.PYTHON_HOME)) {
sb.append(line);
break;
}
}
String result = sb.toString();
if (StringUtils.isEmpty(result)) {
return null;
}
String[] arrs = result.split(Constants.EQUAL_SIGN);
if (arrs.length == 2) {
return arrs[1];
}
} catch (IOException e) {
logger.error("read file failure", e);
} finally {
try {
if (br != null) {
br.close();
}
} catch (IOException e) {
logger.error(e.getMessage(), e);
}
}
return null;
}
}

119
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/ShellCommandExecutor.java

@ -1,119 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task;
import org.apache.dolphinscheduler.common.utils.OSUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.commons.io.FileUtils;
import java.io.File;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.List;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.function.Consumer;
import org.slf4j.Logger;
/**
* shell command executor
*/
public class ShellCommandExecutor extends AbstractCommandExecutor {
/**
* For Unix-like, using sh
*/
public static final String SH = "sh";
/**
* For Windows, using cmd.exe
*/
public static final String CMD = "cmd.exe";
/**
* constructor
* @param logHandler logHandler
* @param taskExecutionContext taskExecutionContext
* @param logger logger
*/
public ShellCommandExecutor(Consumer<LinkedBlockingQueue<String>> logHandler,
TaskExecutionContext taskExecutionContext,
Logger logger) {
super(logHandler,taskExecutionContext,logger);
}
@Override
protected String buildCommandFilePath() {
// command file
return String.format("%s/%s.%s"
, taskExecutionContext.getExecutePath()
, taskExecutionContext.getTaskAppId()
, OSUtils.isWindows() ? "bat" : "command");
}
/**
* get command type
* @return command type
*/
@Override
protected String commandInterpreter() {
return OSUtils.isWindows() ? CMD : SH;
}
/**
* create command file if not exists
* @param execCommand exec command
* @param commandFile command file
* @throws IOException io exception
*/
@Override
protected void createCommandFileIfNotExists(String execCommand, String commandFile) throws IOException {
logger.info("tenantCode user:{}, task dir:{}", taskExecutionContext.getTenantCode(),
taskExecutionContext.getTaskAppId());
// create if non existence
if (!Files.exists(Paths.get(commandFile))) {
logger.info("create command file:{}", commandFile);
StringBuilder sb = new StringBuilder();
if (OSUtils.isWindows()) {
sb.append("@echo off\n");
sb.append("cd /d %~dp0\n");
if (taskExecutionContext.getEnvFile() != null) {
sb.append("call ").append(taskExecutionContext.getEnvFile()).append("\n");
}
} else {
sb.append("#!/bin/sh\n");
sb.append("BASEDIR=$(cd `dirname $0`; pwd)\n");
sb.append("cd $BASEDIR\n");
if (taskExecutionContext.getEnvFile() != null) {
sb.append("source ").append(taskExecutionContext.getEnvFile()).append("\n");
}
}
sb.append(execCommand);
logger.info("command : {}", sb.toString());
// write data to file
FileUtils.writeStringToFile(new File(commandFile), sb.toString(), StandardCharsets.UTF_8);
}
}
}

80
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/TaskManager.java

@ -1,80 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.datax.DataxTask;
import org.apache.dolphinscheduler.server.worker.task.flink.FlinkTask;
import org.apache.dolphinscheduler.server.worker.task.http.HttpTask;
import org.apache.dolphinscheduler.server.worker.task.mr.MapReduceTask;
import org.apache.dolphinscheduler.server.worker.task.procedure.ProcedureTask;
import org.apache.dolphinscheduler.server.worker.task.python.PythonTask;
import org.apache.dolphinscheduler.server.worker.task.shell.ShellTask;
import org.apache.dolphinscheduler.server.worker.task.spark.SparkTask;
import org.apache.dolphinscheduler.server.worker.task.sql.SqlTask;
import org.apache.dolphinscheduler.server.worker.task.sqoop.SqoopTask;
import org.apache.dolphinscheduler.service.alert.AlertClientService;
import org.slf4j.Logger;
/**
* task manager
*/
public class TaskManager {
/**
* create new task
* @param taskExecutionContext taskExecutionContext
* @param logger logger
* @return AbstractTask
* @throws IllegalArgumentException illegal argument exception
*/
public static AbstractTask newTask(TaskExecutionContext taskExecutionContext, Logger logger, AlertClientService alertClientService) throws IllegalArgumentException {
String taskType = taskExecutionContext.getTaskType();
if (taskType == null) {
logger.error("task type is null");
throw new IllegalArgumentException("task type is null");
}
switch (taskType) {
case "SHELL":
case "WATERDROP":
return new ShellTask(taskExecutionContext, logger);
case "PROCEDURE":
return new ProcedureTask(taskExecutionContext, logger);
case "SQL":
return new SqlTask(taskExecutionContext, logger, alertClientService);
case "MR":
return new MapReduceTask(taskExecutionContext, logger);
case "SPARK":
return new SparkTask(taskExecutionContext, logger);
case "FLINK":
return new FlinkTask(taskExecutionContext, logger);
case "PYTHON":
return new PythonTask(taskExecutionContext, logger);
case "HTTP":
return new HttpTask(taskExecutionContext, logger);
case "DATAX":
return new DataxTask(taskExecutionContext, logger);
case "SQOOP":
return new SqoopTask(taskExecutionContext, logger);
default:
logger.error("not support task type: {}", taskType);
throw new IllegalArgumentException("not support task type");
}
}
}

346
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/TaskProps.java

@ -1,346 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task;
import com.fasterxml.jackson.annotation.JsonFormat;
import org.apache.dolphinscheduler.common.enums.CommandType;
import org.apache.dolphinscheduler.common.enums.DataType;
import org.apache.dolphinscheduler.common.enums.Direct;
import org.apache.dolphinscheduler.common.enums.TaskTimeoutStrategy;
import org.apache.dolphinscheduler.common.process.Property;
import java.util.Date;
import java.util.HashMap;
import java.util.Iterator;
import java.util.Map;
/**
* task props
*/
public class TaskProps {
/**
* task node name
**/
private String taskName;
/**
* task instance id
**/
private int taskInstanceId;
/**
* tenant code , execute task linux user
**/
private String tenantCode;
/**
* task type
*/
private String taskType;
/**
* task parameters
**/
private String taskParams;
/**
* queue
**/
private String queue;
/**
* env file
**/
private String envFile;
/**
* defined params
**/
private Map<String, String> definedParams;
/**
* task app id
*/
private String taskAppId;
/**
* task start time
*/
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss",timezone="GMT+8")
private Date taskStartTime;
/**
* task timeout
*/
private int taskTimeout;
/**
* task timeout strategy
*/
private TaskTimeoutStrategy taskTimeoutStrategy;
/**
* task dependence
*/
private String dependence;
/**
* schedule time
*/
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss",timezone="GMT+8")
private Date scheduleTime;
/**
* command type is complement
*/
private CommandType cmdTypeIfComplement;
/**
* host
*/
private String host;
/**
* log path
*/
private String logPath;
/**
* execute path
*/
private String executePath;
/**
* constructor
*/
public TaskProps(){}
/**
* constructor
* @param taskParams taskParams
* @param scheduleTime scheduleTime
* @param nodeName nodeName
* @param taskType taskType
* @param taskInstanceId taskInstanceId
* @param envFile envFile
* @param tenantCode tenantCode
* @param queue queue
* @param taskStartTime taskStartTime
* @param definedParams definedParams
* @param dependence dependence
* @param cmdTypeIfComplement cmdTypeIfComplement
* @param host host
* @param logPath logPath
* @param executePath executePath
*/
public TaskProps(String taskParams,
Date scheduleTime,
String nodeName,
String taskType,
int taskInstanceId,
String envFile,
String tenantCode,
String queue,
Date taskStartTime,
Map<String, String> definedParams,
String dependence,
CommandType cmdTypeIfComplement,
String host,
String logPath,
String executePath){
this.taskParams = taskParams;
this.scheduleTime = scheduleTime;
this.taskName = nodeName;
this.taskType = taskType;
this.taskInstanceId = taskInstanceId;
this.envFile = envFile;
this.tenantCode = tenantCode;
this.queue = queue;
this.taskStartTime = taskStartTime;
this.definedParams = definedParams;
this.dependence = dependence;
this.cmdTypeIfComplement = cmdTypeIfComplement;
this.host = host;
this.logPath = logPath;
this.executePath = executePath;
}
public String getTenantCode() {
return tenantCode;
}
public void setTenantCode(String tenantCode) {
this.tenantCode = tenantCode;
}
public String getTaskParams() {
return taskParams;
}
public void setTaskParams(String taskParams) {
this.taskParams = taskParams;
}
public String getExecutePath() {
return executePath;
}
public void setExecutePath(String executePath) {
this.executePath = executePath;
}
public Map<String, String> getDefinedParams() {
return definedParams;
}
public void setDefinedParams(Map<String, String> definedParams) {
this.definedParams = definedParams;
}
public String getEnvFile() {
return envFile;
}
public void setEnvFile(String envFile) {
this.envFile = envFile;
}
public String getTaskName() {
return taskName;
}
public void setTaskName(String taskName) {
this.taskName = taskName;
}
public int getTaskInstanceId() {
return taskInstanceId;
}
public void setTaskInstanceId(int taskInstanceId) {
this.taskInstanceId = taskInstanceId;
}
public String getQueue() {
return queue;
}
public void setQueue(String queue) {
this.queue = queue;
}
public String getTaskAppId() {
return taskAppId;
}
public void setTaskAppId(String taskAppId) {
this.taskAppId = taskAppId;
}
public Date getTaskStartTime() {
return taskStartTime;
}
public void setTaskStartTime(Date taskStartTime) {
this.taskStartTime = taskStartTime;
}
public int getTaskTimeout() {
return taskTimeout;
}
public void setTaskTimeout(int taskTimeout) {
this.taskTimeout = taskTimeout;
}
public TaskTimeoutStrategy getTaskTimeoutStrategy() {
return taskTimeoutStrategy;
}
public void setTaskTimeoutStrategy(TaskTimeoutStrategy taskTimeoutStrategy) {
this.taskTimeoutStrategy = taskTimeoutStrategy;
}
public String getTaskType() {
return taskType;
}
public void setTaskType(String taskType) {
this.taskType = taskType;
}
public String getDependence() {
return dependence;
}
public void setDependence(String dependence) {
this.dependence = dependence;
}
public Date getScheduleTime() {
return scheduleTime;
}
public void setScheduleTime(Date scheduleTime) {
this.scheduleTime = scheduleTime;
}
public CommandType getCmdTypeIfComplement() {
return cmdTypeIfComplement;
}
public void setCmdTypeIfComplement(CommandType cmdTypeIfComplement) {
this.cmdTypeIfComplement = cmdTypeIfComplement;
}
public String getHost() {
return host;
}
public void setHost(String host) {
this.host = host;
}
public String getLogPath() {
return logPath;
}
public void setLogPath(String logPath) {
this.logPath = logPath;
}
/**
* get parameters map
* @return user defined params map
*/
public Map<String,Property> getUserDefParamsMap() {
if (definedParams != null) {
Map<String,Property> userDefParamsMaps = new HashMap<>();
Iterator<Map.Entry<String, String>> iter = definedParams.entrySet().iterator();
while (iter.hasNext()){
Map.Entry<String, String> en = iter.next();
Property property = new Property(en.getKey(), Direct.IN, DataType.VARCHAR , en.getValue());
userDefParamsMaps.put(property.getProp(),property);
}
return userDefParamsMaps;
}
return null;
}
}

580
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/datax/DataxTask.java

@ -1,580 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.datax;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.datasource.BaseConnectionParam;
import org.apache.dolphinscheduler.common.datasource.DatasourceUtil;
import org.apache.dolphinscheduler.common.enums.DbType;
import org.apache.dolphinscheduler.common.enums.Flag;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import org.apache.dolphinscheduler.common.task.datax.DataxParameters;
import org.apache.dolphinscheduler.common.utils.CollectionUtils;
import org.apache.dolphinscheduler.common.utils.CommonUtils;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.OSUtils;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.server.entity.DataxTaskExecutionContext;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.utils.DataxUtils;
import org.apache.dolphinscheduler.server.utils.ParamUtils;
import org.apache.dolphinscheduler.server.worker.task.AbstractTask;
import org.apache.dolphinscheduler.server.worker.task.CommandExecuteResult;
import org.apache.dolphinscheduler.server.worker.task.ShellCommandExecutor;
import org.apache.commons.io.FileUtils;
import java.io.File;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.StandardOpenOption;
import java.nio.file.attribute.FileAttribute;
import java.nio.file.attribute.PosixFilePermission;
import java.nio.file.attribute.PosixFilePermissions;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.slf4j.Logger;
import com.alibaba.druid.sql.ast.SQLStatement;
import com.alibaba.druid.sql.ast.expr.SQLIdentifierExpr;
import com.alibaba.druid.sql.ast.expr.SQLPropertyExpr;
import com.alibaba.druid.sql.ast.statement.SQLSelect;
import com.alibaba.druid.sql.ast.statement.SQLSelectItem;
import com.alibaba.druid.sql.ast.statement.SQLSelectQueryBlock;
import com.alibaba.druid.sql.ast.statement.SQLSelectStatement;
import com.alibaba.druid.sql.ast.statement.SQLUnionQuery;
import com.alibaba.druid.sql.parser.SQLStatementParser;
import com.fasterxml.jackson.databind.node.ArrayNode;
import com.fasterxml.jackson.databind.node.ObjectNode;
/**
* DataX task
*/
public class DataxTask extends AbstractTask {
/**
* jvm parameters
*/
public static final String JVM_PARAM = " --jvm=\"-Xms%sG -Xmx%sG\" ";
/**
* python process(datax only supports version 2.7 by default)
*/
private static final String DATAX_PYTHON = "python2.7";
private static final Pattern PYTHON_PATH_PATTERN = Pattern.compile("/bin/python[\\d.]*$");
/**
* datax path
*/
private static final String DATAX_PATH = "${DATAX_HOME}/bin/datax.py";
/**
* datax channel count
*/
private static final int DATAX_CHANNEL_COUNT = 1;
/**
* datax parameters
*/
private DataxParameters dataXParameters;
/**
* shell command executor
*/
private ShellCommandExecutor shellCommandExecutor;
/**
* taskExecutionContext
*/
private TaskExecutionContext taskExecutionContext;
/**
* constructor
*
* @param taskExecutionContext taskExecutionContext
* @param logger logger
*/
public DataxTask(TaskExecutionContext taskExecutionContext, Logger logger) {
super(taskExecutionContext, logger);
this.taskExecutionContext = taskExecutionContext;
this.shellCommandExecutor = new ShellCommandExecutor(this::logHandle,
taskExecutionContext, logger);
}
/**
* init DataX config
*/
@Override
public void init() {
logger.info("datax task params {}", taskExecutionContext.getTaskParams());
dataXParameters = JSONUtils.parseObject(taskExecutionContext.getTaskParams(), DataxParameters.class);
if (!dataXParameters.checkParameters()) {
throw new RuntimeException("datax task params is not valid");
}
}
/**
* run DataX process
*
* @throws Exception if error throws Exception
*/
@Override
public void handle() throws Exception {
try {
// set the name of the current thread
String threadLoggerInfoName = String.format("TaskLogInfo-%s", taskExecutionContext.getTaskAppId());
Thread.currentThread().setName(threadLoggerInfoName);
// replace placeholder,and combine local and global parameters
Map<String, Property> paramsMap = ParamUtils.convert(taskExecutionContext,getParameters());
// run datax procesDataSourceService.s
String jsonFilePath = buildDataxJsonFile(paramsMap);
String shellCommandFilePath = buildShellCommandFile(jsonFilePath, paramsMap);
CommandExecuteResult commandExecuteResult = shellCommandExecutor.run(shellCommandFilePath);
setExitStatusCode(commandExecuteResult.getExitStatusCode());
setAppIds(commandExecuteResult.getAppIds());
setProcessId(commandExecuteResult.getProcessId());
} catch (Exception e) {
setExitStatusCode(Constants.EXIT_CODE_FAILURE);
throw e;
}
}
/**
* cancel DataX process
*
* @param cancelApplication cancelApplication
* @throws Exception if error throws Exception
*/
@Override
public void cancelApplication(boolean cancelApplication)
throws Exception {
// cancel process
shellCommandExecutor.cancelApplication();
}
/**
* build datax configuration file
*
* @return datax json file name
* @throws Exception if error throws Exception
*/
private String buildDataxJsonFile(Map<String, Property> paramsMap)
throws Exception {
// generate json
String fileName = String.format("%s/%s_job.json",
taskExecutionContext.getExecutePath(),
taskExecutionContext.getTaskAppId());
String json;
Path path = new File(fileName).toPath();
if (Files.exists(path)) {
return fileName;
}
if (dataXParameters.getCustomConfig() == Flag.YES.ordinal()) {
json = dataXParameters.getJson().replaceAll("\\r\\n", "\n");
} else {
ObjectNode job = JSONUtils.createObjectNode();
job.putArray("content").addAll(buildDataxJobContentJson());
job.set("setting", buildDataxJobSettingJson());
ObjectNode root = JSONUtils.createObjectNode();
root.set("job", job);
root.set("core", buildDataxCoreJson());
json = root.toString();
}
// replace placeholder
json = ParameterUtils.convertParameterPlaceholders(json, ParamUtils.convert(paramsMap));
logger.debug("datax job json : {}", json);
// create datax json file
FileUtils.writeStringToFile(new File(fileName), json, StandardCharsets.UTF_8);
return fileName;
}
/**
* build datax job config
*
* @return collection of datax job config JSONObject
* @throws SQLException if error throws SQLException
*/
private List<ObjectNode> buildDataxJobContentJson() {
DataxTaskExecutionContext dataxTaskExecutionContext = taskExecutionContext.getDataxTaskExecutionContext();
BaseConnectionParam dataSourceCfg = (BaseConnectionParam) DatasourceUtil.buildConnectionParams(
DbType.of(dataxTaskExecutionContext.getSourcetype()),
dataxTaskExecutionContext.getSourceConnectionParams());
BaseConnectionParam dataTargetCfg = (BaseConnectionParam) DatasourceUtil.buildConnectionParams(
DbType.of(dataxTaskExecutionContext.getTargetType()),
dataxTaskExecutionContext.getTargetConnectionParams());
List<ObjectNode> readerConnArr = new ArrayList<>();
ObjectNode readerConn = JSONUtils.createObjectNode();
ArrayNode sqlArr = readerConn.putArray("querySql");
for (String sql : new String[]{dataXParameters.getSql()}) {
sqlArr.add(sql);
}
ArrayNode urlArr = readerConn.putArray("jdbcUrl");
urlArr.add(DatasourceUtil.getJdbcUrl(DbType.valueOf(dataXParameters.getDsType()), dataSourceCfg));
readerConnArr.add(readerConn);
ObjectNode readerParam = JSONUtils.createObjectNode();
readerParam.put("username", dataSourceCfg.getUser());
readerParam.put("password", CommonUtils.decodePassword(dataSourceCfg.getPassword()));
readerParam.putArray("connection").addAll(readerConnArr);
ObjectNode reader = JSONUtils.createObjectNode();
reader.put("name", DataxUtils.getReaderPluginName(DbType.of(dataxTaskExecutionContext.getSourcetype())));
reader.set("parameter", readerParam);
List<ObjectNode> writerConnArr = new ArrayList<>();
ObjectNode writerConn = JSONUtils.createObjectNode();
ArrayNode tableArr = writerConn.putArray("table");
tableArr.add(dataXParameters.getTargetTable());
writerConn.put("jdbcUrl", DatasourceUtil.getJdbcUrl(DbType.valueOf(dataXParameters.getDtType()), dataTargetCfg));
writerConnArr.add(writerConn);
ObjectNode writerParam = JSONUtils.createObjectNode();
writerParam.put("username", dataTargetCfg.getUser());
writerParam.put("password", CommonUtils.decodePassword(dataTargetCfg.getPassword()));
String[] columns = parsingSqlColumnNames(DbType.of(dataxTaskExecutionContext.getSourcetype()),
DbType.of(dataxTaskExecutionContext.getTargetType()),
dataSourceCfg, dataXParameters.getSql());
ArrayNode columnArr = writerParam.putArray("column");
for (String column : columns) {
columnArr.add(column);
}
writerParam.putArray("connection").addAll(writerConnArr);
if (CollectionUtils.isNotEmpty(dataXParameters.getPreStatements())) {
ArrayNode preSqlArr = writerParam.putArray("preSql");
for (String preSql : dataXParameters.getPreStatements()) {
preSqlArr.add(preSql);
}
}
if (CollectionUtils.isNotEmpty(dataXParameters.getPostStatements())) {
ArrayNode postSqlArr = writerParam.putArray("postSql");
for (String postSql : dataXParameters.getPostStatements()) {
postSqlArr.add(postSql);
}
}
ObjectNode writer = JSONUtils.createObjectNode();
writer.put("name", DataxUtils.getWriterPluginName(DbType.of(dataxTaskExecutionContext.getTargetType())));
writer.set("parameter", writerParam);
List<ObjectNode> contentList = new ArrayList<>();
ObjectNode content = JSONUtils.createObjectNode();
content.set("reader", reader);
content.set("writer", writer);
contentList.add(content);
return contentList;
}
/**
* build datax setting config
*
* @return datax setting config JSONObject
*/
private ObjectNode buildDataxJobSettingJson() {
ObjectNode speed = JSONUtils.createObjectNode();
speed.put("channel", DATAX_CHANNEL_COUNT);
if (dataXParameters.getJobSpeedByte() > 0) {
speed.put("byte", dataXParameters.getJobSpeedByte());
}
if (dataXParameters.getJobSpeedRecord() > 0) {
speed.put("record", dataXParameters.getJobSpeedRecord());
}
ObjectNode errorLimit = JSONUtils.createObjectNode();
errorLimit.put("record", 0);
errorLimit.put("percentage", 0);
ObjectNode setting = JSONUtils.createObjectNode();
setting.set("speed", speed);
setting.set("errorLimit", errorLimit);
return setting;
}
private ObjectNode buildDataxCoreJson() {
ObjectNode speed = JSONUtils.createObjectNode();
speed.put("channel", DATAX_CHANNEL_COUNT);
if (dataXParameters.getJobSpeedByte() > 0) {
speed.put("byte", dataXParameters.getJobSpeedByte());
}
if (dataXParameters.getJobSpeedRecord() > 0) {
speed.put("record", dataXParameters.getJobSpeedRecord());
}
ObjectNode channel = JSONUtils.createObjectNode();
channel.set("speed", speed);
ObjectNode transport = JSONUtils.createObjectNode();
transport.set("channel", channel);
ObjectNode core = JSONUtils.createObjectNode();
core.set("transport", transport);
return core;
}
/**
* create command
*
* @return shell command file name
* @throws Exception if error throws Exception
*/
private String buildShellCommandFile(String jobConfigFilePath, Map<String, Property> paramsMap)
throws Exception {
// generate scripts
String fileName = String.format("%s/%s_node.%s",
taskExecutionContext.getExecutePath(),
taskExecutionContext.getTaskAppId(),
OSUtils.isWindows() ? "bat" : "sh");
Path path = new File(fileName).toPath();
if (Files.exists(path)) {
return fileName;
}
// datax python command
StringBuilder sbr = new StringBuilder();
sbr.append(getPythonCommand());
sbr.append(" ");
sbr.append(DATAX_PATH);
sbr.append(" ");
sbr.append(loadJvmEnv(dataXParameters));
sbr.append(jobConfigFilePath);
// replace placeholder
String dataxCommand = ParameterUtils.convertParameterPlaceholders(sbr.toString(), ParamUtils.convert(paramsMap));
logger.debug("raw script : {}", dataxCommand);
// create shell command file
Set<PosixFilePermission> perms = PosixFilePermissions.fromString(Constants.RWXR_XR_X);
FileAttribute<Set<PosixFilePermission>> attr = PosixFilePermissions.asFileAttribute(perms);
if (OSUtils.isWindows()) {
Files.createFile(path);
} else {
Files.createFile(path, attr);
}
Files.write(path, dataxCommand.getBytes(), StandardOpenOption.APPEND);
return fileName;
}
public String getPythonCommand() {
String pythonHome = System.getenv("PYTHON_HOME");
return getPythonCommand(pythonHome);
}
public String getPythonCommand(String pythonHome) {
if (StringUtils.isEmpty(pythonHome)) {
return DATAX_PYTHON;
}
String pythonBinPath = "/bin/" + DATAX_PYTHON;
Matcher matcher = PYTHON_PATH_PATTERN.matcher(pythonHome);
if (matcher.find()) {
return matcher.replaceAll(pythonBinPath);
}
return Paths.get(pythonHome, pythonBinPath).toString();
}
public String loadJvmEnv(DataxParameters dataXParameters) {
int xms = Math.max(dataXParameters.getXms(), 1);
int xmx = Math.max(dataXParameters.getXmx(), 1);
return String.format(JVM_PARAM, xms, xmx);
}
/**
* parsing synchronized column names in SQL statements
*
* @param sourceType the database type of the data source
* @param targetType the database type of the data target
* @param dataSourceCfg the database connection parameters of the data source
* @param sql sql for data synchronization
* @return Keyword converted column names
*/
private String[] parsingSqlColumnNames(DbType sourceType, DbType targetType, BaseConnectionParam dataSourceCfg, String sql) {
String[] columnNames = tryGrammaticalAnalysisSqlColumnNames(sourceType, sql);
if (columnNames == null || columnNames.length == 0) {
logger.info("try to execute sql analysis query column name");
columnNames = tryExecuteSqlResolveColumnNames(sourceType, dataSourceCfg, sql);
}
notNull(columnNames, String.format("parsing sql columns failed : %s", sql));
return DataxUtils.convertKeywordsColumns(targetType, columnNames);
}
/**
* try grammatical parsing column
*
* @param dbType database type
* @param sql sql for data synchronization
* @return column name array
* @throws RuntimeException if error throws RuntimeException
*/
private String[] tryGrammaticalAnalysisSqlColumnNames(DbType dbType, String sql) {
String[] columnNames;
try {
SQLStatementParser parser = DataxUtils.getSqlStatementParser(dbType, sql);
if (parser == null) {
logger.warn("database driver [{}] is not support grammatical analysis sql", dbType);
return new String[0];
}
SQLStatement sqlStatement = parser.parseStatement();
SQLSelectStatement sqlSelectStatement = (SQLSelectStatement) sqlStatement;
SQLSelect sqlSelect = sqlSelectStatement.getSelect();
List<SQLSelectItem> selectItemList = null;
if (sqlSelect.getQuery() instanceof SQLSelectQueryBlock) {
SQLSelectQueryBlock block = (SQLSelectQueryBlock) sqlSelect.getQuery();
selectItemList = block.getSelectList();
} else if (sqlSelect.getQuery() instanceof SQLUnionQuery) {
SQLUnionQuery unionQuery = (SQLUnionQuery) sqlSelect.getQuery();
SQLSelectQueryBlock block = (SQLSelectQueryBlock) unionQuery.getRight();
selectItemList = block.getSelectList();
}
notNull(selectItemList,
String.format("select query type [%s] is not support", sqlSelect.getQuery().toString()));
columnNames = new String[selectItemList.size()];
for (int i = 0; i < selectItemList.size(); i++) {
SQLSelectItem item = selectItemList.get(i);
String columnName = null;
if (item.getAlias() != null) {
columnName = item.getAlias();
} else if (item.getExpr() != null) {
if (item.getExpr() instanceof SQLPropertyExpr) {
SQLPropertyExpr expr = (SQLPropertyExpr) item.getExpr();
columnName = expr.getName();
} else if (item.getExpr() instanceof SQLIdentifierExpr) {
SQLIdentifierExpr expr = (SQLIdentifierExpr) item.getExpr();
columnName = expr.getName();
}
} else {
throw new RuntimeException(
String.format("grammatical analysis sql column [ %s ] failed", item.toString()));
}
if (columnName == null) {
throw new RuntimeException(
String.format("grammatical analysis sql column [ %s ] failed", item.toString()));
}
columnNames[i] = columnName;
}
} catch (Exception e) {
logger.warn(e.getMessage(), e);
return new String[0];
}
return columnNames;
}
/**
* try to execute sql to resolve column names
*
* @param baseDataSource the database connection parameters
* @param sql sql for data synchronization
* @return column name array
*/
public String[] tryExecuteSqlResolveColumnNames(DbType sourceType, BaseConnectionParam baseDataSource, String sql) {
String[] columnNames;
sql = String.format("SELECT t.* FROM ( %s ) t WHERE 0 = 1", sql);
sql = sql.replace(";", "");
try (
Connection connection = DatasourceUtil.getConnection(sourceType, baseDataSource);
PreparedStatement stmt = connection.prepareStatement(sql);
ResultSet resultSet = stmt.executeQuery()) {
ResultSetMetaData md = resultSet.getMetaData();
int num = md.getColumnCount();
columnNames = new String[num];
for (int i = 1; i <= num; i++) {
columnNames[i - 1] = md.getColumnName(i);
}
} catch (SQLException e) {
logger.warn(e.getMessage(), e);
return null;
}
return columnNames;
}
@Override
public AbstractParameters getParameters() {
return dataXParameters;
}
private void notNull(Object obj, String message) {
if (obj == null) {
throw new RuntimeException(message);
}
}
}

143
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/flink/FlinkTask.java

@ -1,143 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.flink;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.process.ResourceInfo;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import org.apache.dolphinscheduler.common.task.flink.FlinkParameters;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.dao.entity.Resource;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.utils.FlinkArgsUtils;
import org.apache.dolphinscheduler.server.utils.ParamUtils;
import org.apache.dolphinscheduler.server.worker.task.AbstractYarnTask;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import org.slf4j.Logger;
/**
* flink task
*/
public class FlinkTask extends AbstractYarnTask {
/**
* flink command
* usage: flink run [OPTIONS] <jar-file> <arguments>
*/
private static final String FLINK_COMMAND = "flink";
private static final String FLINK_RUN = "run";
/**
* flink parameters
*/
private FlinkParameters flinkParameters;
/**
* taskExecutionContext
*/
private TaskExecutionContext taskExecutionContext;
public FlinkTask(TaskExecutionContext taskExecutionContext, Logger logger) {
super(taskExecutionContext, logger);
this.taskExecutionContext = taskExecutionContext;
}
@Override
public void init() {
logger.info("flink task params {}", taskExecutionContext.getTaskParams());
flinkParameters = JSONUtils.parseObject(taskExecutionContext.getTaskParams(), FlinkParameters.class);
if (flinkParameters == null || !flinkParameters.checkParameters()) {
throw new RuntimeException("flink task params is not valid");
}
flinkParameters.setQueue(taskExecutionContext.getQueue());
setMainJarName();
if (StringUtils.isNotEmpty(flinkParameters.getMainArgs())) {
String args = flinkParameters.getMainArgs();
// combining local and global parameters
Map<String, Property> paramsMap = ParamUtils.convert(taskExecutionContext,getParameters());
logger.info("param Map : {}", paramsMap);
if (paramsMap != null) {
args = ParameterUtils.convertParameterPlaceholders(args, ParamUtils.convert(paramsMap));
logger.info("param args : {}", args);
}
flinkParameters.setMainArgs(args);
}
}
/**
* create command
* @return command
*/
@Override
protected String buildCommand() {
// flink run [OPTIONS] <jar-file> <arguments>
List<String> args = new ArrayList<>();
args.add(FLINK_COMMAND);
args.add(FLINK_RUN);
logger.info("flink task args : {}", args);
// other parameters
args.addAll(FlinkArgsUtils.buildArgs(flinkParameters));
String command = ParameterUtils
.convertParameterPlaceholders(String.join(" ", args), taskExecutionContext.getDefinedParams());
logger.info("flink task command : {}", command);
return command;
}
@Override
protected void setMainJarName() {
// main jar
ResourceInfo mainJar = flinkParameters.getMainJar();
if (mainJar != null) {
int resourceId = mainJar.getId();
String resourceName;
if (resourceId == 0) {
resourceName = mainJar.getRes();
} else {
Resource resource = processService.getResourceById(flinkParameters.getMainJar().getId());
if (resource == null) {
logger.error("resource id: {} not exist", resourceId);
throw new RuntimeException(String.format("resource id: %d not exist", resourceId));
}
resourceName = resource.getFullName().replaceFirst("/", "");
}
mainJar.setRes(resourceName);
flinkParameters.setMainJar(mainJar);
}
}
@Override
public AbstractParameters getParameters() {
return flinkParameters;
}
}

332
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/http/HttpTask.java

@ -1,332 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.http;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.enums.HttpMethod;
import org.apache.dolphinscheduler.common.enums.HttpParametersType;
import org.apache.dolphinscheduler.common.process.HttpProperty;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import org.apache.dolphinscheduler.common.task.http.HttpParameters;
import org.apache.dolphinscheduler.common.utils.CollectionUtils;
import org.apache.dolphinscheduler.common.utils.DateUtils;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.utils.ParamUtils;
import org.apache.dolphinscheduler.server.worker.task.AbstractTask;
import org.apache.commons.io.Charsets;
import org.apache.http.HttpEntity;
import org.apache.http.ParseException;
import org.apache.http.client.config.RequestConfig;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpUriRequest;
import org.apache.http.client.methods.RequestBuilder;
import org.apache.http.entity.StringEntity;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClientBuilder;
import org.apache.http.impl.client.HttpClients;
import org.apache.http.util.EntityUtils;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import org.slf4j.Logger;
import com.fasterxml.jackson.databind.node.ObjectNode;
/**
* http task
*/
public class HttpTask extends AbstractTask {
/**
* application json
*/
protected static final String APPLICATION_JSON = "application/json";
/**
* output
*/
protected String output;
/**
* http parameters
*/
private HttpParameters httpParameters;
/**
* taskExecutionContext
*/
private TaskExecutionContext taskExecutionContext;
/**
* constructor
*
* @param taskExecutionContext taskExecutionContext
* @param logger logger
*/
public HttpTask(TaskExecutionContext taskExecutionContext, Logger logger) {
super(taskExecutionContext, logger);
this.taskExecutionContext = taskExecutionContext;
}
@Override
public void init() {
logger.info("http task params {}", taskExecutionContext.getTaskParams());
this.httpParameters = JSONUtils.parseObject(taskExecutionContext.getTaskParams(), HttpParameters.class);
if (!httpParameters.checkParameters()) {
throw new RuntimeException("http task params is not valid");
}
}
@Override
public void handle() throws Exception {
String threadLoggerInfoName = String.format(Constants.TASK_LOG_INFO_FORMAT, taskExecutionContext.getTaskAppId());
Thread.currentThread().setName(threadLoggerInfoName);
long startTime = System.currentTimeMillis();
String formatTimeStamp = DateUtils.formatTimeStamp(startTime);
String statusCode = null;
String body = null;
try (CloseableHttpClient client = createHttpClient();
CloseableHttpResponse response = sendRequest(client)) {
statusCode = String.valueOf(getStatusCode(response));
body = getResponseBody(response);
exitStatusCode = validResponse(body, statusCode);
long costTime = System.currentTimeMillis() - startTime;
logger.info("startTime: {}, httpUrl: {}, httpMethod: {}, costTime : {} milliseconds, statusCode : {}, body : {}, log : {}",
formatTimeStamp, httpParameters.getUrl(),
httpParameters.getHttpMethod(), costTime, statusCode, body, output);
} catch (Exception e) {
appendMessage(e.toString());
exitStatusCode = -1;
logger.error("httpUrl[" + httpParameters.getUrl() + "] connection failed:" + output, e);
throw e;
}
}
/**
* send request
*
* @param client client
* @return CloseableHttpResponse
* @throws IOException io exception
*/
protected CloseableHttpResponse sendRequest(CloseableHttpClient client) throws IOException {
RequestBuilder builder = createRequestBuilder();
// replace placeholder,and combine local and global parameters
Map<String, Property> paramsMap = ParamUtils.convert(taskExecutionContext,getParameters());
List<HttpProperty> httpPropertyList = new ArrayList<>();
if (CollectionUtils.isNotEmpty(httpParameters.getHttpParams())) {
for (HttpProperty httpProperty : httpParameters.getHttpParams()) {
String jsonObject = JSONUtils.toJsonString(httpProperty);
String params = ParameterUtils.convertParameterPlaceholders(jsonObject, ParamUtils.convert(paramsMap));
logger.info("http request params:{}", params);
httpPropertyList.add(JSONUtils.parseObject(params, HttpProperty.class));
}
}
addRequestParams(builder, httpPropertyList);
String requestUrl = ParameterUtils.convertParameterPlaceholders(httpParameters.getUrl(), ParamUtils.convert(paramsMap));
HttpUriRequest request = builder.setUri(requestUrl).build();
setHeaders(request, httpPropertyList);
return client.execute(request);
}
/**
* get response body
*
* @param httpResponse http response
* @return response body
* @throws ParseException parse exception
* @throws IOException io exception
*/
protected String getResponseBody(CloseableHttpResponse httpResponse) throws ParseException, IOException {
if (httpResponse == null) {
return null;
}
HttpEntity entity = httpResponse.getEntity();
if (entity == null) {
return null;
}
return EntityUtils.toString(entity, StandardCharsets.UTF_8.name());
}
/**
* get status code
*
* @param httpResponse http response
* @return status code
*/
protected int getStatusCode(CloseableHttpResponse httpResponse) {
return httpResponse.getStatusLine().getStatusCode();
}
/**
* valid response
*
* @param body body
* @param statusCode status code
* @return exit status code
*/
protected int validResponse(String body, String statusCode) {
int exitStatusCode = 0;
switch (httpParameters.getHttpCheckCondition()) {
case BODY_CONTAINS:
if (StringUtils.isEmpty(body) || !body.contains(httpParameters.getCondition())) {
appendMessage(httpParameters.getUrl() + " doesn contain "
+ httpParameters.getCondition());
exitStatusCode = -1;
}
break;
case BODY_NOT_CONTAINS:
if (StringUtils.isEmpty(body) || body.contains(httpParameters.getCondition())) {
appendMessage(httpParameters.getUrl() + " contains "
+ httpParameters.getCondition());
exitStatusCode = -1;
}
break;
case STATUS_CODE_CUSTOM:
if (!statusCode.equals(httpParameters.getCondition())) {
appendMessage(httpParameters.getUrl() + " statuscode: " + statusCode + ", Must be: " + httpParameters.getCondition());
exitStatusCode = -1;
}
break;
default:
if (!"200".equals(statusCode)) {
appendMessage(httpParameters.getUrl() + " statuscode: " + statusCode + ", Must be: 200");
exitStatusCode = -1;
}
break;
}
return exitStatusCode;
}
public String getOutput() {
return output;
}
/**
* append message
*
* @param message message
*/
protected void appendMessage(String message) {
if (output == null) {
output = "";
}
if (message != null && !message.trim().isEmpty()) {
output += message;
}
}
/**
* add request params
*
* @param builder buidler
* @param httpPropertyList http property list
*/
protected void addRequestParams(RequestBuilder builder, List<HttpProperty> httpPropertyList) {
if (CollectionUtils.isNotEmpty(httpPropertyList)) {
ObjectNode jsonParam = JSONUtils.createObjectNode();
for (HttpProperty property : httpPropertyList) {
if (property.getHttpParametersType() != null) {
if (property.getHttpParametersType().equals(HttpParametersType.PARAMETER)) {
builder.addParameter(property.getProp(), property.getValue());
} else if (property.getHttpParametersType().equals(HttpParametersType.BODY)) {
jsonParam.put(property.getProp(), property.getValue());
}
}
}
StringEntity postingString = new StringEntity(jsonParam.toString(), Charsets.UTF_8);
postingString.setContentEncoding(StandardCharsets.UTF_8.name());
postingString.setContentType(APPLICATION_JSON);
builder.setEntity(postingString);
}
}
/**
* set headers
*
* @param request request
* @param httpPropertyList http property list
*/
protected void setHeaders(HttpUriRequest request, List<HttpProperty> httpPropertyList) {
if (CollectionUtils.isNotEmpty(httpPropertyList)) {
for (HttpProperty property : httpPropertyList) {
if (HttpParametersType.HEADERS.equals(property.getHttpParametersType())) {
request.addHeader(property.getProp(), property.getValue());
}
}
}
}
/**
* create http client
*
* @return CloseableHttpClient
*/
protected CloseableHttpClient createHttpClient() {
final RequestConfig requestConfig = requestConfig();
HttpClientBuilder httpClientBuilder;
httpClientBuilder = HttpClients.custom().setDefaultRequestConfig(requestConfig);
return httpClientBuilder.build();
}
/**
* request config
*
* @return RequestConfig
*/
private RequestConfig requestConfig() {
return RequestConfig.custom().setSocketTimeout(httpParameters.getSocketTimeout()).setConnectTimeout(httpParameters.getConnectTimeout()).build();
}
/**
* create request builder
*
* @return RequestBuilder
*/
protected RequestBuilder createRequestBuilder() {
if (httpParameters.getHttpMethod().equals(HttpMethod.GET)) {
return RequestBuilder.get();
} else if (httpParameters.getHttpMethod().equals(HttpMethod.POST)) {
return RequestBuilder.post();
} else if (httpParameters.getHttpMethod().equals(HttpMethod.HEAD)) {
return RequestBuilder.head();
} else if (httpParameters.getHttpMethod().equals(HttpMethod.PUT)) {
return RequestBuilder.put();
} else if (httpParameters.getHttpMethod().equals(HttpMethod.DELETE)) {
return RequestBuilder.delete();
} else {
return null;
}
}
@Override
public AbstractParameters getParameters() {
return this.httpParameters;
}
}

145
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/mr/MapReduceTask.java

@ -1,145 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.mr;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.enums.ProgramType;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.process.ResourceInfo;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import org.apache.dolphinscheduler.common.task.mr.MapReduceParameters;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.dao.entity.Resource;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.utils.MapReduceArgsUtils;
import org.apache.dolphinscheduler.server.utils.ParamUtils;
import org.apache.dolphinscheduler.server.worker.task.AbstractYarnTask;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import org.slf4j.Logger;
/**
* mapreduce task
*/
public class MapReduceTask extends AbstractYarnTask {
/**
* mapreduce command
* usage: hadoop jar <jar> [mainClass] [GENERIC_OPTIONS] args...
*/
private static final String MAPREDUCE_COMMAND = Constants.HADOOP;
/**
* mapreduce parameters
*/
private MapReduceParameters mapreduceParameters;
/**
* taskExecutionContext
*/
private TaskExecutionContext taskExecutionContext;
/**
* constructor
* @param taskExecutionContext taskExecutionContext
* @param logger logger
*/
public MapReduceTask(TaskExecutionContext taskExecutionContext, Logger logger) {
super(taskExecutionContext, logger);
this.taskExecutionContext = taskExecutionContext;
}
@Override
public void init() {
logger.info("mapreduce task params {}", taskExecutionContext.getTaskParams());
this.mapreduceParameters = JSONUtils.parseObject(taskExecutionContext.getTaskParams(), MapReduceParameters.class);
// check parameters
if (mapreduceParameters == null || !mapreduceParameters.checkParameters()) {
throw new RuntimeException("mapreduce task params is not valid");
}
mapreduceParameters.setQueue(taskExecutionContext.getQueue());
setMainJarName();
// replace placeholder,and combine local and global parameters
Map<String, Property> paramsMap = ParamUtils.convert(taskExecutionContext,getParameters());
if (paramsMap != null) {
String args = ParameterUtils.convertParameterPlaceholders(mapreduceParameters.getMainArgs(), ParamUtils.convert(paramsMap));
mapreduceParameters.setMainArgs(args);
if (mapreduceParameters.getProgramType() != null && mapreduceParameters.getProgramType() == ProgramType.PYTHON) {
String others = ParameterUtils.convertParameterPlaceholders(mapreduceParameters.getOthers(), ParamUtils.convert(paramsMap));
mapreduceParameters.setOthers(others);
}
}
}
/**
* build command
* @return command
*/
@Override
protected String buildCommand() {
// hadoop jar <jar> [mainClass] [GENERIC_OPTIONS] args...
List<String> args = new ArrayList<>();
args.add(MAPREDUCE_COMMAND);
// other parameters
args.addAll(MapReduceArgsUtils.buildArgs(mapreduceParameters));
String command = ParameterUtils.convertParameterPlaceholders(String.join(" ", args),
taskExecutionContext.getDefinedParams());
logger.info("mapreduce task command: {}", command);
return command;
}
@Override
protected void setMainJarName() {
// main jar
ResourceInfo mainJar = mapreduceParameters.getMainJar();
if (mainJar != null) {
int resourceId = mainJar.getId();
String resourceName;
if (resourceId == 0) {
resourceName = mainJar.getRes();
} else {
Resource resource = processService.getResourceById(mapreduceParameters.getMainJar().getId());
if (resource == null) {
logger.error("resource id: {} not exist", resourceId);
throw new RuntimeException(String.format("resource id: %d not exist", resourceId));
}
resourceName = resource.getFullName().replaceFirst("/", "");
}
mainJar.setRes(resourceName);
mapreduceParameters.setMainJar(mainJar);
}
}
@Override
public AbstractParameters getParameters() {
return mapreduceParameters;
}
}

341
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/procedure/ProcedureTask.java

@ -1,341 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.procedure;
import static org.apache.dolphinscheduler.common.enums.DataType.BOOLEAN;
import static org.apache.dolphinscheduler.common.enums.DataType.DATE;
import static org.apache.dolphinscheduler.common.enums.DataType.DOUBLE;
import static org.apache.dolphinscheduler.common.enums.DataType.FLOAT;
import static org.apache.dolphinscheduler.common.enums.DataType.INTEGER;
import static org.apache.dolphinscheduler.common.enums.DataType.LONG;
import static org.apache.dolphinscheduler.common.enums.DataType.TIME;
import static org.apache.dolphinscheduler.common.enums.DataType.TIMESTAMP;
import static org.apache.dolphinscheduler.common.enums.DataType.VARCHAR;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.datasource.ConnectionParam;
import org.apache.dolphinscheduler.common.datasource.DatasourceUtil;
import org.apache.dolphinscheduler.common.enums.DataType;
import org.apache.dolphinscheduler.common.enums.DbType;
import org.apache.dolphinscheduler.common.enums.Direct;
import org.apache.dolphinscheduler.common.enums.TaskTimeoutStrategy;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import org.apache.dolphinscheduler.common.task.procedure.ProcedureParameters;
import org.apache.dolphinscheduler.common.utils.CollectionUtils;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.utils.ParamUtils;
import org.apache.dolphinscheduler.server.worker.task.AbstractTask;
import java.sql.CallableStatement;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import java.sql.Types;
import java.util.Collection;
import java.util.HashMap;
import java.util.Iterator;
import java.util.Map;
import org.slf4j.Logger;
/**
* procedure task
*/
public class ProcedureTask extends AbstractTask {
/**
* procedure parameters
*/
private ProcedureParameters procedureParameters;
/**
* taskExecutionContext
*/
private TaskExecutionContext taskExecutionContext;
/**
* constructor
*
* @param taskExecutionContext taskExecutionContext
* @param logger logger
*/
public ProcedureTask(TaskExecutionContext taskExecutionContext, Logger logger) {
super(taskExecutionContext, logger);
this.taskExecutionContext = taskExecutionContext;
logger.info("procedure task params {}", taskExecutionContext.getTaskParams());
this.procedureParameters = JSONUtils.parseObject(taskExecutionContext.getTaskParams(), ProcedureParameters.class);
// check parameters
if (!procedureParameters.checkParameters()) {
throw new RuntimeException("procedure task params is not valid");
}
}
@Override
public void handle() throws Exception {
// set the name of the current thread
String threadLoggerInfoName = String.format(Constants.TASK_LOG_INFO_FORMAT, taskExecutionContext.getTaskAppId());
Thread.currentThread().setName(threadLoggerInfoName);
logger.info("procedure type : {}, datasource : {}, method : {} , localParams : {}",
procedureParameters.getType(),
procedureParameters.getDatasource(),
procedureParameters.getMethod(),
procedureParameters.getLocalParams());
Connection connection = null;
CallableStatement stmt = null;
try {
// load class
DbType dbType = DbType.valueOf(procedureParameters.getType());
// get datasource
ConnectionParam connectionParam = DatasourceUtil.buildConnectionParams(DbType.valueOf(procedureParameters.getType()),
taskExecutionContext.getProcedureTaskExecutionContext().getConnectionParams());
// get jdbc connection
connection = DatasourceUtil.getConnection(dbType, connectionParam);
// combining local and global parameters
Map<String, Property> paramsMap = ParamUtils.convert(taskExecutionContext,getParameters());
// call method
stmt = connection.prepareCall(procedureParameters.getMethod());
// set timeout
setTimeout(stmt);
// outParameterMap
Map<Integer, Property> outParameterMap = getOutParameterMap(stmt, paramsMap);
stmt.executeUpdate();
// print the output parameters to the log
printOutParameter(stmt, outParameterMap);
setExitStatusCode(Constants.EXIT_CODE_SUCCESS);
} catch (Exception e) {
setExitStatusCode(Constants.EXIT_CODE_FAILURE);
logger.error("procedure task error", e);
throw e;
} finally {
close(stmt, connection);
}
}
/**
* print outParameter
*
* @param stmt CallableStatement
* @param outParameterMap outParameterMap
* @throws SQLException SQLException
*/
private void printOutParameter(CallableStatement stmt,
Map<Integer, Property> outParameterMap) throws SQLException {
Iterator<Map.Entry<Integer, Property>> iter = outParameterMap.entrySet().iterator();
while (iter.hasNext()) {
Map.Entry<Integer, Property> en = iter.next();
int index = en.getKey();
Property property = en.getValue();
String prop = property.getProp();
DataType dataType = property.getType();
// get output parameter
getOutputParameter(stmt, index, prop, dataType);
}
}
/**
* get output parameter
*
* @param stmt CallableStatement
* @param paramsMap paramsMap
* @return outParameterMap
* @throws Exception Exception
*/
private Map<Integer, Property> getOutParameterMap(CallableStatement stmt, Map<String, Property> paramsMap) throws Exception {
Map<Integer, Property> outParameterMap = new HashMap<>();
if (procedureParameters.getLocalParametersMap() == null) {
return outParameterMap;
}
Collection<Property> userDefParamsList = procedureParameters.getLocalParametersMap().values();
if (CollectionUtils.isEmpty(userDefParamsList)) {
return outParameterMap;
}
int index = 1;
for (Property property : userDefParamsList) {
logger.info("localParams : prop : {} , dirct : {} , type : {} , value : {}"
, property.getProp(),
property.getDirect(),
property.getType(),
property.getValue());
// set parameters
if (property.getDirect().equals(Direct.IN)) {
ParameterUtils.setInParameter(index, stmt, property.getType(), paramsMap.get(property.getProp()).getValue());
} else if (property.getDirect().equals(Direct.OUT)) {
setOutParameter(index, stmt, property.getType(), paramsMap.get(property.getProp()).getValue());
property.setValue(paramsMap.get(property.getProp()).getValue());
outParameterMap.put(index, property);
}
index++;
}
return outParameterMap;
}
/**
* set timeout
*
* @param stmt CallableStatement
*/
private void setTimeout(CallableStatement stmt) throws SQLException {
Boolean failed = taskExecutionContext.getTaskTimeoutStrategy() == TaskTimeoutStrategy.FAILED;
Boolean warnFailed = taskExecutionContext.getTaskTimeoutStrategy() == TaskTimeoutStrategy.WARNFAILED;
if (failed || warnFailed) {
stmt.setQueryTimeout(taskExecutionContext.getTaskTimeout());
}
}
/**
* close jdbc resource
*
* @param stmt stmt
* @param connection connection
*/
private void close(PreparedStatement stmt, Connection connection) {
if (stmt != null) {
try {
stmt.close();
} catch (SQLException e) {
logger.error("close prepared statement error : {}", e.getMessage(), e);
}
}
if (connection != null) {
try {
connection.close();
} catch (SQLException e) {
logger.error("close connection error : {}", e.getMessage(), e);
}
}
}
/**
* get output parameter
*
* @param stmt stmt
* @param index index
* @param prop prop
* @param dataType dataType
* @throws SQLException SQLException
*/
private void getOutputParameter(CallableStatement stmt, int index, String prop, DataType dataType) throws SQLException {
switch (dataType) {
case VARCHAR:
logger.info("out prameter varchar key : {} , value : {}", prop, stmt.getString(index));
break;
case INTEGER:
logger.info("out prameter integer key : {} , value : {}", prop, stmt.getInt(index));
break;
case LONG:
logger.info("out prameter long key : {} , value : {}", prop, stmt.getLong(index));
break;
case FLOAT:
logger.info("out prameter float key : {} , value : {}", prop, stmt.getFloat(index));
break;
case DOUBLE:
logger.info("out prameter double key : {} , value : {}", prop, stmt.getDouble(index));
break;
case DATE:
logger.info("out prameter date key : {} , value : {}", prop, stmt.getDate(index));
break;
case TIME:
logger.info("out prameter time key : {} , value : {}", prop, stmt.getTime(index));
break;
case TIMESTAMP:
logger.info("out prameter timestamp key : {} , value : {}", prop, stmt.getTimestamp(index));
break;
case BOOLEAN:
logger.info("out prameter boolean key : {} , value : {}", prop, stmt.getBoolean(index));
break;
default:
break;
}
}
@Override
public AbstractParameters getParameters() {
return procedureParameters;
}
/**
* set out parameter
*
* @param index index
* @param stmt stmt
* @param dataType dataType
* @param value value
* @throws Exception exception
*/
private void setOutParameter(int index, CallableStatement stmt, DataType dataType, String value) throws Exception {
int sqlType;
switch (dataType) {
case VARCHAR:
sqlType = Types.VARCHAR;
break;
case INTEGER:
case LONG:
sqlType = Types.INTEGER;
break;
case FLOAT:
sqlType = Types.FLOAT;
break;
case DOUBLE:
sqlType = Types.DOUBLE;
break;
case DATE:
sqlType = Types.DATE;
break;
case TIME:
sqlType = Types.TIME;
break;
case TIMESTAMP:
sqlType = Types.TIMESTAMP;
break;
case BOOLEAN:
sqlType = Types.BOOLEAN;
break;
default:
throw new IllegalStateException("Unexpected value: " + dataType);
}
if (StringUtils.isEmpty(value)) {
stmt.registerOutParameter(index, sqlType);
} else {
stmt.registerOutParameter(index, sqlType, value);
}
}
}

144
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/python/PythonTask.java

@ -1,144 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.python;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import org.apache.dolphinscheduler.common.task.python.PythonParameters;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.common.utils.VarPoolUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.utils.ParamUtils;
import org.apache.dolphinscheduler.server.worker.task.AbstractTask;
import org.apache.dolphinscheduler.server.worker.task.CommandExecuteResult;
import org.apache.dolphinscheduler.server.worker.task.PythonCommandExecutor;
import java.util.Map;
import org.slf4j.Logger;
/**
* python task
*/
public class PythonTask extends AbstractTask {
/**
* python parameters
*/
private PythonParameters pythonParameters;
/**
* task dir
*/
private String taskDir;
/**
* python command executor
*/
private PythonCommandExecutor pythonCommandExecutor;
/**
* taskExecutionContext
*/
private TaskExecutionContext taskExecutionContext;
/**
* constructor
* @param taskExecutionContext taskExecutionContext
* @param logger logger
*/
public PythonTask(TaskExecutionContext taskExecutionContext, Logger logger) {
super(taskExecutionContext, logger);
this.taskExecutionContext = taskExecutionContext;
this.pythonCommandExecutor = new PythonCommandExecutor(this::logHandle,
taskExecutionContext,
logger);
}
@Override
public void init() {
logger.info("python task params {}", taskExecutionContext.getTaskParams());
pythonParameters = JSONUtils.parseObject(taskExecutionContext.getTaskParams(), PythonParameters.class);
if (!pythonParameters.checkParameters()) {
throw new RuntimeException("python task params is not valid");
}
}
@Override
public void handle() throws Exception {
try {
// construct process
CommandExecuteResult commandExecuteResult = pythonCommandExecutor.run(buildCommand());
setExitStatusCode(commandExecuteResult.getExitStatusCode());
setAppIds(commandExecuteResult.getAppIds());
setProcessId(commandExecuteResult.getProcessId());
pythonParameters.dealOutParam(pythonCommandExecutor.getVarPool());
}
catch (Exception e) {
logger.error("python task failure", e);
setExitStatusCode(Constants.EXIT_CODE_FAILURE);
throw e;
}
}
@Override
public void cancelApplication(boolean cancelApplication) throws Exception {
// cancel process
pythonCommandExecutor.cancelApplication();
}
/**
* build command
* @return raw python script
* @throws Exception exception
*/
private String buildCommand() throws Exception {
String rawPythonScript = pythonParameters.getRawScript().replaceAll("\\r\\n", "\n");
// combining local and global parameters
Map<String, Property> paramsMap = ParamUtils.convert(taskExecutionContext,getParameters());
try {
rawPythonScript = VarPoolUtils.convertPythonScriptPlaceholders(rawPythonScript);
}
catch (StringIndexOutOfBoundsException e) {
logger.error("setShareVar field format error, raw python script : {}", rawPythonScript);
}
if (paramsMap != null) {
rawPythonScript = ParameterUtils.convertParameterPlaceholders(rawPythonScript, ParamUtils.convert(paramsMap));
}
logger.info("raw python script : {}", pythonParameters.getRawScript());
logger.info("task dir : {}", taskDir);
return rawPythonScript;
}
@Override
public AbstractParameters getParameters() {
return pythonParameters;
}
}

185
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/shell/ShellTask.java

@ -1,185 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.shell;
import static java.util.Calendar.DAY_OF_MONTH;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.enums.CommandType;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import org.apache.dolphinscheduler.common.task.shell.ShellParameters;
import org.apache.dolphinscheduler.common.utils.DateUtils;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.OSUtils;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.utils.ParamUtils;
import org.apache.dolphinscheduler.server.worker.task.AbstractTask;
import org.apache.dolphinscheduler.server.worker.task.CommandExecuteResult;
import org.apache.dolphinscheduler.server.worker.task.ShellCommandExecutor;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.StandardOpenOption;
import java.nio.file.attribute.FileAttribute;
import java.nio.file.attribute.PosixFilePermission;
import java.nio.file.attribute.PosixFilePermissions;
import java.util.Date;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;
import org.slf4j.Logger;
/**
* shell task
*/
public class ShellTask extends AbstractTask {
/**
* shell parameters
*/
private ShellParameters shellParameters;
/**
* shell command executor
*/
private ShellCommandExecutor shellCommandExecutor;
/**
* taskExecutionContext
*/
private TaskExecutionContext taskExecutionContext;
/**
* constructor
*
* @param taskExecutionContext taskExecutionContext
* @param logger logger
*/
public ShellTask(TaskExecutionContext taskExecutionContext, Logger logger) {
super(taskExecutionContext, logger);
this.taskExecutionContext = taskExecutionContext;
this.shellCommandExecutor = new ShellCommandExecutor(this::logHandle,
taskExecutionContext,
logger);
}
@Override
public void init() {
logger.info("shell task params {}", taskExecutionContext.getTaskParams());
shellParameters = JSONUtils.parseObject(taskExecutionContext.getTaskParams(), ShellParameters.class);
if (!shellParameters.checkParameters()) {
throw new RuntimeException("shell task params is not valid");
}
}
@Override
public void handle() throws Exception {
try {
// construct process
String command = buildCommand();
CommandExecuteResult commandExecuteResult = shellCommandExecutor.run(command);
setExitStatusCode(commandExecuteResult.getExitStatusCode());
setAppIds(commandExecuteResult.getAppIds());
setProcessId(commandExecuteResult.getProcessId());
shellParameters.dealOutParam(shellCommandExecutor.getVarPool());
} catch (Exception e) {
logger.error("shell task error", e);
setExitStatusCode(Constants.EXIT_CODE_FAILURE);
throw e;
}
}
@Override
public void cancelApplication(boolean cancelApplication) throws Exception {
// cancel process
shellCommandExecutor.cancelApplication();
}
/**
* create command
*
* @return file name
* @throws Exception exception
*/
private String buildCommand() throws Exception {
// generate scripts
String fileName = String.format("%s/%s_node.%s",
taskExecutionContext.getExecutePath(),
taskExecutionContext.getTaskAppId(), OSUtils.isWindows() ? "bat" : "sh");
Path path = new File(fileName).toPath();
if (Files.exists(path)) {
return fileName;
}
String script = shellParameters.getRawScript().replaceAll("\\r\\n", "\n");
script = parseScript(script);
shellParameters.setRawScript(script);
logger.info("raw script : {}", shellParameters.getRawScript());
logger.info("task execute path : {}", taskExecutionContext.getExecutePath());
Set<PosixFilePermission> perms = PosixFilePermissions.fromString(Constants.RWXR_XR_X);
FileAttribute<Set<PosixFilePermission>> attr = PosixFilePermissions.asFileAttribute(perms);
if (OSUtils.isWindows()) {
Files.createFile(path);
} else {
Files.createFile(path, attr);
}
Files.write(path, shellParameters.getRawScript().getBytes(), StandardOpenOption.APPEND);
return fileName;
}
@Override
public AbstractParameters getParameters() {
return shellParameters;
}
private String parseScript(String script) {
// combining local and global parameters
Map<String, Property> paramsMap = ParamUtils.convert(taskExecutionContext,getParameters());
// replace variable TIME with $[YYYYmmddd...] in shell file when history run job and batch complement job
if (taskExecutionContext.getScheduleTime() != null) {
if (paramsMap == null) {
paramsMap = new HashMap<>();
}
Date date = taskExecutionContext.getScheduleTime();
if (CommandType.COMPLEMENT_DATA.getCode() == taskExecutionContext.getCmdTypeIfComplement()) {
date = DateUtils.add(taskExecutionContext.getScheduleTime(), DAY_OF_MONTH, 1);
}
String dateTime = DateUtils.format(date, Constants.PARAMETER_FORMAT_TIME);
Property p = new Property();
p.setValue(dateTime);
p.setProp(Constants.PARAMETER_DATETIME);
paramsMap.put(Constants.PARAMETER_DATETIME, p);
}
return ParameterUtils.convertParameterPlaceholders(script, ParamUtils.convert(paramsMap));
}
}

155
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/spark/SparkTask.java

@ -1,155 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.spark;
import org.apache.dolphinscheduler.common.enums.SparkVersion;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.process.ResourceInfo;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import org.apache.dolphinscheduler.common.task.spark.SparkParameters;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.dao.entity.Resource;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.utils.ParamUtils;
import org.apache.dolphinscheduler.server.utils.SparkArgsUtils;
import org.apache.dolphinscheduler.server.worker.task.AbstractYarnTask;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import org.slf4j.Logger;
/**
* spark task
*/
public class SparkTask extends AbstractYarnTask {
/**
* spark1 command
* usage: spark-submit [options] <app jar | python file> [app arguments]
*/
private static final String SPARK1_COMMAND = "${SPARK_HOME1}/bin/spark-submit";
/**
* spark2 command
* usage: spark-submit [options] <app jar | python file> [app arguments]
*/
private static final String SPARK2_COMMAND = "${SPARK_HOME2}/bin/spark-submit";
/**
* spark parameters
*/
private SparkParameters sparkParameters;
/**
* taskExecutionContext
*/
private TaskExecutionContext taskExecutionContext;
public SparkTask(TaskExecutionContext taskExecutionContext, Logger logger) {
super(taskExecutionContext, logger);
this.taskExecutionContext = taskExecutionContext;
}
@Override
public void init() {
logger.info("spark task params {}", taskExecutionContext.getTaskParams());
sparkParameters = JSONUtils.parseObject(taskExecutionContext.getTaskParams(), SparkParameters.class);
if (null == sparkParameters) {
logger.error("Spark params is null");
return;
}
if (!sparkParameters.checkParameters()) {
throw new RuntimeException("spark task params is not valid");
}
sparkParameters.setQueue(taskExecutionContext.getQueue());
setMainJarName();
}
/**
* create command
* @return command
*/
@Override
protected String buildCommand() {
// spark-submit [options] <app jar | python file> [app arguments]
List<String> args = new ArrayList<>();
// spark version
String sparkCommand = SPARK2_COMMAND;
if (SparkVersion.SPARK1.name().equals(sparkParameters.getSparkVersion())) {
sparkCommand = SPARK1_COMMAND;
}
args.add(sparkCommand);
// other parameters
args.addAll(SparkArgsUtils.buildArgs(sparkParameters));
// replace placeholder, and combining local and global parameters
Map<String, Property> paramsMap = ParamUtils.convert(taskExecutionContext,getParameters());
String command = null;
if (null != paramsMap) {
command = ParameterUtils.convertParameterPlaceholders(String.join(" ", args), ParamUtils.convert(paramsMap));
}
logger.info("spark task command: {}", command);
return command;
}
@Override
protected void setMainJarName() {
// main jar
ResourceInfo mainJar = sparkParameters.getMainJar();
if (null == mainJar) {
throw new RuntimeException("Spark task jar params is null");
}
int resourceId = mainJar.getId();
String resourceName;
if (resourceId == 0) {
resourceName = mainJar.getRes();
} else {
Resource resource = processService.getResourceById(sparkParameters.getMainJar().getId());
if (resource == null) {
logger.error("resource id: {} not exist", resourceId);
throw new RuntimeException(String.format("resource id: %d not exist", resourceId));
}
resourceName = resource.getFullName().replaceFirst("/", "");
}
mainJar.setRes(resourceName);
sparkParameters.setMainJar(mainJar);
}
@Override
public AbstractParameters getParameters() {
return sparkParameters;
}
}

510
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sql/SqlTask.java

@ -1,510 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sql;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.datasource.BaseConnectionParam;
import org.apache.dolphinscheduler.common.datasource.DatasourceUtil;
import org.apache.dolphinscheduler.common.enums.DbType;
import org.apache.dolphinscheduler.common.enums.Direct;
import org.apache.dolphinscheduler.common.enums.TaskTimeoutStrategy;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import org.apache.dolphinscheduler.common.task.sql.SqlBinds;
import org.apache.dolphinscheduler.common.task.sql.SqlParameters;
import org.apache.dolphinscheduler.common.task.sql.SqlType;
import org.apache.dolphinscheduler.common.utils.CollectionUtils;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.dao.AlertDao;
import org.apache.dolphinscheduler.remote.command.alert.AlertSendResponseCommand;
import org.apache.dolphinscheduler.server.entity.SQLTaskExecutionContext;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.utils.ParamUtils;
import org.apache.dolphinscheduler.server.utils.UDFUtils;
import org.apache.dolphinscheduler.server.worker.task.AbstractTask;
import org.apache.dolphinscheduler.service.alert.AlertClientService;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import org.apache.commons.collections.MapUtils;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
import org.slf4j.Logger;
import com.fasterxml.jackson.databind.node.ArrayNode;
import com.fasterxml.jackson.databind.node.ObjectNode;
/**
* sql task
*/
public class SqlTask extends AbstractTask {
/**
* sql parameters
*/
private SqlParameters sqlParameters;
/**
* alert dao
*/
private AlertDao alertDao;
/**
* base datasource
*/
private BaseConnectionParam baseConnectionParam;
/**
* taskExecutionContext
*/
private TaskExecutionContext taskExecutionContext;
private AlertClientService alertClientService;
public SqlTask(TaskExecutionContext taskExecutionContext, Logger logger, AlertClientService alertClientService) {
super(taskExecutionContext, logger);
this.taskExecutionContext = taskExecutionContext;
logger.info("sql task params {}", taskExecutionContext.getTaskParams());
this.sqlParameters = JSONUtils.parseObject(taskExecutionContext.getTaskParams(), SqlParameters.class);
if (!sqlParameters.checkParameters()) {
throw new RuntimeException("sql task params is not valid");
}
this.alertClientService = alertClientService;
this.alertDao = SpringApplicationContext.getBean(AlertDao.class);
}
@Override
public void handle() throws Exception {
// set the name of the current thread
String threadLoggerInfoName = String.format(Constants.TASK_LOG_INFO_FORMAT, taskExecutionContext.getTaskAppId());
Thread.currentThread().setName(threadLoggerInfoName);
logger.info("Full sql parameters: {}", sqlParameters);
logger.info("sql type : {}, datasource : {}, sql : {} , localParams : {},udfs : {},showType : {},connParams : {},varPool : {} ,query max result limit {}",
sqlParameters.getType(),
sqlParameters.getDatasource(),
sqlParameters.getSql(),
sqlParameters.getLocalParams(),
sqlParameters.getUdfs(),
sqlParameters.getShowType(),
sqlParameters.getConnParams(),
sqlParameters.getVarPool(),
sqlParameters.getLimit());
try {
SQLTaskExecutionContext sqlTaskExecutionContext = taskExecutionContext.getSqlTaskExecutionContext();
// get datasource
baseConnectionParam = (BaseConnectionParam) DatasourceUtil.buildConnectionParams(
DbType.valueOf(sqlParameters.getType()),
sqlTaskExecutionContext.getConnectionParams());
// ready to execute SQL and parameter entity Map
SqlBinds mainSqlBinds = getSqlAndSqlParamsMap(sqlParameters.getSql());
List<SqlBinds> preStatementSqlBinds = Optional.ofNullable(sqlParameters.getPreStatements())
.orElse(new ArrayList<>())
.stream()
.map(this::getSqlAndSqlParamsMap)
.collect(Collectors.toList());
List<SqlBinds> postStatementSqlBinds = Optional.ofNullable(sqlParameters.getPostStatements())
.orElse(new ArrayList<>())
.stream()
.map(this::getSqlAndSqlParamsMap)
.collect(Collectors.toList());
List<String> createFuncs = UDFUtils.createFuncs(sqlTaskExecutionContext.getUdfFuncTenantCodeMap(),
logger);
// execute sql task
executeFuncAndSql(mainSqlBinds, preStatementSqlBinds, postStatementSqlBinds, createFuncs);
setExitStatusCode(Constants.EXIT_CODE_SUCCESS);
} catch (Exception e) {
setExitStatusCode(Constants.EXIT_CODE_FAILURE);
logger.error("sql task error: {}", e.toString());
throw e;
}
}
/**
* ready to execute SQL and parameter entity Map
*
* @return SqlBinds
*/
private SqlBinds getSqlAndSqlParamsMap(String sql) {
Map<Integer, Property> sqlParamsMap = new HashMap<>();
StringBuilder sqlBuilder = new StringBuilder();
// combining local and global parameters
Map<String, Property> paramsMap = ParamUtils.convert(taskExecutionContext,getParameters());
// spell SQL according to the final user-defined variable
if (paramsMap == null) {
sqlBuilder.append(sql);
return new SqlBinds(sqlBuilder.toString(), sqlParamsMap);
}
if (StringUtils.isNotEmpty(sqlParameters.getTitle())) {
String title = ParameterUtils.convertParameterPlaceholders(sqlParameters.getTitle(),
ParamUtils.convert(paramsMap));
logger.info("SQL title : {}", title);
sqlParameters.setTitle(title);
}
//new
//replace variable TIME with $[YYYYmmddd...] in sql when history run job and batch complement job
sql = ParameterUtils.replaceScheduleTime(sql, taskExecutionContext.getScheduleTime());
// special characters need to be escaped, ${} needs to be escaped
String rgex = "['\"]*\\$\\{(.*?)\\}['\"]*";
setSqlParamsMap(sql, rgex, sqlParamsMap, paramsMap);
//Replace the original value in sql !{...} ,Does not participate in precompilation
String rgexo = "['\"]*\\!\\{(.*?)\\}['\"]*";
sql = replaceOriginalValue(sql, rgexo, paramsMap);
// replace the ${} of the SQL statement with the Placeholder
String formatSql = sql.replaceAll(rgex, "?");
sqlBuilder.append(formatSql);
// print repalce sql
printReplacedSql(sql, formatSql, rgex, sqlParamsMap);
return new SqlBinds(sqlBuilder.toString(), sqlParamsMap);
}
public String replaceOriginalValue(String content, String rgex, Map<String, Property> sqlParamsMap) {
Pattern pattern = Pattern.compile(rgex);
while (true) {
Matcher m = pattern.matcher(content);
if (!m.find()) {
break;
}
String paramName = m.group(1);
String paramValue = sqlParamsMap.get(paramName).getValue();
content = m.replaceFirst(paramValue);
}
return content;
}
@Override
public AbstractParameters getParameters() {
return this.sqlParameters;
}
/**
* execute function and sql
*
* @param mainSqlBinds main sql binds
* @param preStatementsBinds pre statements binds
* @param postStatementsBinds post statements binds
* @param createFuncs create functions
*/
public void executeFuncAndSql(SqlBinds mainSqlBinds,
List<SqlBinds> preStatementsBinds,
List<SqlBinds> postStatementsBinds,
List<String> createFuncs) throws Exception {
Connection connection = null;
PreparedStatement stmt = null;
ResultSet resultSet = null;
try {
// create connection
connection = DatasourceUtil.getConnection(DbType.valueOf(sqlParameters.getType()), baseConnectionParam);
// create temp function
if (CollectionUtils.isNotEmpty(createFuncs)) {
createTempFunction(connection, createFuncs);
}
// pre sql
preSql(connection, preStatementsBinds);
stmt = prepareStatementAndBind(connection, mainSqlBinds);
String result = null;
// decide whether to executeQuery or executeUpdate based on sqlType
if (sqlParameters.getSqlType() == SqlType.QUERY.ordinal()) {
// query statements need to be convert to JsonArray and inserted into Alert to send
resultSet = stmt.executeQuery();
result = resultProcess(resultSet);
} else if (sqlParameters.getSqlType() == SqlType.NON_QUERY.ordinal()) {
// non query statement
String updateResult = String.valueOf(stmt.executeUpdate());
result = setNonQuerySqlReturn(updateResult, sqlParameters.getLocalParams());
}
//deal out params
sqlParameters.dealOutParam(result);
postSql(connection, postStatementsBinds);
} catch (Exception e) {
logger.error("execute sql error: {}", e.getMessage());
throw e;
} finally {
close(resultSet, stmt, connection);
}
}
public String setNonQuerySqlReturn(String updateResult, List<Property> properties) {
String result = null;
for (Property info : properties) {
if (Direct.OUT == info.getDirect()) {
List<Map<String, String>> updateRL = new ArrayList<>();
Map<String, String> updateRM = new HashMap<>();
updateRM.put(info.getProp(), updateResult);
updateRL.add(updateRM);
result = JSONUtils.toJsonString(updateRL);
break;
}
}
return result;
}
/**
* result process
*
* @param resultSet resultSet
* @throws Exception Exception
*/
private String resultProcess(ResultSet resultSet) throws Exception {
ArrayNode resultJSONArray = JSONUtils.createArrayNode();
if (resultSet != null) {
ResultSetMetaData md = resultSet.getMetaData();
int num = md.getColumnCount();
int rowCount = 0;
while (rowCount < sqlParameters.getLimit() && resultSet.next()) {
ObjectNode mapOfColValues = JSONUtils.createObjectNode();
for (int i = 1; i <= num; i++) {
mapOfColValues.set(md.getColumnLabel(i), JSONUtils.toJsonNode(resultSet.getObject(i)));
}
resultJSONArray.add(mapOfColValues);
rowCount++;
}
int displayRows = sqlParameters.getDisplayRows() > 0 ? sqlParameters.getDisplayRows() : Constants.DEFAULT_DISPLAY_ROWS;
displayRows = Math.min(displayRows, resultJSONArray.size());
logger.info("display sql result {} rows as follows:", displayRows);
for (int i = 0; i < displayRows; i++) {
String row = JSONUtils.toJsonString(resultJSONArray.get(i));
logger.info("row {} : {}", i + 1, row);
}
}
String result = JSONUtils.toJsonString(resultJSONArray);
if (sqlParameters.getSendEmail() == null || sqlParameters.getSendEmail()) {
sendAttachment(sqlParameters.getGroupId(), StringUtils.isNotEmpty(sqlParameters.getTitle())
? sqlParameters.getTitle()
: taskExecutionContext.getTaskName() + " query result sets", result);
}
logger.debug("execute sql result : {}", result);
return result;
}
/**
* pre sql
*
* @param connection connection
* @param preStatementsBinds preStatementsBinds
*/
private void preSql(Connection connection,
List<SqlBinds> preStatementsBinds) throws Exception {
for (SqlBinds sqlBind : preStatementsBinds) {
try (PreparedStatement pstmt = prepareStatementAndBind(connection, sqlBind)) {
int result = pstmt.executeUpdate();
logger.info("pre statement execute result: {}, for sql: {}", result, sqlBind.getSql());
}
}
}
/**
* post sql
*
* @param connection connection
* @param postStatementsBinds postStatementsBinds
*/
private void postSql(Connection connection,
List<SqlBinds> postStatementsBinds) throws Exception {
for (SqlBinds sqlBind : postStatementsBinds) {
try (PreparedStatement pstmt = prepareStatementAndBind(connection, sqlBind)) {
int result = pstmt.executeUpdate();
logger.info("post statement execute result: {},for sql: {}", result, sqlBind.getSql());
}
}
}
/**
* create temp function
*
* @param connection connection
* @param createFuncs createFuncs
*/
private void createTempFunction(Connection connection,
List<String> createFuncs) throws Exception {
try (Statement funcStmt = connection.createStatement()) {
for (String createFunc : createFuncs) {
logger.info("hive create function sql: {}", createFunc);
funcStmt.execute(createFunc);
}
}
}
/**
* close jdbc resource
*
* @param resultSet resultSet
* @param pstmt pstmt
* @param connection connection
*/
private void close(ResultSet resultSet,
PreparedStatement pstmt,
Connection connection) {
if (resultSet != null) {
try {
resultSet.close();
} catch (SQLException e) {
logger.error("close result set error : {}", e.getMessage(), e);
}
}
if (pstmt != null) {
try {
pstmt.close();
} catch (SQLException e) {
logger.error("close prepared statement error : {}", e.getMessage(), e);
}
}
if (connection != null) {
try {
connection.close();
} catch (SQLException e) {
logger.error("close connection error : {}", e.getMessage(), e);
}
}
}
/**
* preparedStatement bind
*
* @param connection connection
* @param sqlBinds sqlBinds
* @return PreparedStatement
* @throws Exception Exception
*/
private PreparedStatement prepareStatementAndBind(Connection connection, SqlBinds sqlBinds) throws Exception {
// is the timeout set
boolean timeoutFlag = taskExecutionContext.getTaskTimeoutStrategy() == TaskTimeoutStrategy.FAILED
|| taskExecutionContext.getTaskTimeoutStrategy() == TaskTimeoutStrategy.WARNFAILED;
PreparedStatement stmt = connection.prepareStatement(sqlBinds.getSql());
if (timeoutFlag) {
stmt.setQueryTimeout(taskExecutionContext.getTaskTimeout());
}
Map<Integer, Property> params = sqlBinds.getParamsMap();
if (params != null) {
for (Map.Entry<Integer, Property> entry : params.entrySet()) {
Property prop = entry.getValue();
ParameterUtils.setInParameter(entry.getKey(), stmt, prop.getType(), prop.getValue());
}
}
logger.info("prepare statement replace sql : {} ", stmt);
return stmt;
}
/**
* send mail as an attachment
*
* @param title title
* @param content content
*/
public void sendAttachment(int groupId, String title, String content) {
AlertSendResponseCommand alertSendResponseCommand = alertClientService.sendAlert(groupId, title, content);
if (!alertSendResponseCommand.getResStatus()) {
throw new RuntimeException("send mail failed!");
}
}
/**
* regular expressions match the contents between two specified strings
*
* @param content content
* @param rgex rgex
* @param sqlParamsMap sql params map
* @param paramsPropsMap params props map
*/
public void setSqlParamsMap(String content, String rgex, Map<Integer, Property> sqlParamsMap, Map<String, Property> paramsPropsMap) {
Pattern pattern = Pattern.compile(rgex);
Matcher m = pattern.matcher(content);
int index = 1;
while (m.find()) {
String paramName = m.group(1);
Property prop = paramsPropsMap.get(paramName);
if (prop == null) {
logger.error("setSqlParamsMap: No Property with paramName: {} is found in paramsPropsMap of task instance"
+ " with id: {}. So couldn't put Property in sqlParamsMap.", paramName, taskExecutionContext.getTaskInstanceId());
}
else {
sqlParamsMap.put(index, prop);
index++;
logger.info("setSqlParamsMap: Property with paramName: {} put in sqlParamsMap of content {} successfully.", paramName, content);
}
}
}
/**
* print replace sql
*
* @param content content
* @param formatSql format sql
* @param rgex rgex
* @param sqlParamsMap sql params map
*/
public void printReplacedSql(String content, String formatSql, String rgex, Map<Integer, Property> sqlParamsMap) {
//parameter print style
logger.info("after replace sql , preparing : {}", formatSql);
if (MapUtils.isEmpty(sqlParamsMap)) {
logger.info("sqlParamsMap should not be Empty");
return;
}
StringBuilder logPrint = new StringBuilder("replaced sql , parameters:");
if (sqlParamsMap == null) {
logger.info("printReplacedSql: sqlParamsMap is null.");
}
else {
for (int i = 1; i <= sqlParamsMap.size(); i++) {
logPrint.append(sqlParamsMap.get(i).getValue() + "(" + sqlParamsMap.get(i).getType() + ")");
}
}
logger.info("Sql Params are {}", logPrint);
}
}

75
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/SqoopConstants.java

@ -1,75 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sqoop;
public final class SqoopConstants {
private SqoopConstants() {
}
//sqoop general param
public static final String SQOOP = "sqoop";
public static final String SQOOP_MR_JOB_NAME = "mapred.job.name";
public static final String SQOOP_PARALLELISM = "-m";
public static final String FIELDS_TERMINATED_BY = "--fields-terminated-by";
public static final String LINES_TERMINATED_BY = "--lines-terminated-by";
public static final String FIELD_NULL_PLACEHOLDER = "--null-non-string 'NULL' --null-string 'NULL'";
//sqoop db
public static final String DB_CONNECT = "--connect";
public static final String DB_USERNAME = "--username";
public static final String DB_PWD = "--password";
public static final String TABLE = "--table";
public static final String COLUMNS = "--columns";
public static final String QUERY_WHERE = "where";
public static final String QUERY = "--query";
public static final String QUERY_CONDITION = "AND \\$CONDITIONS";
public static final String QUERY_WITHOUT_CONDITION = "WHERE \\$CONDITIONS";
public static final String MAP_COLUMN_HIVE = "--map-column-hive";
public static final String MAP_COLUMN_JAVA = "--map-column-java";
//sqoop hive source
public static final String HCATALOG_DATABASE = "--hcatalog-database";
public static final String HCATALOG_TABLE = "--hcatalog-table";
public static final String HCATALOG_PARTITION_KEYS = "--hcatalog-partition-keys";
public static final String HCATALOG_PARTITION_VALUES = "--hcatalog-partition-values";
//sqoop hdfs
public static final String HDFS_EXPORT_DIR = "--export-dir";
public static final String TARGET_DIR = "--target-dir";
public static final String COMPRESSION_CODEC = "--compression-codec";
//sqoop hive
public static final String HIVE_IMPORT = "--hive-import";
public static final String HIVE_DATABASE = "--hive-database";
public static final String HIVE_TABLE = "--hive-table";
public static final String CREATE_HIVE_TABLE = "--create-hive-table";
public static final String HIVE_DROP_IMPORT_DELIMS = "--hive-drop-import-delims";
public static final String HIVE_OVERWRITE = "--hive-overwrite";
public static final String DELETE_TARGET_DIR = "--delete-target-dir";
public static final String HIVE_DELIMS_REPLACEMENT = "--hive-delims-replacement";
public static final String HIVE_PARTITION_KEY = "--hive-partition-key";
public static final String HIVE_PARTITION_VALUE = "--hive-partition-value";
//sqoop update model
public static final String UPDATE_KEY = "--update-key";
public static final String UPDATE_MODE = "--update-mode";
}

95
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/SqoopTask.java

@ -1,95 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sqoop;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import org.apache.dolphinscheduler.common.task.sqoop.SqoopParameters;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.utils.ParamUtils;
import org.apache.dolphinscheduler.server.worker.task.AbstractYarnTask;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.SqoopJobGenerator;
import java.util.Map;
import org.slf4j.Logger;
/**
* sqoop task extends the shell task
*/
public class SqoopTask extends AbstractYarnTask {
/**
* sqoop task params
*/
private SqoopParameters sqoopParameters;
/**
* taskExecutionContext
*/
private final TaskExecutionContext sqoopTaskExecutionContext;
public SqoopTask(TaskExecutionContext taskExecutionContext, Logger logger) {
super(taskExecutionContext, logger);
this.sqoopTaskExecutionContext = taskExecutionContext;
}
@Override
public void init() {
logger.info("sqoop task params {}", sqoopTaskExecutionContext.getTaskParams());
sqoopParameters =
JSONUtils.parseObject(sqoopTaskExecutionContext.getTaskParams(), SqoopParameters.class);
//check sqoop task params
if (null == sqoopParameters) {
throw new IllegalArgumentException("Sqoop Task params is null");
}
if (!sqoopParameters.checkParameters()) {
throw new IllegalArgumentException("Sqoop Task params check fail");
}
}
@Override
protected String buildCommand() {
//get sqoop scripts
SqoopJobGenerator generator = new SqoopJobGenerator();
String script = generator.generateSqoopJob(sqoopParameters, sqoopTaskExecutionContext);
// combining local and global parameters
Map<String, Property> paramsMap = ParamUtils.convert(sqoopTaskExecutionContext,getParameters());
if (paramsMap != null) {
String resultScripts = ParameterUtils.convertParameterPlaceholders(script, ParamUtils.convert(paramsMap));
logger.info("sqoop script: {}", resultScripts);
return resultScripts;
}
return null;
}
@Override
protected void setMainJarName() {
}
@Override
public AbstractParameters getParameters() {
return sqoopParameters;
}
}

85
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/CommonGenerator.java

@ -1,85 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sqoop.generator;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.task.sqoop.SqoopParameters;
import org.apache.dolphinscheduler.common.utils.CollectionUtils;
import org.apache.dolphinscheduler.server.worker.task.sqoop.SqoopConstants;
import java.util.List;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* common script generator
*/
public class CommonGenerator {
private static final Logger logger = LoggerFactory.getLogger(CommonGenerator.class);
public String generate(SqoopParameters sqoopParameters) {
StringBuilder commonSb = new StringBuilder();
try {
//sqoop task model
commonSb.append(SqoopConstants.SQOOP)
.append(Constants.SPACE)
.append(sqoopParameters.getModelType());
//sqoop map-reduce job name
commonSb.append(Constants.SPACE).append(Constants.D).append(Constants.SPACE)
.append(String.format("%s%s%s", SqoopConstants.SQOOP_MR_JOB_NAME,
Constants.EQUAL_SIGN, sqoopParameters.getJobName()));
//hadoop custom param
List<Property> hadoopCustomParams = sqoopParameters.getHadoopCustomParams();
if (CollectionUtils.isNotEmpty(hadoopCustomParams)) {
for (Property hadoopCustomParam : hadoopCustomParams) {
String hadoopCustomParamStr = String.format("%s%s%s", hadoopCustomParam.getProp(),
Constants.EQUAL_SIGN, hadoopCustomParam.getValue());
commonSb.append(Constants.SPACE).append(Constants.D)
.append(Constants.SPACE).append(hadoopCustomParamStr);
}
}
//sqoop custom params
List<Property> sqoopAdvancedParams = sqoopParameters.getSqoopAdvancedParams();
if (CollectionUtils.isNotEmpty(sqoopAdvancedParams)) {
for (Property sqoopAdvancedParam : sqoopAdvancedParams) {
commonSb.append(Constants.SPACE).append(sqoopAdvancedParam.getProp())
.append(Constants.SPACE).append(sqoopAdvancedParam.getValue());
}
}
//sqoop parallelism
if (sqoopParameters.getConcurrency() > 0) {
commonSb.append(Constants.SPACE).append(SqoopConstants.SQOOP_PARALLELISM)
.append(Constants.SPACE).append(sqoopParameters.getConcurrency());
}
} catch (Exception e) {
logger.error(String.format("Sqoop task general param build failed: [%s]", e.getMessage()));
}
return commonSb.toString();
}
}

124
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/SqoopJobGenerator.java

@ -1,124 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sqoop.generator;
import org.apache.dolphinscheduler.common.enums.SqoopJobType;
import org.apache.dolphinscheduler.common.task.sqoop.SqoopParameters;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.sources.HdfsSourceGenerator;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.sources.HiveSourceGenerator;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.sources.MysqlSourceGenerator;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.targets.HdfsTargetGenerator;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.targets.HiveTargetGenerator;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.targets.MysqlTargetGenerator;
/**
* Sqoop Job Scripts Generator
*/
public class SqoopJobGenerator {
private static final String MYSQL = "MYSQL";
private static final String HIVE = "HIVE";
private static final String HDFS = "HDFS";
/**
* target script generator
*/
private ITargetGenerator targetGenerator;
/**
* source script generator
*/
private ISourceGenerator sourceGenerator;
/**
* common script generator
*/
private final CommonGenerator commonGenerator;
public SqoopJobGenerator() {
commonGenerator = new CommonGenerator();
}
private void createSqoopJobGenerator(String sourceType, String targetType) {
sourceGenerator = createSourceGenerator(sourceType);
targetGenerator = createTargetGenerator(targetType);
}
/**
* get the final sqoop scripts
*
* @param sqoopParameters sqoop params
* @return sqoop scripts
*/
public String generateSqoopJob(SqoopParameters sqoopParameters, TaskExecutionContext taskExecutionContext) {
String sqoopScripts = "";
if (SqoopJobType.TEMPLATE.getDescp().equals(sqoopParameters.getJobType())) {
createSqoopJobGenerator(sqoopParameters.getSourceType(), sqoopParameters.getTargetType());
if (sourceGenerator == null || targetGenerator == null) {
throw new RuntimeException("sqoop task source type or target type is null");
}
sqoopScripts = String.format("%s%s%s", commonGenerator.generate(sqoopParameters),
sourceGenerator.generate(sqoopParameters, taskExecutionContext),
targetGenerator.generate(sqoopParameters, taskExecutionContext));
} else if (SqoopJobType.CUSTOM.getDescp().equals(sqoopParameters.getJobType())) {
sqoopScripts = sqoopParameters.getCustomShell().replaceAll("\\r\\n", "\n");
}
return sqoopScripts;
}
/**
* get the source generator
*
* @param sourceType sqoop source type
* @return sqoop source generator
*/
private ISourceGenerator createSourceGenerator(String sourceType) {
switch (sourceType) {
case MYSQL:
return new MysqlSourceGenerator();
case HIVE:
return new HiveSourceGenerator();
case HDFS:
return new HdfsSourceGenerator();
default:
return null;
}
}
/**
* get the target generator
*
* @param targetType sqoop target type
* @return sqoop target generator
*/
private ITargetGenerator createTargetGenerator(String targetType) {
switch (targetType) {
case MYSQL:
return new MysqlTargetGenerator();
case HIVE:
return new HiveTargetGenerator();
case HDFS:
return new HdfsTargetGenerator();
default:
return null;
}
}
}

63
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/sources/HdfsSourceGenerator.java

@ -1,63 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sqoop.generator.sources;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.task.sqoop.SqoopParameters;
import org.apache.dolphinscheduler.common.task.sqoop.sources.SourceHdfsParameter;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.sqoop.SqoopConstants;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.ISourceGenerator;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* hdfs source generator
*/
public class HdfsSourceGenerator implements ISourceGenerator {
private static final Logger logger = LoggerFactory.getLogger(HdfsSourceGenerator.class);
@Override
public String generate(SqoopParameters sqoopParameters, TaskExecutionContext taskExecutionContext) {
StringBuilder hdfsSourceSb = new StringBuilder();
try {
SourceHdfsParameter sourceHdfsParameter
= JSONUtils.parseObject(sqoopParameters.getSourceParams(), SourceHdfsParameter.class);
if (null != sourceHdfsParameter) {
if (StringUtils.isNotEmpty(sourceHdfsParameter.getExportDir())) {
hdfsSourceSb.append(Constants.SPACE).append(SqoopConstants.HDFS_EXPORT_DIR)
.append(Constants.SPACE).append(sourceHdfsParameter.getExportDir());
} else {
throw new IllegalArgumentException("Sqoop hdfs export dir is null");
}
}
} catch (Exception e) {
logger.error(String.format("Sqoop hdfs source parmas build failed: [%s]", e.getMessage()));
}
return hdfsSourceSb.toString();
}
}

73
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/sources/HiveSourceGenerator.java

@ -1,73 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sqoop.generator.sources;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.task.sqoop.SqoopParameters;
import org.apache.dolphinscheduler.common.task.sqoop.sources.SourceHiveParameter;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.sqoop.SqoopConstants;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.ISourceGenerator;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* hive source generator
*/
public class HiveSourceGenerator implements ISourceGenerator {
private static final Logger logger = LoggerFactory.getLogger(HiveSourceGenerator.class);
@Override
public String generate(SqoopParameters sqoopParameters, TaskExecutionContext taskExecutionContext) {
StringBuilder hiveSourceSb = new StringBuilder();
try {
SourceHiveParameter sourceHiveParameter
= JSONUtils.parseObject(sqoopParameters.getSourceParams(), SourceHiveParameter.class);
if (null != sourceHiveParameter) {
if (StringUtils.isNotEmpty(sourceHiveParameter.getHiveDatabase())) {
hiveSourceSb.append(Constants.SPACE).append(SqoopConstants.HCATALOG_DATABASE)
.append(Constants.SPACE).append(sourceHiveParameter.getHiveDatabase());
}
if (StringUtils.isNotEmpty(sourceHiveParameter.getHiveTable())) {
hiveSourceSb.append(Constants.SPACE).append(SqoopConstants.HCATALOG_TABLE)
.append(Constants.SPACE).append(sourceHiveParameter.getHiveTable());
}
if (StringUtils.isNotEmpty(sourceHiveParameter.getHivePartitionKey())
&& StringUtils.isNotEmpty(sourceHiveParameter.getHivePartitionValue())) {
hiveSourceSb.append(Constants.SPACE).append(SqoopConstants.HCATALOG_PARTITION_KEYS)
.append(Constants.SPACE).append(sourceHiveParameter.getHivePartitionKey())
.append(Constants.SPACE).append(SqoopConstants.HCATALOG_PARTITION_VALUES)
.append(Constants.SPACE).append(sourceHiveParameter.getHivePartitionValue());
}
}
} catch (Exception e) {
logger.error(String.format("Sqoop hive source params build failed: [%s]", e.getMessage()));
}
return hiveSourceSb.toString();
}
}

135
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/sources/MysqlSourceGenerator.java

@ -1,135 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sqoop.generator.sources;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.datasource.BaseConnectionParam;
import org.apache.dolphinscheduler.common.datasource.DatasourceUtil;
import org.apache.dolphinscheduler.common.enums.DbType;
import org.apache.dolphinscheduler.common.enums.SqoopQueryType;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.task.sqoop.SqoopParameters;
import org.apache.dolphinscheduler.common.task.sqoop.sources.SourceMysqlParameter;
import org.apache.dolphinscheduler.common.utils.CommonUtils;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.server.entity.SqoopTaskExecutionContext;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.sqoop.SqoopConstants;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.ISourceGenerator;
import java.util.List;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* mysql source generator
*/
public class MysqlSourceGenerator implements ISourceGenerator {
private static final Logger logger = LoggerFactory.getLogger(MysqlSourceGenerator.class);
@Override
public String generate(SqoopParameters sqoopParameters, TaskExecutionContext taskExecutionContext) {
StringBuilder mysqlSourceSb = new StringBuilder();
try {
SourceMysqlParameter sourceMysqlParameter = JSONUtils.parseObject(sqoopParameters.getSourceParams(), SourceMysqlParameter.class);
SqoopTaskExecutionContext sqoopTaskExecutionContext = taskExecutionContext.getSqoopTaskExecutionContext();
if (null != sourceMysqlParameter) {
BaseConnectionParam baseDataSource = (BaseConnectionParam) DatasourceUtil.buildConnectionParams(
DbType.of(sqoopTaskExecutionContext.getSourcetype()),
sqoopTaskExecutionContext.getSourceConnectionParams());
if (null != baseDataSource) {
mysqlSourceSb.append(Constants.SPACE).append(SqoopConstants.DB_CONNECT)
.append(Constants.SPACE).append(Constants.DOUBLE_QUOTES)
.append(DatasourceUtil.getJdbcUrl(DbType.MYSQL, baseDataSource)).append(Constants.DOUBLE_QUOTES)
.append(Constants.SPACE).append(SqoopConstants.DB_USERNAME)
.append(Constants.SPACE).append(baseDataSource.getUser())
.append(Constants.SPACE).append(SqoopConstants.DB_PWD)
.append(Constants.SPACE).append(Constants.DOUBLE_QUOTES)
.append(CommonUtils.decodePassword(baseDataSource.getPassword())).append(Constants.DOUBLE_QUOTES);
//sqoop table & sql query
if (sourceMysqlParameter.getSrcQueryType() == SqoopQueryType.FORM.getCode()) {
if (StringUtils.isNotEmpty(sourceMysqlParameter.getSrcTable())) {
mysqlSourceSb.append(Constants.SPACE).append(SqoopConstants.TABLE)
.append(Constants.SPACE).append(sourceMysqlParameter.getSrcTable());
}
if (StringUtils.isNotEmpty(sourceMysqlParameter.getSrcColumns())) {
mysqlSourceSb.append(Constants.SPACE).append(SqoopConstants.COLUMNS)
.append(Constants.SPACE).append(sourceMysqlParameter.getSrcColumns());
}
} else if (sourceMysqlParameter.getSrcQueryType() == SqoopQueryType.SQL.getCode()
&& StringUtils.isNotEmpty(sourceMysqlParameter.getSrcQuerySql())) {
String srcQuery = sourceMysqlParameter.getSrcQuerySql();
mysqlSourceSb.append(Constants.SPACE).append(SqoopConstants.QUERY)
.append(Constants.SPACE).append(Constants.DOUBLE_QUOTES).append(srcQuery);
if (srcQuery.toLowerCase().contains(SqoopConstants.QUERY_WHERE)) {
mysqlSourceSb.append(Constants.SPACE).append(SqoopConstants.QUERY_CONDITION).append(Constants.DOUBLE_QUOTES);
} else {
mysqlSourceSb.append(Constants.SPACE).append(SqoopConstants.QUERY_WITHOUT_CONDITION).append(Constants.DOUBLE_QUOTES);
}
}
//sqoop hive map column
List<Property> mapColumnHive = sourceMysqlParameter.getMapColumnHive();
if (null != mapColumnHive && !mapColumnHive.isEmpty()) {
StringBuilder columnMap = new StringBuilder();
for (Property item : mapColumnHive) {
columnMap.append(item.getProp()).append(Constants.EQUAL_SIGN).append(item.getValue()).append(Constants.COMMA);
}
if (StringUtils.isNotEmpty(columnMap.toString())) {
mysqlSourceSb.append(Constants.SPACE).append(SqoopConstants.MAP_COLUMN_HIVE)
.append(Constants.SPACE).append(columnMap.substring(0, columnMap.length() - 1));
}
}
//sqoop map column java
List<Property> mapColumnJava = sourceMysqlParameter.getMapColumnJava();
if (null != mapColumnJava && !mapColumnJava.isEmpty()) {
StringBuilder columnMap = new StringBuilder();
for (Property item : mapColumnJava) {
columnMap.append(item.getProp()).append(Constants.EQUAL_SIGN).append(item.getValue()).append(Constants.COMMA);
}
if (StringUtils.isNotEmpty(columnMap.toString())) {
mysqlSourceSb.append(Constants.SPACE).append(SqoopConstants.MAP_COLUMN_JAVA)
.append(Constants.SPACE).append(columnMap.substring(0, columnMap.length() - 1));
}
}
}
}
} catch (Exception e) {
logger.error(String.format("Sqoop task mysql source params build failed: [%s]", e.getMessage()));
}
return mysqlSourceSb.toString();
}
}

86
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/targets/HdfsTargetGenerator.java

@ -1,86 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sqoop.generator.targets;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.task.sqoop.SqoopParameters;
import org.apache.dolphinscheduler.common.task.sqoop.targets.TargetHdfsParameter;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.sqoop.SqoopConstants;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.ITargetGenerator;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* hdfs target generator
*/
public class HdfsTargetGenerator implements ITargetGenerator {
private static final Logger logger = LoggerFactory.getLogger(HdfsTargetGenerator.class);
@Override
public String generate(SqoopParameters sqoopParameters, TaskExecutionContext taskExecutionContext) {
StringBuilder hdfsTargetSb = new StringBuilder();
try {
TargetHdfsParameter targetHdfsParameter =
JSONUtils.parseObject(sqoopParameters.getTargetParams(), TargetHdfsParameter.class);
if (null != targetHdfsParameter) {
if (StringUtils.isNotEmpty(targetHdfsParameter.getTargetPath())) {
hdfsTargetSb.append(Constants.SPACE).append(SqoopConstants.TARGET_DIR)
.append(Constants.SPACE).append(targetHdfsParameter.getTargetPath());
}
if (StringUtils.isNotEmpty(targetHdfsParameter.getCompressionCodec())) {
hdfsTargetSb.append(Constants.SPACE).append(SqoopConstants.COMPRESSION_CODEC)
.append(Constants.SPACE).append(targetHdfsParameter.getCompressionCodec());
}
if (StringUtils.isNotEmpty(targetHdfsParameter.getFileType())) {
hdfsTargetSb.append(Constants.SPACE).append(targetHdfsParameter.getFileType());
}
if (targetHdfsParameter.isDeleteTargetDir()) {
hdfsTargetSb.append(Constants.SPACE).append(SqoopConstants.DELETE_TARGET_DIR);
}
if (StringUtils.isNotEmpty(targetHdfsParameter.getFieldsTerminated())) {
hdfsTargetSb.append(Constants.SPACE).append(SqoopConstants.FIELDS_TERMINATED_BY)
.append(Constants.SPACE).append(Constants.SINGLE_QUOTES).append(targetHdfsParameter.getFieldsTerminated()).append(Constants.SINGLE_QUOTES);
}
if (StringUtils.isNotEmpty(targetHdfsParameter.getLinesTerminated())) {
hdfsTargetSb.append(Constants.SPACE).append(SqoopConstants.LINES_TERMINATED_BY)
.append(Constants.SPACE).append(Constants.SINGLE_QUOTES).append(targetHdfsParameter.getLinesTerminated()).append(Constants.SINGLE_QUOTES);
}
hdfsTargetSb.append(Constants.SPACE).append(SqoopConstants.FIELD_NULL_PLACEHOLDER);
}
} catch (Exception e) {
logger.error(String.format("Sqoop hdfs target params build failed: [%s]", e.getMessage()));
}
return hdfsTargetSb.toString();
}
}

91
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/targets/HiveTargetGenerator.java

@ -1,91 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sqoop.generator.targets;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.task.sqoop.SqoopParameters;
import org.apache.dolphinscheduler.common.task.sqoop.targets.TargetHiveParameter;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.sqoop.SqoopConstants;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.ITargetGenerator;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* hive target generator
*/
public class HiveTargetGenerator implements ITargetGenerator {
private static final Logger logger = LoggerFactory.getLogger(HiveTargetGenerator.class);
@Override
public String generate(SqoopParameters sqoopParameters, TaskExecutionContext taskExecutionContext) {
StringBuilder hiveTargetSb = new StringBuilder();
try {
TargetHiveParameter targetHiveParameter =
JSONUtils.parseObject(sqoopParameters.getTargetParams(), TargetHiveParameter.class);
if (null != targetHiveParameter) {
hiveTargetSb.append(Constants.SPACE).append(SqoopConstants.HIVE_IMPORT);
if (StringUtils.isNotEmpty(targetHiveParameter.getHiveDatabase())
&& StringUtils.isNotEmpty(targetHiveParameter.getHiveTable())) {
hiveTargetSb.append(Constants.SPACE).append(SqoopConstants.HIVE_DATABASE)
.append(Constants.SPACE).append(targetHiveParameter.getHiveDatabase())
.append(Constants.SPACE).append(SqoopConstants.HIVE_TABLE)
.append(Constants.SPACE).append(targetHiveParameter.getHiveTable());
}
if (targetHiveParameter.isCreateHiveTable()) {
hiveTargetSb.append(Constants.SPACE).append(SqoopConstants.CREATE_HIVE_TABLE);
}
if (targetHiveParameter.isDropDelimiter()) {
hiveTargetSb.append(Constants.SPACE).append(SqoopConstants.HIVE_DROP_IMPORT_DELIMS);
}
if (targetHiveParameter.isHiveOverWrite()) {
hiveTargetSb.append(Constants.SPACE).append(SqoopConstants.HIVE_OVERWRITE)
.append(Constants.SPACE).append(SqoopConstants.DELETE_TARGET_DIR);
}
if (StringUtils.isNotEmpty(targetHiveParameter.getReplaceDelimiter())) {
hiveTargetSb.append(Constants.SPACE).append(SqoopConstants.HIVE_DELIMS_REPLACEMENT)
.append(Constants.SPACE).append(targetHiveParameter.getReplaceDelimiter());
}
if (StringUtils.isNotEmpty(targetHiveParameter.getHivePartitionKey())
&& StringUtils.isNotEmpty(targetHiveParameter.getHivePartitionValue())) {
hiveTargetSb.append(Constants.SPACE).append(SqoopConstants.HIVE_PARTITION_KEY)
.append(Constants.SPACE).append(targetHiveParameter.getHivePartitionKey())
.append(Constants.SPACE).append(SqoopConstants.HIVE_PARTITION_VALUE)
.append(Constants.SPACE).append(targetHiveParameter.getHivePartitionValue());
}
}
} catch (Exception e) {
logger.error(String.format("Sqoop hive target params build failed: [%s]", e.getMessage()));
}
return hiveTargetSb.toString();
}
}

115
dolphinscheduler-server/src/main/java/org/apache/dolphinscheduler/server/worker/task/sqoop/generator/targets/MysqlTargetGenerator.java

@ -1,115 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sqoop.generator.targets;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.datasource.BaseConnectionParam;
import org.apache.dolphinscheduler.common.datasource.DatasourceUtil;
import org.apache.dolphinscheduler.common.enums.DbType;
import org.apache.dolphinscheduler.common.task.sqoop.SqoopParameters;
import org.apache.dolphinscheduler.common.task.sqoop.targets.TargetMysqlParameter;
import org.apache.dolphinscheduler.common.utils.CommonUtils;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.StringUtils;
import org.apache.dolphinscheduler.server.entity.SqoopTaskExecutionContext;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.sqoop.SqoopConstants;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.ITargetGenerator;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* mysql target generator
*/
public class MysqlTargetGenerator implements ITargetGenerator {
private static final Logger logger = LoggerFactory.getLogger(MysqlTargetGenerator.class);
@Override
public String generate(SqoopParameters sqoopParameters, TaskExecutionContext taskExecutionContext) {
StringBuilder mysqlTargetSb = new StringBuilder();
try {
TargetMysqlParameter targetMysqlParameter =
JSONUtils.parseObject(sqoopParameters.getTargetParams(), TargetMysqlParameter.class);
SqoopTaskExecutionContext sqoopTaskExecutionContext = taskExecutionContext.getSqoopTaskExecutionContext();
if (null != targetMysqlParameter && targetMysqlParameter.getTargetDatasource() != 0) {
// get datasource
BaseConnectionParam baseDataSource = (BaseConnectionParam) DatasourceUtil.buildConnectionParams(
DbType.of(sqoopTaskExecutionContext.getTargetType()),
sqoopTaskExecutionContext.getTargetConnectionParams());
if (null != baseDataSource) {
mysqlTargetSb.append(Constants.SPACE).append(SqoopConstants.DB_CONNECT)
.append(Constants.SPACE).append(Constants.DOUBLE_QUOTES)
.append(DatasourceUtil.getJdbcUrl(DbType.MYSQL, baseDataSource)).append(Constants.DOUBLE_QUOTES)
.append(Constants.SPACE).append(SqoopConstants.DB_USERNAME)
.append(Constants.SPACE).append(baseDataSource.getUser())
.append(Constants.SPACE).append(SqoopConstants.DB_PWD)
.append(Constants.SPACE).append(Constants.DOUBLE_QUOTES)
.append(CommonUtils.decodePassword(baseDataSource.getPassword())).append(Constants.DOUBLE_QUOTES)
.append(Constants.SPACE).append(SqoopConstants.TABLE)
.append(Constants.SPACE).append(targetMysqlParameter.getTargetTable());
if (StringUtils.isNotEmpty(targetMysqlParameter.getTargetColumns())) {
mysqlTargetSb.append(Constants.SPACE).append(SqoopConstants.COLUMNS)
.append(Constants.SPACE).append(targetMysqlParameter.getTargetColumns());
}
if (StringUtils.isNotEmpty(targetMysqlParameter.getFieldsTerminated())) {
mysqlTargetSb.append(Constants.SPACE).append(SqoopConstants.FIELDS_TERMINATED_BY);
if (targetMysqlParameter.getFieldsTerminated().contains("'")) {
mysqlTargetSb.append(Constants.SPACE).append(targetMysqlParameter.getFieldsTerminated());
} else {
mysqlTargetSb.append(Constants.SPACE).append(Constants.SINGLE_QUOTES).append(targetMysqlParameter.getFieldsTerminated()).append(Constants.SINGLE_QUOTES);
}
}
if (StringUtils.isNotEmpty(targetMysqlParameter.getLinesTerminated())) {
mysqlTargetSb.append(Constants.SPACE).append(SqoopConstants.LINES_TERMINATED_BY);
if (targetMysqlParameter.getLinesTerminated().contains(Constants.SINGLE_QUOTES)) {
mysqlTargetSb.append(Constants.SPACE).append(targetMysqlParameter.getLinesTerminated());
} else {
mysqlTargetSb.append(Constants.SPACE).append(Constants.SINGLE_QUOTES).append(targetMysqlParameter.getLinesTerminated()).append(Constants.SINGLE_QUOTES);
}
}
if (targetMysqlParameter.getIsUpdate()
&& StringUtils.isNotEmpty(targetMysqlParameter.getTargetUpdateKey())
&& StringUtils.isNotEmpty(targetMysqlParameter.getTargetUpdateMode())) {
mysqlTargetSb.append(Constants.SPACE).append(SqoopConstants.UPDATE_KEY)
.append(Constants.SPACE).append(targetMysqlParameter.getTargetUpdateKey())
.append(Constants.SPACE).append(SqoopConstants.UPDATE_MODE)
.append(Constants.SPACE).append(targetMysqlParameter.getTargetUpdateMode());
}
}
}
} catch (Exception e) {
logger.error(String.format("Sqoop mysql target params build failed: [%s]", e.getMessage()));
}
return mysqlTargetSb.toString();
}
}

6
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/master/MasterExecThreadTest.java

@ -101,8 +101,8 @@ public class MasterExecThreadTest {
cmdParam.put(CMDPARAM_COMPLEMENT_DATA_END_DATE, "2020-01-20 23:00:00");
Mockito.when(processInstance.getCommandParam()).thenReturn(JSONUtils.toJsonString(cmdParam));
ProcessDefinition processDefinition = new ProcessDefinition();
processDefinition.setGlobalParamMap(Collections.EMPTY_MAP);
processDefinition.setGlobalParamList(Collections.EMPTY_LIST);
processDefinition.setGlobalParamMap(Collections.emptyMap());
processDefinition.setGlobalParamList(Collections.emptyList());
Mockito.when(processInstance.getProcessDefinition()).thenReturn(processDefinition);
masterExecThread = PowerMockito.spy(new MasterExecThread(processInstance, processService, null, null, config));
@ -256,7 +256,7 @@ public class MasterExecThreadTest {
}
private List<Schedule> zeroSchedulerList() {
return Collections.EMPTY_LIST;
return Collections.emptyList();
}
private List<Schedule> oneSchedulerList() {

167
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/master/SwitchTaskTest.java

@ -0,0 +1,167 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.master;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.enums.ExecutionStatus;
import org.apache.dolphinscheduler.common.enums.TaskTimeoutStrategy;
import org.apache.dolphinscheduler.common.enums.TimeoutFlag;
import org.apache.dolphinscheduler.common.task.switchtask.SwitchParameters;
import org.apache.dolphinscheduler.common.task.switchtask.SwitchResultVo;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.dao.entity.ProcessInstance;
import org.apache.dolphinscheduler.dao.entity.TaskDefinition;
import org.apache.dolphinscheduler.dao.entity.TaskInstance;
import org.apache.dolphinscheduler.server.master.config.MasterConfig;
import org.apache.dolphinscheduler.server.master.runner.SwitchTaskExecThread;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import org.apache.dolphinscheduler.service.process.ProcessService;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mockito;
import org.mockito.junit.MockitoJUnitRunner;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationContext;
@RunWith(MockitoJUnitRunner.Silent.class)
public class SwitchTaskTest {
private static final Logger logger = LoggerFactory.getLogger(SwitchTaskTest.class);
/**
* TaskNode.runFlag : task can be run normally
*/
public static final String FLOWNODE_RUN_FLAG_NORMAL = "NORMAL";
private ProcessService processService;
private ProcessInstance processInstance;
@Before
public void before() {
ApplicationContext applicationContext = Mockito.mock(ApplicationContext.class);
SpringApplicationContext springApplicationContext = new SpringApplicationContext();
springApplicationContext.setApplicationContext(applicationContext);
MasterConfig config = new MasterConfig();
Mockito.when(applicationContext.getBean(MasterConfig.class)).thenReturn(config);
config.setMasterTaskCommitRetryTimes(3);
config.setMasterTaskCommitInterval(1000);
processService = Mockito.mock(ProcessService.class);
Mockito.when(applicationContext.getBean(ProcessService.class)).thenReturn(processService);
processInstance = getProcessInstance();
Mockito.when(processService
.findProcessInstanceById(processInstance.getId()))
.thenReturn(processInstance);
}
private TaskInstance testBasicInit(ExecutionStatus expectResult) {
TaskDefinition taskDefinition = new TaskDefinition();
taskDefinition.setTimeoutFlag(TimeoutFlag.OPEN);
taskDefinition.setTimeoutNotifyStrategy(TaskTimeoutStrategy.WARN);
taskDefinition.setTimeout(0);
Mockito.when(processService.findTaskDefinition(1L, 1))
.thenReturn(taskDefinition);
TaskInstance taskInstance = getTaskInstance(getTaskNode(), processInstance);
// for MasterBaseTaskExecThread.submit
Mockito.when(processService
.submitTask(taskInstance))
.thenReturn(taskInstance);
// for MasterBaseTaskExecThread.call
Mockito.when(processService
.findTaskInstanceById(taskInstance.getId()))
.thenReturn(taskInstance);
// for SwitchTaskExecThread.initTaskParameters
Mockito.when(processService
.saveTaskInstance(taskInstance))
.thenReturn(true);
// for SwitchTaskExecThread.updateTaskState
Mockito.when(processService
.updateTaskInstance(taskInstance))
.thenReturn(true);
return taskInstance;
}
@Test
public void testExe() throws Exception {
TaskInstance taskInstance = testBasicInit(ExecutionStatus.SUCCESS);
taskInstance.setState(ExecutionStatus.SUBMITTED_SUCCESS);
SwitchTaskExecThread taskExecThread = new SwitchTaskExecThread(taskInstance);
taskExecThread.call();
Assert.assertEquals(ExecutionStatus.SUCCESS, taskExecThread.getTaskInstance().getState());
}
private SwitchParameters getTaskNode() {
SwitchParameters conditionsParameters = new SwitchParameters();
SwitchResultVo switchResultVo1 = new SwitchResultVo();
switchResultVo1.setCondition(" 2 == 1");
switchResultVo1.setNextNode("t1");
SwitchResultVo switchResultVo2 = new SwitchResultVo();
switchResultVo2.setCondition(" 2 == 2");
switchResultVo2.setNextNode("t2");
SwitchResultVo switchResultVo3 = new SwitchResultVo();
switchResultVo3.setCondition(" 3 == 2");
switchResultVo3.setNextNode("t3");
List<SwitchResultVo> list = new ArrayList<>();
list.add(switchResultVo1);
list.add(switchResultVo2);
list.add(switchResultVo3);
conditionsParameters.setDependTaskList(list);
conditionsParameters.setNextNode("t");
conditionsParameters.setRelation("AND");
return conditionsParameters;
}
private ProcessInstance getProcessInstance() {
ProcessInstance processInstance = new ProcessInstance();
processInstance.setId(1000);
processInstance.setState(ExecutionStatus.RUNNING_EXECUTION);
processInstance.setProcessDefinitionCode(1L);
return processInstance;
}
private TaskInstance getTaskInstance(SwitchParameters conditionsParameters, ProcessInstance processInstance) {
TaskInstance taskInstance = new TaskInstance();
taskInstance.setId(1000);
Map<String, Object> taskParamsMap = new HashMap<>();
taskParamsMap.put(Constants.SWITCH_RESULT, "");
taskInstance.setTaskParams(JSONUtils.toJsonString(taskParamsMap));
taskInstance.setSwitchDependency(conditionsParameters);
taskInstance.setName("C");
taskInstance.setTaskType("SWITCH");
taskInstance.setProcessInstanceId(processInstance.getId());
taskInstance.setTaskCode(1L);
taskInstance.setTaskDefinitionVersion(1);
return taskInstance;
}
}

4
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/processor/TaskExecuteProcessorTest.java

@ -119,7 +119,7 @@ public class TaskExecuteProcessorTest {
taskExecutionContext.getTaskInstanceId()));
workerManager = PowerMockito.mock(WorkerManagerThread.class);
PowerMockito.when(workerManager.offer(new TaskExecuteThread(taskExecutionContext, taskCallbackService, taskLogger, alertClientService))).thenReturn(Boolean.TRUE);
PowerMockito.when(workerManager.offer(new TaskExecuteThread(taskExecutionContext, taskCallbackService, alertClientService))).thenReturn(Boolean.TRUE);
PowerMockito.when(SpringApplicationContext.getBean(WorkerManagerThread.class))
.thenReturn(workerManager);
@ -188,7 +188,7 @@ public class TaskExecuteProcessorTest {
private static class SimpleTaskExecuteThread extends TaskExecuteThread {
public SimpleTaskExecuteThread(TaskExecutionContext taskExecutionContext, TaskCallbackService taskCallbackService, Logger taskLogger, AlertClientService alertClientService) {
super(taskExecutionContext, taskCallbackService, taskLogger, alertClientService);
super(taskExecutionContext, taskCallbackService, alertClientService);
}
@Override

192
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/runner/TaskExecuteThreadTest.java

@ -1,192 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.runner;
import org.apache.dolphinscheduler.common.enums.ExecutionStatus;
import org.apache.dolphinscheduler.common.enums.TaskType;
import org.apache.dolphinscheduler.common.model.TaskNode;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import org.apache.dolphinscheduler.common.task.sql.SqlParameters;
import org.apache.dolphinscheduler.common.utils.CommonUtils;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.LoggerUtils;
import org.apache.dolphinscheduler.common.utils.OSUtils;
import org.apache.dolphinscheduler.remote.command.Command;
import org.apache.dolphinscheduler.remote.command.TaskExecuteAckCommand;
import org.apache.dolphinscheduler.remote.command.TaskExecuteResponseCommand;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.cache.impl.TaskExecutionContextCacheManagerImpl;
import org.apache.dolphinscheduler.server.worker.processor.TaskCallbackService;
import org.apache.dolphinscheduler.server.worker.task.AbstractTask;
import org.apache.dolphinscheduler.server.worker.task.TaskManager;
import org.apache.dolphinscheduler.service.alert.AlertClientService;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import java.util.Collections;
import java.util.Date;
import java.util.List;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.powermock.api.mockito.PowerMockito;
import org.powermock.core.classloader.annotations.PrepareForTest;
import org.powermock.modules.junit4.PowerMockRunner;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* test task execute thread.
*/
@RunWith(PowerMockRunner.class)
@PrepareForTest({TaskManager.class, JSONUtils.class, CommonUtils.class, SpringApplicationContext.class, OSUtils.class})
public class TaskExecuteThreadTest {
private TaskExecutionContext taskExecutionContext;
private TaskCallbackService taskCallbackService;
private Command ackCommand;
private Command responseCommand;
private Logger taskLogger;
private TaskExecutionContextCacheManagerImpl taskExecutionContextCacheManager;
private AlertClientService alertClientService;
@Before
public void before() {
// init task execution context, logger
taskExecutionContext = new TaskExecutionContext();
taskExecutionContext.setProcessId(12345);
taskExecutionContext.setProcessInstanceId(1);
taskExecutionContext.setTaskInstanceId(1);
taskExecutionContext.setProcessDefineCode(1L);
taskExecutionContext.setProcessDefineVersion(1);
taskExecutionContext.setTaskType(TaskType.SHELL.getDesc());
taskExecutionContext.setFirstSubmitTime(new Date());
taskExecutionContext.setDelayTime(0);
taskExecutionContext.setLogPath("/tmp/test.log");
taskExecutionContext.setHost("localhost");
taskExecutionContext.setExecutePath("/tmp/dolphinscheduler/exec/process/1/2/3/4");
ackCommand = new TaskExecuteAckCommand().convert2Command();
responseCommand = new TaskExecuteResponseCommand(taskExecutionContext.getTaskInstanceId()).convert2Command();
taskLogger = LoggerFactory.getLogger(LoggerUtils.buildTaskId(
LoggerUtils.TASK_LOGGER_INFO_PREFIX,
taskExecutionContext.getProcessDefineCode(),
taskExecutionContext.getProcessDefineVersion(),
taskExecutionContext.getProcessInstanceId(),
taskExecutionContext.getTaskInstanceId()
));
taskExecutionContextCacheManager = new TaskExecutionContextCacheManagerImpl();
taskExecutionContextCacheManager.cacheTaskExecutionContext(taskExecutionContext);
taskCallbackService = PowerMockito.mock(TaskCallbackService.class);
PowerMockito.doNothing().when(taskCallbackService).sendAck(taskExecutionContext.getTaskInstanceId(), ackCommand);
PowerMockito.doNothing().when(taskCallbackService).sendResult(taskExecutionContext.getTaskInstanceId(), responseCommand);
PowerMockito.mockStatic(SpringApplicationContext.class);
PowerMockito.when(SpringApplicationContext.getBean(TaskExecutionContextCacheManagerImpl.class))
.thenReturn(taskExecutionContextCacheManager);
alertClientService = PowerMockito.mock(AlertClientService.class);
PowerMockito.mockStatic(TaskManager.class);
PowerMockito.when(TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService))
.thenReturn(new SimpleTask(taskExecutionContext, taskLogger));
PowerMockito.mockStatic(JSONUtils.class);
PowerMockito.when(JSONUtils.parseObject(taskExecutionContext.getTaskJson(), TaskNode.class))
.thenReturn(new TaskNode());
PowerMockito.mockStatic(CommonUtils.class);
PowerMockito.when(CommonUtils.getSystemEnvPath()).thenReturn("/user_home/.bash_profile");
List<String> osUserList = Collections.singletonList("test");
PowerMockito.mockStatic(OSUtils.class);
PowerMockito.when(OSUtils.getUserList()).thenReturn(osUserList);
}
@Test
public void testNormalExecution() {
taskExecutionContext.setTaskType(TaskType.SQL.getDesc());
taskExecutionContext.setStartTime(new Date());
taskExecutionContext.setCurrentExecutionStatus(ExecutionStatus.RUNNING_EXECUTION);
taskExecutionContext.setTenantCode("test");
TaskExecuteThread taskExecuteThread = new TaskExecuteThread(taskExecutionContext, taskCallbackService, taskLogger, alertClientService);
taskExecuteThread.run();
taskExecutionContext.getCurrentExecutionStatus();
taskExecuteThread.run();
Assert.assertEquals(ExecutionStatus.RUNNING_EXECUTION, taskExecutionContext.getCurrentExecutionStatus());
}
@Test
public void testDelayExecution() {
taskExecutionContext.setTaskType(TaskType.PYTHON.getDesc());
taskExecutionContext.setStartTime(null);
taskExecutionContext.setDelayTime(1);
taskExecutionContext.setCurrentExecutionStatus(ExecutionStatus.DELAY_EXECUTION);
taskExecutionContext.setTenantCode("test");
TaskExecuteThread taskExecuteThread = new TaskExecuteThread(taskExecutionContext, taskCallbackService, taskLogger, alertClientService);
taskExecuteThread.run();
Assert.assertEquals(ExecutionStatus.RUNNING_EXECUTION, taskExecutionContext.getCurrentExecutionStatus());
}
private class SimpleTask extends AbstractTask {
protected SimpleTask(TaskExecutionContext taskExecutionContext, Logger logger) {
super(taskExecutionContext, logger);
// pid
this.processId = taskExecutionContext.getProcessId();
}
@Override
public AbstractParameters getParameters() {
return new SqlParameters();
}
@Override
public void init() {
}
@Override
public void handle() throws Exception {
}
@Override
public void after() {
}
@Override
public ExecutionStatus getExitStatus() {
return ExecutionStatus.SUCCESS;
}
}
}

190
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/runner/WorkerManagerThreadTest.java

@ -1,190 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.runner;
import org.apache.dolphinscheduler.common.enums.ExecutionStatus;
import org.apache.dolphinscheduler.common.enums.TaskType;
import org.apache.dolphinscheduler.common.model.TaskNode;
import org.apache.dolphinscheduler.common.task.AbstractParameters;
import org.apache.dolphinscheduler.common.thread.Stopper;
import org.apache.dolphinscheduler.common.utils.CommonUtils;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.LoggerUtils;
import org.apache.dolphinscheduler.common.utils.OSUtils;
import org.apache.dolphinscheduler.remote.command.Command;
import org.apache.dolphinscheduler.remote.command.TaskExecuteAckCommand;
import org.apache.dolphinscheduler.remote.command.TaskExecuteResponseCommand;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.cache.impl.TaskExecutionContextCacheManagerImpl;
import org.apache.dolphinscheduler.server.worker.config.WorkerConfig;
import org.apache.dolphinscheduler.server.worker.processor.TaskCallbackService;
import org.apache.dolphinscheduler.server.worker.task.AbstractTask;
import org.apache.dolphinscheduler.server.worker.task.TaskManager;
import org.apache.dolphinscheduler.service.alert.AlertClientService;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import java.util.Collections;
import java.util.Date;
import java.util.List;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.powermock.api.mockito.PowerMockito;
import org.powermock.core.classloader.annotations.PrepareForTest;
import org.powermock.modules.junit4.PowerMockRunner;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* test worker manager thread.
*/
@RunWith(PowerMockRunner.class)
@PrepareForTest({
Stopper.class,
TaskManager.class,
JSONUtils.class,
CommonUtils.class,
SpringApplicationContext.class,
OSUtils.class})
public class WorkerManagerThreadTest {
private TaskCallbackService taskCallbackService;
private WorkerManagerThread workerManager;
private TaskExecutionContext taskExecutionContext;
private AlertClientService alertClientService;
private Logger taskLogger;
@Before
public void before() {
// init task execution context, logger
taskExecutionContext = new TaskExecutionContext();
taskExecutionContext.setProcessId(12345);
taskExecutionContext.setProcessInstanceId(1);
taskExecutionContext.setTaskInstanceId(1);
taskExecutionContext.setProcessDefineCode(1L);
taskExecutionContext.setProcessDefineVersion(1);
taskExecutionContext.setTenantCode("test");
taskExecutionContext.setTaskType(TaskType.SHELL.getDesc());
taskExecutionContext.setFirstSubmitTime(new Date());
taskExecutionContext.setDelayTime(0);
taskExecutionContext.setLogPath("/tmp/test.log");
taskExecutionContext.setHost("localhost");
taskExecutionContext.setExecutePath("/tmp/dolphinscheduler/exec/process/1/2/3/4");
Command ackCommand = new TaskExecuteAckCommand().convert2Command();
Command responseCommand = new TaskExecuteResponseCommand(taskExecutionContext.getTaskInstanceId()).convert2Command();
taskLogger = LoggerFactory.getLogger(LoggerUtils.buildTaskId(
LoggerUtils.TASK_LOGGER_INFO_PREFIX,
taskExecutionContext.getProcessDefineCode(),
taskExecutionContext.getProcessDefineVersion(),
taskExecutionContext.getProcessInstanceId(),
taskExecutionContext.getTaskInstanceId()
));
TaskExecutionContextCacheManagerImpl taskExecutionContextCacheManager = new TaskExecutionContextCacheManagerImpl();
taskExecutionContextCacheManager.cacheTaskExecutionContext(taskExecutionContext);
alertClientService = PowerMockito.mock(AlertClientService.class);
WorkerConfig workerConfig = PowerMockito.mock(WorkerConfig.class);
taskCallbackService = PowerMockito.mock(TaskCallbackService.class);
PowerMockito.doNothing().when(taskCallbackService).sendAck(taskExecutionContext.getTaskInstanceId(), ackCommand);
PowerMockito.doNothing().when(taskCallbackService).sendResult(taskExecutionContext.getTaskInstanceId(), responseCommand);
PowerMockito.mockStatic(SpringApplicationContext.class);
PowerMockito.when(SpringApplicationContext.getBean(TaskExecutionContextCacheManagerImpl.class))
.thenReturn(taskExecutionContextCacheManager);
PowerMockito.when(SpringApplicationContext.getBean(WorkerConfig.class))
.thenReturn(workerConfig);
PowerMockito.when(SpringApplicationContext.getBean(TaskCallbackService.class))
.thenReturn(taskCallbackService);
PowerMockito.when(workerConfig.getWorkerExecThreads()).thenReturn(5);
workerManager = new WorkerManagerThread();
PowerMockito.mockStatic(TaskManager.class);
PowerMockito.when(TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService))
.thenReturn(new SimpleTask(taskExecutionContext, taskLogger));
PowerMockito.mockStatic(JSONUtils.class);
PowerMockito.when(JSONUtils.parseObject(taskExecutionContext.getTaskJson(), TaskNode.class))
.thenReturn(new TaskNode());
PowerMockito.mockStatic(CommonUtils.class);
PowerMockito.when(CommonUtils.getSystemEnvPath()).thenReturn("/user_home/.bash_profile");
List<String> osUserList = Collections.singletonList("test");
PowerMockito.mockStatic(OSUtils.class);
PowerMockito.when(OSUtils.getUserList()).thenReturn(osUserList);
PowerMockito.mockStatic(Stopper.class);
PowerMockito.when(Stopper.isRunning()).thenReturn(true, false);
}
@Test
public void testSendTaskKillResponse() {
TaskExecuteThread taskExecuteThread = new TaskExecuteThread(taskExecutionContext, taskCallbackService, taskLogger, alertClientService);
workerManager.offer(taskExecuteThread);
Assert.assertEquals(1, workerManager.getQueueSize());
workerManager.killTaskBeforeExecuteByInstanceId(1);
Assert.assertEquals(0, workerManager.getQueueSize());
}
@Test
public void testRun() {
TaskExecuteThread taskExecuteThread = new TaskExecuteThread(taskExecutionContext, taskCallbackService, taskLogger, alertClientService);
workerManager.offer(taskExecuteThread);
Assert.assertEquals(1, workerManager.getQueueSize());
workerManager.run();
Assert.assertEquals(0, workerManager.getQueueSize());
}
private static class SimpleTask extends AbstractTask {
protected SimpleTask(TaskExecutionContext taskExecutionContext, Logger logger) {
super(taskExecutionContext, logger);
// pid
this.processId = taskExecutionContext.getProcessId();
}
@Override
public AbstractParameters getParameters() {
return null;
}
@Override
public void init() {
}
@Override
public void handle() {
}
@Override
public void after() {
}
@Override
public ExecutionStatus getExitStatus() {
return ExecutionStatus.SUCCESS;
}
}
}

250
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/shell/ShellCommandExecutorTest.java

@ -1,250 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.shell;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.enums.ExecutionStatus;
import org.apache.dolphinscheduler.common.utils.OSUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.AbstractCommandExecutor;
import org.apache.dolphinscheduler.server.worker.task.AbstractTask;
import org.apache.dolphinscheduler.server.worker.task.TaskProps;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import org.apache.dolphinscheduler.service.process.ProcessService;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.lang.reflect.Method;
import java.util.Date;
import java.util.List;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Ignore;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.powermock.api.mockito.PowerMockito;
import org.powermock.core.classloader.annotations.PowerMockIgnore;
import org.powermock.core.classloader.annotations.PrepareForTest;
import org.powermock.modules.junit4.PowerMockRunner;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationContext;
/**
* python shell command executor test
*/
@RunWith(PowerMockRunner.class)
@PrepareForTest(OSUtils.class)
@PowerMockIgnore({"javax.management.*"})
public class ShellCommandExecutorTest {
private static final Logger logger = LoggerFactory.getLogger(ShellCommandExecutorTest.class);
private ProcessService processService = null;
private ApplicationContext applicationContext;
@Before
public void before() {
applicationContext = PowerMockito.mock(ApplicationContext.class);
processService = PowerMockito.mock(ProcessService.class);
SpringApplicationContext springApplicationContext = new SpringApplicationContext();
springApplicationContext.setApplicationContext(applicationContext);
PowerMockito.when(applicationContext.getBean(ProcessService.class)).thenReturn(processService);
}
@Ignore
@Test
public void test() throws Exception {
TaskProps taskProps = new TaskProps();
// processDefineId_processInstanceId_taskInstanceId
taskProps.setExecutePath("/opt/soft/program/tmp/dolphinscheduler/exec/flow/5/36/2864/7657");
taskProps.setTaskAppId("36_2864_7657");
// set tenant -> task execute linux user
taskProps.setTenantCode("hdfs");
taskProps.setTaskStartTime(new Date());
taskProps.setTaskTimeout(360000);
taskProps.setTaskInstanceId(7657);
// TaskInstance taskInstance = processService.findTaskInstanceById(7657);
//
// TaskNode taskNode = JSON.parseObject(taskJson, TaskNode.class);
// taskProps.setTaskParams(taskNode.getParams());
//
//
// // custom logger
// Logger taskLogger = LoggerFactory.getLogger(LoggerUtils.buildTaskId(LoggerUtils.TASK_LOGGER_INFO_PREFIX,
// taskInstance.getProcessDefine().getCode(),
// taskInstance.getProcessDefine().getVersion(),
// taskInstance.getProcessInstanceId(),
// taskInstance.getId()));
//
//
// AbstractTask task = TaskManager.newTask(taskInstance.getTaskType(), taskProps, taskLogger);
AbstractTask task = null;
logger.info("task info : {}", task);
// job init
task.init();
// job handle
task.handle();
ExecutionStatus status = ExecutionStatus.SUCCESS;
if (task.getExitStatusCode() == Constants.EXIT_CODE_SUCCESS) {
status = ExecutionStatus.SUCCESS;
} else if (task.getExitStatusCode() == Constants.EXIT_CODE_KILL) {
status = ExecutionStatus.KILL;
} else {
status = ExecutionStatus.FAILURE;
}
logger.info(status.toString());
}
@Test
public void testParseProcessOutput() {
Class<AbstractCommandExecutor> shellCommandExecutorClass = AbstractCommandExecutor.class;
try {
Method method = shellCommandExecutorClass.getDeclaredMethod("parseProcessOutput", Process.class);
method.setAccessible(true);
Object[] arg1s = {new Process() {
@Override
public OutputStream getOutputStream() {
return new OutputStream() {
@Override
public void write(int b) throws IOException {
logger.info("unit test");
}
};
}
@Override
public InputStream getInputStream() {
return new InputStream() {
@Override
public int read() throws IOException {
return 0;
}
};
}
@Override
public InputStream getErrorStream() {
return null;
}
@Override
public int waitFor() throws InterruptedException {
return 0;
}
@Override
public int exitValue() {
return 0;
}
@Override
public void destroy() {
logger.info("unit test");
}
} };
method.invoke(new AbstractCommandExecutor(null, new TaskExecutionContext(), logger) {
@Override
protected String buildCommandFilePath() {
return null;
}
@Override
protected String commandInterpreter() {
return null;
}
@Override
protected void createCommandFileIfNotExists(String execCommand, String commandFile) throws IOException {
logger.info("unit test");
}
}, arg1s);
} catch (Exception e) {
logger.error(e.getMessage());
}
}
@Test
public void testFindAppId() {
Class<AbstractCommandExecutor> shellCommandExecutorClass = AbstractCommandExecutor.class;
try {
Method method = shellCommandExecutorClass.getDeclaredMethod("findAppId", new Class[]{String.class});
method.setAccessible(true);
Object[] arg1s = {"11111"};
String result = (String) method.invoke(new AbstractCommandExecutor(null, null, null) {
@Override
protected String buildCommandFilePath() {
return null;
}
@Override
protected String commandInterpreter() {
return null;
}
@Override
protected void createCommandFileIfNotExists(String execCommand, String commandFile) throws IOException {
logger.info("unit test");
}
}, arg1s);
} catch (Exception e) {
logger.error(e.getMessage());
}
}
@Test
public void testConvertFile2List() {
Class<AbstractCommandExecutor> shellCommandExecutorClass = AbstractCommandExecutor.class;
try {
Method method = shellCommandExecutorClass.getDeclaredMethod("convertFile2List", String.class);
method.setAccessible(true);
Object[] arg1s = {"/opt/1.txt"};
List<String> result = (List<String>) method.invoke(new AbstractCommandExecutor(null, null, null) {
@Override
protected String buildCommandFilePath() {
return null;
}
@Override
protected String commandInterpreter() {
return null;
}
@Override
protected void createCommandFileIfNotExists(String execCommand, String commandFile) throws IOException {
logger.info("unit test");
}
}, arg1s);
Assert.assertTrue(true);
} catch (Exception e) {
logger.error(e.getMessage());
}
}
}

144
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/sql/SqlExecutorTest.java

@ -1,144 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.sql;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.enums.CommandType;
import org.apache.dolphinscheduler.common.enums.ExecutionStatus;
import org.apache.dolphinscheduler.common.utils.LoggerUtils;
import org.apache.dolphinscheduler.dao.entity.TaskInstance;
import org.apache.dolphinscheduler.server.worker.task.AbstractTask;
import org.apache.dolphinscheduler.server.worker.task.TaskProps;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import org.apache.dolphinscheduler.service.process.ProcessService;
import java.util.Date;
import org.junit.Before;
import org.junit.Ignore;
import org.junit.Test;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* python shell command executor test
*/
@Ignore
public class SqlExecutorTest {
private static final Logger logger = LoggerFactory.getLogger(SqlExecutorTest.class);
private ProcessService processService = null;
@Before
public void before() {
processService = SpringApplicationContext.getBean(ProcessService.class);
}
@Test
public void test() throws Exception {
String nodeName = "mysql sql test";
String taskAppId = "51_11282_263978";
String tenantCode = "hdfs";
int taskInstId = 7;
sharedTestSqlTask(nodeName, taskAppId, tenantCode, taskInstId);
}
@Test
public void testClickhouse() throws Exception {
String nodeName = "ClickHouse sql test";
String taskAppId = "1_11_20";
String tenantCode = "default";
int taskInstId = 20;
sharedTestSqlTask(nodeName, taskAppId, tenantCode, taskInstId);
}
@Test
public void testOracle() throws Exception {
String nodeName = "oracle sql test";
String taskAppId = "2_13_25";
String tenantCode = "demo";
int taskInstId = 25;
sharedTestSqlTask(nodeName, taskAppId, tenantCode, taskInstId);
}
@Test
public void testSQLServer() throws Exception {
String nodeName = "SQL Server sql test";
String taskAppId = "3_14_27";
String tenantCode = "demo";
int taskInstId = 27;
sharedTestSqlTask(nodeName, taskAppId, tenantCode, taskInstId);
}
/**
* Basic test template for SQLTasks, mainly test different types of DBMS types
*
* @param nodeName node name for selected task
* @param taskAppId task app id
* @param tenantCode tenant code
* @param taskInstId task instance id
*/
private void sharedTestSqlTask(String nodeName, String taskAppId, String tenantCode, int taskInstId) throws Exception {
TaskProps taskProps = new TaskProps();
taskProps.setExecutePath("");
// processDefineId_processInstanceId_taskInstanceId
taskProps.setTaskAppId(taskAppId);
// set tenant -> task execute linux user
taskProps.setTenantCode(tenantCode);
taskProps.setTaskStartTime(new Date());
taskProps.setTaskTimeout(360000);
taskProps.setTaskInstanceId(taskInstId);
taskProps.setTaskName(nodeName);
taskProps.setCmdTypeIfComplement(CommandType.START_PROCESS);
TaskInstance taskInstance = processService.findTaskInstanceById(taskInstId);
taskProps.setTaskParams(taskInstance.getTaskParams());
// custom logger
Logger taskLogger = LoggerFactory.getLogger(LoggerUtils.buildTaskId(LoggerUtils.TASK_LOGGER_INFO_PREFIX,
1L,
1,
taskInstance.getProcessInstanceId(),
taskInstance.getId()));
//AbstractTask task = TaskManager.newTask(taskInstance.getTaskType(), taskProps, taskLogger);
AbstractTask task = null;
logger.info("task info : {}", task);
// job init
task.init();
// job handle
task.handle();
ExecutionStatus status = ExecutionStatus.SUCCESS;
if (task.getExitStatusCode() == Constants.EXIT_CODE_SUCCESS) {
status = ExecutionStatus.SUCCESS;
} else if (task.getExitStatusCode() == Constants.EXIT_CODE_KILL) {
status = ExecutionStatus.KILL;
} else {
status = ExecutionStatus.FAILURE;
}
logger.info(status.toString());
}
}

58
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/PythonCommandExecutorTest.java

@ -1,58 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task;
import org.junit.Assert;
import org.junit.Test;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class PythonCommandExecutorTest {
private static final Logger logger = LoggerFactory.getLogger(PythonCommandExecutorTest.class);
@Test
public void testGetPythonHome() {
String path = System.getProperty("user.dir") + "/script/env/dolphinscheduler_env.sh";
if (path.contains("dolphinscheduler-server/")) {
path = path.replace("dolphinscheduler-server/", "");
}
String pythonHome = PythonCommandExecutor.getPythonHome(path);
logger.info(pythonHome);
Assert.assertNotNull(pythonHome);
}
@Test
public void testGetPythonCommand() {
String pythonCommand = PythonCommandExecutor.getPythonCommand(null);
Assert.assertEquals(PythonCommandExecutor.PYTHON, pythonCommand);
pythonCommand = PythonCommandExecutor.getPythonCommand("");
Assert.assertEquals(PythonCommandExecutor.PYTHON, pythonCommand);
pythonCommand = PythonCommandExecutor.getPythonCommand("/usr/bin/python");
Assert.assertEquals("/usr/bin/python", pythonCommand);
pythonCommand = PythonCommandExecutor.getPythonCommand("/usr/local/bin/python2");
Assert.assertEquals("/usr/local/bin/python2", pythonCommand);
pythonCommand = PythonCommandExecutor.getPythonCommand("/opt/python/bin/python3.8");
Assert.assertEquals("/opt/python/bin/python3.8", pythonCommand);
pythonCommand = PythonCommandExecutor.getPythonCommand("/opt/soft/python");
Assert.assertEquals("/opt/soft/python/bin/python", pythonCommand);
pythonCommand = PythonCommandExecutor.getPythonCommand("/opt/soft/python-3.8");
Assert.assertEquals("/opt/soft/python-3.8/bin/python", pythonCommand);
}
}

115
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/ShellTaskReturnTest.java

@ -1,115 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task;
import static org.mockito.ArgumentMatchers.anyString;
import org.apache.dolphinscheduler.common.enums.TaskType;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.shell.ShellTask;
import org.apache.dolphinscheduler.server.worker.task.shell.ShellTaskTest;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.Date;
import java.util.HashMap;
import java.util.Map;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.powermock.api.mockito.PowerMockito;
import org.powermock.core.classloader.annotations.PrepareForTest;
import org.powermock.modules.junit4.PowerMockRunner;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* shell task return test.
*/
@RunWith(PowerMockRunner.class)
@PrepareForTest({ShellTask.class})
public class ShellTaskReturnTest {
private static final Logger logger = LoggerFactory.getLogger(ShellTaskTest.class);
private ShellTask shellTask;
private ShellCommandExecutor shellCommandExecutor;
private TaskExecutionContext taskExecutionContext;
private CommandExecuteResult commandExecuteResult;
@Before
public void before() throws Exception {
System.setProperty("log4j2.disable.jmx", Boolean.TRUE.toString());
shellCommandExecutor = PowerMockito.mock(ShellCommandExecutor.class);
PowerMockito.whenNew(ShellCommandExecutor.class).withAnyArguments().thenReturn(shellCommandExecutor);
taskExecutionContext = new TaskExecutionContext();
taskExecutionContext.setTaskInstanceId(1);
taskExecutionContext.setTaskName("kris test");
taskExecutionContext.setTaskType(TaskType.SHELL.getDesc());
taskExecutionContext.setHost("127.0.0.1:1234");
taskExecutionContext.setExecutePath("/tmp");
taskExecutionContext.setLogPath("/log");
taskExecutionContext.setTaskJson(
"{\"conditionResult\":\"{\\\"successNode\\\":[\\\"\\\"],"
+ "\\\"failedNode\\\":[\\\"\\\"]}\",\"conditionsTask\":false,"
+ "\"depList\":[],\"dependence\":\"{}\",\"forbidden\":false,\"id\":\""
+ "tasks-16849\",\"maxRetryTimes\":0,\"name\":\"shell test 001\","
+ "\"params\":\"{\\\"rawScript\\\":\\\"#!/bin/sh\\\\necho $[yyyy-MM-dd HH:mm:ss +3]\\\\necho \\\\\\\" ?? "
+ "${time1} \\\\\\\"\\\\necho \\\\\\\" ????? ${time2}\\\\\\\"\\\\n\\\","
+ "\\\"localParams\\\":[{\\\"prop\\\":\\\"time1\\\",\\\"direct\\\":\\\"IN\\\",\\\"type\\\":"
+ "\\\"VARCHAR\\\",\\\"value\\\":\\\"$[yyyy-MM-dd HH:mm:ss]\\\"},"
+ "{\\\"prop\\\":\\\"time2\\\",\\\"direct\\\":\\\"IN\\\",\\\"type\\\":\\\"VARCHAR\\\",\\\"value\\\":\\\"${time_gb}\\\"}"
+ "],\\\"resourceList\\\":[]}\",\"preTasks\":\"[]\",\"retryInterval\":1,"
+ "\"runFlag\":\"NORMAL\",\"taskInstancePriority\":\"MEDIUM\",\"taskTimeoutParameter\":"
+ "{\"enable\":false,\"interval\":0},\"timeout\":\"{\\\"enable\\\":false,\\\"strategy\\\":\\\"\\\"}\","
+ "\"type\":\"SHELL\",\"workerGroup\":\"default\"}");
taskExecutionContext.setProcessInstanceId(1);
taskExecutionContext.setGlobalParams("[{\"direct\":\"IN\",\"prop\":\"time_gb\",\"type\":\"VARCHAR\",\"value\":\"2020-12-16 17:18:33\"}]");
taskExecutionContext.setExecutorId(1);
taskExecutionContext.setCmdTypeIfComplement(5);
taskExecutionContext.setTenantCode("roo");
taskExecutionContext.setScheduleTime(new Date());
taskExecutionContext.setQueue("default");
taskExecutionContext.setTaskParams(
"{\"rawScript\":\"#!/bin/sh\\necho $[yyyy-MM-dd HH:mm:ss +3]\\necho \\\" ?? ${time1} \\\"\\necho \\\" ????? ${time2}\\\"\\n\",\"localParams\":"
+
"[{\"prop\":\"time1\",\"direct\":\"OUT\",\"type\":\"VARCHAR\",\"value\":\"$[yyyy-MM-dd HH:mm:ss]\"},{\"prop\":\"time2\",\"direct\":\"IN\",\"type\":\"VARCHAR"
+ "\",\"value\":\"${time_gb}\"}],\"resourceList\":[]}");
Map<String, String> definedParams = new HashMap<>();
definedParams.put("time_gb", "2020-12-16 00:00:00");
taskExecutionContext.setDefinedParams(definedParams);
PowerMockito.mockStatic(Files.class);
PowerMockito.when(Files.exists(Paths.get(anyString()))).thenReturn(true);
commandExecuteResult = new CommandExecuteResult();
commandExecuteResult.setAppIds("appId");
commandExecuteResult.setExitStatusCode(0);
commandExecuteResult.setProcessId(1);
}
@Test
public void testShellReturnString() {
shellTask = new ShellTask(taskExecutionContext, logger);
shellTask.init();
try {
PowerMockito.when(shellCommandExecutor.run(anyString())).thenReturn(commandExecuteResult);
} catch (Exception e) {
e.printStackTrace();
}
}
}

192
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/TaskManagerTest.java

@ -1,192 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task;
import org.apache.dolphinscheduler.common.enums.TaskType;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.task.sql.SqlParameters;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.LoggerUtils;
import org.apache.dolphinscheduler.server.entity.SQLTaskExecutionContext;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.cache.impl.TaskExecutionContextCacheManagerImpl;
import org.apache.dolphinscheduler.server.worker.task.shell.ShellTask;
import org.apache.dolphinscheduler.server.worker.task.sql.SqlTask;
import org.apache.dolphinscheduler.service.alert.AlertClientService;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import java.util.Date;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.powermock.api.mockito.PowerMockito;
import org.powermock.core.classloader.annotations.PrepareForTest;
import org.powermock.modules.junit4.PowerMockRunner;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@RunWith(PowerMockRunner.class)
@PrepareForTest({SpringApplicationContext.class})
public class TaskManagerTest {
private static final Logger logger = LoggerFactory.getLogger(TaskManagerTest.class);
private TaskExecutionContext taskExecutionContext;
private Logger taskLogger;
private TaskExecutionContextCacheManagerImpl taskExecutionContextCacheManager;
private AlertClientService alertClientService;
@Before
public void before() {
// init task execution context, logger
taskExecutionContext = new TaskExecutionContext();
taskExecutionContext.setProcessId(12345);
taskExecutionContext.setProcessInstanceId(1);
taskExecutionContext.setTaskInstanceId(1);
taskExecutionContext.setProcessDefineCode(1L);
taskExecutionContext.setProcessDefineVersion(1);
taskExecutionContext.setTaskType(TaskType.SHELL.getDesc());
taskExecutionContext.setFirstSubmitTime(new Date());
taskExecutionContext.setDelayTime(0);
taskExecutionContext.setLogPath("/tmp/test.log");
taskExecutionContext.setHost("localhost");
taskExecutionContext.setExecutePath("/tmp/dolphinscheduler/exec/process/1/2/3/4");
taskLogger = LoggerFactory.getLogger(LoggerUtils.buildTaskId(
LoggerUtils.TASK_LOGGER_INFO_PREFIX,
taskExecutionContext.getProcessDefineCode(),
taskExecutionContext.getProcessDefineVersion(),
taskExecutionContext.getProcessInstanceId(),
taskExecutionContext.getTaskInstanceId()
));
taskExecutionContextCacheManager = new TaskExecutionContextCacheManagerImpl();
taskExecutionContextCacheManager.cacheTaskExecutionContext(taskExecutionContext);
PowerMockito.mockStatic(SpringApplicationContext.class);
PowerMockito.when(SpringApplicationContext.getBean(TaskExecutionContextCacheManagerImpl.class))
.thenReturn(taskExecutionContextCacheManager);
alertClientService = PowerMockito.mock(AlertClientService.class);
}
@Test
public void testNewTask() {
taskExecutionContext.setTaskType(TaskType.SHELL.getDesc());
Assert.assertNotNull(TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService));
taskExecutionContext.setTaskType(TaskType.WATERDROP.getDesc());
Assert.assertNotNull(TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService));
taskExecutionContext.setTaskType(TaskType.HTTP.getDesc());
Assert.assertNotNull(TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService));
taskExecutionContext.setTaskType(TaskType.MR.getDesc());
Assert.assertNotNull(TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService));
taskExecutionContext.setTaskType(TaskType.SPARK.getDesc());
Assert.assertNotNull(TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService));
taskExecutionContext.setTaskType(TaskType.FLINK.getDesc());
Assert.assertNotNull(TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService));
taskExecutionContext.setTaskType(TaskType.PYTHON.getDesc());
Assert.assertNotNull(TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService));
taskExecutionContext.setTaskType(TaskType.DATAX.getDesc());
Assert.assertNotNull(TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService));
taskExecutionContext.setTaskType(TaskType.SQOOP.getDesc());
Assert.assertNotNull(TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService));
}
@Test(expected = IllegalArgumentException.class)
public void testNewTaskIsNull() {
taskExecutionContext.setTaskType(null);
TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService);
}
@Test(expected = IllegalArgumentException.class)
public void testNewTaskIsNotExists() {
taskExecutionContext.setTaskType("ttt");
TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService);
}
@Test
public void testShellTaskReturnString() {
taskExecutionContext.setTaskInstanceId(1);
taskExecutionContext.setTaskName("kris test");
taskExecutionContext.setTaskType(TaskType.SHELL.getDesc());
taskExecutionContext.setHost("127.0.0.1:1234");
taskExecutionContext.setExecutePath("/tmp");
taskExecutionContext.setLogPath("/log");
taskExecutionContext.setTaskJson(
"{\"conditionResult\":\"{\\\"successNode\\\":[\\\"\\\"],\\\"failedNode\\\":[\\\"\\\"]}\","
+ "\"conditionsTask\":false,\"depList\":[],\"dependence\":\"{}\",\"forbidden\":false,\"id\":\""
+ "tasks-16849\",\"maxRetryTimes\":0,\"name\":\"shell test 001\","
+ "\"params\":\"{\\\"rawScript\\\":\\\"#!/bin/sh\\\\necho $[yyyy-MM-dd HH:mm:ss +3]\\\\necho \\\\\\\" ?? "
+ "${time1} \\\\\\\"\\\\necho \\\\\\\" ????? ${time2}\\\\\\\"\\\\n\\\","
+ "\\\"localParams\\\":[{\\\"prop\\\":\\\"time1\\\",\\\"direct\\\":\\\"OUT\\\",\\\"type\\\":"
+ "\\\"VARCHAR\\\",\\\"value\\\":\\\"$[yyyy-MM-dd HH:mm:ss]\\\"},"
+ "{\\\"prop\\\":\\\"time2\\\",\\\"direct\\\":\\\"IN\\\",\\\"type\\\":\\\"VARCHAR\\\",\\\"value\\\":\\\"${time_gb}\\\"}"
+ "],\\\"resourceList\\\":[]}\",\"preTasks\":\"[]\",\"retryInterval\":1,\"runFlag\":\"NORMAL\",\"taskInstancePriority\":\"MEDIUM\",\"taskTimeoutParameter\":"
+ "{\"enable\":false,\"interval\":0},\"timeout\":\"{\\\"enable\\\":false,\\\"strategy\\\":\\\"\\\"}\",\"type\":\"SHELL\",\"workerGroup\":\"default\"}");
taskExecutionContext.setProcessInstanceId(1);
taskExecutionContext.setGlobalParams("[{\"direct\":\"IN\",\"prop\":\"time_gb\",\"type\":\"VARCHAR\",\"value\":\"2020-12-16 17:18:33\"}]");
taskExecutionContext.setExecutorId(1);
taskExecutionContext.setCmdTypeIfComplement(5);
taskExecutionContext.setTenantCode("roo");
taskExecutionContext.setScheduleTime(new Date());
taskExecutionContext.setQueue("default");
taskExecutionContext.setTaskParams(
"{\"rawScript\":\"#!/bin/sh\\necho $[yyyy-MM-dd HH:mm:ss +3]\\necho \\\" ?? ${time1} \\\"\\necho \\\" ????? ${time2}\\\"\\n\",\"localParams\":"
+
"[{\"prop\":\"time1\",\"direct\":\"OUT\",\"type\":\"VARCHAR\",\"value\":\"$[yyyy-MM-dd HH:mm:ss]\"},{\"prop\":\"time2\",\"direct\":\"IN\",\"type\":\"VARCHAR"
+ "\",\"value\":\"${time_gb}\"}],\"resourceList\":[]}");
Map<String, String> definedParams = new HashMap<>();
definedParams.put("time_gb", "2020-12-16 00:00:00");
taskExecutionContext.setDefinedParams(definedParams);
ShellTask shellTask = (ShellTask) TaskManager.newTask(taskExecutionContext, taskLogger, alertClientService);
}
@Test
public void testSqlTaskReturnString() {
String params = "{\"user\":\"root\",\"password\":\"123456\",\"address\":\"jdbc:mysql://127.0.0.1:3306\","
+ "\"database\":\"test\",\"jdbcUrl\":\"jdbc:mysql://127.0.0.1:3306/test\"}";
taskExecutionContext = new TaskExecutionContext();
taskExecutionContext.setTaskParams("{\"localParams\":[{\"prop\":\"ret\", \"direct\":\"OUT\", \"type\":\"VARCHAR\", \"value\":\"\"}],"
+ "\"type\":\"POSTGRESQL\",\"datasource\":1,\"sql\":\"insert into tb_1 values('1','2')\","
+ "\"sqlType\":1}");
taskExecutionContext.setExecutePath("/tmp");
taskExecutionContext.setTaskAppId("1");
taskExecutionContext.setTenantCode("root");
taskExecutionContext.setStartTime(new Date());
taskExecutionContext.setTaskTimeout(10000);
taskExecutionContext.setLogPath("/tmp/dx");
SQLTaskExecutionContext sqlTaskExecutionContext = new SQLTaskExecutionContext();
sqlTaskExecutionContext.setConnectionParams(params);
taskExecutionContext.setSqlTaskExecutionContext(sqlTaskExecutionContext);
SqlTask sqlTask = new SqlTask(taskExecutionContext, logger, null);
SqlParameters sqlParameters = JSONUtils.parseObject(taskExecutionContext.getTaskParams(), SqlParameters.class);
List<Property> properties = sqlParameters.getLocalParams();
sqlTask.setNonQuerySqlReturn("sql return", properties);
}
}

77
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/TaskParamsTest.java

@ -1,77 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task;
import static org.junit.Assert.assertNotNull;
import org.apache.dolphinscheduler.common.enums.DataType;
import org.apache.dolphinscheduler.common.enums.Direct;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.task.shell.ShellParameters;
import org.apache.dolphinscheduler.common.task.sql.SqlParameters;
import java.util.ArrayList;
import java.util.List;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.powermock.modules.junit4.PowerMockRunner;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* shell task return test.
*/
@RunWith(PowerMockRunner.class)
public class TaskParamsTest {
private static final Logger logger = LoggerFactory.getLogger(TaskParamsTest.class);
@Test
public void testDealOutParam() {
List<Property> properties = new ArrayList<>();
Property property = new Property();
property.setProp("test1");
property.setDirect(Direct.OUT);
property.setType(DataType.VARCHAR);
property.setValue("test1");
properties.add(property);
ShellParameters shellParameters = new ShellParameters();
String resultShell = "key1=value1$VarPoolkey2=value2";
shellParameters.varPool = new ArrayList<>();
shellParameters.setLocalParams(properties);
shellParameters.dealOutParam(resultShell);
assertNotNull(shellParameters.getVarPool().get(0));
String sqlResult = "[{\"id\":6,\"test1\":\"6\"},{\"id\":70002,\"test1\":\"+1\"}]";
SqlParameters sqlParameters = new SqlParameters();
String sqlResult1 = "[{\"id\":6,\"test1\":\"6\"}]";
sqlParameters.setLocalParams(properties);
sqlParameters.varPool = new ArrayList<>();
sqlParameters.dealOutParam(sqlResult1);
assertNotNull(sqlParameters.getVarPool().get(0));
property.setType(DataType.LIST);
properties.clear();
properties.add(property);
sqlParameters.setLocalParams(properties);
sqlParameters.dealOutParam(sqlResult);
assertNotNull(sqlParameters.getVarPool().get(0));
}
}

493
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/datax/DataxTaskTest.java

@ -1,493 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.datax;
import static org.apache.dolphinscheduler.common.enums.CommandType.START_PROCESS;
import org.apache.dolphinscheduler.common.datasource.BaseConnectionParam;
import org.apache.dolphinscheduler.common.datasource.DatasourceUtil;
import org.apache.dolphinscheduler.common.enums.DbType;
import org.apache.dolphinscheduler.common.task.datax.DataxParameters;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.dao.entity.DataSource;
import org.apache.dolphinscheduler.dao.entity.ProcessInstance;
import org.apache.dolphinscheduler.server.entity.DataxTaskExecutionContext;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.utils.DataxUtils;
import org.apache.dolphinscheduler.server.worker.task.ShellCommandExecutor;
import org.apache.dolphinscheduler.server.worker.task.TaskProps;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import org.apache.dolphinscheduler.service.process.ProcessService;
import java.lang.reflect.Method;
import java.util.Arrays;
import java.util.Date;
import java.util.List;
import java.util.UUID;
import org.junit.After;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Ignore;
import org.junit.Test;
import org.mockito.Mockito;
import org.powermock.api.mockito.PowerMockito;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationContext;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.node.ObjectNode;
/**
* DataxTask Tester.
*/
public class DataxTaskTest {
private static final Logger logger = LoggerFactory.getLogger(DataxTaskTest.class);
private static final String CONNECTION_PARAMS = " {\n"
+ " \"user\":\"root\",\n"
+ " \"password\":\"123456\",\n"
+ " \"address\":\"jdbc:mysql://127.0.0.1:3306\",\n"
+ " \"database\":\"test\",\n"
+ " \"jdbcUrl\":\"jdbc:mysql://127.0.0.1:3306/test\"\n"
+ "}";
private DataxTask dataxTask;
private ProcessService processService;
private ShellCommandExecutor shellCommandExecutor;
private ApplicationContext applicationContext;
private TaskExecutionContext taskExecutionContext;
private final TaskProps props = new TaskProps();
@Before
public void before()
throws Exception {
setTaskParems(0);
}
private void setTaskParems(Integer customConfig) {
processService = Mockito.mock(ProcessService.class);
shellCommandExecutor = Mockito.mock(ShellCommandExecutor.class);
applicationContext = Mockito.mock(ApplicationContext.class);
SpringApplicationContext springApplicationContext = new SpringApplicationContext();
springApplicationContext.setApplicationContext(applicationContext);
Mockito.when(applicationContext.getBean(ProcessService.class)).thenReturn(processService);
TaskProps props = new TaskProps();
props.setExecutePath("/tmp");
props.setTaskAppId(String.valueOf(System.currentTimeMillis()));
props.setTaskInstanceId(1);
props.setTenantCode("1");
props.setEnvFile(".dolphinscheduler_env.sh");
props.setTaskStartTime(new Date());
props.setTaskTimeout(0);
if (customConfig == 1) {
props.setTaskParams(
"{\n"
+ " \"customConfig\":1,\n"
+ " \"localParams\":[\n"
+ " {\n"
+ " \"prop\":\"test\",\n"
+ " \"value\":\"38294729\"\n"
+ " }\n"
+ " ],\n"
+ " \"json\":\""
+ "{\"job\":{\"setting\":{\"speed\":{\"byte\":1048576},\"errorLimit\":{\"record\":0,\"percentage\":0.02}},\"content\":["
+ "{\"reader\":{\"name\":\"rdbmsreader\",\"parameter\":{\"username\":\"xxx\",\"password\":\"${test}\",\"column\":[\"id\",\"name\"],\"splitPk\":\"pk\",\""
+ "connection\":[{\"querySql\":[\"SELECT * from dual\"],\"jdbcUrl\":[\"jdbc:dm://ip:port/database\"]}],\"fetchSize\":1024,\"where\":\"1 = 1\"}},\""
+ "writer\":{\"name\":\"streamwriter\",\"parameter\":{\"print\":true}}}]}}\"\n"
+ "}");
} else {
props.setTaskParams(
"{\n"
+ " \"customConfig\":0,\n"
+ " \"targetTable\":\"test\",\n"
+ " \"postStatements\":[\n"
+ " \"delete from test\"\n"
+ " ],\n"
+ " \"jobSpeedByte\":0,\n"
+ " \"jobSpeedRecord\":1000,\n"
+ " \"dtType\":\"MYSQL\",\n"
+ " \"dataSource\":1,\n"
+ " \"dsType\":\"MYSQL\",\n"
+ " \"dataTarget\":2,\n"
+ " \"sql\":\"select 1 as test from dual\",\n"
+ " \"preStatements\":[\n"
+ " \"delete from test\"\n"
+ " ]\n"
+ "}");
}
taskExecutionContext = Mockito.mock(TaskExecutionContext.class);
Mockito.when(taskExecutionContext.getTaskParams()).thenReturn(props.getTaskParams());
Mockito.when(taskExecutionContext.getExecutePath()).thenReturn("/tmp");
Mockito.when(taskExecutionContext.getTaskAppId()).thenReturn(UUID.randomUUID().toString());
Mockito.when(taskExecutionContext.getTenantCode()).thenReturn("root");
Mockito.when(taskExecutionContext.getStartTime()).thenReturn(new Date());
Mockito.when(taskExecutionContext.getTaskTimeout()).thenReturn(10000);
Mockito.when(taskExecutionContext.getLogPath()).thenReturn("/tmp/dx");
DataxTaskExecutionContext dataxTaskExecutionContext = new DataxTaskExecutionContext();
dataxTaskExecutionContext.setSourcetype(0);
dataxTaskExecutionContext.setTargetType(0);
dataxTaskExecutionContext.setSourceConnectionParams(CONNECTION_PARAMS);
dataxTaskExecutionContext.setTargetConnectionParams(CONNECTION_PARAMS);
Mockito.when(taskExecutionContext.getDataxTaskExecutionContext()).thenReturn(dataxTaskExecutionContext);
dataxTask = PowerMockito.spy(new DataxTask(taskExecutionContext, logger));
dataxTask.init();
props.setCmdTypeIfComplement(START_PROCESS);
Mockito.when(processService.findDataSourceById(1)).thenReturn(getDataSource());
Mockito.when(processService.findDataSourceById(2)).thenReturn(getDataSource());
Mockito.when(processService.findProcessInstanceByTaskId(1)).thenReturn(getProcessInstance());
String fileName = String.format("%s/%s_node.sh", props.getExecutePath(), props.getTaskAppId());
try {
Mockito.when(shellCommandExecutor.run(fileName)).thenReturn(null);
} catch (Exception e) {
e.printStackTrace();
}
dataxTask = PowerMockito.spy(new DataxTask(taskExecutionContext, logger));
dataxTask.init();
}
private DataSource getDataSource() {
DataSource dataSource = new DataSource();
dataSource.setType(DbType.MYSQL);
dataSource.setConnectionParams(CONNECTION_PARAMS);
dataSource.setUserId(1);
return dataSource;
}
private ProcessInstance getProcessInstance() {
ProcessInstance processInstance = new ProcessInstance();
processInstance.setCommandType(START_PROCESS);
processInstance.setScheduleTime(new Date());
return processInstance;
}
@After
public void after()
throws Exception {
}
/**
* Method: DataxTask()
*/
@Test
public void testDataxTask()
throws Exception {
TaskProps props = new TaskProps();
props.setExecutePath("/tmp");
props.setTaskAppId(String.valueOf(System.currentTimeMillis()));
props.setTaskInstanceId(1);
props.setTenantCode("1");
Assert.assertNotNull(new DataxTask(null, logger));
}
/**
* Method: init
*/
@Test
public void testInit()
throws Exception {
try {
dataxTask.init();
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
/**
* Method: handle()
*/
@Test
public void testHandle()
throws Exception {
}
/**
* Method: cancelApplication()
*/
@Test
public void testCancelApplication()
throws Exception {
try {
dataxTask.cancelApplication(true);
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
/**
* Method: parsingSqlColumnNames(DbType dsType, DbType dtType, BaseDataSource
* dataSourceCfg, String sql)
*/
@Test
public void testParsingSqlColumnNames()
throws Exception {
try {
BaseConnectionParam dataSource = (BaseConnectionParam) DatasourceUtil.buildConnectionParams(
getDataSource().getType(),
getDataSource().getConnectionParams());
Method method = DataxTask.class.getDeclaredMethod("parsingSqlColumnNames", DbType.class, DbType.class, BaseConnectionParam.class, String.class);
method.setAccessible(true);
String[] columns = (String[]) method.invoke(dataxTask, DbType.MYSQL, DbType.MYSQL, dataSource, "select 1 as a, 2 as `table` from dual");
Assert.assertNotNull(columns);
Assert.assertTrue(columns.length == 2);
Assert.assertEquals("[`a`, `table`]", Arrays.toString(columns));
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
/**
* Method: tryGrammaticalParsingSqlColumnNames(DbType dbType, String sql)
*/
@Test
public void testTryGrammaticalAnalysisSqlColumnNames()
throws Exception {
try {
Method method = DataxTask.class.getDeclaredMethod("tryGrammaticalAnalysisSqlColumnNames", DbType.class, String.class);
method.setAccessible(true);
String[] columns = (String[]) method.invoke(dataxTask, DbType.MYSQL, "select t1.a, t1.b from test t1 union all select a, t2.b from (select a, b from test) t2");
Assert.assertNotNull(columns);
Assert.assertTrue(columns.length == 2);
Assert.assertEquals("[a, b]", Arrays.toString(columns));
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
/**
* Method: tryExecuteSqlResolveColumnNames(BaseDataSource baseDataSource,
* String sql)
*/
@Test
public void testTryExecuteSqlResolveColumnNames()
throws Exception {
// TODO: Test goes here...
}
/**
* Method: buildDataxJsonFile()
*/
@Test
@Ignore("method not found")
public void testBuildDataxJsonFile()
throws Exception {
try {
setTaskParems(1);
Method method = DataxTask.class.getDeclaredMethod("buildDataxJsonFile");
method.setAccessible(true);
String filePath = (String) method.invoke(dataxTask, null);
Assert.assertNotNull(filePath);
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
/**
* Method: buildDataxJsonFile()
*/
@Test
@Ignore("method not found")
public void testBuildDataxJsonFile0()
throws Exception {
try {
setTaskParems(0);
Method method = DataxTask.class.getDeclaredMethod("buildDataxJsonFile");
method.setAccessible(true);
String filePath = (String) method.invoke(dataxTask, null);
Assert.assertNotNull(filePath);
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
/**
* Method: buildDataxJobContentJson()
*/
@Test
public void testBuildDataxJobContentJson()
throws Exception {
try {
Method method = DataxTask.class.getDeclaredMethod("buildDataxJobContentJson");
method.setAccessible(true);
List<ObjectNode> contentList = (List<ObjectNode>) method.invoke(dataxTask, null);
Assert.assertNotNull(contentList);
ObjectNode content = contentList.get(0);
JsonNode reader = JSONUtils.parseObject(content.path("reader").toString());
Assert.assertNotNull(reader);
Assert.assertEquals("{\"name\":\"mysqlreader\",\"parameter\":{\"username\":\"root\","
+ "\"password\":\"123456\",\"connection\":[{\"querySql\":[\"select 1 as test from dual\"],"
+ "\"jdbcUrl\":[\"jdbc:mysql://127.0.0.1:3306/test?allowLoadLocalInfile=false"
+ "&autoDeserialize=false&allowLocalInfile=false&allowUrlInLocalInfile=false\"]}]}}",
reader.toString());
String readerPluginName = reader.path("name").asText();
Assert.assertEquals(DataxUtils.DATAX_READER_PLUGIN_MYSQL, readerPluginName);
JsonNode writer = JSONUtils.parseObject(content.path("writer").toString());
Assert.assertNotNull(writer);
Assert.assertEquals("{\"name\":\"mysqlwriter\",\"parameter\":{\"username\":\"root\","
+ "\"password\":\"123456\",\"column\":[\"`test`\"],\"connection\":[{\"table\":[\"test\"],"
+ "\"jdbcUrl\":\"jdbc:mysql://127.0.0.1:3306/test?allowLoadLocalInfile=false&"
+ "autoDeserialize=false&allowLocalInfile=false&allowUrlInLocalInfile=false\"}],"
+ "\"preSql\":[\"delete from test\"],\"postSql\":[\"delete from test\"]}}",
writer.toString());
String writerPluginName = writer.path("name").asText();
Assert.assertEquals(DataxUtils.DATAX_WRITER_PLUGIN_MYSQL, writerPluginName);
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
/**
* Method: buildDataxJobSettingJson()
*/
@Test
public void testBuildDataxJobSettingJson()
throws Exception {
try {
Method method = DataxTask.class.getDeclaredMethod("buildDataxJobSettingJson");
method.setAccessible(true);
JsonNode setting = (JsonNode) method.invoke(dataxTask, null);
Assert.assertNotNull(setting);
Assert.assertEquals("{\"channel\":1,\"record\":1000}", setting.get("speed").toString());
Assert.assertEquals("{\"record\":0,\"percentage\":0}", setting.get("errorLimit").toString());
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
/**
* Method: buildDataxCoreJson()
*/
@Test
public void testBuildDataxCoreJson()
throws Exception {
try {
Method method = DataxTask.class.getDeclaredMethod("buildDataxCoreJson");
method.setAccessible(true);
ObjectNode coreConfig = (ObjectNode) method.invoke(dataxTask, null);
Assert.assertNotNull(coreConfig);
Assert.assertNotNull(coreConfig.get("transport"));
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
/**
* Method: buildShellCommandFile(String jobConfigFilePath)
*/
@Test
@Ignore("method not found")
public void testBuildShellCommandFile()
throws Exception {
try {
Method method = DataxTask.class.getDeclaredMethod("buildShellCommandFile", String.class);
method.setAccessible(true);
Assert.assertNotNull(method.invoke(dataxTask, "test.json"));
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
/**
* Method: getParameters
*/
@Test
public void testGetParameters()
throws Exception {
Assert.assertTrue(dataxTask.getParameters() != null);
}
/**
* Method: notNull(Object obj, String message)
*/
@Test
public void testNotNull()
throws Exception {
try {
Method method = DataxTask.class.getDeclaredMethod("notNull", Object.class, String.class);
method.setAccessible(true);
method.invoke(dataxTask, "abc", "test throw RuntimeException");
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
@Test
public void testGetPythonCommand() {
String pythonCommand = dataxTask.getPythonCommand();
Assert.assertEquals("python2.7", pythonCommand);
pythonCommand = dataxTask.getPythonCommand("");
Assert.assertEquals("python2.7", pythonCommand);
pythonCommand = dataxTask.getPythonCommand("/usr/bin/python");
Assert.assertEquals("/usr/bin/python2.7", pythonCommand);
pythonCommand = dataxTask.getPythonCommand("/usr/local/bin/python2");
Assert.assertEquals("/usr/local/bin/python2.7", pythonCommand);
pythonCommand = dataxTask.getPythonCommand("/opt/python/bin/python3.8");
Assert.assertEquals("/opt/python/bin/python2.7", pythonCommand);
pythonCommand = dataxTask.getPythonCommand("/opt/soft/python");
Assert.assertEquals("/opt/soft/python/bin/python2.7", pythonCommand);
}
@Test
public void testLoadJvmEnv() {
DataxTask dataxTask = new DataxTask(null,null);
DataxParameters dataxParameters = new DataxParameters();
dataxParameters.setXms(0);
dataxParameters.setXmx(-100);
String actual = dataxTask.loadJvmEnv(dataxParameters);
String except = " --jvm=\"-Xms1G -Xmx1G\" ";
Assert.assertEquals(except,actual);
dataxParameters.setXms(13);
dataxParameters.setXmx(14);
actual = dataxTask.loadJvmEnv(dataxParameters);
except = " --jvm=\"-Xms13G -Xmx14G\" ";
Assert.assertEquals(except,actual);
}
}

208
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/http/HttpTaskTest.java

@ -1,208 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.http;
import static org.apache.dolphinscheduler.common.enums.CommandType.*;
import java.io.IOException;
import java.util.Date;
import org.apache.dolphinscheduler.common.enums.HttpCheckCondition;
import org.apache.dolphinscheduler.common.enums.HttpMethod;
import org.apache.dolphinscheduler.common.task.http.HttpParameters;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.common.utils.OSUtils;
import org.apache.dolphinscheduler.dao.entity.ProcessInstance;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.ShellCommandExecutor;
import org.apache.dolphinscheduler.server.worker.task.TaskProps;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import org.apache.dolphinscheduler.service.process.ProcessService;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.RequestBuilder;
import org.apache.http.impl.client.CloseableHttpClient;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mockito;
import org.powermock.api.mockito.PowerMockito;
import org.powermock.core.classloader.annotations.PowerMockIgnore;
import org.powermock.core.classloader.annotations.PrepareForTest;
import org.powermock.modules.junit4.PowerMockRunner;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationContext;
@RunWith(PowerMockRunner.class)
@PrepareForTest(OSUtils.class)
@PowerMockIgnore({"javax.management.*","javax.net.ssl.*"})
public class HttpTaskTest {
private static final Logger logger = LoggerFactory.getLogger(HttpTaskTest.class);
private HttpTask httpTask;
private ProcessService processService;
private ShellCommandExecutor shellCommandExecutor;
private ApplicationContext applicationContext;
private TaskExecutionContext taskExecutionContext;
@Before
public void before() throws Exception {
taskExecutionContext = new TaskExecutionContext();
PowerMockito.mockStatic(OSUtils.class);
processService = PowerMockito.mock(ProcessService.class);
shellCommandExecutor = PowerMockito.mock(ShellCommandExecutor.class);
applicationContext = PowerMockito.mock(ApplicationContext.class);
SpringApplicationContext springApplicationContext = new SpringApplicationContext();
springApplicationContext.setApplicationContext(applicationContext);
PowerMockito.when(applicationContext.getBean(ProcessService.class)).thenReturn(processService);
TaskProps props = new TaskProps();
props.setExecutePath("/tmp");
props.setTaskAppId(String.valueOf(System.currentTimeMillis()));
props.setTaskInstanceId(1);
props.setTenantCode("1");
props.setEnvFile(".dolphinscheduler_env.sh");
props.setTaskStartTime(new Date());
props.setTaskTimeout(0);
props.setTaskParams(
"{\"localParams\":[],\"httpParams\":[],\"url\":\"https://github.com/\",\"httpMethod\":\"GET\"," +
"\"httpCheckCondition\":\"STATUS_CODE_DEFAULT\",\"condition\":\"https://github.com/\"," +
"\"connectTimeout\":\"1000\",\"socketTimeout\":\"1000\"}");
taskExecutionContext = Mockito.mock(TaskExecutionContext.class);
Mockito.when(taskExecutionContext.getTaskParams()).thenReturn(props.getTaskParams());
Mockito.when(taskExecutionContext.getExecutePath()).thenReturn("/tmp");
Mockito.when(taskExecutionContext.getTaskAppId()).thenReturn("1");
Mockito.when(taskExecutionContext.getTenantCode()).thenReturn("root");
Mockito.when(taskExecutionContext.getStartTime()).thenReturn(new Date());
Mockito.when(taskExecutionContext.getTaskTimeout()).thenReturn(10000);
Mockito.when(taskExecutionContext.getLogPath()).thenReturn("/tmp/dx");
httpTask = new HttpTask(taskExecutionContext, logger);
httpTask.init();
}
@Test
public void testGetParameters() {
Assert.assertNotNull(httpTask.getParameters());
}
@Test
public void testCheckParameters() {
Assert.assertTrue(httpTask.getParameters().checkParameters());
}
@Test
public void testGenerator(){
String paramJson = "{\"localParams\":[],\"httpParams\":[],\"url\":\"https://github.com/\"," +
"\"httpMethod\":\"GET\",\"httpCheckCondition\":\"STATUS_CODE_DEFAULT\",\"condition\":\"\",\"connectTimeout\":\"10000\",\"socketTimeout\":\"10000\"}";
HttpParameters httpParameters = JSONUtils.parseObject(paramJson, HttpParameters.class);
Assert.assertEquals(10000,httpParameters.getConnectTimeout() );
Assert.assertEquals(10000,httpParameters.getSocketTimeout());
Assert.assertEquals("https://github.com/",httpParameters.getUrl());
Assert.assertEquals(HttpMethod.GET,httpParameters.getHttpMethod());
Assert.assertEquals(HttpCheckCondition.STATUS_CODE_DEFAULT,httpParameters.getHttpCheckCondition());
Assert.assertEquals("",httpParameters.getCondition());
}
@Test
public void testHandle(){
boolean flag = true ;
try {
httpTask.handle();
} catch (Exception e) {
flag = false ;
e.printStackTrace();
}
Assert.assertTrue(flag);
}
@Test
public void testSendRequest(){
CloseableHttpClient client = httpTask.createHttpClient();
String statusCode = null;
String body = null;
try {
CloseableHttpResponse response = httpTask.sendRequest(client) ;
statusCode = String.valueOf(httpTask.getStatusCode(response));
body = httpTask.getResponseBody(response);
int exitStatusCode = httpTask.validResponse(body, statusCode);
Assert.assertNotEquals(-1,exitStatusCode);
} catch (IOException e) {
e.printStackTrace();
};
}
@Test
public void testValidResponse(){
String body = "body";
String statusCode = "200" ;
int exitStatusCode = httpTask.validResponse(body,statusCode);
Assert.assertNotEquals(-1,exitStatusCode);
}
@Test
public void testAppendMessage(){
httpTask.appendMessage("message");
Assert.assertEquals("message",httpTask.getOutput());
}
@Test
public void testCreateHttpClient(){
Assert.assertNotNull(httpTask.createHttpClient());
}
@Test
public void testCreateRequestBuilder(){
RequestBuilder requestBuilder = httpTask.createRequestBuilder();
Assert.assertEquals(RequestBuilder.get().getMethod(),requestBuilder.getMethod());
}
private ProcessInstance getProcessInstance() {
ProcessInstance processInstance = new ProcessInstance();
processInstance.setCommandType(START_PROCESS);
processInstance.setScheduleTime(new Date());
return processInstance;
}
}

119
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/procedure/ProcedureTaskTest.java

@ -1,119 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.procedure;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.server.entity.ProcedureTaskExecutionContext;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.TaskProps;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import org.apache.dolphinscheduler.service.process.ProcessService;
import java.sql.CallableStatement;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.util.Date;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mockito;
import org.powermock.api.mockito.PowerMockito;
import org.powermock.core.classloader.annotations.PrepareForTest;
import org.powermock.modules.junit4.PowerMockRunner;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationContext;
@RunWith(PowerMockRunner.class)
@PrepareForTest({ProcedureTask.class,DriverManager.class})
public class ProcedureTaskTest {
private static final Logger logger = LoggerFactory.getLogger(ProcedureTaskTest.class);
private static final String CONNECTION_PARAMS = "{\"user\":\"root\",\"password\":\"123456\",\"address\":\"jdbc:mysql://127.0.0.1:3306\","
+ "\"database\":\"test\",\"jdbcUrl\":\"jdbc:mysql://127.0.0.1:3306/test\"}";
private ProcedureTask procedureTask;
private ProcessService processService;
private ApplicationContext applicationContext;
private TaskExecutionContext taskExecutionContext;
@Before
public void before() throws Exception {
taskExecutionContext = new TaskExecutionContext();
processService = PowerMockito.mock(ProcessService.class);
applicationContext = PowerMockito.mock(ApplicationContext.class);
SpringApplicationContext springApplicationContext = new SpringApplicationContext();
springApplicationContext.setApplicationContext(applicationContext);
PowerMockito.when(applicationContext.getBean(ProcessService.class)).thenReturn(processService);
TaskProps props = new TaskProps();
props.setExecutePath("/tmp");
props.setTaskAppId(String.valueOf(System.currentTimeMillis()));
props.setTaskInstanceId(1);
props.setTenantCode("1");
props.setEnvFile(".dolphinscheduler_env.sh");
props.setTaskStartTime(new Date());
props.setTaskTimeout(0);
props.setTaskParams(
"{\"localParams\":[],\"type\":\"POSTGRESQL\",\"datasource\":1,\"method\":\"add\"}");
taskExecutionContext = PowerMockito.mock(TaskExecutionContext.class);
PowerMockito.when(taskExecutionContext.getTaskParams()).thenReturn(props.getTaskParams());
PowerMockito.when(taskExecutionContext.getExecutePath()).thenReturn("/tmp");
PowerMockito.when(taskExecutionContext.getTaskAppId()).thenReturn("1");
PowerMockito.when(taskExecutionContext.getTenantCode()).thenReturn("root");
PowerMockito.when(taskExecutionContext.getStartTime()).thenReturn(new Date());
PowerMockito.when(taskExecutionContext.getTaskTimeout()).thenReturn(10000);
PowerMockito.when(taskExecutionContext.getLogPath()).thenReturn("/tmp/dx");
ProcedureTaskExecutionContext procedureTaskExecutionContext = new ProcedureTaskExecutionContext();
procedureTaskExecutionContext.setConnectionParams(CONNECTION_PARAMS);
PowerMockito.when(taskExecutionContext.getProcedureTaskExecutionContext()).thenReturn(procedureTaskExecutionContext);
procedureTask = new ProcedureTask(taskExecutionContext, logger);
procedureTask.init();
}
@Test
public void testGetParameters() {
Assert.assertNotNull(procedureTask.getParameters());
}
@Test
public void testHandle() throws SQLException {
Connection connection = PowerMockito.mock(Connection.class);
PowerMockito.mockStatic(DriverManager.class);
PowerMockito.when(DriverManager.getConnection(Mockito.any(), Mockito.any(), Mockito.any())).thenReturn(connection);
CallableStatement callableStatement = PowerMockito.mock(CallableStatement.class);
PowerMockito.when(connection.prepareCall(Mockito.any())).thenReturn(callableStatement);
try {
procedureTask.handle();
Assert.assertEquals(Constants.EXIT_CODE_SUCCESS,procedureTask.getExitStatusCode());
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
}

125
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/shell/ShellTaskTest.java

@ -1,125 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.shell;
import static org.mockito.ArgumentMatchers.anyString;
import org.apache.dolphinscheduler.common.enums.TaskType;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.CommandExecuteResult;
import org.apache.dolphinscheduler.server.worker.task.ShellCommandExecutor;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.sql.DriverManager;
import java.util.ArrayList;
import java.util.Date;
import java.util.HashMap;
import java.util.Map;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.powermock.api.mockito.PowerMockito;
import org.powermock.core.classloader.annotations.PrepareForTest;
import org.powermock.modules.junit4.PowerMockRunner;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* shell task test.
*/
@RunWith(PowerMockRunner.class)
@PrepareForTest(value = {ShellTask.class, DriverManager.class, SpringApplicationContext.class, ParameterUtils.class})
public class ShellTaskTest {
private static final Logger logger = LoggerFactory.getLogger(ShellTaskTest.class);
private ShellTask shellTask;
private ShellCommandExecutor shellCommandExecutor;
private TaskExecutionContext taskExecutionContext;
private CommandExecuteResult commandExecuteResult;
@Before
public void before() throws Exception {
System.setProperty("log4j2.disable.jmx", Boolean.TRUE.toString());
shellCommandExecutor = PowerMockito.mock(ShellCommandExecutor.class);
PowerMockito.whenNew(ShellCommandExecutor.class).withAnyArguments().thenReturn(shellCommandExecutor);
taskExecutionContext = new TaskExecutionContext();
taskExecutionContext.setTaskInstanceId(1);
taskExecutionContext.setTaskName("kris test");
taskExecutionContext.setTaskType(TaskType.SHELL.getDesc());
taskExecutionContext.setHost("127.0.0.1:1234");
taskExecutionContext.setExecutePath("/tmp");
taskExecutionContext.setLogPath("/log");
taskExecutionContext.setTaskJson(
"{\"conditionResult\":\"{\\\"successNode\\\":[\\\"\\\"],\\\"failedNode\\\":[\\\"\\\"]}\",\"conditionsTask\":false,\"depList\":[],\"dependence\":\"{}\",\"forbidden\":false,\"id\":\""
+
"tasks-16849\",\"maxRetryTimes\":0,\"name\":\"shell test 001\",\"params\":\"{\\\"rawScript\\\":\\\"#!/bin/sh\\\\necho $[yyyy-MM-dd HH:mm:ss +3]\\\\necho \\\\\\\" ?? "
+ "${time1} \\\\\\\"\\\\necho \\\\\\\" ????? ${time2}\\\\\\\"\\\\n\\\",\\\"localParams\\\":[{\\\"prop\\\":\\\"time1\\\",\\\"direct\\\":\\\"OUT\\\",\\\"type\\\":"
+ "\\\"VARCHAR\\\",\\\"value\\\":\\\"$[yyyy-MM-dd HH:mm:ss]\\\"},{\\\"prop\\\":\\\"time2\\\",\\\"direct\\\":\\\"IN\\\",\\\"type\\\":\\\"VARCHAR\\\",\\\"value\\\":\\\"${time_gb}\\\"}"
+ "],\\\"resourceList\\\":[]}\",\"preTasks\":\"[]\",\"retryInterval\":1,\"runFlag\":\"NORMAL\",\"taskInstancePriority\":\"MEDIUM\",\"taskTimeoutParameter\":"
+ "{\"enable\":false,\"interval\":0},\"timeout\":\"{\\\"enable\\\":false,\\\"strategy\\\":\\\"\\\"}\",\"type\":\"SHELL\",\"workerGroup\":\"default\"}");
taskExecutionContext.setProcessInstanceId(1);
taskExecutionContext.setGlobalParams("[{\"direct\":\"IN\",\"prop\":\"time_gb\",\"type\":\"VARCHAR\",\"value\":\"2020-12-16 17:18:33\"}]");
taskExecutionContext.setExecutorId(1);
taskExecutionContext.setCmdTypeIfComplement(5);
taskExecutionContext.setTenantCode("roo");
taskExecutionContext.setScheduleTime(new Date());
taskExecutionContext.setQueue("default");
taskExecutionContext.setVarPool("[{\"direct\":\"IN\",\"prop\":\"test\",\"type\":\"VARCHAR\",\"value\":\"\"}]");
taskExecutionContext.setTaskParams(
"{\"rawScript\":\"#!/bin/sh\\necho $[yyyy-MM-dd HH:mm:ss +3]\\necho \\\" ?? ${time1} \\\"\\necho \\\" ????? ${time2}\\\"\\n\",\"localParams\":"
+
"[{\"prop\":\"time1\",\"direct\":\"OUT\",\"type\":\"VARCHAR\",\"value\":\"$[yyyy-MM-dd HH:mm:ss]\"},{\"prop\":\"time2\",\"direct\":\"IN\",\"type\":\"VARCHAR"
+ "\",\"value\":\"${time_gb}\"}],\"resourceList\":[]}");
Map<String, String> definedParams = new HashMap<>();
definedParams.put("time_gb", "2020-12-16 00:00:00");
taskExecutionContext.setDefinedParams(definedParams);
PowerMockito.mockStatic(Files.class);
PowerMockito.when(Files.exists(Paths.get(anyString()))).thenReturn(true);
commandExecuteResult = new CommandExecuteResult();
commandExecuteResult.setAppIds("appId");
commandExecuteResult.setExitStatusCode(0);
commandExecuteResult.setProcessId(1);
}
@Test
public void testComplementData() throws Exception {
shellTask = new ShellTask(taskExecutionContext, logger);
shellTask.init();
shellTask.getParameters().setVarPool(taskExecutionContext.getVarPool());
shellCommandExecutor.isSuccessOfYarnState(new ArrayList<>());
shellCommandExecutor.isSuccessOfYarnState(null);
PowerMockito.when(shellCommandExecutor.run(anyString())).thenReturn(commandExecuteResult);
shellTask.handle();
}
@Test
public void testStartProcess() throws Exception {
taskExecutionContext.setCmdTypeIfComplement(0);
shellTask = new ShellTask(taskExecutionContext, logger);
shellTask.init();
shellTask.getParameters().setVarPool(taskExecutionContext.getVarPool());
PowerMockito.when(shellCommandExecutor.run(anyString())).thenReturn(commandExecuteResult);
shellTask.handle();
}
}

134
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/spark/SparkTaskTest.java

@ -1,134 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.spark;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.common.utils.placeholder.PlaceholderUtils;
import org.apache.dolphinscheduler.common.utils.placeholder.PropertyPlaceholderHelper;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import org.apache.dolphinscheduler.service.process.ProcessService;
import java.util.Date;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mockito;
import org.powermock.core.classloader.annotations.PrepareForTest;
import org.powermock.modules.junit4.PowerMockRunner;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationContext;
@RunWith(PowerMockRunner.class)
@PrepareForTest({ParameterUtils.class, PlaceholderUtils.class, PropertyPlaceholderHelper.class})
public class SparkTaskTest {
private static final Logger logger = LoggerFactory.getLogger(SparkTaskTest.class);
private TaskExecutionContext taskExecutionContext;
private ApplicationContext applicationContext;
private ProcessService processService;
private SparkTask spark2Task;
String spark1Params = "{"
+ "\"mainArgs\":\"\", "
+ "\"driverMemory\":\"1G\", "
+ "\"executorMemory\":\"2G\", "
+ "\"programType\":\"SCALA\", "
+ "\"mainClass\":\"basicetl.GlobalUserCar\", "
+ "\"driverCores\":\"2\", "
+ "\"deployMode\":\"cluster\", "
+ "\"executorCores\":2, "
+ "\"mainJar\":{\"res\":\"test-1.0-SNAPSHOT.jar\"}, "
+ "\"sparkVersion\":\"SPARK1\", "
+ "\"numExecutors\":\"10\", "
+ "\"localParams\":[], "
+ "\"others\":\"\", "
+ "\"resourceList\":[]"
+ "}";
String spark2Params = "{"
+ "\"mainArgs\":\"\", "
+ "\"driverMemory\":\"1G\", "
+ "\"executorMemory\":\"2G\", "
+ "\"programType\":\"SCALA\", "
+ "\"mainClass\":\"basicetl.GlobalUserCar\", "
+ "\"driverCores\":\"2\", "
+ "\"deployMode\":\"cluster\", "
+ "\"executorCores\":2, "
+ "\"mainJar\":{\"res\":\"test-1.0-SNAPSHOT.jar\"}, "
+ "\"sparkVersion\":\"SPARK2\", "
+ "\"numExecutors\":\"10\", "
+ "\"localParams\":[], "
+ "\"others\":\"\", "
+ "\"resourceList\":[]"
+ "}";
@Before
public void setTaskExecutionContext() {
taskExecutionContext = new TaskExecutionContext();
taskExecutionContext.setTaskParams(spark2Params);
taskExecutionContext.setQueue("dev");
taskExecutionContext.setTaskAppId(String.valueOf(System.currentTimeMillis()));
taskExecutionContext.setTenantCode("1");
taskExecutionContext.setEnvFile(".dolphinscheduler_env.sh");
taskExecutionContext.setStartTime(new Date());
taskExecutionContext.setTaskTimeout(0);
processService = Mockito.mock(ProcessService.class);
applicationContext = Mockito.mock(ApplicationContext.class);
SpringApplicationContext springApplicationContext = new SpringApplicationContext();
springApplicationContext.setApplicationContext(applicationContext);
Mockito.when(applicationContext.getBean(ProcessService.class)).thenReturn(processService);
spark2Task = new SparkTask(taskExecutionContext, logger);
spark2Task.init();
}
@Test
public void testSparkTaskInit() {
TaskExecutionContext sparkTaskCtx = new TaskExecutionContext();
SparkTask sparkTask = new SparkTask(sparkTaskCtx, logger);
sparkTask.init();
sparkTask.getParameters();
Assert.assertNull(sparkTaskCtx.getTaskParams());
String spark2Command = spark2Task.buildCommand();
String spark2Expected = "${SPARK_HOME2}/bin/spark-submit --master yarn --deploy-mode cluster "
+ "--class basicetl.GlobalUserCar --driver-cores 2 --driver-memory 1G --num-executors 10 "
+ "--executor-cores 2 --executor-memory 2G --queue dev test-1.0-SNAPSHOT.jar";
Assert.assertEquals(spark2Expected, spark2Command);
taskExecutionContext.setTaskParams(spark1Params);
SparkTask spark1Task = new SparkTask(taskExecutionContext, logger);
spark1Task.init();
String spark1Command = spark1Task.buildCommand();
String spark1Expected = "${SPARK_HOME1}/bin/spark-submit --master yarn --deploy-mode cluster "
+ "--class basicetl.GlobalUserCar --driver-cores 2 --driver-memory 1G --num-executors 10 "
+ "--executor-cores 2 --executor-memory 2G --queue dev test-1.0-SNAPSHOT.jar";
Assert.assertEquals(spark1Expected, spark1Command);
}
}

241
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/sql/SqlTaskTest.java

@ -1,241 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sql;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertTrue;
import static org.junit.Assert.fail;
import org.apache.dolphinscheduler.common.Constants;
import org.apache.dolphinscheduler.common.datasource.DatasourceUtil;
import org.apache.dolphinscheduler.common.process.Property;
import org.apache.dolphinscheduler.common.task.sql.SqlParameters;
import org.apache.dolphinscheduler.common.utils.ParameterUtils;
import org.apache.dolphinscheduler.dao.AlertDao;
import org.apache.dolphinscheduler.remote.command.alert.AlertSendResponseCommand;
import org.apache.dolphinscheduler.server.entity.SQLTaskExecutionContext;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.TaskProps;
import org.apache.dolphinscheduler.service.alert.AlertClientService;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.util.Date;
import java.util.HashMap;
import java.util.Map;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mockito;
import org.powermock.api.mockito.PowerMockito;
import org.powermock.core.classloader.annotations.PrepareForTest;
import org.powermock.modules.junit4.PowerMockRunner;
import org.powermock.reflect.Whitebox;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* sql task test
*/
@RunWith(PowerMockRunner.class)
@PrepareForTest(value = {SqlTask.class, DatasourceUtil.class, SpringApplicationContext.class,
ParameterUtils.class, AlertSendResponseCommand.class})
public class SqlTaskTest {
private static final Logger logger = LoggerFactory.getLogger(SqlTaskTest.class);
private static final String CONNECTION_PARAMS = "{\"user\":\"root\",\"password\":\"123456\",\"address\":\"jdbc:mysql://127.0.0.1:3306\","
+ "\"database\":\"test\",\"jdbcUrl\":\"jdbc:mysql://127.0.0.1:3306/test\"}";
private SqlTask sqlTask;
private TaskExecutionContext taskExecutionContext;
private AlertClientService alertClientService;
@Before
public void before() throws Exception {
taskExecutionContext = new TaskExecutionContext();
TaskProps props = new TaskProps();
props.setExecutePath("/tmp");
props.setTaskAppId(String.valueOf(System.currentTimeMillis()));
props.setTaskInstanceId(1);
props.setTenantCode("1");
props.setEnvFile(".dolphinscheduler_env.sh");
props.setTaskStartTime(new Date());
props.setTaskTimeout(0);
props.setTaskParams(
"{\"localParams\":[{\"prop\":\"ret\", \"direct\":\"OUT\", \"type\":\"VARCHAR\", \"value\":\"\"}],"
+ "\"type\":\"POSTGRESQL\",\"datasource\":1,\"sql\":\"insert into tb_1 values('1','2')\","
+ "\"sqlType\":1}");
taskExecutionContext = PowerMockito.mock(TaskExecutionContext.class);
PowerMockito.when(taskExecutionContext.getTaskParams()).thenReturn(props.getTaskParams());
PowerMockito.when(taskExecutionContext.getExecutePath()).thenReturn("/tmp");
PowerMockito.when(taskExecutionContext.getTaskAppId()).thenReturn("1");
PowerMockito.when(taskExecutionContext.getTenantCode()).thenReturn("root");
PowerMockito.when(taskExecutionContext.getStartTime()).thenReturn(new Date());
PowerMockito.when(taskExecutionContext.getTaskTimeout()).thenReturn(10000);
PowerMockito.when(taskExecutionContext.getLogPath()).thenReturn("/tmp/dx");
PowerMockito.when(taskExecutionContext.getVarPool()).thenReturn("[{\"direct\":\"IN\",\"prop\":\"test\",\"type\":\"VARCHAR\",\"value\":\"\"}]");
SQLTaskExecutionContext sqlTaskExecutionContext = new SQLTaskExecutionContext();
sqlTaskExecutionContext.setConnectionParams(CONNECTION_PARAMS);
PowerMockito.when(taskExecutionContext.getSqlTaskExecutionContext()).thenReturn(sqlTaskExecutionContext);
PowerMockito.mockStatic(SpringApplicationContext.class);
PowerMockito.when(SpringApplicationContext.getBean(Mockito.any())).thenReturn(new AlertDao());
alertClientService = PowerMockito.mock(AlertClientService.class);
sqlTask = new SqlTask(taskExecutionContext, logger, alertClientService);
sqlTask.getParameters().setVarPool(taskExecutionContext.getVarPool());
sqlTask.init();
}
@Test
public void testGetParameters() {
Assert.assertNotNull(sqlTask.getParameters());
}
@Test
public void testHandle() throws Exception {
Connection connection = PowerMockito.mock(Connection.class);
PreparedStatement preparedStatement = PowerMockito.mock(PreparedStatement.class);
PowerMockito.when(connection.prepareStatement(Mockito.any())).thenReturn(preparedStatement);
PowerMockito.mockStatic(ParameterUtils.class);
PowerMockito.when(ParameterUtils.replaceScheduleTime(Mockito.any(), Mockito.any())).thenReturn("insert into tb_1 values('1','2')");
PowerMockito.mockStatic(DatasourceUtil.class);
PowerMockito.when(DatasourceUtil.getConnection(Mockito.any(), Mockito.any())).thenReturn(connection);
sqlTask.handle();
assertEquals(Constants.EXIT_CODE_SUCCESS, sqlTask.getExitStatusCode());
}
@Test
public void testResultProcess() throws Exception {
// test input null and will not throw a exception
AlertSendResponseCommand mockResponseCommand = PowerMockito.mock(AlertSendResponseCommand.class);
PowerMockito.when(mockResponseCommand.getResStatus()).thenReturn(true);
PowerMockito.when(alertClientService.sendAlert(0, "null query result sets", "[]")).thenReturn(mockResponseCommand);
String result = Whitebox.invokeMethod(sqlTask, "resultProcess", null);
Assert.assertNotNull(result);
}
@Test
public void testResultProcess02() throws Exception {
// test input not null
ResultSet resultSet = PowerMockito.mock(ResultSet.class);
ResultSetMetaData mockResultMetaData = PowerMockito.mock(ResultSetMetaData.class);
PowerMockito.when(resultSet.getMetaData()).thenReturn(mockResultMetaData);
PowerMockito.when(mockResultMetaData.getColumnCount()).thenReturn(2);
PowerMockito.when(resultSet.next()).thenReturn(true);
PowerMockito.when(resultSet.getObject(Mockito.anyInt())).thenReturn(1);
PowerMockito.when(mockResultMetaData.getColumnLabel(Mockito.anyInt())).thenReturn("a");
AlertSendResponseCommand mockResponseCommand = PowerMockito.mock(AlertSendResponseCommand.class);
PowerMockito.when(mockResponseCommand.getResStatus()).thenReturn(true);
PowerMockito.when(alertClientService.sendAlert(Mockito.anyInt(), Mockito.anyString(), Mockito.anyString())).thenReturn(mockResponseCommand);
String result = Whitebox.invokeMethod(sqlTask, "resultProcess", resultSet);
Assert.assertNotNull(result);
}
@Test
public void shouldntThrowNullPointerException_When_SqlParamsMapIsNull_printReplacedSql() {
try {
sqlTask.printReplacedSql("", "", "", null);
assertTrue(true);
} catch (NullPointerException err) {
fail();
}
}
@Test
public void shouldntPutPropertyInSqlParamsMap_When_paramNameIsNotFoundInparamsPropsMap_setSqlParamsMap() {
Map<Integer, Property> sqlParamsMap = new HashMap<>();
Map<String, Property> paramsPropsMap = new HashMap<>();
paramsPropsMap.put("validPropertyName", new Property());
taskExecutionContext = PowerMockito.mock(TaskExecutionContext.class);
PowerMockito.when(taskExecutionContext.getTaskInstanceId()).thenReturn(1);
sqlTask.setSqlParamsMap("notValidPropertyName", "(notValidPropertyName)", sqlParamsMap, paramsPropsMap);
assertEquals(0, sqlParamsMap.size());
}
@Test
public void testQueryBySQLUsingLimit() throws Exception {
TaskExecutionContext localTaskExecutionContext;
TaskProps props = new TaskProps();
props.setExecutePath("/tmp");
props.setTaskAppId(String.valueOf(System.currentTimeMillis()));
props.setTaskInstanceId(1);
props.setTenantCode("1");
props.setEnvFile(".dolphinscheduler_env.sh");
props.setTaskStartTime(new Date());
props.setTaskTimeout(0);
props.setTaskParams(
"{\"localParams\":[{\"prop\":\"ret\", \"direct\":\"OUT\", \"type\":\"VARCHAR\", \"value\":\"\"}],"
+ "\"type\":\"POSTGRESQL\",\"datasource\":1,\"sql\":\"SELECT * FROM tb_1\","
+ "\"sqlType\":0, \"limit\":1, \"sendEmail\":\"false\"}");
localTaskExecutionContext = PowerMockito.mock(TaskExecutionContext.class);
PowerMockito.when(localTaskExecutionContext.getTaskParams()).thenReturn(props.getTaskParams());
PowerMockito.when(localTaskExecutionContext.getExecutePath()).thenReturn("/tmp");
PowerMockito.when(localTaskExecutionContext.getTaskAppId()).thenReturn("1");
PowerMockito.when(localTaskExecutionContext.getTenantCode()).thenReturn("root");
PowerMockito.when(localTaskExecutionContext.getStartTime()).thenReturn(new Date());
PowerMockito.when(localTaskExecutionContext.getTaskTimeout()).thenReturn(10000);
PowerMockito.when(localTaskExecutionContext.getLogPath()).thenReturn("/tmp/dx");
SQLTaskExecutionContext sqlTaskExecutionContext = new SQLTaskExecutionContext();
sqlTaskExecutionContext.setConnectionParams(CONNECTION_PARAMS);
PowerMockito.when(localTaskExecutionContext.getSqlTaskExecutionContext()).thenReturn(sqlTaskExecutionContext);
PowerMockito.mockStatic(SpringApplicationContext.class);
PowerMockito.when(SpringApplicationContext.getBean(Mockito.any())).thenReturn(new AlertDao());
AlertClientService localAlertClientService = PowerMockito.mock(AlertClientService.class);
SqlTask localSqlTask = new SqlTask(localTaskExecutionContext, logger, localAlertClientService);
localSqlTask.init();
ResultSet resultSet = PowerMockito.mock(ResultSet.class);
ResultSetMetaData mockResultMetaData = PowerMockito.mock(ResultSetMetaData.class);
PowerMockito.when(resultSet.getMetaData()).thenReturn(mockResultMetaData);
PowerMockito.when(mockResultMetaData.getColumnCount()).thenReturn(2);
PowerMockito.when(resultSet.next()).thenReturn(true);
PowerMockito.when(resultSet.getObject(Mockito.anyInt())).thenReturn(1);
PowerMockito.when(mockResultMetaData.getColumnLabel(Mockito.anyInt())).thenReturn("a");
AlertSendResponseCommand mockResponseCommand = PowerMockito.mock(AlertSendResponseCommand.class);
PowerMockito.when(mockResponseCommand.getResStatus()).thenReturn(true);
PowerMockito.when(localAlertClientService.sendAlert(Mockito.anyInt(), Mockito.anyString(), Mockito.anyString()))
.thenReturn(mockResponseCommand);
String result = Whitebox.invokeMethod(localSqlTask, "resultProcess", resultSet);
Assert.assertEquals(1, ((SqlParameters) localSqlTask.getParameters()).getLimit());
// In fact, the target table has 2 rows, as we set the limit to 1, if the limit works, the `resultProcess` method
// should return [{"a":1}] rather then [{"a":1},{"a":1}]
Assert.assertEquals("[{\"a\":1}]", result);
}
}

228
dolphinscheduler-server/src/test/java/org/apache/dolphinscheduler/server/worker/task/sqoop/SqoopTaskTest.java

@ -1,228 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.dolphinscheduler.server.worker.task.sqoop;
import org.apache.dolphinscheduler.common.task.sqoop.SqoopParameters;
import org.apache.dolphinscheduler.common.utils.JSONUtils;
import org.apache.dolphinscheduler.server.entity.SqoopTaskExecutionContext;
import org.apache.dolphinscheduler.server.entity.TaskExecutionContext;
import org.apache.dolphinscheduler.server.worker.task.sqoop.generator.SqoopJobGenerator;
import org.apache.dolphinscheduler.service.bean.SpringApplicationContext;
import org.apache.dolphinscheduler.service.process.ProcessService;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Date;
import java.util.List;
import java.util.concurrent.LinkedBlockingQueue;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mockito;
import org.mockito.junit.MockitoJUnitRunner;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationContext;
/**
* sqoop task test
*/
@RunWith(MockitoJUnitRunner.Silent.class)
public class SqoopTaskTest {
private static final Logger logger = LoggerFactory.getLogger(SqoopTaskTest.class);
private SqoopTask sqoopTask;
@Before
public void before() {
ProcessService processService = Mockito.mock(ProcessService.class);
ApplicationContext applicationContext = Mockito.mock(ApplicationContext.class);
SpringApplicationContext springApplicationContext = new SpringApplicationContext();
springApplicationContext.setApplicationContext(applicationContext);
Mockito.when(applicationContext.getBean(ProcessService.class)).thenReturn(processService);
TaskExecutionContext taskExecutionContext = new TaskExecutionContext();
taskExecutionContext.setTaskAppId(String.valueOf(System.currentTimeMillis()));
taskExecutionContext.setTenantCode("1");
taskExecutionContext.setEnvFile(".dolphinscheduler_env.sh");
taskExecutionContext.setStartTime(new Date());
taskExecutionContext.setTaskTimeout(0);
taskExecutionContext.setTaskParams("{\"jobName\":\"sqoop_import\",\"jobType\":\"TEMPLATE\",\"concurrency\":1,"
+ "\"modelType\":\"import\",\"sourceType\":\"MYSQL\",\"targetType\":\"HIVE\",\"sourceParams\":\"{\\\"srcDatasource\\\":2,\\\"srcTable\\\":\\\"person_2\\\","
+ "\\\"srcQueryType\\\":\\\"1\\\",\\\"srcQuerySql\\\":\\\"SELECT * FROM person_2\\\",\\\"srcColumnType\\\":\\\"0\\\","
+ "\\\"srcColumns\\\":\\\"\\\",\\\"srcConditionList\\\":[],\\\"mapColumnHive\\\":[],"
+ "\\\"mapColumnJava\\\":[{\\\"prop\\\":\\\"id\\\",\\\"direct\\\":\\\"IN\\\",\\\"type\\\":\\\"VARCHAR\\\",\\\"value\\\":\\\"Integer\\\"}]}\""
+ ",\"targetParams\":\"{\\\"hiveDatabase\\\":\\\"stg\\\",\\\"hiveTable\\\":\\\"person_internal_2\\\",\\\"createHiveTable\\\":true,"
+ "\\\"dropDelimiter\\\":false,\\\"hiveOverWrite\\\":true,\\\"replaceDelimiter\\\":\\\"\\\",\\\"hivePartitionKey\\\":\\\"date\\\","
+ "\\\"hivePartitionValue\\\":\\\"2020-02-16\\\"}\",\"localParams\":[]}");
sqoopTask = new SqoopTask(taskExecutionContext, logger);
//test sqoop task init method
sqoopTask.init();
}
/**
* test SqoopJobGenerator
*/
@Test
public void testGenerator() {
TaskExecutionContext mysqlTaskExecutionContext = getMysqlTaskExecutionContext();
//sqoop TEMPLATE job
//import mysql to HDFS with hadoop
String mysqlToHdfs =
"{\"jobName\":\"sqoop_import\",\"hadoopCustomParams\":[{\"prop\":\"mapreduce.map.memory.mb\",\"direct\":\"IN\",\"type\":\"VARCHAR\",\"value\":\"4096\"}],"
+ "\"sqoopAdvancedParams\":[{\"prop\":\"--direct\",\"direct\":\"IN\",\"type\":\"VARCHAR\",\"value\":\"\"}],\"jobType\":\"TEMPLATE\",\"concurrency\":1,"
+ "\"modelType\":\"import\",\"sourceType\":\"MYSQL\",\"targetType\":\"HDFS\","
+ "\"sourceParams\":\"{\\\"srcDatasource\\\":2,\\\"srcTable\\\":\\\"person_2\\\",\\\"srcQueryType\\\":\\\"0\\\",\\\"srcQuerySql\\\":\\\"\\\",\\\"srcColumnType\\\":\\\"0\\\","
+ "\\\"srcColumns\\\":\\\"\\\",\\\"srcConditionList\\\":[],\\\"mapColumnHive\\\":[],\\\"mapColumnJava\\\":[]}\",\"targetParams\":\"{\\\"targetPath\\\":\\\"/ods/tmp/test/person7\\\","
+ "\\\"deleteTargetDir\\\":true,\\\"fileType\\\":\\\"--as-textfile\\\",\\\"compressionCodec\\\":\\\"\\\",\\\"fieldsTerminated\\\":\\\"@\\\","
+ "\\\"linesTerminated\\\":\\\"\\\\\\\\n\\\"}\",\"localParams\":[]}";
SqoopParameters mysqlToHdfsParams = JSONUtils.parseObject(mysqlToHdfs, SqoopParameters.class);
SqoopJobGenerator generator = new SqoopJobGenerator();
String mysqlToHdfsScript = generator.generateSqoopJob(mysqlToHdfsParams, mysqlTaskExecutionContext);
String mysqlToHdfsExpected =
"sqoop import -D mapred.job.name=sqoop_import -D mapreduce.map.memory.mb=4096 --direct -m 1 --connect "
+ "\"jdbc:mysql://192.168.0.111:3306/test?allowLoadLocalInfile=false&autoDeserialize=false&allowLocalInfile=false&allowUrlInLocalInfile=false\" "
+ "--username kylo --password \"123456\" --table person_2 --target-dir /ods/tmp/test/person7 --as-textfile "
+ "--delete-target-dir --fields-terminated-by '@' --lines-terminated-by '\\n' --null-non-string 'NULL' --null-string 'NULL'";
Assert.assertEquals(mysqlToHdfsExpected, mysqlToHdfsScript);
//export hdfs to mysql using update mode
String hdfsToMysql = "{\"jobName\":\"sqoop_import\",\"jobType\":\"TEMPLATE\",\"concurrency\":1,\"modelType\":\"export\",\"sourceType\":\"HDFS\","
+ "\"targetType\":\"MYSQL\",\"sourceParams\":\"{\\\"exportDir\\\":\\\"/ods/tmp/test/person7\\\"}\","
+ "\"targetParams\":\"{\\\"targetDatasource\\\":2,\\\"targetTable\\\":\\\"person_3\\\",\\\"targetColumns\\\":\\\"id,name,age,sex,create_time\\\","
+ "\\\"preQuery\\\":\\\"\\\",\\\"isUpdate\\\":true,\\\"targetUpdateKey\\\":\\\"id\\\",\\\"targetUpdateMode\\\":\\\"allowinsert\\\","
+ "\\\"fieldsTerminated\\\":\\\"@\\\",\\\"linesTerminated\\\":\\\"\\\\\\\\n\\\"}\",\"localParams\":[]}";
SqoopParameters hdfsToMysqlParams = JSONUtils.parseObject(hdfsToMysql, SqoopParameters.class);
String hdfsToMysqlScript = generator.generateSqoopJob(hdfsToMysqlParams, mysqlTaskExecutionContext);
String hdfsToMysqlScriptExpected =
"sqoop export -D mapred.job.name=sqoop_import -m 1 --export-dir /ods/tmp/test/person7 --connect "
+ "\"jdbc:mysql://192.168.0.111:3306/test?allowLoadLocalInfile=false&autoDeserialize=false&allowLocalInfile=false&allowUrlInLocalInfile=false\" "
+ "--username kylo --password \"123456\" --table person_3 --columns id,name,age,sex,create_time --fields-terminated-by '@' "
+ "--lines-terminated-by '\\n' --update-key id --update-mode allowinsert";
Assert.assertEquals(hdfsToMysqlScriptExpected, hdfsToMysqlScript);
//export hive to mysql
String hiveToMysql =
"{\"jobName\":\"sqoop_import\",\"jobType\":\"TEMPLATE\",\"concurrency\":1,\"modelType\":\"export\",\"sourceType\":\"HIVE\","
+ "\"targetType\":\"MYSQL\",\"sourceParams\":\"{\\\"hiveDatabase\\\":\\\"stg\\\",\\\"hiveTable\\\":\\\"person_internal\\\","
+ "\\\"hivePartitionKey\\\":\\\"date\\\",\\\"hivePartitionValue\\\":\\\"2020-02-17\\\"}\","
+ "\"targetParams\":\"{\\\"targetDatasource\\\":2,\\\"targetTable\\\":\\\"person_3\\\",\\\"targetColumns\\\":\\\"\\\",\\\"preQuery\\\":\\\"\\\","
+ "\\\"isUpdate\\\":false,\\\"targetUpdateKey\\\":\\\"\\\",\\\"targetUpdateMode\\\":\\\"allowinsert\\\",\\\"fieldsTerminated\\\":\\\"@\\\","
+ "\\\"linesTerminated\\\":\\\"\\\\\\\\n\\\"}\",\"localParams\":[]}";
SqoopParameters hiveToMysqlParams = JSONUtils.parseObject(hiveToMysql, SqoopParameters.class);
String hiveToMysqlScript = generator.generateSqoopJob(hiveToMysqlParams, mysqlTaskExecutionContext);
String hiveToMysqlExpected =
"sqoop export -D mapred.job.name=sqoop_import -m 1 --hcatalog-database stg --hcatalog-table person_internal --hcatalog-partition-keys date "
+ "--hcatalog-partition-values 2020-02-17 --connect \"jdbc:mysql://192.168.0.111:3306/test?allowLoadLocalInfile="
+ "false&autoDeserialize=false&allowLocalInfile=false&allowUrlInLocalInfile=false\" --username kylo --password \"123456\" --table person_3 "
+ "--fields-terminated-by '@' --lines-terminated-by '\\n'";
Assert.assertEquals(hiveToMysqlExpected, hiveToMysqlScript);
//import mysql to hive
String mysqlToHive =
"{\"jobName\":\"sqoop_import\",\"jobType\":\"TEMPLATE\",\"concurrency\":1,\"modelType\":\"import\",\"sourceType\":\"MYSQL\",\"targetType\":\"HIVE\","
+ "\"sourceParams\":\"{\\\"srcDatasource\\\":2,\\\"srcTable\\\":\\\"person_2\\\",\\\"srcQueryType\\\":\\\"1\\\","
+ "\\\"srcQuerySql\\\":\\\"SELECT * FROM person_2\\\",\\\"srcColumnType\\\":\\\"0\\\",\\\"srcColumns\\\":\\\"\\\",\\\"srcConditionList\\\":[],"
+ "\\\"mapColumnHive\\\":[],\\\"mapColumnJava\\\":[{\\\"prop\\\":\\\"id\\\",\\\"direct\\\":\\\"IN\\\",\\\"type\\\":\\\"VARCHAR\\\",\\\"value\\\":\\\"Integer\\\"}]}\","
+ "\"targetParams\":\"{\\\"hiveDatabase\\\":\\\"stg\\\",\\\"hiveTable\\\":\\\"person_internal_2\\\",\\\"createHiveTable\\\":true,\\\"dropDelimiter\\\":false,"
+ "\\\"hiveOverWrite\\\":true,\\\"replaceDelimiter\\\":\\\"\\\",\\\"hivePartitionKey\\\":\\\"date\\\",\\\"hivePartitionValue\\\":\\\"2020-02-16\\\"}\",\"localParams\":[]}";
SqoopParameters mysqlToHiveParams = JSONUtils.parseObject(mysqlToHive, SqoopParameters.class);
String mysqlToHiveScript = generator.generateSqoopJob(mysqlToHiveParams, mysqlTaskExecutionContext);
String mysqlToHiveExpected =
"sqoop import -D mapred.job.name=sqoop_import -m 1 --connect \"jdbc:mysql://192.168.0.111:3306/"
+ "test?allowLoadLocalInfile=false&autoDeserialize=false&allowLocalInfile=false&allowUrlInLocalInfile=false\" "
+ "--username kylo --password \"123456\" "
+ "--query \"SELECT * FROM person_2 WHERE \\$CONDITIONS\" --map-column-java id=Integer --hive-import --hive-database stg --hive-table person_internal_2 "
+ "--create-hive-table --hive-overwrite --delete-target-dir --hive-partition-key date --hive-partition-value 2020-02-16";
Assert.assertEquals(mysqlToHiveExpected, mysqlToHiveScript);
//sqoop CUSTOM job
String sqoopCustomString = "{\"jobType\":\"CUSTOM\",\"localParams\":[],\"customShell\":\"sqoop import\"}";
SqoopParameters sqoopCustomParams = JSONUtils.parseObject(sqoopCustomString, SqoopParameters.class);
String sqoopCustomScript = generator.generateSqoopJob(sqoopCustomParams, new TaskExecutionContext());
String sqoopCustomExpected = "sqoop import";
Assert.assertEquals(sqoopCustomExpected, sqoopCustomScript);
}
/**
* get taskExecutionContext include mysql
*
* @return TaskExecutionContext
*/
private TaskExecutionContext getMysqlTaskExecutionContext() {
TaskExecutionContext taskExecutionContext = new TaskExecutionContext();
SqoopTaskExecutionContext sqoopTaskExecutionContext = new SqoopTaskExecutionContext();
String mysqlSourceConnectionParams =
"{\"address\":\"jdbc:mysql://192.168.0.111:3306\",\"database\":\"test\",\"jdbcUrl\":\"jdbc:mysql://192.168.0.111:3306/test\",\"user\":\"kylo\",\"password\":\"123456\"}";
String mysqlTargetConnectionParams =
"{\"address\":\"jdbc:mysql://192.168.0.111:3306\",\"database\":\"test\",\"jdbcUrl\":\"jdbc:mysql://192.168.0.111:3306/test\",\"user\":\"kylo\",\"password\":\"123456\"}";
sqoopTaskExecutionContext.setDataSourceId(2);
sqoopTaskExecutionContext.setDataTargetId(2);
sqoopTaskExecutionContext.setSourcetype(0);
sqoopTaskExecutionContext.setTargetConnectionParams(mysqlTargetConnectionParams);
sqoopTaskExecutionContext.setSourceConnectionParams(mysqlSourceConnectionParams);
sqoopTaskExecutionContext.setTargetType(0);
taskExecutionContext.setSqoopTaskExecutionContext(sqoopTaskExecutionContext);
return taskExecutionContext;
}
@Test
public void testGetParameters() {
Assert.assertNotNull(sqoopTask.getParameters());
}
/**
* Method: init
*/
@Test
public void testInit() {
try {
sqoopTask.init();
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
@Test
public void testLogHandler() throws InterruptedException {
LinkedBlockingQueue<String> loggerBuffer = new LinkedBlockingQueue<>();
Thread thread1 = new Thread(() -> {
for (int i = 0; i < 10; i++) {
loggerBuffer.add("test add log");
}
});
Thread thread2 = new Thread(() -> {
for (int i = 0; i < 10; i++) {
sqoopTask.logHandle(loggerBuffer);
}
});
thread1.start();
thread2.start();
thread1.join();
thread2.join();
// if no exception throw, assert true
Assert.assertTrue(true);
}
}

1
dolphinscheduler-service/src/main/java/org/apache/dolphinscheduler/service/process/ProcessService.java

@ -2458,6 +2458,7 @@ public class ProcessService {
v.setRetryInterval(taskDefinitionLog.getFailRetryInterval());
Map<String, Object> taskParamsMap = v.taskParamsToJsonObj(taskDefinitionLog.getTaskParams());
v.setConditionResult((String) taskParamsMap.get(Constants.CONDITION_RESULT));
v.setSwitchResult((String) taskParamsMap.get(Constants.SWITCH_RESULT));
v.setDependence((String) taskParamsMap.get(Constants.DEPENDENCE));
taskParamsMap.remove(Constants.CONDITION_RESULT);
taskParamsMap.remove(Constants.DEPENDENCE);

14
dolphinscheduler-service/src/test/java/org/apache/dolphinscheduler/service/process/ProcessServiceTest.java

@ -424,12 +424,14 @@ public class ProcessServiceTest {
@Test
public void testGenProcessData() {
String processDefinitionJson = "{\"tasks\":[{\"id\":null,\"code\":3,\"version\":0,\"name\":\"1-test\",\"desc\":null,"
+ "\"type\":\"SHELL\",\"runFlag\":\"FORBIDDEN\",\"loc\":null,\"maxRetryTimes\":0,\"retryInterval\":0,"
+ "\"params\":{},\"preTasks\":[\"unit-test\"],\"preTaskNodeList\":[{\"code\":2,\"name\":\"unit-test\","
+ "\"version\":0}],\"extras\":null,\"depList\":[\"unit-test\"],\"dependence\":null,\"conditionResult\":null,"
+ "\"taskInstancePriority\":null,\"workerGroup\":null,\"timeout\":{\"enable\":false,\"strategy\":null,"
+ "\"interval\":0},\"delayTime\":0}],\"globalParams\":[],\"timeout\":0,\"tenantId\":0}";
String processDefinitionJson = "{\"tasks\":[{\"id\":null,\"code\":3,\"version\":0,\"name\":\"1-test\","
+ "\"desc\":null,\"type\":\"SHELL\",\"runFlag\":\"FORBIDDEN\",\"loc\":null,\"maxRetryTimes\":0,"
+ "\"retryInterval\":0,\"params\":{},\"preTasks\":[\"unit-test\"],"
+ "\"preTaskNodeList\":[{\"code\":2,\"name\":\"unit-test\",\"version\":0}],"
+ "\"extras\":null,\"depList\":[\"unit-test\"],\"dependence\":null,\"conditionResult\":null,"
+ "\"switchResult\":null,\"taskInstancePriority\":null,\"workerGroup\":null,"
+ "\"timeout\":{\"enable\":false,\"strategy\":null,\"interval\":0},\"delayTime\":0}],"
+ "\"globalParams\":[],\"timeout\":0,\"tenantId\":0}";
ProcessDefinition processDefinition = new ProcessDefinition();
processDefinition.setCode(1L);

Some files were not shown because too many files have changed in this diff Show More

Loading…
Cancel
Save