diff --git a/docs/docs/en/guide/task/conditions.md b/docs/docs/en/guide/task/conditions.md index 9ffd8e3db3..75576cdd46 100644 --- a/docs/docs/en/guide/task/conditions.md +++ b/docs/docs/en/guide/task/conditions.md @@ -4,7 +4,7 @@ Condition is a conditional node, that determines which downstream task should ru ## Create Task -- Click `Project Management -> Project Name -> Workflow Definition`, and click the "`Create Workflow`" button to enter the DAG editing page. +- Click `Project Management -> Project Name -> Workflow Definition`, and click the `Create Workflow` button to enter the DAG editing page. - Drag from the toolbar task node to canvas. ## Task Parameters diff --git a/docs/docs/en/guide/task/emr.md b/docs/docs/en/guide/task/emr.md index 9d8021ad4a..050d7c2397 100644 --- a/docs/docs/en/guide/task/emr.md +++ b/docs/docs/en/guide/task/emr.md @@ -6,8 +6,8 @@ Amazon EMR task type, for creating EMR clusters on AWS and running computing tas ## Create Task -* Click `Project Management -> Project Name -> Workflow Definition`, click the "`Create Workflow`" button to enter the DAG editing page. -* Drag `AmazonEMR` task from the toolbar to the artboard to complete the creation. +* Click `Project Management -> Project Name -> Workflow Definition`, click the `Create Workflow` button to enter the DAG editing page. +* Drag `AmazonEMR` task from the toolbar to the artboard to complete the creation. ## Task Parameters diff --git a/docs/docs/en/guide/task/flink.md b/docs/docs/en/guide/task/flink.md index 44c9461803..b32080ac67 100644 --- a/docs/docs/en/guide/task/flink.md +++ b/docs/docs/en/guide/task/flink.md @@ -45,34 +45,6 @@ Flink task type, used to execute Flink programs. For Flink nodes: | Resource | Appoint resource files in the `Resource` if parameters refer to them. | | Custom parameter | It is a local user-defined parameter for Flink, and will replace the content with `${variable}` in the script. | | Predecessor task | Selecting a predecessor task for the current task, will set the selected predecessor task as upstream of the current task. | -- **Node name**: The node name in a workflow definition is unique. -- **Run flag**: Identifies whether this node schedules normally, if it does not need to execute, select the `prohibition execution`. -- **Descriptive information**: Describe the function of the node. -- **Task priority**: When the number of worker threads is insufficient, execute in the order of priority from high to low, and tasks with the same priority will execute in a first-in first-out order. -- **Worker grouping**: Assign tasks to the machines of the worker group to execute. If `Default` is selected, randomly select a worker machine for execution. -- **Environment Name**: Configure the environment name in which run the script. -- **Times of failed retry attempts**: The number of times the task failed to resubmit. -- **Failed retry interval**: The time interval (unit minute) for resubmitting the task after a failed task. -- **Delayed execution time**: The time (unit minute) that a task delays in execution. -- **Timeout alarm**: Check the timeout alarm and timeout failure. When the task runs exceed the "timeout", an alarm email will send and the task execution will fail. -- **Program type**: Support Java, Scala, Python and SQL four languages. -- **The class of main function**: The **full path** of Main Class, the entry point of the Flink program. -- **Main jar package**: The jar package of the Flink program (upload by Resource Center). -- **Deployment mode**: Support 3 deployment modes: cluster, local and application (Flink 1.11 and later. See also [Run an application in Application Mode](https://nightlies.apache.org/flink/flink-docs-release-1.11/ops/deployment/yarn_setup.html#run-an-application-in-application-mode)). -- **Initialization script**: Script file to initialize session context. -- **Script**: The sql script file developed by the user that should be executed. -- **Flink version**: Select version according to the execution env. -- **Task name** (optional): Flink task name. -- **JobManager memory size**: Used to set the size of jobManager memories, which can be set according to the actual production environment. -- **Number of slots**: Used to set the number of slots, which can be set according to the actual production environment. -- **TaskManager memory size**: Used to set the size of taskManager memories, which can be set according to the actual production environment. -- **Number of TaskManager**: Used to set the number of taskManagers, which can be set according to the actual production environment. -- **Parallelism**: Used to set the degree of parallelism for executing Flink tasks. -- **Main program parameters**: Set the input parameters for the Flink program and support the substitution of custom parameter variables. -- **Optional parameters**: Support `--jar`, `--files`,` --archives`, `--conf` format. -- **Resource**: Appoint resource files in the `Resource` if parameters refer to them. -- **Custom parameter**: It is a local user-defined parameter for Flink, and will replace the content with `${variable}` in the script. -- **Predecessor task**: Selecting a predecessor task for the current task, will set the selected predecessor task as upstream of the current task. ## Task Example diff --git a/docs/docs/en/guide/task/http.md b/docs/docs/en/guide/task/http.md index 9aa10500b1..75509afb6b 100644 --- a/docs/docs/en/guide/task/http.md +++ b/docs/docs/en/guide/task/http.md @@ -6,7 +6,7 @@ This node is used to perform http type tasks such as the common POST and GET req ## Create Task -- Click `Project Management -> Project Name -> Workflow Definition`, and click the "`Create Workflow`" button to enter the DAG editing page. +- Click `Project Management -> Project Name -> Workflow Definition`, and click the `Create Workflow` button to enter the DAG editing page. - Drag the from the toolbar to the drawing board. ## Task Parameters diff --git a/docs/docs/en/guide/task/jupyter.md b/docs/docs/en/guide/task/jupyter.md index b69aad6374..648c42a651 100644 --- a/docs/docs/en/guide/task/jupyter.md +++ b/docs/docs/en/guide/task/jupyter.md @@ -42,13 +42,13 @@ Click [here](https://docs.conda.io/en/latest/) for more information about `conda └── ssl ``` -> NOTE: Please follow the `conda pack` instructions above strictly, and DO NOT modify `bin/activate`. +> NOTICE: Please follow the `conda pack` instructions above strictly, and DO NOT modify `bin/activate`. > `Jupyter Task Plugin` uses `source` command to activate your packed conda environment. > If you are concerned about using `source`, choose other options to manage your python dependency. ## Create Task -- Click `Project Management-Project Name-Workflow Definition`, and click the "`Create Workflow`" button to enter the DAG editing page. +- Click `Project Management-Project Name-Workflow Definition`, and click the `Create Workflow` button to enter the DAG editing page. - Drag from the toolbar to the canvas. ## Task Parameters diff --git a/docs/docs/en/guide/task/mlflow.md b/docs/docs/en/guide/task/mlflow.md index 5c61660481..e4af9a15e3 100644 --- a/docs/docs/en/guide/task/mlflow.md +++ b/docs/docs/en/guide/task/mlflow.md @@ -117,9 +117,14 @@ You can now use this feature to run all MLFlow projects on Github (For example [ ![mlflow-models-docker-compose](../../../../img/tasks/demo/mlflow-models-docker-compose.png) +| **Parameter** | **Description** | +| ------- | ---------- | +| Max Cpu Limit | For example, `1.0` or `0.5`, the same as docker compose. | +| Max Memory Limit | For example `1G` or `500M`, the same as docker compose. | + ## Environment to Prepare -### Conda environment +### Conda Environment You need to enter the admin account to configure a conda environment variable(Please install [anaconda](https://docs.continuum.io/anaconda/install/)