Browse Source
* add data quality module * add license * add package configuration in dist pom * fix license and jar import bug * replace apache/skywalking-eyes@9bd5feb SHA * refacotr jbdc-connector and writer * modify parameter name in HiveConnector * fix checkstyle error * fix checkstyle error in dolphinschesuler-dist * fix checkstyle error in dolphinschesuler-dist * fix checkstyle error in dolphinschesuler-dist * fix duplicate code bug * fix code style bug * fix code smells * add dq relevant enums and parameter * replace apache/skywalking-eyes@9bd5feb SHA * fix Constants bug * remove the unused class * add unit test * fix code style error * add unit test * refactor data quality common entity * fix code style error * add unit test * close e2e test * fix code smell bug * modify dataquality enum value to 14 in TaskType * add data qualtiy task * update * add getDatasourceOptions interface * fix checkstyle * close e2e test * add data quality task ui * update skywalking-eyes SHA * fix style * fix eslint error * fix eslint error * test e2e * add unit test and alter dataquality task result * fix checkstyle * fix process service test error * add unit test and fix code smells * fix checkstyle error * fix unit test error * fix checkstyle error * change execute sql type name * revert ui pom.xml * fix data quality task error * fix checkstyle error * fix dq task src_connector_type ui select bug * fix spark rw postgresql bug * change mysql driver scope * fix form-create json bug * fix code smell * fix DolphinException Bug * fix ui validate rule and Alert title * fix target connection param bug * fix threshold validate change * add rule input entry index * change statistic_comparison_check logic * remove check type change * add DateExpressionReplaceUtil * fix null point expetion * fix null point expetion * fix test error * add more sql driver * fix test error and remove DateExprReplaceUtil * add get datasource tables and columns * add get datasource tables and columns * remove hive-jdbc in pom.xml * fix code smells * update sql * change the pom.xml * optimize multi_table_accuracy ui * fix v-show error * fix code smells * update sql * [Feature][DataQuality] Add data quality task ui (#5054) * add data quality task ui * update skywalking-eyes SHA * fix style * fix eslint error * fix eslint error * test e2e * fix dq task src_connector_type ui select bug * fix threshold validate change * remove check type change * add get datasource tables and columns * optimize multi_table_accuracy ui * fix v-show error * fix code smells Co-authored-by: sunchaohe <sunzhaohe@linklogis.com> * [Feature][DataQuality] Add data quality module (#4830) * add data quality module * add license * add package configuration in dist pom * fix license and jar import bug * replace apache/skywalking-eyes@9bd5feb SHA * refacotr jbdc-connector and writer * modify parameter name in HiveConnector * fix checkstyle error * fix checkstyle error in dolphinschesuler-dist * fix checkstyle error in dolphinschesuler-dist * fix checkstyle error in dolphinschesuler-dist * fix duplicate code bug * fix code style bug * fix code smells * update * close e2e test * fix spark rw postgresql bug * change mysql driver scope * add more sql driver * remove hive-jdbc in pom.xml * change the pom.xml Co-authored-by: sunchaohe <sunzhaohe@linklogis.com> * [Feature][DataQuality] Add data quality task backend (#4883) * add dq relevant enums and parameter * replace apache/skywalking-eyes@9bd5feb SHA Co-authored-by: sunchaohe <sunzhaohe@linklogis.com> * refactor data_quality_module * add header license * data quality module refactor * fix unit test error * fix checkstyle error * fix unit test error * fix checkstyle error * fix unit test error * fix code smell * fix check style * fix unit test error * task statistics value add unique code * fix unit test error * fix checkstyle error * fix checkstyle * fix security hotspot * fix unit test error * fix security hotspot * fix check * add data quality task error handling * fix unit test error * add unit test * add unit test * optimize data quality result alert * fix unit test * fix sql script error * fix bug * update sql script * fix checkstyle * add license * fix checkstyle * fix checkstyle * fix unit test * add jacoco dependencies * fix unit test * fix unit test * add jacoco dependencies * add unit test * add unit test * add license * fix checkstyle * fix pom * fix checkstyle * fix checkstyle * merge dev * fix ui error * fix pom error * fix pom error * fix test error * fix test error * mssql-jdbc exclude azure-keyvault * fix test error * merge dev and add unit test * add notes * rollback the CollectionUtils * fix * update sql * fix * fix * fix query rule page error * change dq.jar path * fix sql error * fix ui error * fix(dq): jar path&task enum description * add notes on DataQualityApplication * fix dq result jump error * fix(ui): page condition * feat(ui): add show error output path * change version * remove all chinese word in sql * merge Co-authored-by: sunchaohe <sunzhaohe@linklogis.com>3.0.0/version-upgrade
zixi0825
3 years ago
committed by
GitHub
212 changed files with 20472 additions and 69 deletions
@ -0,0 +1,192 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.api.controller; |
||||
|
||||
import static org.apache.dolphinscheduler.api.enums.Status.GET_DATASOURCE_OPTIONS_ERROR; |
||||
import static org.apache.dolphinscheduler.api.enums.Status.GET_RULE_FORM_CREATE_JSON_ERROR; |
||||
import static org.apache.dolphinscheduler.api.enums.Status.QUERY_EXECUTE_RESULT_LIST_PAGING_ERROR; |
||||
import static org.apache.dolphinscheduler.api.enums.Status.QUERY_RULE_LIST_ERROR; |
||||
import static org.apache.dolphinscheduler.api.enums.Status.QUERY_RULE_LIST_PAGING_ERROR; |
||||
|
||||
import org.apache.dolphinscheduler.api.exceptions.ApiException; |
||||
import org.apache.dolphinscheduler.api.service.DqExecuteResultService; |
||||
import org.apache.dolphinscheduler.api.service.DqRuleService; |
||||
import org.apache.dolphinscheduler.api.utils.Result; |
||||
import org.apache.dolphinscheduler.common.Constants; |
||||
import org.apache.dolphinscheduler.common.utils.ParameterUtils; |
||||
import org.apache.dolphinscheduler.dao.entity.User; |
||||
|
||||
import java.util.Map; |
||||
|
||||
import org.springframework.beans.factory.annotation.Autowired; |
||||
import org.springframework.http.HttpStatus; |
||||
import org.springframework.web.bind.annotation.GetMapping; |
||||
import org.springframework.web.bind.annotation.RequestAttribute; |
||||
import org.springframework.web.bind.annotation.RequestMapping; |
||||
import org.springframework.web.bind.annotation.RequestParam; |
||||
import org.springframework.web.bind.annotation.ResponseStatus; |
||||
import org.springframework.web.bind.annotation.RestController; |
||||
|
||||
import io.swagger.annotations.Api; |
||||
import io.swagger.annotations.ApiImplicitParam; |
||||
import io.swagger.annotations.ApiImplicitParams; |
||||
import io.swagger.annotations.ApiOperation; |
||||
import springfox.documentation.annotations.ApiIgnore; |
||||
|
||||
/** |
||||
* data quality controller |
||||
*/ |
||||
@Api(tags = "DATA_QUALITY_SERVICE") |
||||
@RestController |
||||
@RequestMapping("/data-quality") |
||||
public class DataQualityController extends BaseController { |
||||
|
||||
@Autowired |
||||
private DqRuleService dqRuleService; |
||||
|
||||
@Autowired |
||||
private DqExecuteResultService dqExecuteResultService; |
||||
|
||||
/** |
||||
* get rule from-create json |
||||
* @param ruleId ruleId |
||||
* @return from-create json |
||||
*/ |
||||
@ApiOperation(value = "getRuleFormCreateJson", notes = "GET_RULE_FORM_CREATE_JSON_NOTES") |
||||
@ApiImplicitParams({ |
||||
@ApiImplicitParam(name = "ruleId", value = "RULE_ID", dataType = "Int", example = "1") |
||||
}) |
||||
@GetMapping(value = "/getRuleFormCreateJson") |
||||
@ResponseStatus(HttpStatus.OK) |
||||
@ApiException(GET_RULE_FORM_CREATE_JSON_ERROR) |
||||
public Result getRuleFormCreateJsonById(@RequestParam(value = "ruleId") int ruleId) { |
||||
Map<String, Object> result = dqRuleService.getRuleFormCreateJsonById(ruleId); |
||||
return returnDataList(result); |
||||
} |
||||
|
||||
/** |
||||
* query rule list paging |
||||
* |
||||
* @param loginUser login user |
||||
* @param searchVal search value |
||||
* @param pageNo page number |
||||
* @param pageSize page size |
||||
* @return rule page |
||||
*/ |
||||
@ApiOperation(value = "queryRuleListPaging", notes = "QUERY_RULE_LIST_PAGING_NOTES") |
||||
@ApiImplicitParams({ |
||||
@ApiImplicitParam(name = "searchVal", value = "SEARCH_VAL", type = "String"), |
||||
@ApiImplicitParam(name = "ruleType", value = "RULE_TYPE", dataType = "Int", example = "1"), |
||||
@ApiImplicitParam(name = "startDate", value = "START_DATE", type = "String"), |
||||
@ApiImplicitParam(name = "endDate", value = "END_DATE", type = "String"), |
||||
@ApiImplicitParam(name = "pageNo", value = "PAGE_NO", dataType = "Int", example = "1"), |
||||
@ApiImplicitParam(name = "pageSize", value = "PAGE_SIZE", dataType = "Int", example = "10") |
||||
}) |
||||
@GetMapping(value = "/rule/page") |
||||
@ResponseStatus(HttpStatus.OK) |
||||
@ApiException(QUERY_RULE_LIST_PAGING_ERROR) |
||||
public Result queryRuleListPaging(@ApiIgnore @RequestAttribute(value = Constants.SESSION_USER) User loginUser, |
||||
@RequestParam(value = "searchVal", required = false) String searchVal, |
||||
@RequestParam(value = "ruleType", required = false) Integer ruleType, |
||||
@RequestParam(value = "startDate", required = false) String startTime, |
||||
@RequestParam(value = "endDate", required = false) String endTime, |
||||
@RequestParam("pageNo") Integer pageNo, |
||||
@RequestParam("pageSize") Integer pageSize) { |
||||
Result result = checkPageParams(pageNo, pageSize); |
||||
if (!result.checkResult()) { |
||||
return result; |
||||
} |
||||
searchVal = ParameterUtils.handleEscapes(searchVal); |
||||
|
||||
return dqRuleService.queryRuleListPaging(loginUser, searchVal, ruleType, startTime, endTime, pageNo, pageSize); |
||||
} |
||||
|
||||
/** |
||||
* query all rule list |
||||
* @return rule list |
||||
*/ |
||||
@ApiOperation(value = "queryRuleList", notes = "QUERY_RULE_LIST_NOTES") |
||||
@GetMapping(value = "/ruleList") |
||||
@ResponseStatus(HttpStatus.OK) |
||||
@ApiException(QUERY_RULE_LIST_ERROR) |
||||
public Result queryRuleList() { |
||||
Map<String, Object> result = dqRuleService.queryAllRuleList(); |
||||
return returnDataList(result); |
||||
} |
||||
|
||||
/** |
||||
* query task execute result list paging |
||||
* |
||||
* @param loginUser loginUser |
||||
* @param searchVal searchVal |
||||
* @param ruleType ruleType |
||||
* @param state state |
||||
* @param startTime startTime |
||||
* @param endTime endTime |
||||
* @param pageNo pageNo |
||||
* @param pageSize pageSize |
||||
* @return |
||||
*/ |
||||
@ApiOperation(value = "queryExecuteResultListPaging", notes = "QUERY_EXECUTE_RESULT_LIST_PAGING_NOTES") |
||||
@ApiImplicitParams({ |
||||
@ApiImplicitParam(name = "searchVal", value = "SEARCH_VAL", type = "String"), |
||||
@ApiImplicitParam(name = "ruleType", value = "RULE_TYPE", dataType = "Int", example = "1"), |
||||
@ApiImplicitParam(name = "state", value = "STATE", dataType = "Int", example = "1"), |
||||
@ApiImplicitParam(name = "startDate", value = "START_DATE", type = "String"), |
||||
@ApiImplicitParam(name = "endDate", value = "END_DATE", type = "String"), |
||||
@ApiImplicitParam(name = "pageNo", value = "PAGE_NO", dataType = "Int", example = "1"), |
||||
@ApiImplicitParam(name = "pageSize", value = "PAGE_SIZE", dataType = "Int", example = "10") |
||||
}) |
||||
@GetMapping(value = "/result/page") |
||||
@ResponseStatus(HttpStatus.OK) |
||||
@ApiException(QUERY_EXECUTE_RESULT_LIST_PAGING_ERROR) |
||||
public Result queryExecuteResultListPaging(@ApiIgnore @RequestAttribute(value = Constants.SESSION_USER) User loginUser, |
||||
@RequestParam(value = "searchVal", required = false) String searchVal, |
||||
@RequestParam(value = "ruleType", required = false) Integer ruleType, |
||||
@RequestParam(value = "state", required = false) Integer state, |
||||
@RequestParam(value = "startDate", required = false) String startTime, |
||||
@RequestParam(value = "endDate", required = false) String endTime, |
||||
@RequestParam("pageNo") Integer pageNo, |
||||
@RequestParam("pageSize") Integer pageSize) { |
||||
|
||||
Result result = checkPageParams(pageNo, pageSize); |
||||
if (!result.checkResult()) { |
||||
return result; |
||||
} |
||||
searchVal = ParameterUtils.handleEscapes(searchVal); |
||||
|
||||
return dqExecuteResultService.queryResultListPaging(loginUser, searchVal, state, ruleType, startTime, endTime, pageNo, pageSize); |
||||
} |
||||
|
||||
/** |
||||
* get datasource options by id |
||||
* @param datasourceId datasourceId |
||||
* @return result |
||||
*/ |
||||
@ApiOperation(value = "getDatasourceOptionsById", notes = "GET_DATASOURCE_OPTIONS_NOTES") |
||||
@ApiImplicitParams({ |
||||
@ApiImplicitParam(name = "datasourceId", value = "DATA_SOURCE_ID", dataType = "Int", example = "1") |
||||
}) |
||||
@GetMapping(value = "/getDatasourceOptionsById") |
||||
@ResponseStatus(HttpStatus.OK) |
||||
@ApiException(GET_DATASOURCE_OPTIONS_ERROR) |
||||
public Result getDatasourceOptionsById(@RequestParam(value = "datasourceId") int datasourceId) { |
||||
Map<String, Object> result = dqRuleService.getDatasourceOptionsById(datasourceId); |
||||
return returnDataList(result); |
||||
} |
||||
} |
@ -0,0 +1,63 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.api.dto; |
||||
|
||||
import org.apache.dolphinscheduler.dao.entity.DqRuleExecuteSql; |
||||
import org.apache.dolphinscheduler.dao.entity.DqRuleInputEntry; |
||||
|
||||
import java.util.List; |
||||
|
||||
/** |
||||
* RuleDefinition |
||||
*/ |
||||
public class RuleDefinition { |
||||
|
||||
/** |
||||
* rule input entry list |
||||
*/ |
||||
private List<DqRuleInputEntry> ruleInputEntryList; |
||||
|
||||
/** |
||||
* rule execute sql list |
||||
*/ |
||||
private List<DqRuleExecuteSql> executeSqlList; |
||||
|
||||
public RuleDefinition() { |
||||
} |
||||
|
||||
public RuleDefinition(List<DqRuleInputEntry> ruleInputEntryList,List<DqRuleExecuteSql> executeSqlList) { |
||||
this.ruleInputEntryList = ruleInputEntryList; |
||||
this.executeSqlList = executeSqlList; |
||||
} |
||||
|
||||
public List<DqRuleInputEntry> getRuleInputEntryList() { |
||||
return ruleInputEntryList; |
||||
} |
||||
|
||||
public void setRuleInputEntryList(List<DqRuleInputEntry> ruleInputEntryList) { |
||||
this.ruleInputEntryList = ruleInputEntryList; |
||||
} |
||||
|
||||
public List<DqRuleExecuteSql> getExecuteSqlList() { |
||||
return executeSqlList; |
||||
} |
||||
|
||||
public void setExecuteSqlList(List<DqRuleExecuteSql> executeSqlList) { |
||||
this.executeSqlList = executeSqlList; |
||||
} |
||||
} |
@ -0,0 +1,35 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.api.service; |
||||
|
||||
import org.apache.dolphinscheduler.api.utils.Result; |
||||
import org.apache.dolphinscheduler.dao.entity.User; |
||||
|
||||
/** |
||||
* DqExecuteResultService |
||||
*/ |
||||
public interface DqExecuteResultService { |
||||
|
||||
Result queryResultListPaging(User loginUser, |
||||
String searchVal, |
||||
Integer state, |
||||
Integer ruleType, |
||||
String startTime, |
||||
String endTime, |
||||
Integer pageNo, Integer pageSize); |
||||
} |
@ -0,0 +1,42 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.api.service; |
||||
|
||||
import org.apache.dolphinscheduler.api.utils.Result; |
||||
import org.apache.dolphinscheduler.dao.entity.User; |
||||
|
||||
import java.util.Map; |
||||
|
||||
/** |
||||
* DqsRuleService |
||||
*/ |
||||
public interface DqRuleService { |
||||
|
||||
Map<String, Object> getRuleFormCreateJsonById(int id); |
||||
|
||||
Map<String, Object> queryAllRuleList(); |
||||
|
||||
Result queryRuleListPaging(User loginUser, |
||||
String searchVal, |
||||
Integer ruleType, |
||||
String startTime, |
||||
String endTime, |
||||
Integer pageNo, Integer pageSize); |
||||
|
||||
Map<String,Object> getDatasourceOptionsById(int datasourceId); |
||||
} |
@ -0,0 +1,101 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.api.service.impl; |
||||
|
||||
import org.apache.dolphinscheduler.api.enums.Status; |
||||
import org.apache.dolphinscheduler.api.service.DqExecuteResultService; |
||||
import org.apache.dolphinscheduler.api.utils.PageInfo; |
||||
import org.apache.dolphinscheduler.api.utils.Result; |
||||
import org.apache.dolphinscheduler.common.utils.DateUtils; |
||||
import org.apache.dolphinscheduler.dao.entity.DqExecuteResult; |
||||
import org.apache.dolphinscheduler.dao.entity.User; |
||||
import org.apache.dolphinscheduler.dao.mapper.DqExecuteResultMapper; |
||||
import org.apache.dolphinscheduler.spi.utils.StringUtils; |
||||
|
||||
import java.util.Date; |
||||
|
||||
import org.springframework.beans.factory.annotation.Autowired; |
||||
import org.springframework.stereotype.Service; |
||||
|
||||
import com.baomidou.mybatisplus.core.metadata.IPage; |
||||
import com.baomidou.mybatisplus.extension.plugins.pagination.Page; |
||||
|
||||
/** |
||||
* DqExecuteResultServiceImpl |
||||
*/ |
||||
@Service |
||||
public class DqExecuteResultServiceImpl extends BaseServiceImpl implements DqExecuteResultService { |
||||
|
||||
@Autowired |
||||
private DqExecuteResultMapper dqExecuteResultMapper; |
||||
|
||||
@Override |
||||
public Result queryResultListPaging(User loginUser, |
||||
String searchVal, |
||||
Integer state, |
||||
Integer ruleType, |
||||
String startTime, |
||||
String endTime, |
||||
Integer pageNo, |
||||
Integer pageSize) { |
||||
|
||||
Result result = new Result(); |
||||
int[] statusArray = null; |
||||
// filter by state
|
||||
if (state != null) { |
||||
statusArray = new int[]{state}; |
||||
} |
||||
|
||||
Date start = null; |
||||
Date end = null; |
||||
try { |
||||
if (StringUtils.isNotEmpty(startTime)) { |
||||
start = DateUtils.getScheduleDate(startTime); |
||||
} |
||||
if (StringUtils.isNotEmpty(endTime)) { |
||||
end = DateUtils.getScheduleDate(endTime); |
||||
} |
||||
} catch (Exception e) { |
||||
putMsg(result, Status.REQUEST_PARAMS_NOT_VALID_ERROR, "startTime,endTime"); |
||||
return result; |
||||
} |
||||
|
||||
Page<DqExecuteResult> page = new Page<>(pageNo, pageSize); |
||||
PageInfo<DqExecuteResult> pageInfo = new PageInfo<>(pageNo, pageSize); |
||||
|
||||
if (ruleType == null) { |
||||
ruleType = -1; |
||||
} |
||||
|
||||
IPage<DqExecuteResult> dqsResultPage = |
||||
dqExecuteResultMapper.queryResultListPaging( |
||||
page, |
||||
searchVal, |
||||
loginUser.getId(), |
||||
statusArray, |
||||
ruleType, |
||||
start, |
||||
end); |
||||
|
||||
pageInfo.setTotal((int) dqsResultPage.getTotal()); |
||||
pageInfo.setTotalList(dqsResultPage.getRecords()); |
||||
result.setData(pageInfo); |
||||
putMsg(result, Status.SUCCESS); |
||||
return result; |
||||
} |
||||
} |
@ -0,0 +1,340 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.api.service.impl; |
||||
|
||||
import static org.apache.dolphinscheduler.common.Constants.DATA_LIST; |
||||
import static org.apache.dolphinscheduler.spi.utils.Constants.CHANGE; |
||||
import static org.apache.dolphinscheduler.spi.utils.Constants.SMALL; |
||||
|
||||
import org.apache.dolphinscheduler.api.dto.RuleDefinition; |
||||
import org.apache.dolphinscheduler.api.enums.Status; |
||||
import org.apache.dolphinscheduler.api.service.DqRuleService; |
||||
import org.apache.dolphinscheduler.api.utils.PageInfo; |
||||
import org.apache.dolphinscheduler.api.utils.Result; |
||||
import org.apache.dolphinscheduler.common.utils.DateUtils; |
||||
import org.apache.dolphinscheduler.common.utils.JSONUtils; |
||||
import org.apache.dolphinscheduler.dao.entity.DataSource; |
||||
import org.apache.dolphinscheduler.dao.entity.DqComparisonType; |
||||
import org.apache.dolphinscheduler.dao.entity.DqRule; |
||||
import org.apache.dolphinscheduler.dao.entity.DqRuleExecuteSql; |
||||
import org.apache.dolphinscheduler.dao.entity.DqRuleInputEntry; |
||||
import org.apache.dolphinscheduler.dao.entity.User; |
||||
import org.apache.dolphinscheduler.dao.mapper.DataSourceMapper; |
||||
import org.apache.dolphinscheduler.dao.mapper.DqComparisonTypeMapper; |
||||
import org.apache.dolphinscheduler.dao.mapper.DqRuleExecuteSqlMapper; |
||||
import org.apache.dolphinscheduler.dao.mapper.DqRuleInputEntryMapper; |
||||
import org.apache.dolphinscheduler.dao.mapper.DqRuleMapper; |
||||
import org.apache.dolphinscheduler.dao.utils.DqRuleUtils; |
||||
import org.apache.dolphinscheduler.spi.enums.DbType; |
||||
import org.apache.dolphinscheduler.spi.params.base.FormType; |
||||
import org.apache.dolphinscheduler.spi.params.base.ParamsOptions; |
||||
import org.apache.dolphinscheduler.spi.params.base.PluginParams; |
||||
import org.apache.dolphinscheduler.spi.params.base.PropsType; |
||||
import org.apache.dolphinscheduler.spi.params.base.Validate; |
||||
import org.apache.dolphinscheduler.spi.params.group.GroupParam; |
||||
import org.apache.dolphinscheduler.spi.params.group.GroupParamsProps; |
||||
import org.apache.dolphinscheduler.spi.params.input.InputParam; |
||||
import org.apache.dolphinscheduler.spi.params.input.InputParamProps; |
||||
import org.apache.dolphinscheduler.spi.params.select.SelectParam; |
||||
import org.apache.dolphinscheduler.spi.task.dq.enums.OptionSourceType; |
||||
import org.apache.dolphinscheduler.spi.utils.StringUtils; |
||||
|
||||
import org.apache.commons.collections4.CollectionUtils; |
||||
|
||||
import java.util.ArrayList; |
||||
import java.util.Collections; |
||||
import java.util.Date; |
||||
import java.util.HashMap; |
||||
import java.util.List; |
||||
import java.util.Map; |
||||
|
||||
import org.slf4j.Logger; |
||||
import org.slf4j.LoggerFactory; |
||||
import org.springframework.beans.factory.annotation.Autowired; |
||||
import org.springframework.stereotype.Service; |
||||
|
||||
import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper; |
||||
import com.baomidou.mybatisplus.core.metadata.IPage; |
||||
import com.baomidou.mybatisplus.extension.plugins.pagination.Page; |
||||
import com.fasterxml.jackson.annotation.JsonInclude; |
||||
import com.fasterxml.jackson.core.JsonProcessingException; |
||||
import com.fasterxml.jackson.databind.ObjectMapper; |
||||
|
||||
/** |
||||
* DqRuleServiceImpl |
||||
*/ |
||||
@Service |
||||
public class DqRuleServiceImpl extends BaseServiceImpl implements DqRuleService { |
||||
|
||||
private final Logger logger = LoggerFactory.getLogger(DqRuleServiceImpl.class); |
||||
|
||||
@Autowired |
||||
private DqRuleMapper dqRuleMapper; |
||||
|
||||
@Autowired |
||||
private DqRuleInputEntryMapper dqRuleInputEntryMapper; |
||||
|
||||
@Autowired |
||||
private DqRuleExecuteSqlMapper dqRuleExecuteSqlMapper; |
||||
|
||||
@Autowired |
||||
private DataSourceMapper dataSourceMapper; |
||||
|
||||
@Autowired |
||||
private DqComparisonTypeMapper dqComparisonTypeMapper; |
||||
|
||||
@Override |
||||
public Map<String, Object> getRuleFormCreateJsonById(int id) { |
||||
|
||||
Map<String, Object> result = new HashMap<>(); |
||||
|
||||
List<DqRuleInputEntry> ruleInputEntryList = dqRuleInputEntryMapper.getRuleInputEntryList(id); |
||||
|
||||
if (ruleInputEntryList == null || ruleInputEntryList.isEmpty()) { |
||||
putMsg(result, Status.QUERY_RULE_INPUT_ENTRY_LIST_ERROR); |
||||
} else { |
||||
result.put(DATA_LIST, getRuleFormCreateJson(DqRuleUtils.transformInputEntry(ruleInputEntryList))); |
||||
putMsg(result, Status.SUCCESS); |
||||
} |
||||
|
||||
return result; |
||||
} |
||||
|
||||
@Override |
||||
public Map<String, Object> queryAllRuleList() { |
||||
Map<String, Object> result = new HashMap<>(); |
||||
|
||||
List<DqRule> ruleList = |
||||
dqRuleMapper.selectList(new QueryWrapper<>()); |
||||
|
||||
result.put(DATA_LIST, ruleList); |
||||
putMsg(result, Status.SUCCESS); |
||||
|
||||
return result; |
||||
} |
||||
|
||||
@Override |
||||
public Map<String, Object> getDatasourceOptionsById(int datasourceId) { |
||||
Map<String, Object> result = new HashMap<>(); |
||||
|
||||
List<DataSource> dataSourceList = dataSourceMapper.listAllDataSourceByType(datasourceId); |
||||
List<ParamsOptions> options = null; |
||||
if (CollectionUtils.isNotEmpty(dataSourceList)) { |
||||
options = new ArrayList<>(); |
||||
|
||||
for (DataSource dataSource: dataSourceList) { |
||||
ParamsOptions childrenOption = |
||||
new ParamsOptions(dataSource.getName(),dataSource.getId(),false); |
||||
options.add(childrenOption); |
||||
} |
||||
} |
||||
|
||||
result.put(DATA_LIST, options); |
||||
putMsg(result, Status.SUCCESS); |
||||
|
||||
return result; |
||||
} |
||||
|
||||
@Override |
||||
public Result queryRuleListPaging(User loginUser, |
||||
String searchVal, |
||||
Integer ruleType, |
||||
String startTime, |
||||
String endTime, |
||||
Integer pageNo, |
||||
Integer pageSize) { |
||||
Result result = new Result(); |
||||
|
||||
Date start = null; |
||||
Date end = null; |
||||
try { |
||||
if (StringUtils.isNotEmpty(startTime)) { |
||||
start = DateUtils.getScheduleDate(startTime); |
||||
} |
||||
if (StringUtils.isNotEmpty(endTime)) { |
||||
end = DateUtils.getScheduleDate(endTime); |
||||
} |
||||
} catch (Exception e) { |
||||
putMsg(result, Status.REQUEST_PARAMS_NOT_VALID_ERROR, "startTime,endTime"); |
||||
return result; |
||||
} |
||||
|
||||
Page<DqRule> page = new Page<>(pageNo, pageSize); |
||||
PageInfo<DqRule> pageInfo = new PageInfo<>(pageNo, pageSize); |
||||
|
||||
if (ruleType == null) { |
||||
ruleType = -1; |
||||
} |
||||
|
||||
IPage<DqRule> dqRulePage = |
||||
dqRuleMapper.queryRuleListPaging( |
||||
page, |
||||
searchVal, |
||||
ruleType, |
||||
start, |
||||
end); |
||||
if (dqRulePage != null) { |
||||
List<DqRule> dataList = dqRulePage.getRecords(); |
||||
dataList.forEach(dqRule -> { |
||||
List<DqRuleInputEntry> ruleInputEntryList = |
||||
DqRuleUtils.transformInputEntry(dqRuleInputEntryMapper.getRuleInputEntryList(dqRule.getId())); |
||||
List<DqRuleExecuteSql> ruleExecuteSqlList = dqRuleExecuteSqlMapper.getExecuteSqlList(dqRule.getId()); |
||||
|
||||
RuleDefinition ruleDefinition = new RuleDefinition(ruleInputEntryList,ruleExecuteSqlList); |
||||
dqRule.setRuleJson(JSONUtils.toJsonString(ruleDefinition)); |
||||
}); |
||||
|
||||
pageInfo.setTotal((int) dqRulePage.getTotal()); |
||||
pageInfo.setTotalList(dataList); |
||||
} |
||||
|
||||
result.setData(pageInfo); |
||||
putMsg(result, Status.SUCCESS); |
||||
return result; |
||||
} |
||||
|
||||
private String getRuleFormCreateJson(List<DqRuleInputEntry> ruleInputEntryList) { |
||||
List<PluginParams> params = new ArrayList<>(); |
||||
|
||||
for (DqRuleInputEntry inputEntry : ruleInputEntryList) { |
||||
if (Boolean.TRUE.equals(inputEntry.getShow())) { |
||||
switch (FormType.of(inputEntry.getType())) { |
||||
case INPUT: |
||||
params.add(getInputParam(inputEntry)); |
||||
break; |
||||
case SELECT: |
||||
params.add(getSelectParam(inputEntry)); |
||||
break; |
||||
case TEXTAREA: |
||||
params.add(getTextareaParam(inputEntry)); |
||||
break; |
||||
case GROUP: |
||||
params.add(getGroupParam(inputEntry)); |
||||
break; |
||||
default: |
||||
break; |
||||
} |
||||
} |
||||
} |
||||
|
||||
ObjectMapper mapper = new ObjectMapper(); |
||||
mapper.setSerializationInclusion(JsonInclude.Include.NON_NULL); |
||||
String result = null; |
||||
|
||||
try { |
||||
result = mapper.writeValueAsString(params); |
||||
} catch (JsonProcessingException e) { |
||||
logger.error("json parse error : {}", e.getMessage(), e); |
||||
} |
||||
|
||||
return result; |
||||
} |
||||
|
||||
private InputParam getTextareaParam(DqRuleInputEntry inputEntry) { |
||||
|
||||
InputParamProps paramProps = |
||||
new InputParamProps(); |
||||
paramProps.setDisabled(!inputEntry.getCanEdit()); |
||||
paramProps.setSize(SMALL); |
||||
paramProps.setType(PropsType.TEXTAREA.getPropsType()); |
||||
paramProps.setRows(1); |
||||
|
||||
return InputParam |
||||
.newBuilder(inputEntry.getField(),inputEntry.getTitle()) |
||||
.addValidate(Validate.newBuilder() |
||||
.setRequired(inputEntry.getValidate()) |
||||
.build()) |
||||
.setProps(paramProps) |
||||
.setValue(inputEntry.getValue()) |
||||
.setPlaceholder(inputEntry.getPlaceholder()) |
||||
.setEmit(Boolean.TRUE.equals(inputEntry.getEmit()) ? Collections.singletonList(CHANGE) : null) |
||||
.build(); |
||||
} |
||||
|
||||
private SelectParam getSelectParam(DqRuleInputEntry inputEntry) { |
||||
List<ParamsOptions> options = null; |
||||
|
||||
switch (OptionSourceType.of(inputEntry.getOptionSourceType())) { |
||||
case DEFAULT: |
||||
String optionStr = inputEntry.getOptions(); |
||||
if (StringUtils.isNotEmpty(optionStr)) { |
||||
options = JSONUtils.toList(optionStr, ParamsOptions.class); |
||||
} |
||||
break; |
||||
case DATASOURCE_TYPE: |
||||
options = new ArrayList<>(); |
||||
ParamsOptions paramsOptions = null; |
||||
for (DbType dbtype: DbType.values()) { |
||||
paramsOptions = new ParamsOptions(dbtype.name(),dbtype.getCode(),false); |
||||
options.add(paramsOptions); |
||||
} |
||||
break; |
||||
case COMPARISON_TYPE: |
||||
options = new ArrayList<>(); |
||||
ParamsOptions comparisonOptions = null; |
||||
List<DqComparisonType> list = dqComparisonTypeMapper.selectList(new QueryWrapper<DqComparisonType>().orderByAsc("id")); |
||||
|
||||
for (DqComparisonType type: list) { |
||||
comparisonOptions = new ParamsOptions(type.getType(), type.getId(),false); |
||||
options.add(comparisonOptions); |
||||
} |
||||
break; |
||||
default: |
||||
break; |
||||
} |
||||
|
||||
return SelectParam |
||||
.newBuilder(inputEntry.getField(),inputEntry.getTitle()) |
||||
.setOptions(options) |
||||
.setValue(inputEntry.getValue()) |
||||
.setSize(SMALL) |
||||
.setPlaceHolder(inputEntry.getPlaceholder()) |
||||
.setEmit(Boolean.TRUE.equals(inputEntry.getEmit()) ? Collections.singletonList(CHANGE) : null) |
||||
.build(); |
||||
} |
||||
|
||||
private InputParam getInputParam(DqRuleInputEntry inputEntry) { |
||||
InputParamProps paramProps = |
||||
new InputParamProps(); |
||||
paramProps.setDisabled(!inputEntry.getCanEdit()); |
||||
paramProps.setSize(SMALL); |
||||
paramProps.setRows(2); |
||||
|
||||
return InputParam |
||||
.newBuilder(inputEntry.getField(),inputEntry.getTitle()) |
||||
.addValidate(Validate.newBuilder() |
||||
.setRequired(inputEntry.getValidate()) |
||||
.build()) |
||||
.setProps(paramProps) |
||||
.setValue(inputEntry.getValue()) |
||||
.setPlaceholder(inputEntry.getPlaceholder()) |
||||
.setEmit(Boolean.TRUE.equals(inputEntry.getEmit()) ? Collections.singletonList(CHANGE) : null) |
||||
.build(); |
||||
} |
||||
|
||||
private GroupParam getGroupParam(DqRuleInputEntry inputEntry) { |
||||
return GroupParam |
||||
.newBuilder(inputEntry.getField(),inputEntry.getTitle()) |
||||
.addValidate(Validate.newBuilder() |
||||
.setRequired(inputEntry.getValidate()) |
||||
.build()) |
||||
.setProps(new GroupParamsProps().setRules(JSONUtils.toList(inputEntry.getOptions(),PluginParams.class)).setFontSize(20)) |
||||
.setEmit(Boolean.TRUE.equals(inputEntry.getEmit()) ? Collections.singletonList(CHANGE) : null) |
||||
.build(); |
||||
} |
||||
} |
@ -0,0 +1,180 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.api.controller; |
||||
|
||||
import static org.mockito.Mockito.when; |
||||
|
||||
import org.apache.dolphinscheduler.api.enums.Status; |
||||
import org.apache.dolphinscheduler.api.service.impl.DqExecuteResultServiceImpl; |
||||
import org.apache.dolphinscheduler.api.service.impl.DqRuleServiceImpl; |
||||
import org.apache.dolphinscheduler.api.utils.PageInfo; |
||||
import org.apache.dolphinscheduler.api.utils.Result; |
||||
import org.apache.dolphinscheduler.common.Constants; |
||||
import org.apache.dolphinscheduler.common.enums.UserType; |
||||
import org.apache.dolphinscheduler.dao.entity.DqRule; |
||||
import org.apache.dolphinscheduler.dao.entity.User; |
||||
import org.apache.dolphinscheduler.spi.task.dq.enums.RuleType; |
||||
|
||||
import java.text.MessageFormat; |
||||
import java.util.ArrayList; |
||||
import java.util.Date; |
||||
import java.util.HashMap; |
||||
import java.util.List; |
||||
import java.util.Map; |
||||
|
||||
import org.junit.Assert; |
||||
import org.junit.Before; |
||||
import org.junit.Test; |
||||
import org.junit.runner.RunWith; |
||||
import org.mockito.InjectMocks; |
||||
import org.mockito.Mock; |
||||
import org.mockito.Mockito; |
||||
import org.mockito.junit.MockitoJUnitRunner; |
||||
|
||||
/** |
||||
* process definition controller test |
||||
*/ |
||||
@RunWith(MockitoJUnitRunner.Silent.class) |
||||
public class DataQualityControllerTest { |
||||
|
||||
@InjectMocks |
||||
private DataQualityController dataQualityController; |
||||
|
||||
@Mock |
||||
private DqRuleServiceImpl dqRuleService; |
||||
|
||||
@Mock |
||||
private DqExecuteResultServiceImpl dqExecuteResultService; |
||||
|
||||
protected User user; |
||||
|
||||
@Before |
||||
public void before() { |
||||
User loginUser = new User(); |
||||
loginUser.setId(1); |
||||
loginUser.setUserType(UserType.GENERAL_USER); |
||||
loginUser.setUserName("admin"); |
||||
|
||||
user = loginUser; |
||||
} |
||||
|
||||
@Test |
||||
public void testGetRuleFormCreateJsonById() throws Exception { |
||||
|
||||
Map<String, Object> result = new HashMap<>(); |
||||
putMsg(result, Status.SUCCESS); |
||||
result.put(Constants.DATA_LIST, 1); |
||||
|
||||
Mockito.when(dqRuleService.getRuleFormCreateJsonById(1)).thenReturn(result); |
||||
|
||||
Result response = dataQualityController.getRuleFormCreateJsonById(1); |
||||
Assert.assertEquals(Status.SUCCESS.getCode(), response.getCode().intValue()); |
||||
} |
||||
|
||||
private void putMsg(Map<String, Object> result, Status status, Object... statusParams) { |
||||
result.put(Constants.STATUS, status); |
||||
if (statusParams != null && statusParams.length > 0) { |
||||
result.put(Constants.MSG, MessageFormat.format(status.getMsg(), statusParams)); |
||||
} else { |
||||
result.put(Constants.MSG, status.getMsg()); |
||||
} |
||||
} |
||||
|
||||
public void putMsg(Result result, Status status, Object... statusParams) { |
||||
result.setCode(status.getCode()); |
||||
if (statusParams != null && statusParams.length > 0) { |
||||
result.setMsg(MessageFormat.format(status.getMsg(), statusParams)); |
||||
} else { |
||||
result.setMsg(status.getMsg()); |
||||
} |
||||
} |
||||
|
||||
private List<DqRule> getRuleList() { |
||||
List<DqRule> list = new ArrayList<>(); |
||||
DqRule rule = new DqRule(); |
||||
rule.setId(1); |
||||
rule.setName("空值检测"); |
||||
rule.setType(RuleType.SINGLE_TABLE.getCode()); |
||||
rule.setUserId(1); |
||||
rule.setUserName("admin"); |
||||
rule.setCreateTime(new Date()); |
||||
rule.setUpdateTime(new Date()); |
||||
|
||||
list.add(rule); |
||||
|
||||
return list; |
||||
} |
||||
|
||||
@Test |
||||
public void testQueryRuleListPaging() throws Exception { |
||||
|
||||
String searchVal = ""; |
||||
int ruleType = 0; |
||||
String start = "2020-01-01 00:00:00"; |
||||
String end = "2020-01-02 00:00:00"; |
||||
|
||||
PageInfo<DqRule> pageInfo = new PageInfo<>(1,10); |
||||
pageInfo.setTotal(10); |
||||
pageInfo.setTotalList(getRuleList()); |
||||
|
||||
Result result = new Result(); |
||||
result.setData(pageInfo); |
||||
putMsg(result, Status.SUCCESS); |
||||
|
||||
when(dqRuleService.queryRuleListPaging( |
||||
user, searchVal, ruleType, start, end,1, 10)).thenReturn(result); |
||||
|
||||
Result response = dataQualityController.queryRuleListPaging(user, searchVal, ruleType,start,end,1,10); |
||||
Assert.assertEquals(Status.SUCCESS.getCode(), response.getCode().intValue()); |
||||
} |
||||
|
||||
@Test |
||||
public void testQueryRuleList() throws Exception { |
||||
|
||||
Map<String, Object> result = new HashMap<>(); |
||||
putMsg(result, Status.SUCCESS); |
||||
result.put(Constants.DATA_LIST, getRuleList()); |
||||
|
||||
when(dqRuleService.queryAllRuleList()).thenReturn(result); |
||||
|
||||
Result response = dataQualityController.queryRuleList(); |
||||
Assert.assertEquals(Status.SUCCESS.getCode(), response.getCode().intValue()); |
||||
} |
||||
|
||||
@Test |
||||
public void testQueryResultListPaging() throws Exception { |
||||
|
||||
String searchVal = ""; |
||||
int ruleType = 0; |
||||
String start = "2020-01-01 00:00:00"; |
||||
String end = "2020-01-02 00:00:00"; |
||||
|
||||
PageInfo<DqRule> pageInfo = new PageInfo<>(1,10); |
||||
pageInfo.setTotal(10); |
||||
|
||||
Result result = new Result(); |
||||
result.setData(pageInfo); |
||||
putMsg(result, Status.SUCCESS); |
||||
|
||||
when(dqExecuteResultService.queryResultListPaging( |
||||
user, searchVal, 0,ruleType, start, end,1, 10)).thenReturn(result); |
||||
|
||||
Result response = dataQualityController.queryExecuteResultListPaging(user, searchVal, ruleType,0,start,end,1,10); |
||||
Assert.assertEquals(Status.SUCCESS.getCode(), response.getCode().intValue()); |
||||
} |
||||
} |
@ -0,0 +1,96 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.api.service; |
||||
|
||||
import static org.mockito.ArgumentMatchers.any; |
||||
import static org.mockito.ArgumentMatchers.eq; |
||||
import static org.mockito.Mockito.when; |
||||
|
||||
import org.apache.dolphinscheduler.api.ApiApplicationServer; |
||||
import org.apache.dolphinscheduler.api.enums.Status; |
||||
import org.apache.dolphinscheduler.api.service.impl.DqExecuteResultServiceImpl; |
||||
import org.apache.dolphinscheduler.api.utils.Result; |
||||
import org.apache.dolphinscheduler.common.enums.UserType; |
||||
import org.apache.dolphinscheduler.common.utils.DateUtils; |
||||
import org.apache.dolphinscheduler.dao.entity.DqExecuteResult; |
||||
import org.apache.dolphinscheduler.dao.entity.User; |
||||
import org.apache.dolphinscheduler.dao.mapper.DqExecuteResultMapper; |
||||
import org.apache.dolphinscheduler.spi.task.dq.enums.DqTaskState; |
||||
|
||||
import java.util.ArrayList; |
||||
import java.util.Date; |
||||
import java.util.List; |
||||
|
||||
import org.junit.Assert; |
||||
import org.junit.Test; |
||||
import org.junit.runner.RunWith; |
||||
import org.mockito.InjectMocks; |
||||
import org.mockito.Mock; |
||||
import org.mockito.junit.MockitoJUnitRunner; |
||||
import org.slf4j.Logger; |
||||
import org.slf4j.LoggerFactory; |
||||
import org.springframework.boot.test.context.SpringBootTest; |
||||
|
||||
import com.baomidou.mybatisplus.core.metadata.IPage; |
||||
import com.baomidou.mybatisplus.extension.plugins.pagination.Page; |
||||
|
||||
@RunWith(MockitoJUnitRunner.Silent.class) |
||||
@SpringBootTest(classes = ApiApplicationServer.class) |
||||
public class DqExecuteResultServiceTest { |
||||
private static final Logger logger = LoggerFactory.getLogger(DqExecuteResultServiceTest.class); |
||||
|
||||
@InjectMocks |
||||
private DqExecuteResultServiceImpl dqExecuteResultService; |
||||
|
||||
@Mock |
||||
DqExecuteResultMapper dqExecuteResultMapper; |
||||
|
||||
@Test |
||||
public void testQueryResultListPaging() { |
||||
|
||||
String searchVal = ""; |
||||
int ruleType = 0; |
||||
Date start = DateUtils.getScheduleDate("2020-01-01 00:00:00"); |
||||
Date end = DateUtils.getScheduleDate("2020-01-02 00:00:00"); |
||||
|
||||
User loginUser = new User(); |
||||
loginUser.setId(1); |
||||
loginUser.setUserType(UserType.ADMIN_USER); |
||||
|
||||
Page<DqExecuteResult> page = new Page<>(1, 10); |
||||
page.setTotal(1); |
||||
page.setRecords(getExecuteResultList()); |
||||
when(dqExecuteResultMapper.queryResultListPaging( |
||||
any(IPage.class), eq(""), eq(loginUser.getId()), any(),eq(ruleType), eq(start), eq(end))).thenReturn(page); |
||||
|
||||
Result result = dqExecuteResultService.queryResultListPaging( |
||||
loginUser,searchVal,1,0,"2020-01-01 00:00:00","2020-01-02 00:00:00",1,10); |
||||
Assert.assertEquals(Integer.valueOf(Status.SUCCESS.getCode()),result.getCode()); |
||||
} |
||||
|
||||
public List<DqExecuteResult> getExecuteResultList() { |
||||
|
||||
List<DqExecuteResult> list = new ArrayList<>(); |
||||
DqExecuteResult dqExecuteResult = new DqExecuteResult(); |
||||
dqExecuteResult.setId(1); |
||||
dqExecuteResult.setState(DqTaskState.FAILURE.getCode()); |
||||
list.add(dqExecuteResult); |
||||
|
||||
return list; |
||||
} |
||||
} |
@ -0,0 +1,237 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.api.service; |
||||
|
||||
import static org.mockito.ArgumentMatchers.any; |
||||
import static org.mockito.ArgumentMatchers.eq; |
||||
import static org.mockito.Mockito.when; |
||||
|
||||
import org.apache.dolphinscheduler.api.ApiApplicationServer; |
||||
import org.apache.dolphinscheduler.api.enums.Status; |
||||
import org.apache.dolphinscheduler.api.service.impl.DqRuleServiceImpl; |
||||
import org.apache.dolphinscheduler.api.utils.Result; |
||||
import org.apache.dolphinscheduler.common.Constants; |
||||
import org.apache.dolphinscheduler.common.enums.UserType; |
||||
import org.apache.dolphinscheduler.common.utils.DateUtils; |
||||
import org.apache.dolphinscheduler.dao.entity.DataSource; |
||||
import org.apache.dolphinscheduler.dao.entity.DqRule; |
||||
import org.apache.dolphinscheduler.dao.entity.DqRuleExecuteSql; |
||||
import org.apache.dolphinscheduler.dao.entity.DqRuleInputEntry; |
||||
import org.apache.dolphinscheduler.dao.entity.User; |
||||
import org.apache.dolphinscheduler.dao.mapper.DataSourceMapper; |
||||
import org.apache.dolphinscheduler.dao.mapper.DqRuleExecuteSqlMapper; |
||||
import org.apache.dolphinscheduler.dao.mapper.DqRuleInputEntryMapper; |
||||
import org.apache.dolphinscheduler.dao.mapper.DqRuleMapper; |
||||
import org.apache.dolphinscheduler.spi.enums.DbType; |
||||
import org.apache.dolphinscheduler.spi.params.base.FormType; |
||||
import org.apache.dolphinscheduler.spi.task.dq.enums.ExecuteSqlType; |
||||
import org.apache.dolphinscheduler.spi.task.dq.enums.InputType; |
||||
import org.apache.dolphinscheduler.spi.task.dq.enums.OptionSourceType; |
||||
import org.apache.dolphinscheduler.spi.task.dq.enums.RuleType; |
||||
import org.apache.dolphinscheduler.spi.task.dq.enums.ValueType; |
||||
|
||||
import java.util.ArrayList; |
||||
import java.util.Date; |
||||
import java.util.List; |
||||
import java.util.Map; |
||||
|
||||
import org.junit.Assert; |
||||
import org.junit.Test; |
||||
import org.junit.runner.RunWith; |
||||
import org.mockito.InjectMocks; |
||||
import org.mockito.Mock; |
||||
import org.mockito.junit.MockitoJUnitRunner; |
||||
import org.springframework.boot.test.context.SpringBootTest; |
||||
|
||||
import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper; |
||||
import com.baomidou.mybatisplus.core.metadata.IPage; |
||||
import com.baomidou.mybatisplus.extension.plugins.pagination.Page; |
||||
|
||||
@RunWith(MockitoJUnitRunner.Silent.class) |
||||
@SpringBootTest(classes = ApiApplicationServer.class) |
||||
public class DqRuleServiceTest { |
||||
|
||||
@InjectMocks |
||||
private DqRuleServiceImpl dqRuleService; |
||||
|
||||
@Mock |
||||
DqRuleMapper dqRuleMapper; |
||||
|
||||
@Mock |
||||
DqRuleInputEntryMapper dqRuleInputEntryMapper; |
||||
|
||||
@Mock |
||||
DqRuleExecuteSqlMapper dqRuleExecuteSqlMapper; |
||||
|
||||
@Mock |
||||
DataSourceMapper dataSourceMapper; |
||||
|
||||
@Test |
||||
public void testGetRuleFormCreateJsonById() { |
||||
String json = "[{\"field\":\"src_connector_type\",\"name\":\"源数据类型\",\"props\":{\"placeholder\":" |
||||
+ "\"Please select the source connector type\",\"size\":\"small\"},\"type\":\"select\",\"title\":" |
||||
+ "\"源数据类型\",\"value\":\"JDBC\",\"emit\":[\"change\"],\"options\":[{\"label\":\"HIVE\",\"value\":" |
||||
+ "\"HIVE\",\"disabled\":false},{\"label\":\"JDBC\",\"value\":\"JDBC\",\"disabled\":false}]},{\"props\":" |
||||
+ "{\"disabled\":false,\"rows\":2,\"placeholder\":\"Please enter statistics name, the alias in " |
||||
+ "statistics execute sql\",\"size\":\"small\"},\"field\":\"statistics_name\",\"name\":" |
||||
+ "\"统计值名\",\"type\":\"input\",\"title\":\"统计值名\",\"validate\":[{\"required\":true,\"type\":" |
||||
+ "\"string\",\"trigger\":\"blur\"}]},{\"props\":{\"disabled\":false,\"type\":\"textarea\",\"rows\":" |
||||
+ "1,\"placeholder\":\"Please enter the statistics execute sql\",\"size\":\"small\"},\"field\":" |
||||
+ "\"statistics_execute_sql\",\"name\":\"统计值计算SQL\",\"type\":\"input\",\"title\":" |
||||
+ "\"统计值计算SQL\",\"validate\":[{\"required\":true,\"type\":\"string\",\"trigger\":\"blur\"}]}]"; |
||||
when(dqRuleInputEntryMapper.getRuleInputEntryList(1)).thenReturn(getRuleInputEntryList()); |
||||
Map<String,Object> result = dqRuleService.getRuleFormCreateJsonById(1); |
||||
Assert.assertEquals(json,result.get(Constants.DATA_LIST)); |
||||
} |
||||
|
||||
@Test |
||||
public void testQueryAllRuleList() { |
||||
when(dqRuleMapper.selectList(new QueryWrapper<>())).thenReturn(getRuleList()); |
||||
Map<String,Object> result = dqRuleService.queryAllRuleList(); |
||||
Assert.assertEquals(Status.SUCCESS,result.get(Constants.STATUS)); |
||||
} |
||||
|
||||
@Test |
||||
public void testGetDatasourceOptionsById() { |
||||
when(dataSourceMapper.listAllDataSourceByType(DbType.MYSQL.getCode())).thenReturn(dataSourceList()); |
||||
Map<String,Object> result = dqRuleService.queryAllRuleList(); |
||||
Assert.assertEquals(Status.SUCCESS,result.get(Constants.STATUS)); |
||||
} |
||||
|
||||
@Test |
||||
public void testQueryRuleListPaging() { |
||||
|
||||
String searchVal = ""; |
||||
int ruleType = 0; |
||||
Date start = DateUtils.getScheduleDate("2020-01-01 00:00:00"); |
||||
Date end = DateUtils.getScheduleDate("2020-01-02 00:00:00"); |
||||
|
||||
User loginUser = new User(); |
||||
loginUser.setId(1); |
||||
loginUser.setUserType(UserType.ADMIN_USER); |
||||
|
||||
Page<DqRule> page = new Page<>(1, 10); |
||||
page.setTotal(1); |
||||
page.setRecords(getRuleList()); |
||||
|
||||
when(dqRuleMapper.queryRuleListPaging( |
||||
any(IPage.class), eq(""), eq(ruleType), eq(start), eq(end))).thenReturn(page); |
||||
|
||||
when(dqRuleInputEntryMapper.getRuleInputEntryList(1)).thenReturn(getRuleInputEntryList()); |
||||
when(dqRuleExecuteSqlMapper.getExecuteSqlList(1)).thenReturn(getRuleExecuteSqlList()); |
||||
|
||||
Result result = dqRuleService.queryRuleListPaging( |
||||
loginUser,searchVal,0,"2020-01-01 00:00:00","2020-01-02 00:00:00",1,10); |
||||
Assert.assertEquals(Integer.valueOf(Status.SUCCESS.getCode()),result.getCode()); |
||||
} |
||||
|
||||
private List<DataSource> dataSourceList() { |
||||
List<DataSource> dataSourceList = new ArrayList<>(); |
||||
DataSource dataSource = new DataSource(); |
||||
dataSource.setId(1); |
||||
dataSource.setName("dolphinscheduler"); |
||||
dataSource.setType(DbType.MYSQL); |
||||
dataSource.setUserId(1); |
||||
dataSource.setUserName("admin"); |
||||
dataSource.setConnectionParams(""); |
||||
dataSource.setCreateTime(new Date()); |
||||
dataSource.setUpdateTime(new Date()); |
||||
dataSourceList.add(dataSource); |
||||
|
||||
return dataSourceList; |
||||
} |
||||
|
||||
private List<DqRule> getRuleList() { |
||||
List<DqRule> list = new ArrayList<>(); |
||||
DqRule rule = new DqRule(); |
||||
rule.setId(1); |
||||
rule.setName("空值检测"); |
||||
rule.setType(RuleType.SINGLE_TABLE.getCode()); |
||||
rule.setUserId(1); |
||||
rule.setUserName("admin"); |
||||
rule.setCreateTime(new Date()); |
||||
rule.setUpdateTime(new Date()); |
||||
|
||||
list.add(rule); |
||||
|
||||
return list; |
||||
} |
||||
|
||||
private List<DqRuleInputEntry> getRuleInputEntryList() { |
||||
List<DqRuleInputEntry> list = new ArrayList<>(); |
||||
|
||||
DqRuleInputEntry srcConnectorType = new DqRuleInputEntry(); |
||||
srcConnectorType.setTitle("源数据类型"); |
||||
srcConnectorType.setField("src_connector_type"); |
||||
srcConnectorType.setType(FormType.SELECT.getFormType()); |
||||
srcConnectorType.setCanEdit(true); |
||||
srcConnectorType.setShow(true); |
||||
srcConnectorType.setValue("JDBC"); |
||||
srcConnectorType.setPlaceholder("Please select the source connector type"); |
||||
srcConnectorType.setOptionSourceType(OptionSourceType.DEFAULT.getCode()); |
||||
srcConnectorType.setOptions("[{\"label\":\"HIVE\",\"value\":\"HIVE\"},{\"label\":\"JDBC\",\"value\":\"JDBC\"}]"); |
||||
srcConnectorType.setInputType(InputType.DEFAULT.getCode()); |
||||
srcConnectorType.setValueType(ValueType.NUMBER.getCode()); |
||||
srcConnectorType.setEmit(true); |
||||
srcConnectorType.setValidate(true); |
||||
|
||||
DqRuleInputEntry statisticsName = new DqRuleInputEntry(); |
||||
statisticsName.setTitle("统计值名"); |
||||
statisticsName.setField("statistics_name"); |
||||
statisticsName.setType(FormType.INPUT.getFormType()); |
||||
statisticsName.setCanEdit(true); |
||||
statisticsName.setShow(true); |
||||
statisticsName.setPlaceholder("Please enter statistics name, the alias in statistics execute sql"); |
||||
statisticsName.setOptionSourceType(OptionSourceType.DEFAULT.getCode()); |
||||
statisticsName.setInputType(InputType.DEFAULT.getCode()); |
||||
statisticsName.setValueType(ValueType.STRING.getCode()); |
||||
statisticsName.setEmit(false); |
||||
statisticsName.setValidate(true); |
||||
|
||||
DqRuleInputEntry statisticsExecuteSql = new DqRuleInputEntry(); |
||||
statisticsExecuteSql.setTitle("统计值计算SQL"); |
||||
statisticsExecuteSql.setField("statistics_execute_sql"); |
||||
statisticsExecuteSql.setType(FormType.TEXTAREA.getFormType()); |
||||
statisticsExecuteSql.setCanEdit(true); |
||||
statisticsExecuteSql.setShow(true); |
||||
statisticsExecuteSql.setPlaceholder("Please enter the statistics execute sql"); |
||||
statisticsExecuteSql.setOptionSourceType(OptionSourceType.DEFAULT.getCode()); |
||||
statisticsExecuteSql.setValueType(ValueType.LIKE_SQL.getCode()); |
||||
statisticsExecuteSql.setEmit(false); |
||||
statisticsExecuteSql.setValidate(true); |
||||
|
||||
list.add(srcConnectorType); |
||||
list.add(statisticsName); |
||||
list.add(statisticsExecuteSql); |
||||
|
||||
return list; |
||||
} |
||||
|
||||
private List<DqRuleExecuteSql> getRuleExecuteSqlList() { |
||||
List<DqRuleExecuteSql> list = new ArrayList<>(); |
||||
|
||||
DqRuleExecuteSql executeSqlDefinition = new DqRuleExecuteSql(); |
||||
executeSqlDefinition.setIndex(0); |
||||
executeSqlDefinition.setSql("SELECT COUNT(*) AS total FROM ${src_table} WHERE (${src_filter})"); |
||||
executeSqlDefinition.setTableAlias("total_count"); |
||||
executeSqlDefinition.setType(ExecuteSqlType.COMPARISON.getCode()); |
||||
list.add(executeSqlDefinition); |
||||
|
||||
return list; |
||||
} |
||||
} |
@ -0,0 +1,103 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.common.task.dq; |
||||
|
||||
import org.apache.dolphinscheduler.common.process.ResourceInfo; |
||||
import org.apache.dolphinscheduler.common.task.AbstractParameters; |
||||
import org.apache.dolphinscheduler.common.task.spark.SparkParameters; |
||||
|
||||
import org.apache.commons.collections.MapUtils; |
||||
|
||||
import java.util.ArrayList; |
||||
import java.util.List; |
||||
import java.util.Map; |
||||
|
||||
import org.slf4j.Logger; |
||||
import org.slf4j.LoggerFactory; |
||||
|
||||
/** |
||||
* DataQualityParameters |
||||
*/ |
||||
public class DataQualityParameters extends AbstractParameters { |
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(DataQualityParameters.class); |
||||
|
||||
/** |
||||
* rule id |
||||
*/ |
||||
private int ruleId; |
||||
/** |
||||
* rule input entry value map |
||||
*/ |
||||
private Map<String,String> ruleInputParameter; |
||||
/** |
||||
* spark parameters |
||||
*/ |
||||
private SparkParameters sparkParameters; |
||||
|
||||
public int getRuleId() { |
||||
return ruleId; |
||||
} |
||||
|
||||
public void setRuleId(int ruleId) { |
||||
this.ruleId = ruleId; |
||||
} |
||||
|
||||
public Map<String, String> getRuleInputParameter() { |
||||
return ruleInputParameter; |
||||
} |
||||
|
||||
public void setRuleInputParameter(Map<String, String> ruleInputParameter) { |
||||
this.ruleInputParameter = ruleInputParameter; |
||||
} |
||||
|
||||
/** |
||||
* In this function ,we need more detailed check every parameter, |
||||
* if the parameter is non-conformant will return false |
||||
* @return boolean result |
||||
*/ |
||||
@Override |
||||
public boolean checkParameters() { |
||||
|
||||
if (ruleId == 0) { |
||||
logger.error("rule id is null"); |
||||
return false; |
||||
} |
||||
|
||||
if (MapUtils.isEmpty(ruleInputParameter)) { |
||||
logger.error("rule input parameter is empty"); |
||||
return false; |
||||
} |
||||
|
||||
return sparkParameters != null; |
||||
} |
||||
|
||||
@Override |
||||
public List<ResourceInfo> getResourceFilesList() { |
||||
return new ArrayList<>(); |
||||
} |
||||
|
||||
public SparkParameters getSparkParameters() { |
||||
return sparkParameters; |
||||
} |
||||
|
||||
public void setSparkParameters(SparkParameters sparkParameters) { |
||||
this.sparkParameters = sparkParameters; |
||||
} |
||||
|
||||
} |
@ -0,0 +1,132 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.common.task; |
||||
|
||||
import org.apache.dolphinscheduler.common.task.dq.DataQualityParameters; |
||||
import org.apache.dolphinscheduler.common.task.spark.SparkParameters; |
||||
import org.apache.dolphinscheduler.spi.params.base.ParamsOptions; |
||||
import org.apache.dolphinscheduler.spi.params.base.PluginParams; |
||||
import org.apache.dolphinscheduler.spi.params.base.TriggerType; |
||||
import org.apache.dolphinscheduler.spi.params.base.Validate; |
||||
import org.apache.dolphinscheduler.spi.params.input.InputParam; |
||||
import org.apache.dolphinscheduler.spi.params.input.InputParamProps; |
||||
import org.apache.dolphinscheduler.spi.params.select.SelectParam; |
||||
import org.apache.dolphinscheduler.spi.params.select.SelectParamProps; |
||||
|
||||
import java.util.ArrayList; |
||||
import java.util.HashMap; |
||||
import java.util.List; |
||||
import java.util.Map; |
||||
|
||||
import org.junit.Assert; |
||||
import org.junit.Before; |
||||
import org.junit.Test; |
||||
|
||||
import com.fasterxml.jackson.annotation.JsonInclude; |
||||
import com.fasterxml.jackson.core.JsonProcessingException; |
||||
import com.fasterxml.jackson.databind.ObjectMapper; |
||||
|
||||
/** |
||||
* DataQualityParameterTest |
||||
*/ |
||||
public class DataQualityParameterTest { |
||||
|
||||
private DataQualityParameters dataQualityParameters = null; |
||||
|
||||
@Before |
||||
public void before() { |
||||
dataQualityParameters = new DataQualityParameters(); |
||||
dataQualityParameters.setRuleId(1); |
||||
dataQualityParameters.setSparkParameters(new SparkParameters()); |
||||
} |
||||
|
||||
@Test |
||||
public void testCheckParameterNormal() { |
||||
|
||||
Map<String,String> inputParameterValue = new HashMap<>(); |
||||
inputParameterValue.put("src_connector_type","JDBC"); |
||||
inputParameterValue.put("src_datasource_id","1"); |
||||
inputParameterValue.put("src_table","test1"); |
||||
inputParameterValue.put("src_filter","date=2012-10-05"); |
||||
inputParameterValue.put("src_field","id"); |
||||
|
||||
inputParameterValue.put("rule_type","1"); |
||||
inputParameterValue.put("process_definition_id","1"); |
||||
inputParameterValue.put("task_instance_id","1"); |
||||
inputParameterValue.put("check_type","1"); |
||||
inputParameterValue.put("threshold","1000"); |
||||
inputParameterValue.put("create_time","2012-10-05"); |
||||
inputParameterValue.put("update_time","2012-10-05"); |
||||
|
||||
dataQualityParameters.setRuleInputParameter(inputParameterValue); |
||||
|
||||
Assert.assertTrue(dataQualityParameters.checkParameters()); |
||||
} |
||||
|
||||
@Test |
||||
public void testRuleInputParameter() { |
||||
String formCreateJson = "[{\"field\":\"src_connector_type\",\"name\":\"源数据类型\"," |
||||
+ "\"props\":{\"disabled\":false,\"multiple\":false,\"size\":\"small\"}," |
||||
+ "\"type\":\"select\",\"title\":\"源数据类型\",\"value\":\"JDBC\"," |
||||
+ "\"options\":[{\"label\":\"HIVE\",\"value\":\"HIVE\",\"disabled\":false}," |
||||
+ "{\"label\":\"JDBC\",\"value\":\"JDBC\",\"disabled\":false}]}," |
||||
+ "{\"props\":{\"disabled\":false,\"rows\":0,\"placeholder\":\"Please enter source table name\"," |
||||
+ "\"size\":\"small\"},\"field\":\"src_table\",\"name\":\"源数据表\"," |
||||
+ "\"type\":\"input\",\"title\":\"源数据表\",\"validate\":[{\"required\":true,\"type\":\"string\"," |
||||
+ "\"trigger\":\"blur\"}]}]"; |
||||
|
||||
List<PluginParams> pluginParamsList = new ArrayList<>(); |
||||
SelectParamProps selectParamProps = new SelectParamProps(); |
||||
selectParamProps.setMultiple(false); |
||||
selectParamProps.setDisabled(false); |
||||
selectParamProps.setSize("small"); |
||||
|
||||
SelectParam srcConnectorType = SelectParam.newBuilder("src_connector_type","源数据类型") |
||||
.setProps(selectParamProps) |
||||
.addOptions(new ParamsOptions("HIVE","HIVE",false)) |
||||
.addOptions(new ParamsOptions("JDBC","JDBC",false)) |
||||
.setValue("JDBC") |
||||
.build(); |
||||
|
||||
InputParamProps inputParamProps = new InputParamProps(); |
||||
inputParamProps.setPlaceholder("Please enter source table name"); |
||||
inputParamProps.setDisabled(false); |
||||
inputParamProps.setSize("small"); |
||||
inputParamProps.setRows(0); |
||||
|
||||
InputParam srcTable = InputParam.newBuilder("src_table","源数据表") |
||||
.setProps(inputParamProps) |
||||
.addValidate(Validate.newBuilder().setType("string").setRequired(true).setTrigger(TriggerType.BLUR.getTriggerType()).build()) |
||||
.build(); |
||||
|
||||
pluginParamsList.add(srcConnectorType); |
||||
pluginParamsList.add(srcTable); |
||||
|
||||
ObjectMapper mapper = new ObjectMapper(); |
||||
mapper.setSerializationInclusion(JsonInclude.Include.NON_NULL); |
||||
String result = null; |
||||
|
||||
try { |
||||
result = mapper.writeValueAsString(pluginParamsList); |
||||
} catch (JsonProcessingException e) { |
||||
Assert.fail(); |
||||
} |
||||
|
||||
Assert.assertEquals(formCreateJson,result); |
||||
} |
||||
} |
@ -0,0 +1,151 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.entity; |
||||
|
||||
import java.io.Serializable; |
||||
import java.util.Date; |
||||
|
||||
import com.baomidou.mybatisplus.annotation.IdType; |
||||
import com.baomidou.mybatisplus.annotation.TableField; |
||||
import com.baomidou.mybatisplus.annotation.TableId; |
||||
import com.baomidou.mybatisplus.annotation.TableName; |
||||
import com.fasterxml.jackson.annotation.JsonFormat; |
||||
|
||||
@TableName("t_ds_dq_comparison_type") |
||||
public class DqComparisonType implements Serializable { |
||||
/** |
||||
* primary key |
||||
*/ |
||||
@TableId(value = "id", type = IdType.AUTO) |
||||
private int id; |
||||
/** |
||||
* type |
||||
*/ |
||||
@TableField(value = "type") |
||||
private String type; |
||||
/** |
||||
* execute sql |
||||
*/ |
||||
@TableField(value = "execute_sql") |
||||
private String executeSql; |
||||
/** |
||||
* output table |
||||
*/ |
||||
@TableField(value = "output_table") |
||||
private String outputTable; |
||||
/** |
||||
* comparison name |
||||
*/ |
||||
@TableField(value = "name") |
||||
private String name; |
||||
/** |
||||
* is inner source |
||||
*/ |
||||
@TableField(value = "is_inner_source") |
||||
private Boolean isInnerSource; |
||||
/** |
||||
* create_time |
||||
*/ |
||||
@TableField(value = "create_time") |
||||
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8") |
||||
private Date createTime; |
||||
/** |
||||
* update_time |
||||
*/ |
||||
@TableField(value = "update_time") |
||||
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8") |
||||
private Date updateTime; |
||||
|
||||
public int getId() { |
||||
return id; |
||||
} |
||||
|
||||
public void setId(int id) { |
||||
this.id = id; |
||||
} |
||||
|
||||
public String getType() { |
||||
return type; |
||||
} |
||||
|
||||
public void setType(String type) { |
||||
this.type = type; |
||||
} |
||||
|
||||
public String getExecuteSql() { |
||||
return executeSql; |
||||
} |
||||
|
||||
public void setExecuteSql(String executeSql) { |
||||
this.executeSql = executeSql; |
||||
} |
||||
|
||||
public String getOutputTable() { |
||||
return outputTable; |
||||
} |
||||
|
||||
public void setOutputTable(String outputTable) { |
||||
this.outputTable = outputTable; |
||||
} |
||||
|
||||
public String getName() { |
||||
return name; |
||||
} |
||||
|
||||
public void setName(String name) { |
||||
this.name = name; |
||||
} |
||||
|
||||
public Boolean getInnerSource() { |
||||
return isInnerSource; |
||||
} |
||||
|
||||
public void setInnerSource(Boolean innerSource) { |
||||
isInnerSource = innerSource; |
||||
} |
||||
|
||||
public Date getCreateTime() { |
||||
return createTime; |
||||
} |
||||
|
||||
public void setCreateTime(Date createTime) { |
||||
this.createTime = createTime; |
||||
} |
||||
|
||||
public Date getUpdateTime() { |
||||
return updateTime; |
||||
} |
||||
|
||||
public void setUpdateTime(Date updateTime) { |
||||
this.updateTime = updateTime; |
||||
} |
||||
|
||||
@Override |
||||
public String toString() { |
||||
return "DqComparisonType{" |
||||
+ "id=" + id |
||||
+ ", type='" + type + '\'' |
||||
+ ", executeSql='" + executeSql + '\'' |
||||
+ ", outputTable='" + outputTable + '\'' |
||||
+ ", name='" + name + '\'' |
||||
+ ", isInnerSource='" + isInnerSource + '\'' |
||||
+ ", createTime=" + createTime |
||||
+ ", updateTime=" + updateTime |
||||
+ '}'; |
||||
} |
||||
} |
@ -0,0 +1,389 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.entity; |
||||
|
||||
import java.io.Serializable; |
||||
import java.util.Date; |
||||
|
||||
import com.baomidou.mybatisplus.annotation.IdType; |
||||
import com.baomidou.mybatisplus.annotation.TableField; |
||||
import com.baomidou.mybatisplus.annotation.TableId; |
||||
import com.baomidou.mybatisplus.annotation.TableName; |
||||
import com.fasterxml.jackson.annotation.JsonFormat; |
||||
|
||||
@TableName("t_ds_dq_execute_result") |
||||
public class DqExecuteResult implements Serializable { |
||||
/** |
||||
* primary key |
||||
*/ |
||||
@TableId(value = "id", type = IdType.AUTO) |
||||
private int id; |
||||
/** |
||||
* process defined id |
||||
*/ |
||||
@TableField(value = "process_definition_id") |
||||
private long processDefinitionId; |
||||
/** |
||||
* process definition name |
||||
*/ |
||||
@TableField(exist = false) |
||||
private String processDefinitionName; |
||||
/** |
||||
* process definition code |
||||
*/ |
||||
@TableField(exist = false) |
||||
private long processDefinitionCode; |
||||
/** |
||||
* process instance id |
||||
*/ |
||||
@TableField(value = "process_instance_id") |
||||
private long processInstanceId; |
||||
/** |
||||
* process instance name |
||||
*/ |
||||
@TableField(exist = false) |
||||
private String processInstanceName; |
||||
/** |
||||
* project code |
||||
*/ |
||||
@TableField(exist = false) |
||||
private long projectCode; |
||||
/** |
||||
* task instance id |
||||
*/ |
||||
@TableField(value = "task_instance_id") |
||||
private long taskInstanceId; |
||||
/** |
||||
* task name |
||||
*/ |
||||
@TableField(exist = false) |
||||
private String taskName; |
||||
/** |
||||
* rule type |
||||
*/ |
||||
@TableField(value = "rule_type") |
||||
private int ruleType; |
||||
/** |
||||
* rule name |
||||
*/ |
||||
@TableField(value = "rule_name") |
||||
private String ruleName; |
||||
/** |
||||
* statistics value |
||||
*/ |
||||
@TableField(value = "statistics_value") |
||||
private double statisticsValue; |
||||
/** |
||||
* comparison value |
||||
*/ |
||||
@TableField(value = "comparison_value") |
||||
private double comparisonValue; |
||||
/** |
||||
* comparison type |
||||
*/ |
||||
@TableField(value = "comparison_type") |
||||
private int comparisonType; |
||||
/** |
||||
* comparison type name |
||||
*/ |
||||
@TableField(exist = false) |
||||
private String comparisonTypeName; |
||||
/** |
||||
* check type |
||||
*/ |
||||
@TableField(value = "check_type") |
||||
private int checkType; |
||||
/** |
||||
* threshold |
||||
*/ |
||||
@TableField(value = "threshold") |
||||
private double threshold; |
||||
/** |
||||
* operator |
||||
*/ |
||||
@TableField(value = "operator") |
||||
private int operator; |
||||
/** |
||||
* failure strategy |
||||
*/ |
||||
@TableField(value = "failure_strategy") |
||||
private int failureStrategy; |
||||
/** |
||||
* user id |
||||
*/ |
||||
@TableField(value = "user_id") |
||||
private int userId; |
||||
/** |
||||
* user name |
||||
*/ |
||||
@TableField(exist = false) |
||||
private String userName; |
||||
/** |
||||
* state |
||||
*/ |
||||
@TableField(value = "state") |
||||
private int state; |
||||
/** |
||||
* error output path |
||||
*/ |
||||
@TableField(value = "error_output_path") |
||||
private String errorOutputPath; |
||||
/** |
||||
* create_time |
||||
*/ |
||||
@TableField(value = "create_time") |
||||
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8") |
||||
private Date createTime; |
||||
/** |
||||
* update_time |
||||
*/ |
||||
@TableField(value = "update_time") |
||||
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8") |
||||
private Date updateTime; |
||||
|
||||
public int getId() { |
||||
return id; |
||||
} |
||||
|
||||
public void setId(int id) { |
||||
this.id = id; |
||||
} |
||||
|
||||
public long getProcessDefinitionId() { |
||||
return processDefinitionId; |
||||
} |
||||
|
||||
public void setProcessDefinitionId(long processDefinitionId) { |
||||
this.processDefinitionId = processDefinitionId; |
||||
} |
||||
|
||||
public long getTaskInstanceId() { |
||||
return taskInstanceId; |
||||
} |
||||
|
||||
public void setTaskInstanceId(long taskInstanceId) { |
||||
this.taskInstanceId = taskInstanceId; |
||||
} |
||||
|
||||
public long getProcessInstanceId() { |
||||
return processInstanceId; |
||||
} |
||||
|
||||
public void setProcessInstanceId(long processInstanceId) { |
||||
this.processInstanceId = processInstanceId; |
||||
} |
||||
|
||||
public String getProcessInstanceName() { |
||||
return processInstanceName; |
||||
} |
||||
|
||||
public void setProcessInstanceName(String processInstanceName) { |
||||
this.processInstanceName = processInstanceName; |
||||
} |
||||
|
||||
public long getProjectCode() { |
||||
return projectCode; |
||||
} |
||||
|
||||
public void setProjectCode(long projectCode) { |
||||
this.projectCode = projectCode; |
||||
} |
||||
|
||||
public String getRuleName() { |
||||
return ruleName; |
||||
} |
||||
|
||||
public void setRuleName(String ruleName) { |
||||
this.ruleName = ruleName; |
||||
} |
||||
|
||||
public double getStatisticsValue() { |
||||
return statisticsValue; |
||||
} |
||||
|
||||
public void setStatisticsValue(double statisticsValue) { |
||||
this.statisticsValue = statisticsValue; |
||||
} |
||||
|
||||
public double getComparisonValue() { |
||||
return comparisonValue; |
||||
} |
||||
|
||||
public void setComparisonValue(double comparisonValue) { |
||||
this.comparisonValue = comparisonValue; |
||||
} |
||||
|
||||
public double getThreshold() { |
||||
return threshold; |
||||
} |
||||
|
||||
public void setThreshold(double threshold) { |
||||
this.threshold = threshold; |
||||
} |
||||
|
||||
public int getOperator() { |
||||
return operator; |
||||
} |
||||
|
||||
public void setOperator(int operator) { |
||||
this.operator = operator; |
||||
} |
||||
|
||||
public int getFailureStrategy() { |
||||
return failureStrategy; |
||||
} |
||||
|
||||
public void setFailureStrategy(int failureStrategy) { |
||||
this.failureStrategy = failureStrategy; |
||||
} |
||||
|
||||
public int getUserId() { |
||||
return userId; |
||||
} |
||||
|
||||
public void setUserId(int userId) { |
||||
this.userId = userId; |
||||
} |
||||
|
||||
public String getUserName() { |
||||
return userName; |
||||
} |
||||
|
||||
public void setUserName(String userName) { |
||||
this.userName = userName; |
||||
} |
||||
|
||||
public int getRuleType() { |
||||
return ruleType; |
||||
} |
||||
|
||||
public void setRuleType(int ruleType) { |
||||
this.ruleType = ruleType; |
||||
} |
||||
|
||||
public int getCheckType() { |
||||
return checkType; |
||||
} |
||||
|
||||
public void setCheckType(int checkType) { |
||||
this.checkType = checkType; |
||||
} |
||||
|
||||
public int getState() { |
||||
return state; |
||||
} |
||||
|
||||
public void setState(int state) { |
||||
this.state = state; |
||||
} |
||||
|
||||
public Date getCreateTime() { |
||||
return createTime; |
||||
} |
||||
|
||||
public void setCreateTime(Date createTime) { |
||||
this.createTime = createTime; |
||||
} |
||||
|
||||
public Date getUpdateTime() { |
||||
return updateTime; |
||||
} |
||||
|
||||
public void setUpdateTime(Date updateTime) { |
||||
this.updateTime = updateTime; |
||||
} |
||||
|
||||
public String getProcessDefinitionName() { |
||||
return processDefinitionName; |
||||
} |
||||
|
||||
public void setProcessDefinitionName(String processDefinitionName) { |
||||
this.processDefinitionName = processDefinitionName; |
||||
} |
||||
|
||||
public long getProcessDefinitionCode() { |
||||
return processDefinitionCode; |
||||
} |
||||
|
||||
public void setProcessDefinitionCode(long processDefinitionCode) { |
||||
this.processDefinitionCode = processDefinitionCode; |
||||
} |
||||
|
||||
public String getTaskName() { |
||||
return taskName; |
||||
} |
||||
|
||||
public void setTaskName(String taskName) { |
||||
this.taskName = taskName; |
||||
} |
||||
|
||||
public int getComparisonType() { |
||||
return comparisonType; |
||||
} |
||||
|
||||
public void setComparisonType(int comparisonType) { |
||||
this.comparisonType = comparisonType; |
||||
} |
||||
|
||||
public String getComparisonTypeName() { |
||||
return comparisonTypeName; |
||||
} |
||||
|
||||
public void setComparisonTypeName(String comparisonTypeName) { |
||||
this.comparisonTypeName = comparisonTypeName; |
||||
} |
||||
|
||||
public String getErrorOutputPath() { |
||||
return errorOutputPath; |
||||
} |
||||
|
||||
public void setErrorOutputPath(String errorOutputPath) { |
||||
this.errorOutputPath = errorOutputPath; |
||||
} |
||||
|
||||
@Override |
||||
public String toString() { |
||||
return "DqExecuteResult{" |
||||
+ "id=" + id |
||||
+ ", processDefinitionId=" + processDefinitionId |
||||
+ ", processDefinitionName='" + processDefinitionName + '\'' |
||||
+ ", processDefinitionCode='" + processDefinitionCode + '\'' |
||||
+ ", processInstanceId=" + processInstanceId |
||||
+ ", processInstanceName='" + processInstanceName + '\'' |
||||
+ ", projectCode='" + projectCode + '\'' |
||||
+ ", taskInstanceId=" + taskInstanceId |
||||
+ ", taskName='" + taskName + '\'' |
||||
+ ", ruleType=" + ruleType |
||||
+ ", ruleName='" + ruleName + '\'' |
||||
+ ", statisticsValue=" + statisticsValue |
||||
+ ", comparisonValue=" + comparisonValue |
||||
+ ", comparisonType=" + comparisonType |
||||
+ ", comparisonTypeName=" + comparisonTypeName |
||||
+ ", checkType=" + checkType |
||||
+ ", threshold=" + threshold |
||||
+ ", operator=" + operator |
||||
+ ", failureStrategy=" + failureStrategy |
||||
+ ", userId=" + userId |
||||
+ ", userName='" + userName + '\'' |
||||
+ ", state=" + state |
||||
+ ", errorOutputPath=" + errorOutputPath |
||||
+ ", createTime=" + createTime |
||||
+ ", updateTime=" + updateTime |
||||
+ '}'; |
||||
} |
||||
} |
@ -0,0 +1,257 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.entity; |
||||
|
||||
import java.io.Serializable; |
||||
|
||||
import com.fasterxml.jackson.annotation.JsonInclude; |
||||
import com.fasterxml.jackson.annotation.JsonInclude.Include; |
||||
import com.fasterxml.jackson.annotation.JsonProperty; |
||||
|
||||
@JsonInclude(Include.NON_NULL) |
||||
public class DqExecuteResultAlertContent implements Serializable { |
||||
|
||||
/** |
||||
* process_defined_id |
||||
*/ |
||||
@JsonProperty(value = "processDefinitionId") |
||||
private long processDefinitionId; |
||||
/** |
||||
* process define name |
||||
*/ |
||||
@JsonProperty("processDefinitionName") |
||||
private String processDefinitionName; |
||||
/** |
||||
* process_instance_id |
||||
*/ |
||||
@JsonProperty(value = "processInstanceId") |
||||
private long processInstanceId; |
||||
/** |
||||
* process instance name |
||||
*/ |
||||
@JsonProperty("processInstanceName") |
||||
private String processInstanceName; |
||||
/** |
||||
* task_instance_id |
||||
*/ |
||||
@JsonProperty(value = "taskInstanceId") |
||||
private long taskInstanceId; |
||||
/** |
||||
* task name |
||||
*/ |
||||
@JsonProperty("taskName") |
||||
private String taskName; |
||||
/** |
||||
* rule_type |
||||
*/ |
||||
@JsonProperty(value = "ruleType") |
||||
private int ruleType; |
||||
/** |
||||
* rule_name |
||||
*/ |
||||
@JsonProperty(value = "ruleName") |
||||
private String ruleName; |
||||
/** |
||||
* statistics_value |
||||
*/ |
||||
@JsonProperty(value = "statisticsValue") |
||||
private double statisticsValue; |
||||
/** |
||||
* comparison_value |
||||
*/ |
||||
@JsonProperty(value = "comparisonValue") |
||||
private double comparisonValue; |
||||
/** |
||||
* check_type |
||||
*/ |
||||
@JsonProperty(value = "checkType") |
||||
private int checkType; |
||||
/** |
||||
* task_instance_id |
||||
*/ |
||||
@JsonProperty(value = "threshold") |
||||
private double threshold; |
||||
/** |
||||
* operator |
||||
*/ |
||||
@JsonProperty(value = "operator") |
||||
private int operator; |
||||
/** |
||||
* operator |
||||
*/ |
||||
@JsonProperty(value = "failureStrategy") |
||||
private int failureStrategy; |
||||
/** |
||||
* user id |
||||
*/ |
||||
@JsonProperty(value = "userId") |
||||
private int userId; |
||||
/** |
||||
* user_name |
||||
*/ |
||||
@JsonProperty("userName") |
||||
private String userName; |
||||
/** |
||||
* state |
||||
*/ |
||||
@JsonProperty(value = "state") |
||||
private int state; |
||||
|
||||
@JsonProperty(value = "errorDataPath") |
||||
private String errorDataPath; |
||||
|
||||
public DqExecuteResultAlertContent(Builder builder) { |
||||
this.processDefinitionId = builder.processDefinitionId; |
||||
this.processDefinitionName = builder.processDefinitionName; |
||||
this.processInstanceId = builder.processInstanceId; |
||||
this.processInstanceName = builder.processInstanceName; |
||||
this.taskInstanceId = builder.taskInstanceId; |
||||
this.taskName = builder.taskName; |
||||
this.ruleType = builder.ruleType; |
||||
this.ruleName = builder.ruleName; |
||||
this.statisticsValue = builder.statisticsValue; |
||||
this.comparisonValue = builder.comparisonValue; |
||||
this.checkType = builder.checkType; |
||||
this.threshold = builder.threshold; |
||||
this.operator = builder.operator; |
||||
this.failureStrategy = builder.failureStrategy; |
||||
this.userId = builder.userId; |
||||
this.userName = builder.userName; |
||||
this.state = builder.state; |
||||
this.errorDataPath = builder.errorDataPath; |
||||
} |
||||
|
||||
public static Builder newBuilder() { |
||||
return new Builder(); |
||||
} |
||||
|
||||
public static class Builder { |
||||
private long processDefinitionId; |
||||
private String processDefinitionName; |
||||
private long processInstanceId; |
||||
private String processInstanceName; |
||||
private long taskInstanceId; |
||||
private String taskName; |
||||
private int ruleType; |
||||
private String ruleName; |
||||
private double statisticsValue; |
||||
private double comparisonValue; |
||||
private int checkType; |
||||
private double threshold; |
||||
private int operator; |
||||
private int failureStrategy; |
||||
private int userId; |
||||
private String userName; |
||||
private int state; |
||||
private String errorDataPath; |
||||
|
||||
public Builder processDefinitionId(long processDefinitionId) { |
||||
this.processDefinitionId = processDefinitionId; |
||||
return this; |
||||
} |
||||
|
||||
public Builder processDefinitionName(String processDefinitionName) { |
||||
this.processDefinitionName = processDefinitionName; |
||||
return this; |
||||
} |
||||
|
||||
public Builder processInstanceId(long processInstanceId) { |
||||
this.processInstanceId = processInstanceId; |
||||
return this; |
||||
} |
||||
|
||||
public Builder processInstanceName(String processInstanceName) { |
||||
this.processInstanceName = processInstanceName; |
||||
return this; |
||||
} |
||||
|
||||
public Builder taskInstanceId(long taskInstanceId) { |
||||
this.taskInstanceId = taskInstanceId; |
||||
return this; |
||||
} |
||||
|
||||
public Builder taskName(String taskName) { |
||||
this.taskName = taskName; |
||||
return this; |
||||
} |
||||
|
||||
public Builder ruleType(int ruleType) { |
||||
this.ruleType = ruleType; |
||||
return this; |
||||
} |
||||
|
||||
public Builder ruleName(String ruleName) { |
||||
this.ruleName = ruleName; |
||||
return this; |
||||
} |
||||
|
||||
public Builder statisticsValue(double statisticsValue) { |
||||
this.statisticsValue = statisticsValue; |
||||
return this; |
||||
} |
||||
|
||||
public Builder comparisonValue(double comparisonValue) { |
||||
this.comparisonValue = comparisonValue; |
||||
return this; |
||||
} |
||||
|
||||
public Builder checkType(int checkType) { |
||||
this.checkType = checkType; |
||||
return this; |
||||
} |
||||
|
||||
public Builder threshold(double threshold) { |
||||
this.threshold = threshold; |
||||
return this; |
||||
} |
||||
|
||||
public Builder operator(int operator) { |
||||
this.operator = operator; |
||||
return this; |
||||
} |
||||
|
||||
public Builder failureStrategy(int failureStrategy) { |
||||
this.failureStrategy = failureStrategy; |
||||
return this; |
||||
} |
||||
|
||||
public Builder userId(int userId) { |
||||
this.userId = userId; |
||||
return this; |
||||
} |
||||
|
||||
public Builder userName(String userName) { |
||||
this.userName = userName; |
||||
return this; |
||||
} |
||||
|
||||
public Builder state(int state) { |
||||
this.state = state; |
||||
return this; |
||||
} |
||||
|
||||
public Builder errorDataPath(String errorDataPath) { |
||||
this.errorDataPath = errorDataPath; |
||||
return this; |
||||
} |
||||
|
||||
public DqExecuteResultAlertContent build() { |
||||
return new DqExecuteResultAlertContent(this); |
||||
} |
||||
} |
||||
} |
@ -0,0 +1,147 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.entity; |
||||
|
||||
import java.io.Serializable; |
||||
import java.util.Date; |
||||
|
||||
import com.baomidou.mybatisplus.annotation.IdType; |
||||
import com.baomidou.mybatisplus.annotation.TableField; |
||||
import com.baomidou.mybatisplus.annotation.TableId; |
||||
import com.baomidou.mybatisplus.annotation.TableName; |
||||
|
||||
@TableName("t_ds_dq_rule") |
||||
public class DqRule implements Serializable { |
||||
/** |
||||
* primary key |
||||
*/ |
||||
@TableId(value = "id", type = IdType.AUTO) |
||||
private int id; |
||||
/** |
||||
* name |
||||
*/ |
||||
@TableField(value = "name") |
||||
private String name; |
||||
/** |
||||
* type |
||||
*/ |
||||
@TableField(value = "type") |
||||
private int type; |
||||
/** |
||||
* type |
||||
*/ |
||||
@TableField(exist = false) |
||||
private String ruleJson; |
||||
/** |
||||
* user_id |
||||
*/ |
||||
@TableField(value = "user_id") |
||||
private int userId; |
||||
/** |
||||
* user_name |
||||
*/ |
||||
@TableField(exist = false) |
||||
private String userName; |
||||
/** |
||||
* create_time |
||||
*/ |
||||
@TableField(value = "create_time") |
||||
private Date createTime; |
||||
/** |
||||
* update_time |
||||
*/ |
||||
@TableField(value = "update_time") |
||||
private Date updateTime; |
||||
|
||||
public int getId() { |
||||
return id; |
||||
} |
||||
|
||||
public void setId(int id) { |
||||
this.id = id; |
||||
} |
||||
|
||||
public String getName() { |
||||
return name; |
||||
} |
||||
|
||||
public void setName(String name) { |
||||
this.name = name; |
||||
} |
||||
|
||||
public int getType() { |
||||
return type; |
||||
} |
||||
|
||||
public void setType(int type) { |
||||
this.type = type; |
||||
} |
||||
|
||||
public String getRuleJson() { |
||||
return ruleJson; |
||||
} |
||||
|
||||
public void setRuleJson(String ruleJson) { |
||||
this.ruleJson = ruleJson; |
||||
} |
||||
|
||||
public int getUserId() { |
||||
return userId; |
||||
} |
||||
|
||||
public void setUserId(int userId) { |
||||
this.userId = userId; |
||||
} |
||||
|
||||
public Date getCreateTime() { |
||||
return createTime; |
||||
} |
||||
|
||||
public void setCreateTime(Date createTime) { |
||||
this.createTime = createTime; |
||||
} |
||||
|
||||
public Date getUpdateTime() { |
||||
return updateTime; |
||||
} |
||||
|
||||
public void setUpdateTime(Date updateTime) { |
||||
this.updateTime = updateTime; |
||||
} |
||||
|
||||
public String getUserName() { |
||||
return userName; |
||||
} |
||||
|
||||
public void setUserName(String userName) { |
||||
this.userName = userName; |
||||
} |
||||
|
||||
@Override |
||||
public String toString() { |
||||
return "DqRule{" |
||||
+ "id=" + id |
||||
+ ", name='" + name + '\'' |
||||
+ ", type=" + type |
||||
+ ", userId=" + userId |
||||
+ ", userName='" + userName + '\'' |
||||
+ ", createTime=" + createTime |
||||
+ ", updateTime=" + updateTime |
||||
+ '}'; |
||||
} |
||||
} |
@ -0,0 +1,156 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.entity; |
||||
|
||||
import org.apache.dolphinscheduler.spi.task.dq.enums.ExecuteSqlType; |
||||
|
||||
import java.io.Serializable; |
||||
import java.util.Date; |
||||
|
||||
import com.baomidou.mybatisplus.annotation.IdType; |
||||
import com.baomidou.mybatisplus.annotation.TableField; |
||||
import com.baomidou.mybatisplus.annotation.TableId; |
||||
import com.baomidou.mybatisplus.annotation.TableName; |
||||
import com.fasterxml.jackson.annotation.JsonFormat; |
||||
|
||||
/** |
||||
* RuleExecuteSql |
||||
*/ |
||||
@TableName("t_ds_dq_rule_execute_sql") |
||||
public class DqRuleExecuteSql implements Serializable { |
||||
/** |
||||
* primary key |
||||
*/ |
||||
@TableId(value = "id", type = IdType.AUTO) |
||||
private int id; |
||||
/** |
||||
* index,ensure the execution order of sql |
||||
*/ |
||||
@TableField(value = "index") |
||||
private int index; |
||||
/** |
||||
* SQL Statement |
||||
*/ |
||||
@TableField(value = "sql") |
||||
private String sql; |
||||
/** |
||||
* table alias name |
||||
*/ |
||||
@TableField(value = "table_alias") |
||||
private String tableAlias; |
||||
/** |
||||
* input entry type: default,statistics,comparison,check |
||||
*/ |
||||
@TableField(value = "type") |
||||
private int type = ExecuteSqlType.MIDDLE.getCode(); |
||||
/** |
||||
* is error output sql |
||||
*/ |
||||
@TableField(value = "is_error_output_sql") |
||||
private boolean isErrorOutputSql; |
||||
/** |
||||
* create_time |
||||
*/ |
||||
@TableField(value = "create_time") |
||||
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8") |
||||
private Date createTime; |
||||
/** |
||||
* update_time |
||||
*/ |
||||
@TableField(value = "update_time") |
||||
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8") |
||||
private Date updateTime; |
||||
|
||||
public int getId() { |
||||
return id; |
||||
} |
||||
|
||||
public void setId(int id) { |
||||
this.id = id; |
||||
} |
||||
|
||||
public int getIndex() { |
||||
return index; |
||||
} |
||||
|
||||
public void setIndex(int index) { |
||||
this.index = index; |
||||
} |
||||
|
||||
public String getSql() { |
||||
return sql; |
||||
} |
||||
|
||||
public void setSql(String sql) { |
||||
this.sql = sql; |
||||
} |
||||
|
||||
public String getTableAlias() { |
||||
return tableAlias; |
||||
} |
||||
|
||||
public void setTableAlias(String tableAlias) { |
||||
this.tableAlias = tableAlias; |
||||
} |
||||
|
||||
public int getType() { |
||||
return type; |
||||
} |
||||
|
||||
public void setType(int type) { |
||||
this.type = type; |
||||
} |
||||
|
||||
public boolean isErrorOutputSql() { |
||||
return isErrorOutputSql; |
||||
} |
||||
|
||||
public void setErrorOutputSql(boolean errorOutputSql) { |
||||
isErrorOutputSql = errorOutputSql; |
||||
} |
||||
|
||||
public Date getCreateTime() { |
||||
return createTime; |
||||
} |
||||
|
||||
public void setCreateTime(Date createTime) { |
||||
this.createTime = createTime; |
||||
} |
||||
|
||||
public Date getUpdateTime() { |
||||
return updateTime; |
||||
} |
||||
|
||||
public void setUpdateTime(Date updateTime) { |
||||
this.updateTime = updateTime; |
||||
} |
||||
|
||||
@Override |
||||
public String toString() { |
||||
return "DqRuleExecuteSql{" |
||||
+ "id=" + id |
||||
+ ", index=" + index |
||||
+ ", sql='" + sql + '\'' |
||||
+ ", tableAlias='" + tableAlias + '\'' |
||||
+ ", type=" + type |
||||
+ ", isErrorOutputSql=" + isErrorOutputSql |
||||
+ ", createTime=" + createTime |
||||
+ ", updateTime=" + updateTime |
||||
+ '}'; |
||||
} |
||||
} |
@ -0,0 +1,300 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.entity; |
||||
|
||||
import org.apache.dolphinscheduler.spi.task.dq.enums.InputType; |
||||
import org.apache.dolphinscheduler.spi.task.dq.enums.OptionSourceType; |
||||
import org.apache.dolphinscheduler.spi.task.dq.enums.ValueType; |
||||
|
||||
import java.io.Serializable; |
||||
import java.util.Date; |
||||
|
||||
import com.baomidou.mybatisplus.annotation.IdType; |
||||
import com.baomidou.mybatisplus.annotation.TableField; |
||||
import com.baomidou.mybatisplus.annotation.TableId; |
||||
import com.baomidou.mybatisplus.annotation.TableName; |
||||
import com.fasterxml.jackson.annotation.JsonFormat; |
||||
|
||||
/** |
||||
* RuleInputEntry |
||||
*/ |
||||
@TableName("t_ds_dq_rule_input_entry") |
||||
public class DqRuleInputEntry implements Serializable { |
||||
/** |
||||
* primary key |
||||
*/ |
||||
@TableId(value = "id", type = IdType.AUTO) |
||||
private int id; |
||||
/** |
||||
* form field name |
||||
*/ |
||||
@TableField(value = "field") |
||||
private String field; |
||||
/** |
||||
* form type |
||||
*/ |
||||
@TableField(value = "type") |
||||
private String type; |
||||
/** |
||||
* form title |
||||
*/ |
||||
@TableField(value = "title") |
||||
private String title; |
||||
/** |
||||
* default value,can be null |
||||
*/ |
||||
@TableField(value = "value") |
||||
private String value; |
||||
/** |
||||
* default options,can be null |
||||
* [{label:"",value:""}] |
||||
*/ |
||||
@TableField(value = "options") |
||||
private String options; |
||||
/** |
||||
* ${field} |
||||
*/ |
||||
@TableField(value = "placeholder") |
||||
private String placeholder; |
||||
/** |
||||
* the source type of options,use default options or other |
||||
*/ |
||||
@TableField(value = "option_source_type") |
||||
private int optionSourceType = OptionSourceType.DEFAULT.getCode(); |
||||
/** |
||||
* input entry type: string,array,number .etc |
||||
*/ |
||||
@TableField(value = "value_type") |
||||
private int valueType = ValueType.NUMBER.getCode(); |
||||
/** |
||||
* input entry type: default,statistics,comparison |
||||
*/ |
||||
@TableField(value = "input_type") |
||||
private int inputType = InputType.DEFAULT.getCode(); |
||||
/** |
||||
* whether to display on the front end |
||||
*/ |
||||
@TableField(value = "is_show") |
||||
private Boolean isShow; |
||||
/** |
||||
* whether to edit on the front end |
||||
*/ |
||||
@TableField(value = "can_edit") |
||||
private Boolean canEdit; |
||||
/** |
||||
* is emit event |
||||
*/ |
||||
@TableField(value = "is_emit") |
||||
private Boolean isEmit; |
||||
/** |
||||
* is validate |
||||
*/ |
||||
@TableField(value = "is_validate") |
||||
private Boolean isValidate; |
||||
/** |
||||
* values map |
||||
*/ |
||||
@TableField(exist = false) |
||||
private String valuesMap; |
||||
|
||||
/** |
||||
* values map |
||||
*/ |
||||
@TableField(exist = false) |
||||
private Integer index; |
||||
/** |
||||
* create_time |
||||
*/ |
||||
@TableField(value = "create_time") |
||||
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8") |
||||
private Date createTime; |
||||
/** |
||||
* update_time |
||||
*/ |
||||
@TableField(value = "update_time") |
||||
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8") |
||||
private Date updateTime; |
||||
|
||||
public int getId() { |
||||
return id; |
||||
} |
||||
|
||||
public void setId(int id) { |
||||
this.id = id; |
||||
} |
||||
|
||||
public String getField() { |
||||
return field; |
||||
} |
||||
|
||||
public void setField(String field) { |
||||
this.field = field; |
||||
} |
||||
|
||||
public String getType() { |
||||
return type; |
||||
} |
||||
|
||||
public void setType(String type) { |
||||
this.type = type; |
||||
} |
||||
|
||||
public String getTitle() { |
||||
return title; |
||||
} |
||||
|
||||
public void setTitle(String title) { |
||||
this.title = title; |
||||
} |
||||
|
||||
public String getValue() { |
||||
return value; |
||||
} |
||||
|
||||
public void setValue(String value) { |
||||
this.value = value; |
||||
} |
||||
|
||||
public String getOptions() { |
||||
return options; |
||||
} |
||||
|
||||
public void setOptions(String options) { |
||||
this.options = options; |
||||
} |
||||
|
||||
public String getPlaceholder() { |
||||
return placeholder; |
||||
} |
||||
|
||||
public void setPlaceholder(String placeholder) { |
||||
this.placeholder = placeholder; |
||||
} |
||||
|
||||
public int getOptionSourceType() { |
||||
return optionSourceType; |
||||
} |
||||
|
||||
public void setOptionSourceType(int optionSourceType) { |
||||
this.optionSourceType = optionSourceType; |
||||
} |
||||
|
||||
public int getValueType() { |
||||
return valueType; |
||||
} |
||||
|
||||
public void setValueType(int valueType) { |
||||
this.valueType = valueType; |
||||
} |
||||
|
||||
public int getInputType() { |
||||
return inputType; |
||||
} |
||||
|
||||
public void setInputType(int inputType) { |
||||
this.inputType = inputType; |
||||
} |
||||
|
||||
public Boolean getShow() { |
||||
return isShow; |
||||
} |
||||
|
||||
public void setShow(Boolean show) { |
||||
isShow = show; |
||||
} |
||||
|
||||
public Boolean getCanEdit() { |
||||
return canEdit; |
||||
} |
||||
|
||||
public void setCanEdit(Boolean canEdit) { |
||||
this.canEdit = canEdit; |
||||
} |
||||
|
||||
public Boolean getEmit() { |
||||
return isEmit; |
||||
} |
||||
|
||||
public void setEmit(Boolean emit) { |
||||
isEmit = emit; |
||||
} |
||||
|
||||
public Boolean getValidate() { |
||||
return isValidate; |
||||
} |
||||
|
||||
public void setValidate(Boolean validate) { |
||||
isValidate = validate; |
||||
} |
||||
|
||||
public String getValuesMap() { |
||||
return valuesMap; |
||||
} |
||||
|
||||
public void setValuesMap(String valuesMap) { |
||||
this.valuesMap = valuesMap; |
||||
} |
||||
|
||||
public Integer getIndex() { |
||||
return index; |
||||
} |
||||
|
||||
public void setIndex(Integer index) { |
||||
this.index = index; |
||||
} |
||||
|
||||
public Date getCreateTime() { |
||||
return createTime; |
||||
} |
||||
|
||||
public void setCreateTime(Date createTime) { |
||||
this.createTime = createTime; |
||||
} |
||||
|
||||
public Date getUpdateTime() { |
||||
return updateTime; |
||||
} |
||||
|
||||
public void setUpdateTime(Date updateTime) { |
||||
this.updateTime = updateTime; |
||||
} |
||||
|
||||
@Override |
||||
public String toString() { |
||||
return "DqRuleInputEntry{" |
||||
+ "id=" + id |
||||
+ ", field='" + field + '\'' |
||||
+ ", type=" + type |
||||
+ ", title='" + title + '\'' |
||||
+ ", value='" + value + '\'' |
||||
+ ", options='" + options + '\'' |
||||
+ ", placeholder='" + placeholder + '\'' |
||||
+ ", optionSourceType=" + optionSourceType |
||||
+ ", valueType=" + valueType |
||||
+ ", inputType=" + inputType |
||||
+ ", isShow=" + isShow |
||||
+ ", canEdit=" + canEdit |
||||
+ ", isEmit=" + isEmit |
||||
+ ", isValidate=" + isValidate |
||||
+ ", valuesMap='" + valuesMap + '\'' |
||||
+ ", index=" + index |
||||
+ ", createTime=" + createTime |
||||
+ ", updateTime=" + updateTime |
||||
+ '}'; |
||||
} |
||||
} |
@ -0,0 +1,222 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.entity; |
||||
|
||||
import java.io.Serializable; |
||||
import java.util.Date; |
||||
|
||||
import com.baomidou.mybatisplus.annotation.IdType; |
||||
import com.baomidou.mybatisplus.annotation.TableField; |
||||
import com.baomidou.mybatisplus.annotation.TableId; |
||||
import com.baomidou.mybatisplus.annotation.TableName; |
||||
import com.fasterxml.jackson.annotation.JsonFormat; |
||||
|
||||
@TableName("t_ds_dq_task_statistics_value") |
||||
public class DqTaskStatisticsValue implements Serializable { |
||||
/** |
||||
* primary key |
||||
*/ |
||||
@TableId(value = "id", type = IdType.AUTO) |
||||
private int id; |
||||
/** |
||||
* process defined id |
||||
*/ |
||||
@TableField(value = "process_definition_id") |
||||
private long processDefinitionId; |
||||
/** |
||||
* process definition name |
||||
*/ |
||||
@TableField(exist = false) |
||||
private String processDefinitionName; |
||||
/** |
||||
* task instance id |
||||
*/ |
||||
@TableField(value = "task_instance_id") |
||||
private long taskInstanceId; |
||||
/** |
||||
* task name |
||||
*/ |
||||
@TableField(exist = false) |
||||
private String taskName; |
||||
/** |
||||
* rule id |
||||
*/ |
||||
@TableField(value = "rule_id") |
||||
private long ruleId; |
||||
/** |
||||
* rule type |
||||
*/ |
||||
@TableField(exist = false) |
||||
private int ruleType; |
||||
/** |
||||
* rule name |
||||
*/ |
||||
@TableField(exist = false) |
||||
private String ruleName; |
||||
/** |
||||
* statistics value |
||||
*/ |
||||
@TableField(value = "statistics_value") |
||||
private double statisticsValue; |
||||
/** |
||||
* comparison value |
||||
*/ |
||||
@TableField(value = "statistics_name") |
||||
private String statisticsName; |
||||
/** |
||||
* data time |
||||
*/ |
||||
@TableField(value = "data_time") |
||||
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8") |
||||
private Date dataTime; |
||||
/** |
||||
* create time |
||||
*/ |
||||
@TableField(value = "create_time") |
||||
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8") |
||||
private Date createTime; |
||||
/** |
||||
* update time |
||||
*/ |
||||
@TableField(value = "update_time") |
||||
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8") |
||||
private Date updateTime; |
||||
|
||||
public int getId() { |
||||
return id; |
||||
} |
||||
|
||||
public void setId(int id) { |
||||
this.id = id; |
||||
} |
||||
|
||||
public long getProcessDefinitionId() { |
||||
return processDefinitionId; |
||||
} |
||||
|
||||
public void setProcessDefinitionId(long processDefinitionId) { |
||||
this.processDefinitionId = processDefinitionId; |
||||
} |
||||
|
||||
public String getProcessDefinitionName() { |
||||
return processDefinitionName; |
||||
} |
||||
|
||||
public void setProcessDefinitionName(String processDefinitionName) { |
||||
this.processDefinitionName = processDefinitionName; |
||||
} |
||||
|
||||
public long getTaskInstanceId() { |
||||
return taskInstanceId; |
||||
} |
||||
|
||||
public void setTaskInstanceId(long taskInstanceId) { |
||||
this.taskInstanceId = taskInstanceId; |
||||
} |
||||
|
||||
public String getTaskName() { |
||||
return taskName; |
||||
} |
||||
|
||||
public void setTaskName(String taskName) { |
||||
this.taskName = taskName; |
||||
} |
||||
|
||||
public long getRuleId() { |
||||
return ruleId; |
||||
} |
||||
|
||||
public void setRuleId(long ruleId) { |
||||
this.ruleId = ruleId; |
||||
} |
||||
|
||||
public int getRuleType() { |
||||
return ruleType; |
||||
} |
||||
|
||||
public void setRuleType(int ruleType) { |
||||
this.ruleType = ruleType; |
||||
} |
||||
|
||||
public String getRuleName() { |
||||
return ruleName; |
||||
} |
||||
|
||||
public void setRuleName(String ruleName) { |
||||
this.ruleName = ruleName; |
||||
} |
||||
|
||||
public double getStatisticsValue() { |
||||
return statisticsValue; |
||||
} |
||||
|
||||
public void setStatisticsValue(double statisticsValue) { |
||||
this.statisticsValue = statisticsValue; |
||||
} |
||||
|
||||
public String getStatisticsName() { |
||||
return statisticsName; |
||||
} |
||||
|
||||
public void setStatisticsName(String statisticsName) { |
||||
this.statisticsName = statisticsName; |
||||
} |
||||
|
||||
public Date getDataTime() { |
||||
return dataTime; |
||||
} |
||||
|
||||
public void setDataTime(Date dataTime) { |
||||
this.dataTime = dataTime; |
||||
} |
||||
|
||||
public Date getCreateTime() { |
||||
return createTime; |
||||
} |
||||
|
||||
public void setCreateTime(Date createTime) { |
||||
this.createTime = createTime; |
||||
} |
||||
|
||||
public Date getUpdateTime() { |
||||
return updateTime; |
||||
} |
||||
|
||||
public void setUpdateTime(Date updateTime) { |
||||
this.updateTime = updateTime; |
||||
} |
||||
|
||||
@Override |
||||
public String toString() { |
||||
return "DqTaskStatisticsValue{" |
||||
+ "id=" + id |
||||
+ ", processDefinitionId=" + processDefinitionId |
||||
+ ", processDefinitionName='" + processDefinitionName + '\'' |
||||
+ ", taskInstanceId=" + taskInstanceId |
||||
+ ", taskName='" + taskName + '\'' |
||||
+ ", ruleId=" + ruleId |
||||
+ ", ruleType=" + ruleType |
||||
+ ", ruleName='" + ruleName + '\'' |
||||
+ ", statisticsValue=" + statisticsValue |
||||
+ ", statisticsName='" + statisticsName + '\'' |
||||
+ ", dataTime=" + dataTime |
||||
+ ", createTime=" + createTime |
||||
+ ", updateTime=" + updateTime |
||||
+ '}'; |
||||
} |
||||
} |
@ -0,0 +1,156 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.entity; |
||||
|
||||
import org.apache.dolphinscheduler.common.enums.ExecutionStatus; |
||||
|
||||
import java.io.Serializable; |
||||
import java.util.Date; |
||||
|
||||
import com.fasterxml.jackson.annotation.JsonFormat; |
||||
import com.fasterxml.jackson.annotation.JsonInclude; |
||||
import com.fasterxml.jackson.annotation.JsonInclude.Include; |
||||
import com.fasterxml.jackson.annotation.JsonProperty; |
||||
|
||||
@JsonInclude(Include.NON_NULL) |
||||
public class TaskAlertContent implements Serializable { |
||||
@JsonProperty("taskInstanceId") |
||||
private int taskInstanceId; |
||||
@JsonProperty("taskName") |
||||
private String taskName; |
||||
@JsonProperty("taskType") |
||||
private String taskType; |
||||
@JsonProperty("processDefinitionId") |
||||
private int processDefinitionId; |
||||
@JsonProperty("processDefinitionName") |
||||
private String processDefinitionName; |
||||
@JsonProperty("processInstanceId") |
||||
private int processInstanceId; |
||||
@JsonProperty("processInstanceName") |
||||
private String processInstanceName; |
||||
@JsonProperty("state") |
||||
private ExecutionStatus state; |
||||
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8") |
||||
@JsonProperty("startTime") |
||||
private Date startTime; |
||||
@JsonFormat(pattern = "yyyy-MM-dd HH:mm:ss", timezone = "GMT+8") |
||||
@JsonProperty("endTime") |
||||
private Date endTime; |
||||
@JsonProperty("host") |
||||
private String host; |
||||
@JsonProperty("logPath") |
||||
private String logPath; |
||||
|
||||
private TaskAlertContent(Builder builder) { |
||||
this.taskInstanceId = builder.taskInstanceId; |
||||
this.taskName = builder.taskName; |
||||
this.taskType = builder.taskType; |
||||
this.processDefinitionId = builder.processDefinitionId; |
||||
this.processDefinitionName = builder.processDefinitionName; |
||||
this.processInstanceId = builder.processInstanceId; |
||||
this.processInstanceName = builder.processInstanceName; |
||||
this.state = builder.state; |
||||
this.startTime = builder.startTime; |
||||
this.endTime = builder.endTime; |
||||
this.host = builder.host; |
||||
this.logPath = builder.logPath; |
||||
} |
||||
|
||||
public static Builder newBuilder() { |
||||
return new Builder(); |
||||
} |
||||
|
||||
public static class Builder { |
||||
private int taskInstanceId; |
||||
private String taskName; |
||||
private String taskType; |
||||
private int processDefinitionId; |
||||
private String processDefinitionName; |
||||
private int processInstanceId; |
||||
private String processInstanceName; |
||||
private ExecutionStatus state; |
||||
private Date startTime; |
||||
private Date endTime; |
||||
private String host; |
||||
private String logPath; |
||||
|
||||
public Builder taskInstanceId(int taskInstanceId) { |
||||
this.taskInstanceId = taskInstanceId; |
||||
return this; |
||||
} |
||||
|
||||
public Builder taskName(String taskName) { |
||||
this.taskName = taskName; |
||||
return this; |
||||
} |
||||
|
||||
public Builder taskType(String taskType) { |
||||
this.taskType = taskType; |
||||
return this; |
||||
} |
||||
|
||||
public Builder processDefinitionId(int processDefinitionId) { |
||||
this.processDefinitionId = processDefinitionId; |
||||
return this; |
||||
} |
||||
|
||||
public Builder processDefinitionName(String processDefinitionName) { |
||||
this.processDefinitionName = processDefinitionName; |
||||
return this; |
||||
} |
||||
|
||||
public Builder processInstanceId(int processInstanceId) { |
||||
this.processInstanceId = processInstanceId; |
||||
return this; |
||||
} |
||||
|
||||
public Builder processInstanceName(String processInstanceName) { |
||||
this.processInstanceName = processInstanceName; |
||||
return this; |
||||
} |
||||
|
||||
public Builder state(ExecutionStatus state) { |
||||
this.state = state; |
||||
return this; |
||||
} |
||||
|
||||
public Builder startTime(Date startTime) { |
||||
this.startTime = startTime; |
||||
return this; |
||||
} |
||||
|
||||
public Builder endTime(Date endTime) { |
||||
this.endTime = endTime; |
||||
return this; |
||||
} |
||||
|
||||
public Builder host(String host) { |
||||
this.host = host; |
||||
return this; |
||||
} |
||||
|
||||
public Builder logPath(String logPath) { |
||||
this.logPath = logPath; |
||||
return this; |
||||
} |
||||
|
||||
public TaskAlertContent build() { |
||||
return new TaskAlertContent(this); |
||||
} |
||||
} |
||||
} |
@ -0,0 +1,29 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.mapper; |
||||
|
||||
import org.apache.dolphinscheduler.dao.entity.DqComparisonType; |
||||
|
||||
import com.baomidou.mybatisplus.core.mapper.BaseMapper; |
||||
|
||||
/** |
||||
* DqComparisonTypeMapper |
||||
*/ |
||||
public interface DqComparisonTypeMapper extends BaseMapper<DqComparisonType> { |
||||
|
||||
} |
@ -0,0 +1,59 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.mapper; |
||||
|
||||
import org.apache.dolphinscheduler.dao.entity.DqExecuteResult; |
||||
|
||||
import org.apache.ibatis.annotations.Param; |
||||
|
||||
import java.util.Date; |
||||
|
||||
import com.baomidou.mybatisplus.core.mapper.BaseMapper; |
||||
import com.baomidou.mybatisplus.core.metadata.IPage; |
||||
|
||||
/** |
||||
* DqExecuteResultMapper |
||||
*/ |
||||
public interface DqExecuteResultMapper extends BaseMapper<DqExecuteResult> { |
||||
|
||||
/** |
||||
* data quality task execute result page |
||||
* |
||||
* @param page page |
||||
* @param searchVal searchVal |
||||
* @param userId userId |
||||
* @param statusArray states |
||||
* @param ruleType ruleType |
||||
* @param startTime startTime |
||||
* @return endTime endTime |
||||
*/ |
||||
IPage<DqExecuteResult> queryResultListPaging(IPage<DqExecuteResult> page, |
||||
@Param("searchVal") String searchVal, |
||||
@Param("userId") int userId, |
||||
@Param("states") int[] statusArray, |
||||
@Param("ruleType") int ruleType, |
||||
@Param("startTime") Date startTime, |
||||
@Param("endTime") Date endTime); |
||||
|
||||
/** |
||||
* get execute result by id |
||||
* @param taskInstanceId taskInstanceId |
||||
* @return DqExecuteResult |
||||
*/ |
||||
DqExecuteResult getExecuteResultById(@Param("taskInstanceId") int taskInstanceId); |
||||
} |
@ -0,0 +1,39 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.mapper; |
||||
|
||||
import org.apache.dolphinscheduler.dao.entity.DqRuleExecuteSql; |
||||
|
||||
import org.apache.ibatis.annotations.Param; |
||||
|
||||
import java.util.List; |
||||
|
||||
import com.baomidou.mybatisplus.core.mapper.BaseMapper; |
||||
|
||||
/** |
||||
* DqRuleExecuteSqlMapper |
||||
*/ |
||||
public interface DqRuleExecuteSqlMapper extends BaseMapper<DqRuleExecuteSql> { |
||||
|
||||
/** |
||||
* get execute sql list by rule id |
||||
* |
||||
* @param ruleId Integer |
||||
*/ |
||||
List<DqRuleExecuteSql> getExecuteSqlList(@Param("ruleId") Integer ruleId); |
||||
} |
@ -0,0 +1,39 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.mapper; |
||||
|
||||
import org.apache.dolphinscheduler.dao.entity.DqRuleInputEntry; |
||||
|
||||
import org.apache.ibatis.annotations.Param; |
||||
|
||||
import java.util.List; |
||||
|
||||
import com.baomidou.mybatisplus.core.mapper.BaseMapper; |
||||
|
||||
/** |
||||
* DqRuleInputEntryMapper |
||||
*/ |
||||
public interface DqRuleInputEntryMapper extends BaseMapper<DqRuleInputEntry> { |
||||
|
||||
/** |
||||
* get rule input entry list by rule id |
||||
* |
||||
* @param ruleId Integer |
||||
*/ |
||||
List<DqRuleInputEntry> getRuleInputEntryList(@Param("ruleId") Integer ruleId); |
||||
} |
@ -0,0 +1,48 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.mapper; |
||||
|
||||
import org.apache.dolphinscheduler.dao.entity.DqRule; |
||||
|
||||
import org.apache.ibatis.annotations.Param; |
||||
|
||||
import java.util.Date; |
||||
|
||||
import com.baomidou.mybatisplus.core.mapper.BaseMapper; |
||||
import com.baomidou.mybatisplus.core.metadata.IPage; |
||||
|
||||
/** |
||||
* DqRuleMapper |
||||
*/ |
||||
public interface DqRuleMapper extends BaseMapper<DqRule> { |
||||
|
||||
/** |
||||
* data quality rule page |
||||
* |
||||
* @param page page |
||||
* @param searchVal searchVal |
||||
* @param ruleType ruleType |
||||
* @param startTime startTime |
||||
* @return endTime endTime |
||||
*/ |
||||
IPage<DqRule> queryRuleListPaging(IPage<DqRule> page, |
||||
@Param("searchVal") String searchVal, |
||||
@Param("ruleType") int ruleType, |
||||
@Param("startTime") Date startTime, |
||||
@Param("endTime") Date endTime); |
||||
} |
@ -0,0 +1,29 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.mapper; |
||||
|
||||
import org.apache.dolphinscheduler.dao.entity.DqTaskStatisticsValue; |
||||
|
||||
import com.baomidou.mybatisplus.core.mapper.BaseMapper; |
||||
|
||||
/** |
||||
* DqTaskStatisticsValueMapper |
||||
*/ |
||||
public interface DqTaskStatisticsValueMapper extends BaseMapper<DqTaskStatisticsValue> { |
||||
|
||||
} |
@ -0,0 +1,57 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.dao.utils; |
||||
|
||||
import org.apache.dolphinscheduler.common.utils.JSONUtils; |
||||
import org.apache.dolphinscheduler.dao.entity.DqRuleInputEntry; |
||||
|
||||
import java.util.List; |
||||
import java.util.Map; |
||||
|
||||
/** |
||||
* DqRuleUtils |
||||
*/ |
||||
public class DqRuleUtils { |
||||
|
||||
private DqRuleUtils() { |
||||
throw new IllegalStateException("Utility class"); |
||||
} |
||||
|
||||
public static List<DqRuleInputEntry> transformInputEntry(List<DqRuleInputEntry> ruleInputEntryList) { |
||||
for (DqRuleInputEntry dqRuleInputEntry : ruleInputEntryList) { |
||||
Map<String,Object> valuesMap = JSONUtils.toMap(dqRuleInputEntry.getValuesMap(),String.class,Object.class); |
||||
if (valuesMap != null) { |
||||
|
||||
if (valuesMap.get(dqRuleInputEntry.getField()) != null) { |
||||
String value = String.valueOf(valuesMap.get(dqRuleInputEntry.getField())); |
||||
dqRuleInputEntry.setValue(value); |
||||
} |
||||
|
||||
if (valuesMap.get("is_show") != null) { |
||||
dqRuleInputEntry.setShow(Boolean.parseBoolean(String.valueOf(valuesMap.get("is_show")))); |
||||
} |
||||
|
||||
if (valuesMap.get("can_edit") != null) { |
||||
dqRuleInputEntry.setCanEdit(Boolean.parseBoolean(String.valueOf(valuesMap.get("can_edit")))); |
||||
} |
||||
} |
||||
} |
||||
|
||||
return ruleInputEntryList; |
||||
} |
||||
} |
@ -0,0 +1,22 @@
|
||||
<?xml version="1.0" encoding="UTF-8" ?> |
||||
<!-- |
||||
~ Licensed to the Apache Software Foundation (ASF) under one or more |
||||
~ contributor license agreements. See the NOTICE file distributed with |
||||
~ this work for additional information regarding copyright ownership. |
||||
~ The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
~ (the "License"); you may not use this file except in compliance with |
||||
~ the License. You may obtain a copy of the License at |
||||
~ |
||||
~ http://www.apache.org/licenses/LICENSE-2.0 |
||||
~ |
||||
~ Unless required by applicable law or agreed to in writing, software |
||||
~ distributed under the License is distributed on an "AS IS" BASIS, |
||||
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
~ See the License for the specific language governing permissions and |
||||
~ limitations under the License. |
||||
--> |
||||
|
||||
<!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" "http://mybatis.org/dtd/mybatis-3-mapper.dtd" > |
||||
<mapper namespace="org.apache.dolphinscheduler.dao.mapper.DqComparisonTypeMapper"> |
||||
|
||||
</mapper> |
@ -0,0 +1,105 @@
|
||||
<?xml version="1.0" encoding="UTF-8" ?> |
||||
<!-- |
||||
~ Licensed to the Apache Software Foundation (ASF) under one or more |
||||
~ contributor license agreements. See the NOTICE file distributed with |
||||
~ this work for additional information regarding copyright ownership. |
||||
~ The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
~ (the "License"); you may not use this file except in compliance with |
||||
~ the License. You may obtain a copy of the License at |
||||
~ |
||||
~ http://www.apache.org/licenses/LICENSE-2.0 |
||||
~ |
||||
~ Unless required by applicable law or agreed to in writing, software |
||||
~ distributed under the License is distributed on an "AS IS" BASIS, |
||||
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
~ See the License for the specific language governing permissions and |
||||
~ limitations under the License. |
||||
--> |
||||
|
||||
<!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" "http://mybatis.org/dtd/mybatis-3-mapper.dtd" > |
||||
<mapper namespace="org.apache.dolphinscheduler.dao.mapper.DqExecuteResultMapper"> |
||||
<select id="queryResultListPaging" resultType="org.apache.dolphinscheduler.dao.entity.DqExecuteResult"> |
||||
SELECT a.id, |
||||
a.process_definition_id, |
||||
b.name as process_definition_name, |
||||
b.code as process_definition_code, |
||||
a.process_instance_id, |
||||
e.name as process_instance_name, |
||||
b.project_code, |
||||
a.task_instance_id, |
||||
c.name as task_name, |
||||
a.rule_type, |
||||
a.rule_name, |
||||
a.statistics_value, |
||||
a.comparison_value, |
||||
a.check_type, |
||||
a.threshold, |
||||
cp.type as comparison_type_name, |
||||
a.operator, |
||||
a.failure_strategy, |
||||
a.state, |
||||
a.user_id, |
||||
d.user_name, |
||||
a.error_output_path, |
||||
a.create_time, |
||||
a.update_time |
||||
FROM t_ds_dq_execute_result a |
||||
left join t_ds_process_definition b on a.process_definition_id = b.id |
||||
left join t_ds_task_instance c on a.task_instance_id = c.id |
||||
left join t_ds_process_instance e on a.process_instance_id = e.id |
||||
left join t_ds_user d on d.id = a.user_id |
||||
left join t_ds_dq_comparison_type cp on cp.id = a.comparison_type |
||||
<where> |
||||
<if test=" searchVal != null and searchVal != ''"> |
||||
and c.name like concat('%', #{searchVal}, '%') |
||||
</if> |
||||
<if test="startTime != null "> |
||||
and a.update_time > #{startTime} and a.update_time <![CDATA[ <=]]> #{endTime} |
||||
</if> |
||||
<if test="states != null and states != ''"> |
||||
and a.state in |
||||
<foreach collection="states" index="index" item="i" open="(" separator="," close=")"> |
||||
#{i} |
||||
</foreach> |
||||
</if> |
||||
<if test=" userId != 1"> |
||||
and a.user_id = #{userId} |
||||
</if> |
||||
<if test=" ruleType != -1"> |
||||
and a.rule_type = #{ruleType} |
||||
</if> |
||||
</where> |
||||
order by a.update_time desc |
||||
</select> |
||||
|
||||
<select id="getExecuteResultById" resultType="org.apache.dolphinscheduler.dao.entity.DqExecuteResult"> |
||||
SELECT a.id, |
||||
a.process_definition_id, |
||||
a.process_instance_id, |
||||
a.task_instance_id, |
||||
a.rule_type, |
||||
a.rule_name, |
||||
a.statistics_value, |
||||
a.comparison_value, |
||||
a.check_type, |
||||
a.threshold, |
||||
a.operator, |
||||
a.failure_strategy, |
||||
a.state, |
||||
a.user_id, |
||||
a.comparison_type, |
||||
a.error_output_path, |
||||
b.name as process_definition_name, |
||||
e.name as process_instance_name, |
||||
c.name as task_name, |
||||
cp.type as comparison_type_name, |
||||
d.user_name |
||||
FROM t_ds_dq_execute_result a |
||||
left join t_ds_process_definition b on a.process_definition_id = b.id |
||||
left join t_ds_task_instance c on a.task_instance_id = c.id |
||||
left join t_ds_process_instance e on a.process_instance_id = e.id |
||||
left join t_ds_user d on d.id = a.user_id |
||||
left join t_ds_dq_comparison_type cp on cp.id = a.comparison_type |
||||
where task_instance_id = #{taskInstanceId} |
||||
</select> |
||||
</mapper> |
@ -0,0 +1,27 @@
|
||||
<?xml version="1.0" encoding="UTF-8" ?> |
||||
<!-- |
||||
~ Licensed to the Apache Software Foundation (ASF) under one or more |
||||
~ contributor license agreements. See the NOTICE file distributed with |
||||
~ this work for additional information regarding copyright ownership. |
||||
~ The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
~ (the "License"); you may not use this file except in compliance with |
||||
~ the License. You may obtain a copy of the License at |
||||
~ |
||||
~ http://www.apache.org/licenses/LICENSE-2.0 |
||||
~ |
||||
~ Unless required by applicable law or agreed to in writing, software |
||||
~ distributed under the License is distributed on an "AS IS" BASIS, |
||||
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
~ See the License for the specific language governing permissions and |
||||
~ limitations under the License. |
||||
--> |
||||
|
||||
<!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" "http://mybatis.org/dtd/mybatis-3-mapper.dtd" > |
||||
<mapper namespace="org.apache.dolphinscheduler.dao.mapper.DqRuleExecuteSqlMapper"> |
||||
|
||||
<select id="getExecuteSqlList" resultType="org.apache.dolphinscheduler.dao.entity.DqRuleExecuteSql"> |
||||
SELECT * FROM t_ds_dq_rule_execute_sql a join ( SELECT * |
||||
FROM t_ds_relation_rule_execute_sql where rule_id = #{ruleId}) b |
||||
on a.id = b.execute_sql_id |
||||
</select> |
||||
</mapper> |
@ -0,0 +1,43 @@
|
||||
<?xml version="1.0" encoding="UTF-8" ?> |
||||
<!-- |
||||
~ Licensed to the Apache Software Foundation (ASF) under one or more |
||||
~ contributor license agreements. See the NOTICE file distributed with |
||||
~ this work for additional information regarding copyright ownership. |
||||
~ The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
~ (the "License"); you may not use this file except in compliance with |
||||
~ the License. You may obtain a copy of the License at |
||||
~ |
||||
~ http://www.apache.org/licenses/LICENSE-2.0 |
||||
~ |
||||
~ Unless required by applicable law or agreed to in writing, software |
||||
~ distributed under the License is distributed on an "AS IS" BASIS, |
||||
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
~ See the License for the specific language governing permissions and |
||||
~ limitations under the License. |
||||
--> |
||||
|
||||
<!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" "http://mybatis.org/dtd/mybatis-3-mapper.dtd" > |
||||
<mapper namespace="org.apache.dolphinscheduler.dao.mapper.DqRuleInputEntryMapper"> |
||||
|
||||
<select id="getRuleInputEntryList" resultType="org.apache.dolphinscheduler.dao.entity.DqRuleInputEntry"> |
||||
SELECT a.id, |
||||
a.field, |
||||
a.`type`, |
||||
a.title, |
||||
a.value, |
||||
a.`options`, |
||||
a.placeholder, |
||||
a.option_source_type, |
||||
a.value_type, |
||||
a.input_type, |
||||
a.is_show, |
||||
a.can_edit, |
||||
a.is_emit, |
||||
a.is_validate, |
||||
b.values_map, |
||||
b.index |
||||
FROM t_ds_dq_rule_input_entry a join ( SELECT * |
||||
FROM t_ds_relation_rule_input_entry where rule_id = #{ruleId} ) b |
||||
on a.id = b.rule_input_entry_id order by b.index |
||||
</select> |
||||
</mapper> |
@ -0,0 +1,37 @@
|
||||
<?xml version="1.0" encoding="UTF-8" ?> |
||||
<!-- |
||||
~ Licensed to the Apache Software Foundation (ASF) under one or more |
||||
~ contributor license agreements. See the NOTICE file distributed with |
||||
~ this work for additional information regarding copyright ownership. |
||||
~ The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
~ (the "License"); you may not use this file except in compliance with |
||||
~ the License. You may obtain a copy of the License at |
||||
~ |
||||
~ http://www.apache.org/licenses/LICENSE-2.0 |
||||
~ |
||||
~ Unless required by applicable law or agreed to in writing, software |
||||
~ distributed under the License is distributed on an "AS IS" BASIS, |
||||
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
~ See the License for the specific language governing permissions and |
||||
~ limitations under the License. |
||||
--> |
||||
|
||||
<!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" "http://mybatis.org/dtd/mybatis-3-mapper.dtd" > |
||||
<mapper namespace="org.apache.dolphinscheduler.dao.mapper.DqRuleMapper"> |
||||
<select id="queryRuleListPaging" resultType="org.apache.dolphinscheduler.dao.entity.DqRule"> |
||||
SELECT a.id, a.name, a.type, b.user_name, a.create_time, a.update_time |
||||
FROM t_ds_dq_rule a left join t_ds_user b on a.user_id = b.id |
||||
<where> |
||||
<if test=" searchVal != null and searchVal != ''"> |
||||
and a.name like concat('%', #{searchVal}, '%') |
||||
</if> |
||||
<if test="startTime != null "> |
||||
and a.update_time > #{startTime} and a.update_time <![CDATA[ <=]]> #{endTime} |
||||
</if> |
||||
<if test=" ruleType != -1"> |
||||
and a.type = #{ruleType} |
||||
</if> |
||||
</where> |
||||
order by a.update_time desc |
||||
</select> |
||||
</mapper> |
@ -0,0 +1,22 @@
|
||||
<?xml version="1.0" encoding="UTF-8" ?> |
||||
<!-- |
||||
~ Licensed to the Apache Software Foundation (ASF) under one or more |
||||
~ contributor license agreements. See the NOTICE file distributed with |
||||
~ this work for additional information regarding copyright ownership. |
||||
~ The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
~ (the "License"); you may not use this file except in compliance with |
||||
~ the License. You may obtain a copy of the License at |
||||
~ |
||||
~ http://www.apache.org/licenses/LICENSE-2.0 |
||||
~ |
||||
~ Unless required by applicable law or agreed to in writing, software |
||||
~ distributed under the License is distributed on an "AS IS" BASIS, |
||||
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
~ See the License for the specific language governing permissions and |
||||
~ limitations under the License. |
||||
--> |
||||
|
||||
<!DOCTYPE mapper PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN" "http://mybatis.org/dtd/mybatis-3-mapper.dtd" > |
||||
<mapper namespace="org.apache.dolphinscheduler.dao.mapper.DqTaskStatisticsValueMapper"> |
||||
|
||||
</mapper> |
@ -0,0 +1,203 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?> |
||||
<!-- |
||||
~ Licensed to the Apache Software Foundation (ASF) under one or more |
||||
~ contributor license agreements. See the NOTICE file distributed with |
||||
~ this work for additional information regarding copyright ownership. |
||||
~ The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
~ (the "License"); you may not use this file except in compliance with |
||||
~ the License. You may obtain a copy of the License at |
||||
~ |
||||
~ http://www.apache.org/licenses/LICENSE-2.0 |
||||
~ |
||||
~ Unless required by applicable law or agreed to in writing, software |
||||
~ distributed under the License is distributed on an "AS IS" BASIS, |
||||
~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
~ See the License for the specific language governing permissions and |
||||
~ limitations under the License. |
||||
--> |
||||
<project xmlns="http://maven.apache.org/POM/4.0.0" |
||||
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" |
||||
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> |
||||
<parent> |
||||
<artifactId>dolphinscheduler</artifactId> |
||||
<groupId>org.apache.dolphinscheduler</groupId> |
||||
<version>2.0.4-SNAPSHOT</version> |
||||
</parent> |
||||
<modelVersion>4.0.0</modelVersion> |
||||
<artifactId>dolphinscheduler-data-quality</artifactId> |
||||
<name>dolphinscheduler-data-quality</name> |
||||
|
||||
<packaging>jar</packaging> |
||||
|
||||
<properties> |
||||
<scala.binary.version>2.11</scala.binary.version> |
||||
<spark.version>2.4.0</spark.version> |
||||
<jackson.version>2.9.0</jackson.version> |
||||
<scope>provided</scope> |
||||
</properties> |
||||
|
||||
<dependencies> |
||||
<dependency> |
||||
<groupId>org.apache.spark</groupId> |
||||
<artifactId>spark-core_${scala.binary.version}</artifactId> |
||||
<version>${spark.version}</version> |
||||
<scope>${scope}</scope> |
||||
<exclusions> |
||||
<exclusion> |
||||
<artifactId>jackson-module-scala_2.11</artifactId> |
||||
<groupId>com.fasterxml.jackson.module</groupId> |
||||
</exclusion> |
||||
</exclusions> |
||||
</dependency> |
||||
|
||||
<dependency> |
||||
<groupId>org.apache.spark</groupId> |
||||
<artifactId>spark-sql_${scala.binary.version}</artifactId> |
||||
<version>${spark.version}</version> |
||||
<scope>${scope}</scope> |
||||
<exclusions> |
||||
<exclusion> |
||||
<artifactId>jackson-core</artifactId> |
||||
<groupId>com.fasterxml.jackson.core</groupId> |
||||
</exclusion> |
||||
</exclusions> |
||||
</dependency> |
||||
|
||||
<dependency> |
||||
<groupId>org.apache.spark</groupId> |
||||
<artifactId>spark-hive_${scala.binary.version}</artifactId> |
||||
<version>${spark.version}</version> |
||||
<exclusions> |
||||
<exclusion> |
||||
<groupId>commons-httpclient</groupId> |
||||
<artifactId>commons-httpclient</artifactId> |
||||
</exclusion> |
||||
<exclusion> |
||||
<groupId>org.apache.httpcomponents</groupId> |
||||
<artifactId>httpclient</artifactId> |
||||
</exclusion> |
||||
<exclusion> |
||||
<artifactId>jackson-core-asl</artifactId> |
||||
<groupId>org.codehaus.jackson</groupId> |
||||
</exclusion> |
||||
<exclusion> |
||||
<artifactId>jackson-mapper-asl</artifactId> |
||||
<groupId>org.codehaus.jackson</groupId> |
||||
</exclusion> |
||||
</exclusions> |
||||
<scope>${scope}</scope> |
||||
</dependency> |
||||
|
||||
<dependency> |
||||
<groupId>com.h2database</groupId> |
||||
<artifactId>h2</artifactId> |
||||
<scope>test</scope> |
||||
</dependency> |
||||
|
||||
<dependency> |
||||
<groupId>mysql</groupId> |
||||
<artifactId>mysql-connector-java</artifactId> |
||||
</dependency> |
||||
|
||||
<dependency> |
||||
<groupId>org.postgresql</groupId> |
||||
<artifactId>postgresql</artifactId> |
||||
</dependency> |
||||
|
||||
<dependency> |
||||
<groupId>ru.yandex.clickhouse</groupId> |
||||
<artifactId>clickhouse-jdbc</artifactId> |
||||
<exclusions> |
||||
<exclusion> |
||||
<artifactId>jackson-core</artifactId> |
||||
<groupId>com.fasterxml.jackson.core</groupId> |
||||
</exclusion> |
||||
</exclusions> |
||||
</dependency> |
||||
|
||||
<dependency> |
||||
<groupId>com.microsoft.sqlserver</groupId> |
||||
<artifactId>mssql-jdbc</artifactId> |
||||
<exclusions> |
||||
<exclusion> |
||||
<artifactId>azure-keyvault</artifactId> |
||||
<groupId>com.microsoft.azure</groupId> |
||||
</exclusion> |
||||
</exclusions> |
||||
</dependency> |
||||
|
||||
<dependency> |
||||
<groupId>com.facebook.presto</groupId> |
||||
<artifactId>presto-jdbc</artifactId> |
||||
</dependency> |
||||
|
||||
<dependency> |
||||
<groupId>com.google.guava</groupId> |
||||
<artifactId>guava</artifactId> |
||||
</dependency> |
||||
|
||||
<dependency> |
||||
<groupId>com.fasterxml.jackson.core</groupId> |
||||
<artifactId>jackson-databind</artifactId> |
||||
<version>${jackson.version}</version> |
||||
<scope>${scope}</scope> |
||||
<exclusions> |
||||
<exclusion> |
||||
<artifactId>jackson-core</artifactId> |
||||
<groupId>com.fasterxml.jackson.core</groupId> |
||||
</exclusion> |
||||
</exclusions> |
||||
</dependency> |
||||
|
||||
<dependency> |
||||
<groupId>com.fasterxml.jackson.module</groupId> |
||||
<artifactId>jackson-module-scala_2.11</artifactId> |
||||
<version>${jackson.version}</version> |
||||
<scope>${scope}</scope> |
||||
<exclusions> |
||||
<exclusion> |
||||
<artifactId>jackson-core</artifactId> |
||||
<groupId>com.fasterxml.jackson.core</groupId> |
||||
</exclusion> |
||||
</exclusions> |
||||
</dependency> |
||||
|
||||
<dependency> |
||||
<groupId>org.codehaus.janino</groupId> |
||||
<artifactId>janino</artifactId> |
||||
<version>3.0.8</version> |
||||
<scope>${scope}</scope> |
||||
</dependency> |
||||
|
||||
</dependencies> |
||||
|
||||
<build> |
||||
<plugins> |
||||
<plugin> |
||||
<groupId>org.apache.maven.plugins</groupId> |
||||
<artifactId>maven-assembly-plugin</artifactId> |
||||
<version>2.2</version> |
||||
<configuration> |
||||
<appendAssemblyId>false</appendAssemblyId> |
||||
<descriptorRefs> |
||||
<descriptorRef>jar-with-dependencies</descriptorRef> |
||||
</descriptorRefs> |
||||
<archive> |
||||
<manifest> |
||||
<mainClass>org.apache.dolphinscheduler.data.quality.DataQualityApplication</mainClass> |
||||
</manifest> |
||||
</archive> |
||||
</configuration> |
||||
<executions> |
||||
<execution> |
||||
<id>make-assembly</id> |
||||
<phase>package</phase> |
||||
<goals> |
||||
<goal>assembly</goal> |
||||
</goals> |
||||
</execution> |
||||
</executions> |
||||
</plugin> |
||||
</plugins> |
||||
</build> |
||||
</project> |
@ -0,0 +1,62 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality; |
||||
|
||||
/** |
||||
* Constants |
||||
*/ |
||||
public final class Constants { |
||||
|
||||
private Constants() { |
||||
throw new IllegalStateException("Construct Constants"); |
||||
} |
||||
|
||||
public static final String DATABASE = "database"; |
||||
|
||||
public static final String TABLE = "table"; |
||||
|
||||
public static final String URL = "url"; |
||||
|
||||
public static final String USER = "user"; |
||||
|
||||
public static final String PASSWORD = "password"; |
||||
|
||||
public static final String DRIVER = "driver"; |
||||
|
||||
public static final String EMPTY = ""; |
||||
|
||||
public static final String SQL = "sql"; |
||||
|
||||
public static final String DOTS = "."; |
||||
|
||||
public static final String INPUT_TABLE = "input_table"; |
||||
|
||||
public static final String OUTPUT_TABLE = "output_table"; |
||||
|
||||
public static final String TMP_TABLE = "tmp_table"; |
||||
|
||||
public static final String DB_TABLE = "dbtable"; |
||||
|
||||
public static final String JDBC = "jdbc"; |
||||
|
||||
public static final String SAVE_MODE = "save_mode"; |
||||
|
||||
public static final String APPEND = "append"; |
||||
|
||||
public static final String SPARK_APP_NAME = "spark.app.name"; |
||||
} |
@ -0,0 +1,72 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality; |
||||
|
||||
import static org.apache.dolphinscheduler.data.quality.Constants.SPARK_APP_NAME; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.config.DataQualityConfiguration; |
||||
import org.apache.dolphinscheduler.data.quality.config.EnvConfig; |
||||
import org.apache.dolphinscheduler.data.quality.context.DataQualityContext; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
import org.apache.dolphinscheduler.data.quality.utils.JsonUtils; |
||||
|
||||
import org.slf4j.Logger; |
||||
import org.slf4j.LoggerFactory; |
||||
|
||||
import com.google.common.base.Strings; |
||||
|
||||
/** |
||||
* DataQualityApplication is spark application. |
||||
* It mainly includes three components: reader, transformer and writer. |
||||
* These three components realize the functions of connecting data, executing intermediate SQL |
||||
* and writing execution results and error data to the specified storage engine |
||||
*/ |
||||
public class DataQualityApplication { |
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(DataQualityApplication.class); |
||||
|
||||
public static void main(String[] args) throws Exception { |
||||
|
||||
if (args.length < 1) { |
||||
logger.error("Can not find DataQualityConfiguration"); |
||||
System.exit(-1); |
||||
} |
||||
|
||||
String dataQualityParameter = args[0]; |
||||
|
||||
DataQualityConfiguration dataQualityConfiguration = JsonUtils.fromJson(dataQualityParameter,DataQualityConfiguration.class); |
||||
if (dataQualityConfiguration == null) { |
||||
logger.info("DataQualityConfiguration is null"); |
||||
System.exit(-1); |
||||
} else { |
||||
dataQualityConfiguration.validate(); |
||||
} |
||||
|
||||
EnvConfig envConfig = dataQualityConfiguration.getEnvConfig(); |
||||
Config config = new Config(envConfig.getConfig()); |
||||
config.put("type",envConfig.getType()); |
||||
if (Strings.isNullOrEmpty(config.getString(SPARK_APP_NAME))) { |
||||
config.put(SPARK_APP_NAME,dataQualityConfiguration.getName()); |
||||
} |
||||
|
||||
SparkRuntimeEnvironment sparkRuntimeEnvironment = new SparkRuntimeEnvironment(config); |
||||
DataQualityContext dataQualityContext = new DataQualityContext(sparkRuntimeEnvironment,dataQualityConfiguration); |
||||
dataQualityContext.execute(); |
||||
} |
||||
} |
@ -0,0 +1,66 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.config; |
||||
|
||||
import java.util.Map; |
||||
|
||||
import com.fasterxml.jackson.annotation.JsonProperty; |
||||
import com.google.common.base.Preconditions; |
||||
import com.google.common.base.Strings; |
||||
|
||||
/** |
||||
* BaseConfig |
||||
*/ |
||||
public class BaseConfig implements IConfig { |
||||
|
||||
@JsonProperty("type") |
||||
private String type; |
||||
|
||||
@JsonProperty("config") |
||||
private Map<String,Object> config; |
||||
|
||||
public BaseConfig() { |
||||
} |
||||
|
||||
public BaseConfig(String type, Map<String,Object> config) { |
||||
this.type = type; |
||||
this.config = config; |
||||
} |
||||
|
||||
public String getType() { |
||||
return type; |
||||
} |
||||
|
||||
public void setType(String type) { |
||||
this.type = type; |
||||
} |
||||
|
||||
public Map<String, Object> getConfig() { |
||||
return config; |
||||
} |
||||
|
||||
public void setConfig(Map<String, Object> config) { |
||||
this.config = config; |
||||
} |
||||
|
||||
@Override |
||||
public void validate() { |
||||
Preconditions.checkArgument(!Strings.isNullOrEmpty(type), "type should not be empty"); |
||||
Preconditions.checkArgument(config != null, "config should not be empty"); |
||||
} |
||||
} |
@ -0,0 +1,94 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.config; |
||||
|
||||
import java.util.HashMap; |
||||
import java.util.List; |
||||
import java.util.Map; |
||||
import java.util.Map.Entry; |
||||
import java.util.Set; |
||||
|
||||
/** |
||||
* Config |
||||
*/ |
||||
public class Config { |
||||
|
||||
private Map<String,Object> configuration = new HashMap<>(); |
||||
|
||||
public Config() { |
||||
|
||||
} |
||||
|
||||
public Config(Map<String,Object> configuration) { |
||||
if (configuration != null) { |
||||
this.configuration = configuration; |
||||
} |
||||
} |
||||
|
||||
public String getString(String key) { |
||||
return configuration.get(key) == null ? null : String.valueOf(configuration.get(key)); |
||||
} |
||||
|
||||
public List<String> getStringList(String key) { |
||||
return (List<String>)configuration.get(key); |
||||
} |
||||
|
||||
public Integer getInt(String key) { |
||||
return Integer.valueOf(String.valueOf(configuration.get(key))); |
||||
} |
||||
|
||||
public Boolean getBoolean(String key) { |
||||
return Boolean.valueOf(String.valueOf(configuration.get(key))); |
||||
} |
||||
|
||||
public Double getDouble(String key) { |
||||
return Double.valueOf(String.valueOf(configuration.get(key))); |
||||
} |
||||
|
||||
public Long getLong(String key) { |
||||
return Long.valueOf(String.valueOf(configuration.get(key))); |
||||
} |
||||
|
||||
public Boolean has(String key) { |
||||
return configuration.get(key) != null; |
||||
} |
||||
|
||||
public Set<Entry<String, Object>> entrySet() { |
||||
return configuration.entrySet(); |
||||
} |
||||
|
||||
public boolean isEmpty() { |
||||
return configuration.size() <= 0; |
||||
} |
||||
|
||||
public boolean isNotEmpty() { |
||||
return configuration.size() > 0; |
||||
} |
||||
|
||||
public void put(String key, Object value) { |
||||
this.configuration.put(key,value); |
||||
} |
||||
|
||||
public void merge(Map<String, Object> configuration) { |
||||
configuration.forEach(this.configuration::putIfAbsent); |
||||
} |
||||
|
||||
public Map<String, Object> configurationMap() { |
||||
return this.configuration; |
||||
} |
||||
} |
@ -0,0 +1,132 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.config; |
||||
|
||||
import java.util.List; |
||||
|
||||
import com.fasterxml.jackson.annotation.JsonProperty; |
||||
import com.google.common.base.Preconditions; |
||||
import com.google.common.base.Strings; |
||||
|
||||
/** |
||||
* DataQualityConfiguration |
||||
*/ |
||||
public class DataQualityConfiguration implements IConfig { |
||||
|
||||
@JsonProperty("name") |
||||
private String name; |
||||
|
||||
@JsonProperty("env") |
||||
private EnvConfig envConfig; |
||||
|
||||
@JsonProperty("readers") |
||||
private List<ReaderConfig> readerConfigs; |
||||
|
||||
@JsonProperty("transformers") |
||||
private List<TransformerConfig> transformerConfigs; |
||||
|
||||
@JsonProperty("writers") |
||||
private List<WriterConfig> writerConfigs; |
||||
|
||||
public DataQualityConfiguration() {} |
||||
|
||||
public DataQualityConfiguration(String name, |
||||
EnvConfig envConfig, |
||||
List<ReaderConfig> readerConfigs, |
||||
List<WriterConfig> writerConfigs, |
||||
List<TransformerConfig> transformerConfigs) { |
||||
this.name = name; |
||||
this.envConfig = envConfig; |
||||
this.readerConfigs = readerConfigs; |
||||
this.writerConfigs = writerConfigs; |
||||
this.transformerConfigs = transformerConfigs; |
||||
} |
||||
|
||||
public String getName() { |
||||
return name; |
||||
} |
||||
|
||||
public void setName(String name) { |
||||
this.name = name; |
||||
} |
||||
|
||||
public EnvConfig getEnvConfig() { |
||||
return envConfig; |
||||
} |
||||
|
||||
public void setEnvConfig(EnvConfig envConfig) { |
||||
this.envConfig = envConfig; |
||||
} |
||||
|
||||
public List<ReaderConfig> getReaderConfigs() { |
||||
return readerConfigs; |
||||
} |
||||
|
||||
public void setReaderConfigs(List<ReaderConfig> readerConfigs) { |
||||
this.readerConfigs = readerConfigs; |
||||
} |
||||
|
||||
public List<TransformerConfig> getTransformerConfigs() { |
||||
return transformerConfigs; |
||||
} |
||||
|
||||
public void setTransformerConfigs(List<TransformerConfig> transformerConfigs) { |
||||
this.transformerConfigs = transformerConfigs; |
||||
} |
||||
|
||||
public List<WriterConfig> getWriterConfigs() { |
||||
return writerConfigs; |
||||
} |
||||
|
||||
public void setWriterConfigs(List<WriterConfig> writerConfigs) { |
||||
this.writerConfigs = writerConfigs; |
||||
} |
||||
|
||||
@Override |
||||
public void validate() { |
||||
Preconditions.checkArgument(!Strings.isNullOrEmpty(name), "name should not be empty"); |
||||
|
||||
Preconditions.checkArgument(envConfig != null, "env config should not be empty"); |
||||
|
||||
Preconditions.checkArgument(readerConfigs != null, "reader config should not be empty"); |
||||
for (ReaderConfig readerConfig : readerConfigs) { |
||||
readerConfig.validate(); |
||||
} |
||||
|
||||
Preconditions.checkArgument(transformerConfigs != null, "transform config should not be empty"); |
||||
for (TransformerConfig transformParameter : transformerConfigs) { |
||||
transformParameter.validate(); |
||||
} |
||||
|
||||
Preconditions.checkArgument(writerConfigs != null, "writer config should not be empty"); |
||||
for (WriterConfig writerConfig :writerConfigs) { |
||||
writerConfig.validate(); |
||||
} |
||||
} |
||||
|
||||
@Override |
||||
public String toString() { |
||||
return "DataQualityConfiguration{" |
||||
+ "name='" + name + '\'' |
||||
+ ", envConfig=" + envConfig |
||||
+ ", readerConfigs=" + readerConfigs |
||||
+ ", transformerConfigs=" + transformerConfigs |
||||
+ ", writerConfigs=" + writerConfigs |
||||
+ '}'; |
||||
} |
||||
} |
@ -0,0 +1,34 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.config; |
||||
|
||||
import java.util.Map; |
||||
|
||||
/** |
||||
* EnvConfig |
||||
*/ |
||||
public class EnvConfig extends BaseConfig { |
||||
|
||||
public EnvConfig() { |
||||
} |
||||
|
||||
public EnvConfig(String type, Map<String,Object> config) { |
||||
super(type,config); |
||||
} |
||||
|
||||
} |
@ -0,0 +1,29 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.config; |
||||
|
||||
/** |
||||
* IConfig |
||||
*/ |
||||
public interface IConfig { |
||||
|
||||
/** |
||||
* check the parameter |
||||
*/ |
||||
void validate(); |
||||
} |
@ -0,0 +1,32 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.config; |
||||
|
||||
import java.util.Map; |
||||
|
||||
/** |
||||
* ReaderConfig |
||||
*/ |
||||
public class ReaderConfig extends BaseConfig { |
||||
|
||||
public ReaderConfig() {} |
||||
|
||||
public ReaderConfig(String type, Map<String,Object> config) { |
||||
super(type, config); |
||||
} |
||||
} |
@ -0,0 +1,32 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.config; |
||||
|
||||
import java.util.Map; |
||||
|
||||
/** |
||||
* TransformerConfig |
||||
*/ |
||||
public class TransformerConfig extends BaseConfig { |
||||
|
||||
public TransformerConfig() {} |
||||
|
||||
public TransformerConfig(String type, Map<String,Object> config) { |
||||
super(type, config); |
||||
} |
||||
} |
@ -0,0 +1,46 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.config; |
||||
|
||||
public class ValidateResult { |
||||
|
||||
private boolean success; |
||||
|
||||
private String msg; |
||||
|
||||
public ValidateResult(boolean success, String msg) { |
||||
this.success = success; |
||||
this.msg = msg; |
||||
} |
||||
|
||||
public boolean isSuccess() { |
||||
return success; |
||||
} |
||||
|
||||
public void setSuccess(boolean success) { |
||||
this.success = success; |
||||
} |
||||
|
||||
public String getMsg() { |
||||
return msg; |
||||
} |
||||
|
||||
public void setMsg(String msg) { |
||||
this.msg = msg; |
||||
} |
||||
} |
@ -0,0 +1,32 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.config; |
||||
|
||||
import java.util.Map; |
||||
|
||||
/** |
||||
* WriterConfig |
||||
*/ |
||||
public class WriterConfig extends BaseConfig { |
||||
|
||||
public WriterConfig() {} |
||||
|
||||
public WriterConfig(String type, Map<String,Object> config) { |
||||
super(type, config); |
||||
} |
||||
} |
@ -0,0 +1,67 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.context; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.DataQualityConfiguration; |
||||
import org.apache.dolphinscheduler.data.quality.exception.DataQualityException; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchReader; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchTransformer; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchWriter; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.reader.ReaderFactory; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.transformer.TransformerFactory; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.writer.WriterFactory; |
||||
|
||||
import java.util.List; |
||||
|
||||
/** |
||||
* DataQualityContext |
||||
*/ |
||||
public class DataQualityContext { |
||||
|
||||
private SparkRuntimeEnvironment sparkRuntimeEnvironment; |
||||
|
||||
private DataQualityConfiguration dataQualityConfiguration; |
||||
|
||||
public DataQualityContext() { |
||||
} |
||||
|
||||
public DataQualityContext(SparkRuntimeEnvironment sparkRuntimeEnvironment, |
||||
DataQualityConfiguration dataQualityConfiguration) { |
||||
this.sparkRuntimeEnvironment = sparkRuntimeEnvironment; |
||||
this.dataQualityConfiguration = dataQualityConfiguration; |
||||
} |
||||
|
||||
public void execute() throws DataQualityException { |
||||
List<BatchReader> readers = ReaderFactory |
||||
.getInstance() |
||||
.getReaders(this.sparkRuntimeEnvironment,dataQualityConfiguration.getReaderConfigs()); |
||||
List<BatchTransformer> transformers = TransformerFactory |
||||
.getInstance() |
||||
.getTransformer(this.sparkRuntimeEnvironment,dataQualityConfiguration.getTransformerConfigs()); |
||||
List<BatchWriter> writers = WriterFactory |
||||
.getInstance() |
||||
.getWriters(this.sparkRuntimeEnvironment,dataQualityConfiguration.getWriterConfigs()); |
||||
|
||||
if (sparkRuntimeEnvironment.isBatch()) { |
||||
sparkRuntimeEnvironment.getBatchExecution().execute(readers,transformers,writers); |
||||
} else { |
||||
throw new DataQualityException("stream mode is not supported now"); |
||||
} |
||||
} |
||||
} |
@ -0,0 +1,40 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.enums; |
||||
|
||||
/** |
||||
* ReaderType |
||||
*/ |
||||
public enum ReaderType { |
||||
/** |
||||
* JDBC |
||||
* HIVE |
||||
*/ |
||||
JDBC, |
||||
HIVE; |
||||
|
||||
public static ReaderType getType(String name) { |
||||
for (ReaderType type: ReaderType.values()) { |
||||
if (type.name().equalsIgnoreCase(name)) { |
||||
return type; |
||||
} |
||||
} |
||||
|
||||
return null; |
||||
} |
||||
} |
@ -0,0 +1,38 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.enums; |
||||
|
||||
/** |
||||
* TransformerType |
||||
*/ |
||||
public enum TransformerType { |
||||
/** |
||||
* JDBC |
||||
*/ |
||||
SQL; |
||||
|
||||
public static TransformerType getType(String name) { |
||||
for (TransformerType type: TransformerType.values()) { |
||||
if (type.name().equalsIgnoreCase(name)) { |
||||
return type; |
||||
} |
||||
} |
||||
|
||||
return null; |
||||
} |
||||
} |
@ -0,0 +1,40 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.enums; |
||||
|
||||
/** |
||||
* WriterType |
||||
*/ |
||||
public enum WriterType { |
||||
/** |
||||
* JDBC |
||||
*/ |
||||
JDBC, |
||||
LOCAL_FILE, |
||||
HDFS_FILE; |
||||
|
||||
public static WriterType getType(String name) { |
||||
for (WriterType type: WriterType.values()) { |
||||
if (type.name().equalsIgnoreCase(name)) { |
||||
return type; |
||||
} |
||||
} |
||||
|
||||
return null; |
||||
} |
||||
} |
@ -0,0 +1,40 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.exception; |
||||
|
||||
/** |
||||
* ConfigRuntimeException |
||||
*/ |
||||
public class ConfigRuntimeException extends RuntimeException { |
||||
|
||||
public ConfigRuntimeException() { |
||||
super(); |
||||
} |
||||
|
||||
public ConfigRuntimeException(String message) { |
||||
super(message); |
||||
} |
||||
|
||||
public ConfigRuntimeException(String message, Throwable cause) { |
||||
super(message, cause); |
||||
} |
||||
|
||||
public ConfigRuntimeException(Throwable cause) { |
||||
super(cause); |
||||
} |
||||
} |
@ -0,0 +1,57 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.exception; |
||||
|
||||
/** |
||||
* data quality exception |
||||
*/ |
||||
public class DataQualityException extends Exception { |
||||
|
||||
public DataQualityException() { |
||||
super(); |
||||
} |
||||
|
||||
/** |
||||
* Construct a new runtime exception with the detail message |
||||
* |
||||
* @param message detail message |
||||
*/ |
||||
public DataQualityException(String message) { |
||||
super(message); |
||||
} |
||||
|
||||
/** |
||||
* Construct a new runtime exception with the detail message and cause |
||||
* |
||||
* @param message the detail message |
||||
* @param cause the cause |
||||
* @since 1.4 |
||||
*/ |
||||
public DataQualityException(String message, Throwable cause) { |
||||
super(message, cause); |
||||
} |
||||
|
||||
/** |
||||
* Construct a new runtime exception with throwable |
||||
* |
||||
* @param cause the cause |
||||
*/ |
||||
public DataQualityException(Throwable cause) { |
||||
super(cause); |
||||
} |
||||
} |
@ -0,0 +1,35 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.execution; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.flow.Component; |
||||
|
||||
import java.util.List; |
||||
|
||||
/** |
||||
* Execution |
||||
*/ |
||||
public interface Execution<R extends Component, T extends Component, W extends Component> { |
||||
/** |
||||
* execute |
||||
* @param readers readers |
||||
* @param transformers transformers |
||||
* @param writers writers |
||||
*/ |
||||
void execute(List<R> readers, List<T> transformers, List<W> writers); |
||||
} |
@ -0,0 +1,132 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.execution; |
||||
|
||||
import static org.apache.dolphinscheduler.data.quality.Constants.INPUT_TABLE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.OUTPUT_TABLE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.TMP_TABLE; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.exception.ConfigRuntimeException; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchReader; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchTransformer; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchWriter; |
||||
|
||||
import org.apache.spark.sql.Dataset; |
||||
import org.apache.spark.sql.Row; |
||||
|
||||
import java.util.List; |
||||
|
||||
/** |
||||
* SparkBatchExecution is responsible for executing readers、transformers and writers |
||||
*/ |
||||
public class SparkBatchExecution implements Execution<BatchReader, BatchTransformer, BatchWriter> { |
||||
|
||||
private final SparkRuntimeEnvironment environment; |
||||
|
||||
public SparkBatchExecution(SparkRuntimeEnvironment environment) throws ConfigRuntimeException { |
||||
this.environment = environment; |
||||
} |
||||
|
||||
@Override |
||||
public void execute(List<BatchReader> readers, List<BatchTransformer> transformers, List<BatchWriter> writers) { |
||||
readers.forEach(reader -> registerInputTempView(reader, environment)); |
||||
|
||||
if (!readers.isEmpty()) { |
||||
Dataset<Row> ds = readers.get(0).read(environment); |
||||
for (BatchTransformer tf:transformers) { |
||||
ds = executeTransformer(environment, tf, ds); |
||||
registerTransformTempView(tf, ds); |
||||
} |
||||
|
||||
for (BatchWriter sink: writers) { |
||||
executeWriter(environment, sink, ds); |
||||
} |
||||
} |
||||
|
||||
environment.sparkSession().stop(); |
||||
} |
||||
|
||||
private void registerTempView(String tableName, Dataset<Row> ds) { |
||||
if (ds != null) { |
||||
ds.createOrReplaceTempView(tableName); |
||||
} else { |
||||
throw new ConfigRuntimeException("dataset is null, can not createOrReplaceTempView"); |
||||
} |
||||
} |
||||
|
||||
private void registerInputTempView(BatchReader reader, SparkRuntimeEnvironment environment) { |
||||
Config conf = reader.getConfig(); |
||||
if (Boolean.TRUE.equals(conf.has(OUTPUT_TABLE))) { |
||||
String tableName = conf.getString(OUTPUT_TABLE); |
||||
registerTempView(tableName, reader.read(environment)); |
||||
} else { |
||||
throw new ConfigRuntimeException( |
||||
"[" + reader.getClass().getName() + "] must be registered as dataset, please set \"output_table\" config"); |
||||
} |
||||
} |
||||
|
||||
private Dataset<Row> executeTransformer(SparkRuntimeEnvironment environment, BatchTransformer transformer, Dataset<Row> dataset) { |
||||
Config config = transformer.getConfig(); |
||||
Dataset<Row> inputDataset; |
||||
Dataset<Row> outputDataset = null; |
||||
if (Boolean.TRUE.equals(config.has(INPUT_TABLE))) { |
||||
String[] tableNames = config.getString(INPUT_TABLE).split(","); |
||||
|
||||
for (String sourceTableName: tableNames) { |
||||
inputDataset = environment.sparkSession().read().table(sourceTableName); |
||||
|
||||
if (outputDataset == null) { |
||||
outputDataset = inputDataset; |
||||
} else { |
||||
outputDataset = outputDataset.union(inputDataset); |
||||
} |
||||
} |
||||
} else { |
||||
outputDataset = dataset; |
||||
} |
||||
|
||||
if (Boolean.TRUE.equals(config.has(TMP_TABLE))) { |
||||
if (outputDataset == null) { |
||||
outputDataset = dataset; |
||||
} |
||||
String tableName = config.getString(TMP_TABLE); |
||||
registerTempView(tableName, outputDataset); |
||||
} |
||||
|
||||
return transformer.transform(outputDataset, environment); |
||||
} |
||||
|
||||
private void registerTransformTempView(BatchTransformer transformer, Dataset<Row> ds) { |
||||
Config config = transformer.getConfig(); |
||||
if (Boolean.TRUE.equals(config.has(OUTPUT_TABLE))) { |
||||
String tableName = config.getString(OUTPUT_TABLE); |
||||
registerTempView(tableName, ds); |
||||
} |
||||
} |
||||
|
||||
private void executeWriter(SparkRuntimeEnvironment environment, BatchWriter writer, Dataset<Row> ds) { |
||||
Config config = writer.getConfig(); |
||||
Dataset<Row> inputDataSet = ds; |
||||
if (Boolean.TRUE.equals(config.has(INPUT_TABLE))) { |
||||
String sourceTableName = config.getString(INPUT_TABLE); |
||||
inputDataSet = environment.sparkSession().read().table(sourceTableName); |
||||
} |
||||
writer.write(inputDataSet, environment); |
||||
} |
||||
} |
@ -0,0 +1,72 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.execution; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
|
||||
import org.apache.spark.SparkConf; |
||||
import org.apache.spark.sql.SparkSession; |
||||
|
||||
/** |
||||
* The SparkRuntimeEnvironment is responsible for creating SparkSession and SparkExecution |
||||
*/ |
||||
public class SparkRuntimeEnvironment { |
||||
|
||||
private static final String TYPE = "type"; |
||||
private static final String BATCH = "batch"; |
||||
|
||||
private SparkSession sparkSession; |
||||
|
||||
private Config config = new Config(); |
||||
|
||||
public SparkRuntimeEnvironment(Config config) { |
||||
if (config != null) { |
||||
this.config = config; |
||||
} |
||||
|
||||
this.prepare(); |
||||
} |
||||
|
||||
public Config getConfig() { |
||||
return this.config; |
||||
} |
||||
|
||||
public void prepare() { |
||||
sparkSession = SparkSession.builder().config(createSparkConf()).getOrCreate(); |
||||
} |
||||
|
||||
private SparkConf createSparkConf() { |
||||
SparkConf conf = new SparkConf(); |
||||
this.config.entrySet() |
||||
.forEach(entry -> conf.set(entry.getKey(), String.valueOf(entry.getValue()))); |
||||
conf.set("spark.sql.crossJoin.enabled","true"); |
||||
return conf; |
||||
} |
||||
|
||||
public SparkSession sparkSession() { |
||||
return sparkSession; |
||||
} |
||||
|
||||
public boolean isBatch() { |
||||
return BATCH.equalsIgnoreCase(config.getString(TYPE)); |
||||
} |
||||
|
||||
public SparkBatchExecution getBatchExecution() { |
||||
return new SparkBatchExecution(this); |
||||
} |
||||
} |
@ -0,0 +1,56 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.config.ValidateResult; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
|
||||
import java.util.ArrayList; |
||||
import java.util.List; |
||||
import java.util.stream.Collectors; |
||||
|
||||
/** |
||||
* Component |
||||
*/ |
||||
public interface Component { |
||||
|
||||
Config getConfig(); |
||||
|
||||
ValidateResult validateConfig(); |
||||
|
||||
default ValidateResult validate(List<String> requiredOptions) { |
||||
List<String> nonExistsOptions = new ArrayList<>(); |
||||
requiredOptions.forEach(x -> { |
||||
if (Boolean.FALSE.equals(getConfig().has(x))) { |
||||
nonExistsOptions.add(x); |
||||
} |
||||
}); |
||||
|
||||
if (!nonExistsOptions.isEmpty()) { |
||||
return new ValidateResult( |
||||
false, |
||||
nonExistsOptions.stream().map(option -> |
||||
"[" + option + "]").collect(Collectors.joining(",")) + " is not exist"); |
||||
} else { |
||||
return new ValidateResult(true, ""); |
||||
} |
||||
} |
||||
|
||||
void prepare(SparkRuntimeEnvironment prepareEnv); |
||||
} |
@ -0,0 +1,37 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.batch; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
import org.apache.dolphinscheduler.data.quality.flow.Component; |
||||
|
||||
import org.apache.spark.sql.Dataset; |
||||
import org.apache.spark.sql.Row; |
||||
|
||||
/** |
||||
* BatchReader |
||||
*/ |
||||
public interface BatchReader extends Component { |
||||
|
||||
/** |
||||
* read data from source return dataset |
||||
* @param env env |
||||
* @return Dataset<Row> |
||||
*/ |
||||
Dataset<Row> read(SparkRuntimeEnvironment env); |
||||
} |
@ -0,0 +1,38 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.batch; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
import org.apache.dolphinscheduler.data.quality.flow.Component; |
||||
|
||||
import org.apache.spark.sql.Dataset; |
||||
import org.apache.spark.sql.Row; |
||||
|
||||
/** |
||||
* BatchTransformer |
||||
*/ |
||||
public interface BatchTransformer extends Component { |
||||
|
||||
/** |
||||
* transform the dataset |
||||
* @param data data |
||||
* @param env env |
||||
* @return Dataset<Row> |
||||
*/ |
||||
Dataset<Row> transform(Dataset<Row> data, SparkRuntimeEnvironment env); |
||||
} |
@ -0,0 +1,37 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.batch; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
import org.apache.dolphinscheduler.data.quality.flow.Component; |
||||
|
||||
import org.apache.spark.sql.Dataset; |
||||
import org.apache.spark.sql.Row; |
||||
|
||||
/** |
||||
* BatchWriter |
||||
*/ |
||||
public interface BatchWriter extends Component { |
||||
|
||||
/** |
||||
* write data to target storage |
||||
* @param data data |
||||
* @param environment environment |
||||
*/ |
||||
void write(Dataset<Row> data, SparkRuntimeEnvironment environment); |
||||
} |
@ -0,0 +1,69 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.batch.reader; |
||||
|
||||
import static org.apache.dolphinscheduler.data.quality.Constants.DATABASE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.SQL; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.TABLE; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.config.ValidateResult; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchReader; |
||||
|
||||
import org.apache.spark.sql.Dataset; |
||||
import org.apache.spark.sql.Row; |
||||
|
||||
import java.util.Arrays; |
||||
|
||||
import com.google.common.base.Strings; |
||||
|
||||
/** |
||||
* HiveReader |
||||
*/ |
||||
public class HiveReader implements BatchReader { |
||||
|
||||
private final Config config; |
||||
|
||||
public HiveReader(Config config) { |
||||
this.config = config; |
||||
} |
||||
|
||||
@Override |
||||
public Config getConfig() { |
||||
return config; |
||||
} |
||||
|
||||
@Override |
||||
public ValidateResult validateConfig() { |
||||
return validate(Arrays.asList(DATABASE, TABLE)); |
||||
} |
||||
|
||||
@Override |
||||
public void prepare(SparkRuntimeEnvironment prepareEnv) { |
||||
if (Strings.isNullOrEmpty(config.getString(SQL))) { |
||||
config.put(SQL,"select * from " + config.getString(DATABASE) + "." + config.getString(TABLE)); |
||||
} |
||||
} |
||||
|
||||
@Override |
||||
public Dataset<Row> read(SparkRuntimeEnvironment env) { |
||||
return env.sparkSession().sql(config.getString(SQL)); |
||||
} |
||||
|
||||
} |
@ -0,0 +1,95 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.batch.reader; |
||||
|
||||
import static org.apache.dolphinscheduler.data.quality.Constants.DB_TABLE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.DOTS; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.DRIVER; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.JDBC; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.PASSWORD; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.TABLE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.URL; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.USER; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.config.ValidateResult; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchReader; |
||||
import org.apache.dolphinscheduler.data.quality.utils.ConfigUtils; |
||||
|
||||
import org.apache.spark.sql.DataFrameReader; |
||||
import org.apache.spark.sql.Dataset; |
||||
import org.apache.spark.sql.Row; |
||||
import org.apache.spark.sql.SparkSession; |
||||
|
||||
import java.util.Arrays; |
||||
import java.util.HashMap; |
||||
import java.util.Map; |
||||
|
||||
/** |
||||
* AbstractJdbcSource |
||||
*/ |
||||
public class JdbcReader implements BatchReader { |
||||
|
||||
private final Config config; |
||||
|
||||
public JdbcReader(Config config) { |
||||
this.config = config; |
||||
} |
||||
|
||||
@Override |
||||
public Config getConfig() { |
||||
return config; |
||||
} |
||||
|
||||
@Override |
||||
public ValidateResult validateConfig() { |
||||
return validate(Arrays.asList(URL, TABLE, USER, PASSWORD)); |
||||
} |
||||
|
||||
@Override |
||||
public void prepare(SparkRuntimeEnvironment prepareEnv) { |
||||
// Do nothing
|
||||
} |
||||
|
||||
@Override |
||||
public Dataset<Row> read(SparkRuntimeEnvironment env) { |
||||
return jdbcReader(env.sparkSession()).load(); |
||||
} |
||||
|
||||
private DataFrameReader jdbcReader(SparkSession sparkSession) { |
||||
|
||||
DataFrameReader reader = sparkSession.read() |
||||
.format(JDBC) |
||||
.option(URL, config.getString(URL)) |
||||
.option(DB_TABLE, config.getString(TABLE)) |
||||
.option(USER, config.getString(USER)) |
||||
.option(PASSWORD, config.getString(PASSWORD)) |
||||
.option(DRIVER, config.getString(DRIVER)); |
||||
|
||||
Config jdbcConfig = ConfigUtils.extractSubConfig(config, JDBC + DOTS, false); |
||||
|
||||
if (!config.isEmpty()) { |
||||
Map<String,String> optionMap = new HashMap<>(16); |
||||
jdbcConfig.entrySet().forEach(x -> optionMap.put(x.getKey(),String.valueOf(x.getValue()))); |
||||
reader.options(optionMap); |
||||
} |
||||
|
||||
return reader; |
||||
} |
||||
} |
@ -0,0 +1,76 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.batch.reader; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.config.ReaderConfig; |
||||
import org.apache.dolphinscheduler.data.quality.enums.ReaderType; |
||||
import org.apache.dolphinscheduler.data.quality.exception.DataQualityException; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchReader; |
||||
|
||||
import java.util.ArrayList; |
||||
import java.util.List; |
||||
|
||||
/** |
||||
* ReaderFactory |
||||
*/ |
||||
public class ReaderFactory { |
||||
|
||||
private static class Singleton { |
||||
static ReaderFactory instance = new ReaderFactory(); |
||||
} |
||||
|
||||
public static ReaderFactory getInstance() { |
||||
return Singleton.instance; |
||||
} |
||||
|
||||
public List<BatchReader> getReaders(SparkRuntimeEnvironment sparkRuntimeEnvironment, List<ReaderConfig> readerConfigs) throws DataQualityException { |
||||
|
||||
List<BatchReader> readerList = new ArrayList<>(); |
||||
|
||||
for (ReaderConfig readerConfig : readerConfigs) { |
||||
BatchReader reader = getReader(readerConfig); |
||||
if (reader != null) { |
||||
reader.validateConfig(); |
||||
reader.prepare(sparkRuntimeEnvironment); |
||||
readerList.add(reader); |
||||
} |
||||
} |
||||
|
||||
return readerList; |
||||
} |
||||
|
||||
private BatchReader getReader(ReaderConfig readerConfig) throws DataQualityException { |
||||
ReaderType readerType = ReaderType.getType(readerConfig.getType()); |
||||
Config config = new Config(readerConfig.getConfig()); |
||||
if (readerType != null) { |
||||
switch (readerType) { |
||||
case JDBC: |
||||
return new JdbcReader(config); |
||||
case HIVE: |
||||
return new HiveReader(config); |
||||
default: |
||||
throw new DataQualityException("reader type " + readerType + " is not supported!"); |
||||
} |
||||
} |
||||
|
||||
return null; |
||||
} |
||||
|
||||
} |
@ -0,0 +1,62 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.batch.transformer; |
||||
|
||||
import static org.apache.dolphinscheduler.data.quality.Constants.SQL; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.config.ValidateResult; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchTransformer; |
||||
|
||||
import org.apache.spark.sql.Dataset; |
||||
import org.apache.spark.sql.Row; |
||||
|
||||
import java.util.Collections; |
||||
|
||||
/** |
||||
* SqlTransformer |
||||
*/ |
||||
public class SqlTransformer implements BatchTransformer { |
||||
|
||||
private final Config config; |
||||
|
||||
public SqlTransformer(Config config) { |
||||
this.config = config; |
||||
} |
||||
|
||||
@Override |
||||
public Config getConfig() { |
||||
return config; |
||||
} |
||||
|
||||
@Override |
||||
public ValidateResult validateConfig() { |
||||
return validate(Collections.singletonList(SQL)); |
||||
} |
||||
|
||||
@Override |
||||
public void prepare(SparkRuntimeEnvironment prepareEnv) { |
||||
// Do nothing
|
||||
} |
||||
|
||||
@Override |
||||
public Dataset<Row> transform(Dataset<Row> data, SparkRuntimeEnvironment env) { |
||||
return env.sparkSession().sql(config.getString(SQL)); |
||||
} |
||||
} |
@ -0,0 +1,72 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.batch.transformer; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.config.TransformerConfig; |
||||
import org.apache.dolphinscheduler.data.quality.enums.TransformerType; |
||||
import org.apache.dolphinscheduler.data.quality.exception.DataQualityException; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchTransformer; |
||||
|
||||
import java.util.ArrayList; |
||||
import java.util.List; |
||||
|
||||
/** |
||||
* WriterFactory |
||||
*/ |
||||
public class TransformerFactory { |
||||
|
||||
private static class Singleton { |
||||
static TransformerFactory instance = new TransformerFactory(); |
||||
} |
||||
|
||||
public static TransformerFactory getInstance() { |
||||
return Singleton.instance; |
||||
} |
||||
|
||||
public List<BatchTransformer> getTransformer(SparkRuntimeEnvironment sparkRuntimeEnvironment, List<TransformerConfig> transformerConfigs) throws DataQualityException { |
||||
|
||||
List<BatchTransformer> transformers = new ArrayList<>(); |
||||
|
||||
for (TransformerConfig transformerConfig:transformerConfigs) { |
||||
BatchTransformer transformer = getTransformer(transformerConfig); |
||||
if (transformer != null) { |
||||
transformer.validateConfig(); |
||||
transformer.prepare(sparkRuntimeEnvironment); |
||||
transformers.add(transformer); |
||||
} |
||||
} |
||||
|
||||
return transformers; |
||||
} |
||||
|
||||
private BatchTransformer getTransformer(TransformerConfig transformerConfig) throws DataQualityException { |
||||
TransformerType transformerType = TransformerType.getType(transformerConfig.getType()); |
||||
Config config = new Config(transformerConfig.getConfig()); |
||||
if (transformerType != null) { |
||||
if (transformerType == TransformerType.SQL) { |
||||
return new SqlTransformer(config); |
||||
} |
||||
throw new DataQualityException("transformer type " + transformerType + " is not supported!"); |
||||
} |
||||
|
||||
return null; |
||||
} |
||||
|
||||
} |
@ -0,0 +1,87 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.batch.writer; |
||||
|
||||
import static org.apache.dolphinscheduler.data.quality.Constants.APPEND; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.DB_TABLE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.DRIVER; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.JDBC; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.PASSWORD; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.SAVE_MODE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.SQL; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.TABLE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.URL; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.USER; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.config.ValidateResult; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchWriter; |
||||
|
||||
import org.apache.spark.sql.Dataset; |
||||
import org.apache.spark.sql.Row; |
||||
|
||||
import java.util.Arrays; |
||||
|
||||
import com.google.common.base.Strings; |
||||
|
||||
/** |
||||
* JdbcWriter |
||||
*/ |
||||
public class JdbcWriter implements BatchWriter { |
||||
|
||||
private final Config config; |
||||
|
||||
public JdbcWriter(Config config) { |
||||
this.config = config; |
||||
} |
||||
|
||||
@Override |
||||
public Config getConfig() { |
||||
return config; |
||||
} |
||||
|
||||
@Override |
||||
public ValidateResult validateConfig() { |
||||
return validate(Arrays.asList(URL, TABLE, USER, PASSWORD)); |
||||
} |
||||
|
||||
@Override |
||||
public void prepare(SparkRuntimeEnvironment prepareEnv) { |
||||
if (Strings.isNullOrEmpty(config.getString(SAVE_MODE))) { |
||||
config.put(SAVE_MODE,APPEND); |
||||
} |
||||
} |
||||
|
||||
@Override |
||||
public void write(Dataset<Row> data, SparkRuntimeEnvironment env) { |
||||
if (!Strings.isNullOrEmpty(config.getString(SQL))) { |
||||
data = env.sparkSession().sql(config.getString(SQL)); |
||||
} |
||||
|
||||
data.write() |
||||
.format(JDBC) |
||||
.option(DRIVER,config.getString(DRIVER)) |
||||
.option(URL,config.getString(URL)) |
||||
.option(DB_TABLE, config.getString(TABLE)) |
||||
.option(USER, config.getString(USER)) |
||||
.option(PASSWORD, config.getString(PASSWORD)) |
||||
.mode(config.getString(SAVE_MODE)) |
||||
.save(); |
||||
} |
||||
} |
@ -0,0 +1,81 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.batch.writer; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.config.WriterConfig; |
||||
import org.apache.dolphinscheduler.data.quality.enums.WriterType; |
||||
import org.apache.dolphinscheduler.data.quality.exception.DataQualityException; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchWriter; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.writer.file.HdfsFileWriter; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.writer.file.LocalFileWriter; |
||||
|
||||
import java.util.ArrayList; |
||||
import java.util.List; |
||||
|
||||
/** |
||||
* WriterFactory |
||||
*/ |
||||
public class WriterFactory { |
||||
|
||||
private static class Singleton { |
||||
static WriterFactory instance = new WriterFactory(); |
||||
} |
||||
|
||||
public static WriterFactory getInstance() { |
||||
return Singleton.instance; |
||||
} |
||||
|
||||
public List<BatchWriter> getWriters(SparkRuntimeEnvironment sparkRuntimeEnvironment, List<WriterConfig> writerConfigs) throws DataQualityException { |
||||
|
||||
List<BatchWriter> writerList = new ArrayList<>(); |
||||
|
||||
for (WriterConfig writerConfig:writerConfigs) { |
||||
BatchWriter writer = getWriter(writerConfig); |
||||
if (writer != null) { |
||||
writer.validateConfig(); |
||||
writer.prepare(sparkRuntimeEnvironment); |
||||
writerList.add(writer); |
||||
} |
||||
} |
||||
|
||||
return writerList; |
||||
} |
||||
|
||||
private BatchWriter getWriter(WriterConfig writerConfig) throws DataQualityException { |
||||
|
||||
WriterType writerType = WriterType.getType(writerConfig.getType()); |
||||
Config config = new Config(writerConfig.getConfig()); |
||||
if (writerType != null) { |
||||
switch (writerType) { |
||||
case JDBC: |
||||
return new JdbcWriter(config); |
||||
case LOCAL_FILE: |
||||
return new LocalFileWriter(config); |
||||
case HDFS_FILE: |
||||
return new HdfsFileWriter(config); |
||||
default: |
||||
throw new DataQualityException("writer type " + writerType + " is not supported!"); |
||||
} |
||||
} |
||||
|
||||
return null; |
||||
} |
||||
|
||||
} |
@ -0,0 +1,131 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.batch.writer.file; |
||||
|
||||
import static org.apache.dolphinscheduler.data.quality.Constants.SAVE_MODE; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.config.ValidateResult; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchWriter; |
||||
import org.apache.dolphinscheduler.data.quality.utils.ConfigUtils; |
||||
|
||||
import org.apache.commons.collections.CollectionUtils; |
||||
import org.apache.spark.sql.DataFrameWriter; |
||||
import org.apache.spark.sql.Dataset; |
||||
import org.apache.spark.sql.Row; |
||||
|
||||
import java.util.Collections; |
||||
import java.util.HashMap; |
||||
import java.util.List; |
||||
import java.util.Map; |
||||
|
||||
import com.google.common.base.Strings; |
||||
|
||||
/** |
||||
* BaseFileWriter |
||||
*/ |
||||
public abstract class BaseFileWriter implements BatchWriter { |
||||
|
||||
public static final String PARTITION_BY = "partition_by"; |
||||
public static final String SERIALIZER = "serializer"; |
||||
public static final String PATH = "path"; |
||||
|
||||
private final Config config; |
||||
|
||||
protected BaseFileWriter(Config config) { |
||||
this.config = config; |
||||
} |
||||
|
||||
@Override |
||||
public Config getConfig() { |
||||
return config; |
||||
} |
||||
|
||||
@Override |
||||
public void prepare(SparkRuntimeEnvironment prepareEnv) { |
||||
Map<String,Object> defaultConfig = new HashMap<>(); |
||||
|
||||
defaultConfig.put(PARTITION_BY, Collections.emptyList()); |
||||
defaultConfig.put(SAVE_MODE,"error"); |
||||
defaultConfig.put(SERIALIZER,"csv"); |
||||
|
||||
config.merge(defaultConfig); |
||||
} |
||||
|
||||
protected ValidateResult checkConfigImpl(List<String> allowedUri) { |
||||
|
||||
if (Boolean.TRUE.equals(config.has(PATH)) && !Strings.isNullOrEmpty(config.getString(PATH))) { |
||||
String dir = config.getString(PATH); |
||||
if (dir.startsWith("/") || uriInAllowedSchema(dir, allowedUri)) { |
||||
return new ValidateResult(true, ""); |
||||
} else { |
||||
return new ValidateResult(false, "invalid path URI, please set the following allowed schemas: " + String.join(",", allowedUri)); |
||||
} |
||||
} else { |
||||
return new ValidateResult(false, "please specify [path] as non-empty string"); |
||||
} |
||||
} |
||||
|
||||
protected boolean uriInAllowedSchema(String uri, List<String> allowedUri) { |
||||
return allowedUri.stream().map(uri::startsWith).reduce(true, (a, b) -> a && b); |
||||
} |
||||
|
||||
protected String buildPathWithDefaultSchema(String uri, String defaultUriSchema) { |
||||
return uri.startsWith("/") ? defaultUriSchema + uri : uri; |
||||
} |
||||
|
||||
protected void outputImpl(Dataset<Row> df, String defaultUriSchema) { |
||||
|
||||
DataFrameWriter<Row> writer = df.write().mode(config.getString(SAVE_MODE)); |
||||
|
||||
if (CollectionUtils.isNotEmpty(config.getStringList(PARTITION_BY))) { |
||||
List<String> partitionKeys = config.getStringList(PARTITION_BY); |
||||
writer.partitionBy(partitionKeys.toArray(new String[]{})); |
||||
} |
||||
|
||||
Config fileConfig = ConfigUtils.extractSubConfig(config, "options.", false); |
||||
if (fileConfig.isNotEmpty()) { |
||||
Map<String,String> optionMap = new HashMap<>(16); |
||||
fileConfig.entrySet().forEach(x -> optionMap.put(x.getKey(),String.valueOf(x.getValue()))); |
||||
writer.options(optionMap); |
||||
} |
||||
|
||||
String path = buildPathWithDefaultSchema(config.getString(PATH), defaultUriSchema); |
||||
|
||||
switch (config.getString(SERIALIZER)) { |
||||
case "csv": |
||||
writer.csv(path); |
||||
break; |
||||
case "json": |
||||
writer.json(path); |
||||
break; |
||||
case "parquet": |
||||
writer.parquet(path); |
||||
break; |
||||
case "text": |
||||
writer.text(path); |
||||
break; |
||||
case "orc": |
||||
writer.orc(path); |
||||
break; |
||||
default: |
||||
break; |
||||
} |
||||
} |
||||
} |
@ -0,0 +1,47 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.batch.writer.file; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.config.ValidateResult; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
|
||||
import org.apache.spark.sql.Dataset; |
||||
import org.apache.spark.sql.Row; |
||||
|
||||
import java.util.Collections; |
||||
|
||||
/** |
||||
* HdfsFileWriter |
||||
*/ |
||||
public class HdfsFileWriter extends BaseFileWriter { |
||||
|
||||
public HdfsFileWriter(Config config) { |
||||
super(config); |
||||
} |
||||
|
||||
@Override |
||||
public void write(Dataset<Row> data, SparkRuntimeEnvironment environment) { |
||||
outputImpl(data,"hdfs://"); |
||||
} |
||||
|
||||
@Override |
||||
public ValidateResult validateConfig() { |
||||
return checkConfigImpl(Collections.singletonList("hdfs://")); |
||||
} |
||||
} |
@ -0,0 +1,47 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.batch.writer.file; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.config.ValidateResult; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
|
||||
import org.apache.spark.sql.Dataset; |
||||
import org.apache.spark.sql.Row; |
||||
|
||||
import java.util.Collections; |
||||
|
||||
/** |
||||
* LocalFileWriter |
||||
*/ |
||||
public class LocalFileWriter extends BaseFileWriter { |
||||
|
||||
public LocalFileWriter(Config config) { |
||||
super(config); |
||||
} |
||||
|
||||
@Override |
||||
public void write(Dataset<Row> data, SparkRuntimeEnvironment environment) { |
||||
outputImpl(data,"file://"); |
||||
} |
||||
|
||||
@Override |
||||
public ValidateResult validateConfig() { |
||||
return checkConfigImpl(Collections.singletonList("file://")); |
||||
} |
||||
} |
@ -0,0 +1,56 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.utils; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
|
||||
import java.util.LinkedHashMap; |
||||
import java.util.Map; |
||||
|
||||
public class ConfigUtils { |
||||
|
||||
private ConfigUtils() { |
||||
throw new IllegalStateException("Construct ConfigUtils"); |
||||
} |
||||
|
||||
/** |
||||
* Extract sub config with fixed prefix |
||||
* |
||||
* @param source config source |
||||
* @param prefix config prefix |
||||
* @param keepPrefix true if keep prefix |
||||
*/ |
||||
public static Config extractSubConfig(Config source, String prefix, boolean keepPrefix) { |
||||
Map<String, Object> values = new LinkedHashMap<>(); |
||||
|
||||
for (Map.Entry<String, Object> entry : source.entrySet()) { |
||||
final String key = entry.getKey(); |
||||
final String value = String.valueOf(entry.getValue()); |
||||
|
||||
if (key.startsWith(prefix)) { |
||||
if (keepPrefix) { |
||||
values.put(key, value); |
||||
} else { |
||||
values.put(key.substring(prefix.length()), value); |
||||
} |
||||
} |
||||
} |
||||
|
||||
return new Config(values); |
||||
} |
||||
} |
@ -0,0 +1,71 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.utils; |
||||
|
||||
import static com.fasterxml.jackson.databind.DeserializationFeature.ACCEPT_EMPTY_ARRAY_AS_NULL_OBJECT; |
||||
import static com.fasterxml.jackson.databind.DeserializationFeature.ACCEPT_EMPTY_STRING_AS_NULL_OBJECT; |
||||
import static com.fasterxml.jackson.databind.DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES; |
||||
import static com.fasterxml.jackson.databind.DeserializationFeature.READ_UNKNOWN_ENUM_VALUES_AS_NULL; |
||||
import static com.fasterxml.jackson.databind.MapperFeature.REQUIRE_SETTERS_FOR_GETTERS; |
||||
import static com.fasterxml.jackson.databind.SerializationFeature.FAIL_ON_EMPTY_BEANS; |
||||
|
||||
import java.util.TimeZone; |
||||
|
||||
import org.slf4j.Logger; |
||||
import org.slf4j.LoggerFactory; |
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper; |
||||
import com.google.common.base.Strings; |
||||
|
||||
/** |
||||
* JsonUtil |
||||
*/ |
||||
public class JsonUtils { |
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(JsonUtils.class); |
||||
|
||||
/** |
||||
* can use static singleton, inject: just make sure to reuse! |
||||
*/ |
||||
private static final ObjectMapper MAPPER = new ObjectMapper() |
||||
.configure(FAIL_ON_UNKNOWN_PROPERTIES, false) |
||||
.configure(ACCEPT_EMPTY_ARRAY_AS_NULL_OBJECT, true) |
||||
.configure(ACCEPT_EMPTY_STRING_AS_NULL_OBJECT,true) |
||||
.configure(READ_UNKNOWN_ENUM_VALUES_AS_NULL, true) |
||||
.configure(REQUIRE_SETTERS_FOR_GETTERS, true) |
||||
.configure(FAIL_ON_EMPTY_BEANS,false) |
||||
.setTimeZone(TimeZone.getDefault()); |
||||
|
||||
private JsonUtils() { |
||||
throw new UnsupportedOperationException("Construct JSONUtils"); |
||||
} |
||||
|
||||
public static <T> T fromJson(String json, Class<T> clazz) { |
||||
if (Strings.isNullOrEmpty(json)) { |
||||
return null; |
||||
} |
||||
|
||||
try { |
||||
return MAPPER.readValue(json, clazz); |
||||
} catch (Exception e) { |
||||
logger.error("parse object exception!", e); |
||||
} |
||||
|
||||
return null; |
||||
} |
||||
} |
@ -0,0 +1,22 @@
|
||||
# |
||||
# Licensed to the Apache Software Foundation (ASF) under one or more |
||||
# contributor license agreements. See the NOTICE file distributed with |
||||
# this work for additional information regarding copyright ownership. |
||||
# The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
# (the "License"); you may not use this file except in compliance with |
||||
# the License. You may obtain a copy of the License at |
||||
# |
||||
# http://www.apache.org/licenses/LICENSE-2.0 |
||||
# |
||||
# Unless required by applicable law or agreed to in writing, software |
||||
# distributed under the License is distributed on an "AS IS" BASIS, |
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
# See the License for the specific language governing permissions and |
||||
# limitations under the License. |
||||
# |
||||
|
||||
log4j.rootLogger=INFO, stdout |
||||
log4j.appender.stdout=org.apache.log4j.ConsoleAppender |
||||
log4j.appender.stdout.Target=System.out |
||||
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout |
||||
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss.SSS} %-5p [%c] - %m%n |
@ -0,0 +1,46 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.execution.SparkRuntimeEnvironment; |
||||
|
||||
import java.util.HashMap; |
||||
import java.util.Map; |
||||
|
||||
import org.junit.Before; |
||||
|
||||
/** |
||||
* SparkApplicationTestBase |
||||
*/ |
||||
public class SparkApplicationTestBase { |
||||
|
||||
protected SparkRuntimeEnvironment sparkRuntimeEnvironment; |
||||
|
||||
@Before |
||||
public void init() { |
||||
Map<String,Object> config = new HashMap<>(); |
||||
config.put("spark.app.name","data quality test"); |
||||
config.put("spark.sql.crossJoin.enabled","true"); |
||||
config.put("spark.driver.bindAddress","127.0.0.1"); |
||||
config.put("spark.ui.port",13000); |
||||
config.put("spark.master","local[4]"); |
||||
|
||||
sparkRuntimeEnvironment = new SparkRuntimeEnvironment(new Config(config)); |
||||
} |
||||
} |
@ -0,0 +1,61 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.configuration; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.DataQualityConfiguration; |
||||
import org.apache.dolphinscheduler.data.quality.utils.JsonUtils; |
||||
|
||||
import org.junit.Assert; |
||||
import org.junit.Test; |
||||
|
||||
/** |
||||
* ConfigurationParserTest |
||||
*/ |
||||
public class ConfigurationParserTest { |
||||
|
||||
@Test |
||||
public void testConfigurationValidate() { |
||||
Assert.assertEquals(1,verifyConfigurationValidate()); |
||||
} |
||||
|
||||
private int verifyConfigurationValidate() { |
||||
int flag = 1; |
||||
try { |
||||
String parameterStr = "{\"name\":\"data quality test\",\"env\":{\"type\":\"batch\",\"config\":null}," |
||||
+ "\"readers\":[{\"type\":\"JDBC\",\"config\":{\"database\":\"test\",\"password\":\"Test@123!\"," |
||||
+ "\"driver\":\"com.mysql.jdbc.Driver\",\"user\":\"test\",\"output_table\":\"test1\",\"table\":\"test1\"," |
||||
+ "\"url\":\"jdbc:mysql://172.16.100.199:3306/test\"} }],\"transformers\":[{\"type\":\"sql\",\"config\":" |
||||
+ "{\"index\":1,\"output_table\":\"miss_count\",\"sql\":\"SELECT COUNT(*) AS miss FROM test1 WHERE (c1 is null or c1 = '') \"} }," |
||||
+ "{\"type\":\"sql\",\"config\":{\"index\":2,\"output_table\":\"total_count\",\"sql\":\"SELECT COUNT(*) AS total FROM test1 \"} }]," |
||||
+ "\"writers\":[{\"type\":\"JDBC\",\"config\":{\"database\":\"dolphinscheduler\",\"password\":\"test\"," |
||||
+ "\"driver\":\"org.postgresql.Driver\",\"user\":\"test\",\"table\":\"t_ds_dq_execute_result\"," |
||||
+ "\"url\":\"jdbc:postgresql://172.16.100.199:5432/dolphinscheduler?stringtype=unspecified\"," |
||||
+ "\"sql\":\"SELECT 0 as rule_type,'data quality test' as rule_name,7 as process_definition_id,80 as process_instance_id," |
||||
+ "80 as task_instance_id,miss_count.miss AS statistics_value, total_count.total AS comparison_value,2 as check_type,10 as" |
||||
+ " threshold, 3 as operator, 0 as failure_strategy, '2021-06-29 10:18:59' as create_time,'2021-06-29 10:18:59' as update_time " |
||||
+ "from miss_count FULL JOIN total_count\"} }]}"; |
||||
|
||||
DataQualityConfiguration dataQualityConfiguration = JsonUtils.fromJson(parameterStr,DataQualityConfiguration.class); |
||||
dataQualityConfiguration.validate(); |
||||
} catch (Exception e) { |
||||
flag = 0; |
||||
e.printStackTrace(); |
||||
} |
||||
return flag; |
||||
} |
||||
} |
@ -0,0 +1,45 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.SparkApplicationTestBase; |
||||
|
||||
import java.sql.Connection; |
||||
import java.sql.DriverManager; |
||||
import java.util.Properties; |
||||
|
||||
/** |
||||
* FlowTestBase |
||||
*/ |
||||
public class FlowTestBase extends SparkApplicationTestBase { |
||||
|
||||
protected String url = "jdbc:h2:mem:test;DB_CLOSE_DELAY=-1"; |
||||
|
||||
protected String driver = "org.h2.Driver"; |
||||
|
||||
protected Connection getConnection() throws Exception { |
||||
Properties properties = new Properties(); |
||||
properties.setProperty("user", "test"); |
||||
properties.setProperty("password", "123456"); |
||||
properties.setProperty("rowId", "false"); |
||||
DriverManager.registerDriver(new org.h2.Driver()); |
||||
Class.forName(driver, false, this.getClass().getClassLoader()); |
||||
return DriverManager.getConnection(url, properties); |
||||
} |
||||
|
||||
} |
@ -0,0 +1,99 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.reader; |
||||
|
||||
import static org.apache.dolphinscheduler.data.quality.Constants.DATABASE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.DRIVER; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.PASSWORD; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.TABLE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.URL; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.USER; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.flow.FlowTestBase; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.reader.JdbcReader; |
||||
|
||||
import java.sql.Connection; |
||||
import java.util.HashMap; |
||||
import java.util.Map; |
||||
|
||||
import org.junit.Assert; |
||||
import org.junit.Before; |
||||
import org.junit.Test; |
||||
|
||||
/** |
||||
* JdbcConnectorTest |
||||
*/ |
||||
public class JdbcReaderTest extends FlowTestBase { |
||||
|
||||
@Before |
||||
public void before() { |
||||
super.init(); |
||||
createConnectorTable(); |
||||
} |
||||
|
||||
@Test |
||||
public void testJdbcConnectorExecute() { |
||||
JdbcReader jdbcReader = new JdbcReader(buildReaderConfig()); |
||||
Assert.assertNotNull(jdbcReader.read(sparkRuntimeEnvironment)); |
||||
} |
||||
|
||||
private Config buildReaderConfig() { |
||||
Map<String,Object> config = new HashMap<>(); |
||||
config.put(DATABASE,"test"); |
||||
config.put(TABLE,"test.test1"); |
||||
config.put(URL,url); |
||||
config.put(USER,"test"); |
||||
config.put(PASSWORD,"123456"); |
||||
config.put(DRIVER,driver); |
||||
return new Config(config); |
||||
} |
||||
|
||||
private void createConnectorTable() { |
||||
try { |
||||
Connection connection = getConnection(); |
||||
connection.prepareStatement("create schema if not exists test").executeUpdate(); |
||||
|
||||
connection.prepareStatement("drop table if exists test.test1").executeUpdate(); |
||||
connection |
||||
.prepareStatement( |
||||
"CREATE TABLE test.test1 (\n" |
||||
+ " `id` int(11) NOT NULL AUTO_INCREMENT,\n" |
||||
+ " `company` varchar(255) DEFAULT NULL,\n" |
||||
+ " `date` varchar(255) DEFAULT NULL,\n" |
||||
+ " `c1` varchar(255) DEFAULT NULL,\n" |
||||
+ " `c2` varchar(255) DEFAULT NULL,\n" |
||||
+ " `c3` varchar(255) DEFAULT NULL,\n" |
||||
+ " `c4` int(11) DEFAULT NULL,\n" |
||||
+ " PRIMARY KEY (`id`)\n" |
||||
+ ")") |
||||
.executeUpdate(); |
||||
connection.prepareStatement("INSERT INTO test.test1 (company,`date`,c1,c2,c3,c4) VALUES\n" |
||||
+ "\t ('1','2019-03-01','11','12','13',1),\n" |
||||
+ "\t ('2','2019-06-01','21','22','23',1),\n" |
||||
+ "\t ('3','2019-09-01','31','32','33',1),\n" |
||||
+ "\t ('4','2019-12-01','41','42','43',1),\n" |
||||
+ "\t ('5','2013','42','43','54',1),\n" |
||||
+ "\t ('6','2020','42','43','54',1);").executeUpdate(); |
||||
connection.commit(); |
||||
} catch (Exception e) { |
||||
e.printStackTrace(); |
||||
} |
||||
} |
||||
|
||||
} |
@ -0,0 +1,70 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.reader; |
||||
|
||||
import static org.apache.dolphinscheduler.data.quality.Constants.DATABASE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.DRIVER; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.PASSWORD; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.TABLE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.URL; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.USER; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.ReaderConfig; |
||||
import org.apache.dolphinscheduler.data.quality.exception.DataQualityException; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchReader; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.reader.ReaderFactory; |
||||
|
||||
import java.util.ArrayList; |
||||
import java.util.HashMap; |
||||
import java.util.List; |
||||
import java.util.Map; |
||||
|
||||
import org.junit.Assert; |
||||
import org.junit.Test; |
||||
|
||||
/** |
||||
* ConnectorFactoryTest |
||||
*/ |
||||
public class ReaderFactoryTest { |
||||
|
||||
@Test |
||||
public void testConnectorGenerate() throws DataQualityException { |
||||
|
||||
List<ReaderConfig> readerConfigs = new ArrayList<>(); |
||||
ReaderConfig readerConfig = new ReaderConfig(); |
||||
readerConfig.setType("JDBC"); |
||||
Map<String,Object> config = new HashMap<>(); |
||||
config.put(DATABASE,"test"); |
||||
config.put(TABLE,"test1"); |
||||
config.put(URL,"jdbc:mysql://localhost:3306/test"); |
||||
config.put(USER,"test"); |
||||
config.put(PASSWORD,"123456"); |
||||
config.put(DRIVER,"com.mysql.jdbc.Driver"); |
||||
readerConfig.setConfig(config); |
||||
readerConfigs.add(readerConfig); |
||||
|
||||
int flag = 0; |
||||
|
||||
List<BatchReader> readers = ReaderFactory.getInstance().getReaders(null,readerConfigs); |
||||
if (readers != null && readers.size() >= 1) { |
||||
flag = 1; |
||||
} |
||||
|
||||
Assert.assertEquals(1,flag); |
||||
} |
||||
} |
@ -0,0 +1,101 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.writer; |
||||
|
||||
import static org.apache.dolphinscheduler.data.quality.Constants.DATABASE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.DRIVER; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.PASSWORD; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.TABLE; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.URL; |
||||
import static org.apache.dolphinscheduler.data.quality.Constants.USER; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
import org.apache.dolphinscheduler.data.quality.flow.FlowTestBase; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.reader.JdbcReader; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.writer.JdbcWriter; |
||||
|
||||
import java.sql.Connection; |
||||
import java.util.HashMap; |
||||
import java.util.Map; |
||||
|
||||
import org.junit.Before; |
||||
import org.junit.Test; |
||||
|
||||
/** |
||||
* JdbcWriterTest |
||||
*/ |
||||
public class JdbcWriterTest extends FlowTestBase { |
||||
|
||||
@Before |
||||
public void before() { |
||||
super.init(); |
||||
createWriterTable(); |
||||
} |
||||
|
||||
@Test |
||||
public void testJdbcWriterExecute() { |
||||
JdbcReader jdbcConnector = new JdbcReader(buildJdbcReaderConfig()); |
||||
JdbcWriter jdbcWriter = new JdbcWriter(buildJdbcConfig()); |
||||
jdbcWriter.write(jdbcConnector.read(sparkRuntimeEnvironment),sparkRuntimeEnvironment); |
||||
} |
||||
|
||||
private Config buildJdbcConfig() { |
||||
Map<String,Object> config = new HashMap<>(); |
||||
config.put(DATABASE,"test"); |
||||
config.put(TABLE,"test.test2"); |
||||
config.put(URL,url); |
||||
config.put(USER,"test"); |
||||
config.put(PASSWORD,"123456"); |
||||
config.put(DRIVER,driver); |
||||
config.put("save_mode","append"); |
||||
return new Config(config); |
||||
} |
||||
|
||||
private Config buildJdbcReaderConfig() { |
||||
Config config = buildJdbcConfig(); |
||||
config.put("sql","SELECT '1' as company,'1' as date,'2' as c1,'2' as c2,'2' as c3, 2 as c4"); |
||||
return config; |
||||
} |
||||
|
||||
private void createWriterTable() { |
||||
try { |
||||
Connection connection = getConnection(); |
||||
connection.prepareStatement("create schema if not exists test").executeUpdate(); |
||||
|
||||
connection.prepareStatement("drop table if exists test.test2").executeUpdate(); |
||||
connection |
||||
.prepareStatement( |
||||
"CREATE TABLE test.test2 (\n" |
||||
+ " `id` int(11) NOT NULL AUTO_INCREMENT,\n" |
||||
+ " `company` varchar(255) DEFAULT NULL,\n" |
||||
+ " `date` varchar(255) DEFAULT NULL,\n" |
||||
+ " `c1` varchar(255) DEFAULT NULL,\n" |
||||
+ " `c2` varchar(255) DEFAULT NULL,\n" |
||||
+ " `c3` varchar(255) DEFAULT NULL,\n" |
||||
+ " `c4` int(11) DEFAULT NULL,\n" |
||||
+ " PRIMARY KEY (`id`)\n" |
||||
+ ")") |
||||
.executeUpdate(); |
||||
connection.prepareStatement("set schema test").executeUpdate(); |
||||
connection.commit(); |
||||
} catch (Exception e) { |
||||
e.printStackTrace(); |
||||
} |
||||
} |
||||
|
||||
} |
@ -0,0 +1,54 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.flow.writer; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.WriterConfig; |
||||
import org.apache.dolphinscheduler.data.quality.exception.DataQualityException; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.BatchWriter; |
||||
import org.apache.dolphinscheduler.data.quality.flow.batch.writer.WriterFactory; |
||||
|
||||
import java.util.ArrayList; |
||||
import java.util.List; |
||||
|
||||
import org.junit.Assert; |
||||
import org.junit.Test; |
||||
|
||||
/** |
||||
* WriterFactoryTest |
||||
*/ |
||||
public class WriterFactoryTest { |
||||
|
||||
@Test |
||||
public void testWriterGenerate() throws DataQualityException { |
||||
|
||||
List<WriterConfig> writerConfigs = new ArrayList<>(); |
||||
WriterConfig writerConfig = new WriterConfig(); |
||||
writerConfig.setType("JDBC"); |
||||
writerConfig.setConfig(null); |
||||
writerConfigs.add(writerConfig); |
||||
|
||||
int flag = 0; |
||||
|
||||
List<BatchWriter> writers = WriterFactory.getInstance().getWriters(null,writerConfigs); |
||||
if (writers != null && writers.size() >= 1) { |
||||
flag = 1; |
||||
} |
||||
|
||||
Assert.assertEquals(1,flag); |
||||
} |
||||
} |
@ -0,0 +1,46 @@
|
||||
/* |
||||
* Licensed to the Apache Software Foundation (ASF) under one or more |
||||
* contributor license agreements. See the NOTICE file distributed with |
||||
* this work for additional information regarding copyright ownership. |
||||
* The ASF licenses this file to You under the Apache License, Version 2.0 |
||||
* (the "License"); you may not use this file except in compliance with |
||||
* the License. You may obtain a copy of the License at |
||||
* |
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
* |
||||
* Unless required by applicable law or agreed to in writing, software |
||||
* distributed under the License is distributed on an "AS IS" BASIS, |
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
||||
* See the License for the specific language governing permissions and |
||||
* limitations under the License. |
||||
*/ |
||||
|
||||
package org.apache.dolphinscheduler.data.quality.utils; |
||||
|
||||
import org.apache.dolphinscheduler.data.quality.config.Config; |
||||
|
||||
import java.util.HashMap; |
||||
import java.util.Map; |
||||
|
||||
import org.junit.Assert; |
||||
import org.junit.Test; |
||||
|
||||
public class ConfigUtilsTest { |
||||
|
||||
@Test |
||||
public void testExtractSubConfig() { |
||||
// Setup
|
||||
Map<String,Object> configMap = new HashMap<>(); |
||||
configMap.put("aaa.www","1"); |
||||
configMap.put("bbb.www","1"); |
||||
|
||||
final Config source = new Config(configMap); |
||||
|
||||
// Run the test
|
||||
final Config result = ConfigUtils.extractSubConfig(source, "aaa", false); |
||||
int expect = 1; |
||||
int actual = result.entrySet().size(); |
||||
|
||||
Assert.assertEquals(expect,actual); |
||||
} |
||||
} |
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in new issue