mirror of https://github.com/nocodb/nocodb
Naveen MR
2 years ago
30 changed files with 0 additions and 20971 deletions
@ -1,82 +0,0 @@
|
||||
# OS |
||||
# =========== |
||||
.DS_Store |
||||
ehthumbs.db |
||||
Icon? |
||||
Thumbs.db |
||||
|
||||
# oracle instantclient_18_5 |
||||
instantclient_* |
||||
|
||||
# Node and related ecosystem |
||||
# ========================== |
||||
.nodemonignore |
||||
.sass-cache/ |
||||
node_modules/ |
||||
public/lib/ |
||||
app/tests/coverage/ |
||||
.bower-*/ |
||||
.idea/ |
||||
coverage/ |
||||
*.report |
||||
|
||||
|
||||
|
||||
# General |
||||
# ======= |
||||
*.log |
||||
*.report |
||||
*.csv |
||||
*.dat |
||||
*.out |
||||
*.pid |
||||
*.gz |
||||
*.tmp |
||||
*.bak |
||||
*.swp |
||||
logs/ |
||||
build/ |
||||
uploads/ |
||||
builds/ |
||||
out/ |
||||
|
||||
# Sublime editor |
||||
# ============== |
||||
.sublime-project |
||||
*.sublime-project |
||||
*.sublime-workspace |
||||
|
||||
# Eclipse project files |
||||
# ===================== |
||||
.project |
||||
.settings/ |
||||
.*.md.html |
||||
.metadata |
||||
*~.nib |
||||
local.properties |
||||
|
||||
# IntelliJ |
||||
# ======== |
||||
*.iml |
||||
|
||||
# Cloud9 IDE |
||||
# ========= |
||||
.c9/ |
||||
data/ |
||||
mongod |
||||
|
||||
# Visual Studio |
||||
# ========= |
||||
*.suo |
||||
*.ntvs* |
||||
*.njsproj |
||||
*.sln |
||||
!/v |
||||
|
||||
# project related |
||||
help/ |
||||
xmigrator/ |
||||
docs/ |
||||
xmigrator.json |
||||
a |
||||
.nyc_output |
@ -1,25 +0,0 @@
|
||||
variables: |
||||
DOCKER_DRIVER: overlay |
||||
|
||||
|
||||
client-test: |
||||
image: tmaier/docker-compose:latest |
||||
services: |
||||
- docker:dind |
||||
stage: test |
||||
cache: |
||||
paths: |
||||
- node_modules/ |
||||
script: |
||||
- docker info |
||||
- docker-compose --version |
||||
- if command -v apk > /dev/null; then apk add --no-cache bash; fi |
||||
# - apk add --no-cache bash |
||||
# - bash ./docker-compose/client/run_client_tests.sh |
||||
- bash ./docker-compose/migrations/run_migration_tests.sh mysql |
||||
# - time npm test |
||||
# - echo "tests ran?" |
||||
when: manual |
||||
# tags: |
||||
# - shell |
||||
# - docker |
@ -1,289 +0,0 @@
|
||||
# Install & setup |
||||
|
||||
|
||||
# API Reference |
||||
**Kind**: global class |
||||
**Extends**: <code>SqlMigrator</code> |
||||
|
||||
* [KnexMigrator](#KnexMigrator) ⇐ <code>SqlMigrator</code> |
||||
* [new KnexMigrator()](#new_KnexMigrator_new) |
||||
* _instance_ |
||||
* [.init(args)](#KnexMigrator+init) |
||||
* [.sync()](#KnexMigrator+sync) |
||||
* [.clean(args)](#KnexMigrator+clean) |
||||
* [.migrationsCreate(args)](#KnexMigrator+migrationsCreate) ⇒ <code>object</code> \| <code>String</code> \| <code>String</code> |
||||
* [.migrationsDelete(args)](#KnexMigrator+migrationsDelete) ⇒ <code>String</code> \| <code>String</code> |
||||
* [.migrationsUp(args)](#KnexMigrator+migrationsUp) |
||||
* [.migrationsDown(args)](#KnexMigrator+migrationsDown) |
||||
* [.migrationsWrite(args)](#KnexMigrator+migrationsWrite) |
||||
* [.migrationsList(args)](#KnexMigrator+migrationsList) ⇒ <code>Object</code> \| <code>Object</code> \| <code>Object</code> \| <code>String</code> \| <code>String</code> |
||||
* [.migrationsToSql(args)](#KnexMigrator+migrationsToSql) ⇒ <code>Object</code> \| <code>Object</code> \| <code>Object</code> \| <code>String</code> \| <code>String</code> |
||||
* [.migrationsSquash(args)](#KnexMigrator+migrationsSquash) |
||||
* [.migrationsCreateManually(args)](#KnexMigrator+migrationsCreateManually) |
||||
* [.migrationsRenameProjectKey(args)](#KnexMigrator+migrationsRenameProjectKey) ⇒ <code>Result</code> |
||||
* [.migrationsCreateEnv(args)](#KnexMigrator+migrationsCreateEnv) ⇒ <code>Promise.<void></code> |
||||
* [.migrationsDeleteEnv(args)](#KnexMigrator+migrationsDeleteEnv) ⇒ <code>Promise.<void></code> |
||||
* [.migrationsCreateEnvDb(args)](#KnexMigrator+migrationsCreateEnvDb) ⇒ <code>Result</code> |
||||
* _static_ |
||||
* [.KnexMigrator](#KnexMigrator.KnexMigrator) |
||||
* [new KnexMigrator()](#new_KnexMigrator.KnexMigrator_new) |
||||
|
||||
<a name="new_KnexMigrator_new"></a> |
||||
|
||||
### new KnexMigrator() |
||||
Class to create an instance of KnexMigrator |
||||
|
||||
<a name="KnexMigrator+init"></a> |
||||
|
||||
### knexMigrator.init(args) |
||||
Initialises migration project |
||||
Creates project json file in pwd of where command is run. |
||||
Creates xmigrator folder in pwd, within which migrations for all dbs will be sored |
||||
|
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
|
||||
| Param | Type | Description | |
||||
| --- | --- | --- | |
||||
| args | <code>object</code> | | |
||||
| args.type | <code>String</code> | type of database (mysql | pg | oracle | mssql | sqlite) | |
||||
| args.title | <code>String</code> | Name of Project | |
||||
| args.folder | <code>String</code> | Project Dir | |
||||
|
||||
<a name="KnexMigrator+sync"></a> |
||||
|
||||
### knexMigrator.sync() |
||||
Sync is called after init() or any change to config.xc.json file |
||||
This initialises databases and migration tables within each connection of config.xc.json |
||||
|
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
<a name="KnexMigrator+clean"></a> |
||||
|
||||
### knexMigrator.clean(args) |
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
|
||||
| Param | Type | |
||||
| --- | --- | |
||||
| args | <code>Object</code> | |
||||
| args.env | <code>Object</code> | |
||||
| args.dbAlias | <code>Object</code> | |
||||
| args.json | <code>Object</code> | |
||||
|
||||
<a name="KnexMigrator+migrationsCreate"></a> |
||||
|
||||
### knexMigrator.migrationsCreate(args) ⇒ <code>object</code> \| <code>String</code> \| <code>String</code> |
||||
Creates up and down migration files within migration folders |
||||
|
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
**Returns**: <code>object</code> - files<code>String</code> - files.up<code>String</code> - files.down |
||||
|
||||
| Param | Type | Description | |
||||
| --- | --- | --- | |
||||
| args | <code>object</code> | | |
||||
| args.dbAlias | <code>String</code> | Database alias within environment | |
||||
|
||||
<a name="KnexMigrator+migrationsDelete"></a> |
||||
|
||||
### knexMigrator.migrationsDelete(args) ⇒ <code>String</code> \| <code>String</code> |
||||
Creates up and down migration files within migration folders |
||||
|
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
**Returns**: <code>String</code> - files.up<code>String</code> - files.down |
||||
|
||||
| Param | Type | |
||||
| --- | --- | |
||||
| args | <code>object</code> | |
||||
| args.env | <code>String</code> | |
||||
| args.dbAlias | <code>String</code> | |
||||
|
||||
<a name="KnexMigrator+migrationsUp"></a> |
||||
|
||||
### knexMigrator.migrationsUp(args) |
||||
migrationsUp |
||||
|
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
|
||||
| Param | Type | Description | |
||||
| --- | --- | --- | |
||||
| args | <code>object</code> | | |
||||
| args.env | <code>String</code> | | |
||||
| args.dbAlias | <code>String</code> | | |
||||
| args.folder | <code>String</code> | | |
||||
| args.steps | <code>Number</code> | number of steps to migrate | |
||||
| args.file | <code>String</code> | till which file to migration | |
||||
| args.sqlContentMigrate | <code>Number</code> | defaults to 1 , on zero sqlContent is ignored and only filenames are migrated to _evolution table | |
||||
|
||||
<a name="KnexMigrator+migrationsDown"></a> |
||||
|
||||
### knexMigrator.migrationsDown(args) |
||||
migrationsDown |
||||
|
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
|
||||
| Param | Type | Description | |
||||
| --- | --- | --- | |
||||
| args | <code>object</code> | | |
||||
| args.env | <code>String</code> | | |
||||
| args.dbAlias | <code>String</code> | | |
||||
| args.folder | <code>String</code> | | |
||||
| args.steps | <code>Number</code> | number of steps to migrate | |
||||
| args.file | <code>String</code> | till which file to migration | |
||||
| args.sqlContentMigrate | <code>Number</code> | defaults to 1 , on zero sqlContent is ignored and only filenames are migrated to _evolution table | |
||||
|
||||
<a name="KnexMigrator+migrationsWrite"></a> |
||||
|
||||
### knexMigrator.migrationsWrite(args) |
||||
Migrations write |
||||
|
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
|
||||
| Param | Type | Description | |
||||
| --- | --- | --- | |
||||
| args | <code>\*</code> | | |
||||
| args.env | <code>String</code> | | |
||||
| args.dbAlias | <code>String</code> | | |
||||
| args.folder | <code>String</code> | | |
||||
| args.upStatement | <code>Array.<Object></code> | array of sql statements in obj | |
||||
| args.upStatement[].sql | <code>String</code> | sql statements without ';' | |
||||
| args.downStatement | <code>Array.<Object></code> | | |
||||
| args.downStatement[].sql | <code>String</code> | sql statements without ';' | |
||||
| args.up | <code>String</code> | up filename - up filename (only name not entire path) | |
||||
| args.down | <code>String</code> | down filename - down filename (only name not entire path) | |
||||
|
||||
<a name="KnexMigrator+migrationsList"></a> |
||||
|
||||
### knexMigrator.migrationsList(args) ⇒ <code>Object</code> \| <code>Object</code> \| <code>Object</code> \| <code>String</code> \| <code>String</code> |
||||
Migrations List |
||||
|
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
**Returns**: <code>Object</code> - Result<code>Object</code> - Result.data<code>Object</code> - Result.data.object<code>String</code> - Result.data.object.list<code>String</code> - Result.data.object.pending |
||||
|
||||
| Param | Type | Description | |
||||
| --- | --- | --- | |
||||
| args | <code>object</code> | | |
||||
| args.env | <code>String</code> | | |
||||
| args.dbAlias | <code>String</code> | | |
||||
| args.steps | <code>Number</code> | number of steps to migrate | |
||||
| args.file | <code>String</code> | till which file to migration | |
||||
|
||||
<a name="KnexMigrator+migrationsToSql"></a> |
||||
|
||||
### knexMigrator.migrationsToSql(args) ⇒ <code>Object</code> \| <code>Object</code> \| <code>Object</code> \| <code>String</code> \| <code>String</code> |
||||
Migrations to SQL |
||||
|
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
**Returns**: <code>Object</code> - Result<code>Object</code> - Result.data<code>Object</code> - Result.data.object<code>String</code> - Result.data.object.up<code>String</code> - Result.data.object.down |
||||
|
||||
| Param | Type | |
||||
| --- | --- | |
||||
| args | <code>\*</code> | |
||||
| args.env | <code>String</code> | |
||||
| args.dbAlias | <code>String</code> | |
||||
| args.folder | <code>String</code> | |
||||
|
||||
<a name="KnexMigrator+migrationsSquash"></a> |
||||
|
||||
### knexMigrator.migrationsSquash(args) |
||||
Migrations Squash |
||||
|
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
|
||||
| Param | Type | |
||||
| --- | --- | |
||||
| args | <code>\*</code> | |
||||
| args.env | <code>String</code> | |
||||
| args.dbAlias | <code>String</code> | |
||||
| args.folder | <code>String</code> | |
||||
| args.file | <code>String</code> | |
||||
| args.steps | <code>String</code> | |
||||
| args.up | <code>String</code> | |
||||
| args.down | <code>String</code> | |
||||
|
||||
<a name="KnexMigrator+migrationsCreateManually"></a> |
||||
|
||||
### knexMigrator.migrationsCreateManually(args) |
||||
Migrations Create Manual |
||||
|
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
|
||||
| Param | Type | |
||||
| --- | --- | |
||||
| args | <code>\*</code> | |
||||
| args.env | <code>String</code> | |
||||
| args.dbAlias | <code>String</code> | |
||||
| args.folder | <code>String</code> | |
||||
| args.file | <code>String</code> | |
||||
| args.steps | <code>String</code> | |
||||
| args.up | <code>String</code> | |
||||
| args.down | <code>String</code> | |
||||
|
||||
<a name="KnexMigrator+migrationsRenameProjectKey"></a> |
||||
|
||||
### knexMigrator.migrationsRenameProjectKey(args) ⇒ <code>Result</code> |
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
|
||||
| Param | Type | Description | |
||||
| --- | --- | --- | |
||||
| args | | | |
||||
| args.folder | <code>String</code> | defaults to process.cwd() | |
||||
| args.key | <code>String</code> | | |
||||
| args.value | <code>String</code> | | |
||||
|
||||
<a name="KnexMigrator+migrationsCreateEnv"></a> |
||||
|
||||
### knexMigrator.migrationsCreateEnv(args) ⇒ <code>Promise.<void></code> |
||||
update json |
||||
update sqlite |
||||
project reopen |
||||
|
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
|
||||
| Param | Type | |
||||
| --- | --- | |
||||
| args | | |
||||
| args.folder | <code>String</code> | |
||||
| args.env | <code>String</code> | |
||||
| args.envValue | <code>String</code> | |
||||
|
||||
<a name="KnexMigrator+migrationsDeleteEnv"></a> |
||||
|
||||
### knexMigrator.migrationsDeleteEnv(args) ⇒ <code>Promise.<void></code> |
||||
update json |
||||
update sqlite |
||||
project reopen |
||||
|
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
|
||||
| Param | Type | |
||||
| --- | --- | |
||||
| args | | |
||||
| args.folder | <code>String</code> | |
||||
| args.env | <code>String</code> | |
||||
|
||||
<a name="KnexMigrator+migrationsCreateEnvDb"></a> |
||||
|
||||
### knexMigrator.migrationsCreateEnvDb(args) ⇒ <code>Result</code> |
||||
**Kind**: instance method of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
|
||||
| Param | Type | |
||||
| --- | --- | |
||||
| args | | |
||||
| args.folder | <code>String</code> | |
||||
| args.env | <code>String</code> | |
||||
| args.db | <code>String</code> | |
||||
|
||||
<a name="KnexMigrator.KnexMigrator"></a> |
||||
|
||||
### KnexMigrator.KnexMigrator |
||||
**Kind**: static class of [<code>KnexMigrator</code>](#KnexMigrator) |
||||
<a name="new_KnexMigrator.KnexMigrator_new"></a> |
||||
|
||||
#### new KnexMigrator() |
||||
Creates an instance of KnexMigrator. |
||||
|
||||
|
||||
|
||||
test |
||||
|
||||
|
||||
|
||||
|
@ -1,465 +0,0 @@
|
||||
const {expect} = require('chai'); |
||||
|
||||
const models = require('./models') |
||||
|
||||
const country = models.country; |
||||
|
||||
const city = models.city; |
||||
|
||||
|
||||
//desribe group of tests done
|
||||
describe('xc-data-mapper tests', function () { |
||||
|
||||
before(function (done) { |
||||
//start appln
|
||||
done(); |
||||
}); |
||||
|
||||
after(function (done) { |
||||
//stop appln
|
||||
|
||||
done(); |
||||
}); |
||||
|
||||
beforeEach(function (done) { |
||||
//init common variables for each test
|
||||
|
||||
done(); |
||||
}); |
||||
|
||||
afterEach(function (done) { |
||||
//term common variables for each test
|
||||
|
||||
done(); |
||||
}); |
||||
|
||||
it('create, read, update, delete', async function () { |
||||
|
||||
let data = await country.insert({country: 'xc'}); |
||||
expect(data['country']).to.equal('xc'); |
||||
|
||||
data = await country.readByPk(data.country_id); |
||||
expect(data['country']).to.equal('xc'); |
||||
|
||||
data.country = 'xgc' |
||||
const updated = await country.updateByPk(data.country_id, data); |
||||
expect(updated).to.equal(1); |
||||
|
||||
data.country = 'xgc' |
||||
const deleted = await country.delByPk(data.country_id); |
||||
expect(deleted).to.equal(1); |
||||
|
||||
}); |
||||
|
||||
it('create : transaction + commit', async function () { |
||||
|
||||
let trx = null; |
||||
|
||||
try { |
||||
|
||||
trx = await country.transaction(); |
||||
|
||||
let data = await country.insert({country: 'xc'}, trx); |
||||
expect(data['country']).to.equal('xc'); |
||||
|
||||
let find = await country.findOne({where: `(country_id,eq,${data.country_id})`}); |
||||
expect(find['country']).to.equal(undefined); |
||||
|
||||
await country.commit(trx); |
||||
|
||||
data = await country.findOne({where: `(country_id,eq,${data.country_id})`}); |
||||
expect(data['country']).to.equal(data.country); |
||||
|
||||
const deleted = await country.delByPk(data.country_id); |
||||
expect(deleted).to.equal(1); |
||||
|
||||
|
||||
} catch (e) { |
||||
if (trx) |
||||
await country.rollback(trx) |
||||
console.log(e); |
||||
throw e; |
||||
} |
||||
|
||||
}); |
||||
|
||||
it('create: transaction + rollback ', async function () { |
||||
|
||||
let trx = null; |
||||
let data = null; |
||||
try { |
||||
|
||||
trx = await country.transaction(); |
||||
|
||||
data = await country.insert({country: 'xc'}, trx); |
||||
expect(data['country']).to.equal('xc'); |
||||
|
||||
data = await country.findOne({where: `(country_id,eq,${data.country_id})`}); |
||||
expect(data['country']).to.equal(undefined); |
||||
|
||||
await country.rollback(trx); |
||||
|
||||
data = await country.findOne({where: `(country_id,eq,${data.country_id})`}); |
||||
expect(data['country']).to.equal(undefined); |
||||
|
||||
} catch (e) { |
||||
await country.rollback(trx) |
||||
console.log(e); |
||||
throw e; |
||||
} finally { |
||||
|
||||
} |
||||
|
||||
}); |
||||
|
||||
it('update : transaction + commit', async function () { |
||||
|
||||
let trx = null; |
||||
let data = null; |
||||
try { |
||||
|
||||
let data = await country.insert({country: 'xc'}); |
||||
|
||||
trx = await country.transaction(); |
||||
data.country = 'xgc' |
||||
const updated = await country.updateByPk(data.country_id, data, trx); |
||||
|
||||
await country.commit(trx); |
||||
|
||||
data = await country.findOne({where: `(country_id,eq,${data.country_id})`}); |
||||
expect(data['country']).to.equal('xgc'); |
||||
|
||||
} catch (e) { |
||||
await country.rollback(trx) |
||||
console.log(e); |
||||
throw e; |
||||
} finally { |
||||
if (data) |
||||
await country.delByPk(data.country_id); |
||||
} |
||||
|
||||
}); |
||||
|
||||
it('update: transaction + rollback ', async function () { |
||||
|
||||
let trx = null; |
||||
let data = null; |
||||
try { |
||||
|
||||
let data = await country.insert({country: 'xc'}); |
||||
|
||||
trx = await country.transaction(); |
||||
data.country = 'xgc' |
||||
const updated = await country.updateByPk(data.country_id, data, trx); |
||||
|
||||
await country.rollback(trx); |
||||
|
||||
data = await country.findOne({where: `(country_id,eq,${data.country_id})`}); |
||||
expect(data['country']).to.equal('xc'); |
||||
|
||||
} catch (e) { |
||||
await country.rollback(trx) |
||||
console.log(e); |
||||
throw e; |
||||
} finally { |
||||
if (data) |
||||
await country.delByPk(data.country_id); |
||||
} |
||||
|
||||
}); |
||||
|
||||
it('delete : transaction + commit', async function () { |
||||
|
||||
let trx = null; |
||||
let data = null; |
||||
try { |
||||
|
||||
let data = await country.insert({country: 'xc'}); |
||||
|
||||
trx = await country.transaction(); |
||||
const deleted = await country.delByPk(data.country_id, trx); |
||||
|
||||
await country.commit(trx); |
||||
|
||||
data = await country.findOne({where: `(country_id,eq,${data.country_id})`}); |
||||
expect(data['country']).to.equal(undefined); |
||||
|
||||
data = null; |
||||
|
||||
} catch (e) { |
||||
await country.rollback(trx) |
||||
console.log(e); |
||||
throw e; |
||||
} finally { |
||||
if (data) |
||||
await country.delByPk(data.country_id); |
||||
} |
||||
|
||||
}); |
||||
|
||||
it('delete: transaction + rollback ', async function () { |
||||
|
||||
let trx = null; |
||||
let data = null; |
||||
try { |
||||
|
||||
let data = await country.insert({country: 'xc'}); |
||||
|
||||
trx = await country.transaction(); |
||||
const deleted = await country.delByPk(data.country_id, trx); |
||||
|
||||
await country.rollback(trx); |
||||
|
||||
data = await country.findOne({where: `(country_id,eq,${data.country_id})`}); |
||||
expect(data['country']).to.equal('xc'); |
||||
|
||||
} catch (e) { |
||||
await country.rollback(trx) |
||||
console.log(e); |
||||
throw e; |
||||
} finally { |
||||
if (data) |
||||
await country.delByPk(data.country_id); |
||||
} |
||||
|
||||
}); |
||||
|
||||
|
||||
it('read invalid', async function () { |
||||
data = await country.readByPk('xys'); |
||||
expect(Object.keys(data).length).to.equal(0); |
||||
}); |
||||
|
||||
|
||||
it('list', async function () { |
||||
let data = await country.list({}) |
||||
expect(data.length).to.not.equal(0) |
||||
}); |
||||
|
||||
it('list + fields', async function () { |
||||
|
||||
let data = await country.list({fields: 'country'}) |
||||
expect(Object.keys(data[0]).length).to.equal(1) |
||||
expect(Object.keys(data[0])[0]).to.equal('country') |
||||
|
||||
data = await country.list({f: 'country'}) |
||||
expect(Object.keys(data[0]).length).to.equal(1) |
||||
expect(Object.keys(data[0])[0]).to.equal('country') |
||||
|
||||
data = await country.list({f: 'country_id,country'}) |
||||
expect(Object.keys(data[0]).length).to.equal(2) |
||||
expect(Object.keys(data[0])[0]).to.equal('country_id') |
||||
expect(Object.keys(data[0])[1]).to.equal('country') |
||||
|
||||
|
||||
}); |
||||
|
||||
it('list + limit', async function () { |
||||
let data = await country.list({limit: 2}) |
||||
expect(data.length).to.not.equal(2) |
||||
|
||||
data = await country.list({l: 2}) |
||||
expect(data.length).to.not.equal(2) |
||||
|
||||
}); |
||||
|
||||
it('list + offset', async function () { |
||||
let data = await country.list({offset: 1}) |
||||
expect(data[0]['country']).to.not.equal('Afghanistan') |
||||
|
||||
}); |
||||
|
||||
it('list + where', async function () { |
||||
|
||||
let data = await country.list({where: '(country,eq,India)'}) |
||||
expect(data.length).to.equal(1) |
||||
|
||||
data = await country.list({w: '(country,eq,India)'}) |
||||
expect(data.length).to.equal(1) |
||||
|
||||
}); |
||||
|
||||
it('list + sort', async function () { |
||||
|
||||
let data = await country.list({sort: 'country'}) |
||||
expect(data[0]['country']).to.equal('Afghanistan') |
||||
|
||||
data = await country.list({sort: '-country'}) |
||||
expect(data[0]['country']).to.equal('Zambia') |
||||
|
||||
}); |
||||
|
||||
it('list + sort', async function () { |
||||
|
||||
let data = await country.list({sort: 'country'}) |
||||
expect(data[0]['country']).to.equal('Afghanistan') |
||||
|
||||
data = await country.list({sort: '-country'}) |
||||
expect(data[0]['country']).to.equal('Zambia') |
||||
|
||||
}); |
||||
|
||||
it('findOne', async function () { |
||||
|
||||
let data = await country.findOne({where: '(country,eq,India)'}) |
||||
expect(data['country']).to.equal('India') |
||||
|
||||
data = await country.findOne({sort: '-country'}) |
||||
expect(data['country']).to.equal('Zambia') |
||||
|
||||
data = await country.findOne({offset: '1'}) |
||||
expect(data['country']).not.to.equal('Afghanistan') |
||||
|
||||
data = await country.findOne() |
||||
expect(data['country']).to.equal('Afghanistan') |
||||
|
||||
}); |
||||
|
||||
it('count', async function () { |
||||
let data = await country.countByPk({}); |
||||
expect(data['count']).to.be.above(100) |
||||
}); |
||||
|
||||
|
||||
it('groupBy', async function () { |
||||
let data = await city.groupBy({cn: 'country_id', limit: 2}); |
||||
expect(data[0]['count']).to.not.equal(0); |
||||
expect(data.length).to.be.equal(2); |
||||
data = await city.groupBy({cn: 'country_id', having: '(count,gt,50)'}); |
||||
expect(data.length).to.be.equal(2); |
||||
}); |
||||
|
||||
it('aggregate', async function () { |
||||
|
||||
let data = await city.aggregate({cn: 'country_id', fields: 'country_id', func: 'count,sum,avg'}); |
||||
expect(data[0]['count']).to.not.equal(0); |
||||
expect(data[0]['sum']).to.not.equal(0); |
||||
expect(data[0]['avg']).to.not.equal(0); |
||||
|
||||
data = await city.aggregate({ |
||||
cn: 'country_id', |
||||
fields: 'country_id', |
||||
func: 'count,sum,avg', |
||||
having: '(count,gt,50)' |
||||
}); |
||||
expect(data.length).to.be.equal(2); |
||||
|
||||
}); |
||||
|
||||
it('distinct', async function () { |
||||
|
||||
let data = await city.distinct({cn: 'country_id'}); |
||||
expect(Object.keys(data[0]).length).to.be.equal(1); |
||||
expect(data[0]['country_id']).to.not.equal(0); |
||||
|
||||
data = await city.distinct({cn: 'country_id', fields: 'city'}); |
||||
expect(Object.keys(data[0]).length).to.be.equal(2); |
||||
|
||||
}); |
||||
|
||||
it('distribution', async function () { |
||||
|
||||
let data = await city.distribution({cn: 'country_id', steps: '0,20,40,100'}); |
||||
expect(data.length).to.be.equal(3); |
||||
expect(data[0]['range']).to.be.equal('0-20'); |
||||
expect(data[0]['count']).to.be.above(70); |
||||
expect(data[1]['range']).to.be.equal('21-40'); |
||||
expect(data[1]['count']).to.be.equal(100); |
||||
expect(data[2]['range']).to.be.equal('41-100'); |
||||
|
||||
data = await city.distribution({cn: 'country_id', step: 20, min: 0, max: 100}); |
||||
expect(data.length).to.be.equal(5); |
||||
expect(data[0]['range']).to.be.equal('0-20'); |
||||
expect(data[0]['count']).to.be.above(70); |
||||
expect(data[3]['count']).to.be.equal(106); |
||||
|
||||
}); |
||||
|
||||
it('hasManyList', async function () { |
||||
|
||||
let data = await country.hasManyList({childs: 'city'}) |
||||
for (let i = 0; i < data.length; ++i) { |
||||
expect(data[i].city.length).to.not.equal(0) |
||||
expect(Object.keys(data[i]).length).to.be.equal(4) |
||||
expect(Object.keys(data[i].city[0]).length).to.be.equal(4) |
||||
} |
||||
|
||||
data = await country.hasManyList({childs: 'city', fields: 'country', fields1: 'city'}) |
||||
for (let i = 0; i < data.length; ++i) { |
||||
expect(data[i].city.length).to.not.equal(0) |
||||
expect(Object.keys(data[i]).length).to.be.equal(3) |
||||
expect(Object.keys(data[i].city[0]).length).to.be.equal(2) |
||||
} |
||||
|
||||
data = await models.film.hasManyList({childs: 'inventory.film_category'}) |
||||
for (let i = 0; i < data.length; ++i) { |
||||
expect(data[i].inventory.length).to.not.equal(0) |
||||
expect(data[i].film_category.length).to.not.equal(0) |
||||
} |
||||
|
||||
}); |
||||
|
||||
it('belongsTo', async function () { |
||||
|
||||
let data = await city.belongsTo({parents: 'country'}) |
||||
for (let i = 0; i < data.length; ++i) { |
||||
expect(data[i].country).to.not.equal(null) |
||||
} |
||||
|
||||
}); |
||||
|
||||
it('hasManyListGQL', async function () { |
||||
let data = await country.hasManyListGQL({ids: [1, 2], child: 'city'}) |
||||
expect(data['1'].length).to.not.equal(0) |
||||
expect(data['2'].length).to.not.equal(0) |
||||
}); |
||||
|
||||
it('bulk - create, update, delete', async function () { |
||||
|
||||
// let data = await country.insertb([{country:'IN'}, {country:'US'}])
|
||||
// console.log(data);
|
||||
|
||||
}); |
||||
|
||||
|
||||
it('update and delete with where', async function () { |
||||
|
||||
let data = await country.insertb([{country: 'abc'}, {country: 'abc'}]) |
||||
|
||||
let updateCount = await country.update({ |
||||
data: {country: 'abcd'}, |
||||
where: '(country,eq,abc)' |
||||
}); |
||||
|
||||
expect(updateCount).to.be.equal(2); |
||||
|
||||
let deleteCount = await country.del({ |
||||
where: '(country,eq,abcd)' |
||||
}); |
||||
|
||||
expect(deleteCount).to.be.equal(2); |
||||
}); |
||||
|
||||
|
||||
});/** |
||||
* @copyright Copyright (c) 2021, Xgene Cloud Ltd |
||||
* |
||||
* @author Naveen MR <oof1lab@gmail.com> |
||||
* @author Pranav C Balan <pranavxc@gmail.com> |
||||
* |
||||
* @license GNU AGPL version 3 or any later version |
||||
* |
||||
* This program is free software: you can redistribute it and/or modify |
||||
* it under the terms of the GNU Affero General Public License as |
||||
* published by the Free Software Foundation, either version 3 of the |
||||
* License, or (at your option) any later version. |
||||
* |
||||
* This program is distributed in the hope that it will be useful, |
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of |
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
||||
* GNU Affero General Public License for more details. |
||||
* |
||||
* You should have received a copy of the GNU Affero General Public License |
||||
* along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
* |
||||
*/ |
@ -1,22 +0,0 @@
|
||||
exports.Migrator = require("./lib/SqlMigrator/lib/KnexMigrator.js");/** |
||||
* @copyright Copyright (c) 2021, Xgene Cloud Ltd |
||||
* |
||||
* @author Naveen MR <oof1lab@gmail.com> |
||||
* @author Pranav C Balan <pranavxc@gmail.com> |
||||
* |
||||
* @license GNU AGPL version 3 or any later version |
||||
* |
||||
* This program is free software: you can redistribute it and/or modify |
||||
* it under the terms of the GNU Affero General Public License as |
||||
* published by the Free Software Foundation, either version 3 of the |
||||
* License, or (at your option) any later version. |
||||
* |
||||
* This program is distributed in the hope that it will be useful, |
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of |
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
||||
* GNU Affero General Public License for more details. |
||||
* |
||||
* You should have received a copy of the GNU Affero General Public License |
||||
* along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
* |
||||
*/ |
@ -1,22 +0,0 @@
|
||||
{ |
||||
"tags": { |
||||
"allowUnknownTags": false |
||||
}, |
||||
"source": { |
||||
"include": "./lib/SqlMigrator/lib", |
||||
"includePattern": ".js$", |
||||
"excludePattern": "(node_modules/|docs)" |
||||
}, |
||||
"plugins": ["plugins/markdown"], |
||||
"opts": { |
||||
"template": "node_modules/minami", |
||||
"encoding": "utf8", |
||||
"destination": "docs/", |
||||
"recurse": true, |
||||
"verbose": true |
||||
}, |
||||
"templates": { |
||||
"cleverLinks": false, |
||||
"monospaceLinks": false |
||||
} |
||||
} |
@ -1,177 +0,0 @@
|
||||
/* eslint-disable func-names */ |
||||
const KnexMigrator = require("../../SqlMigrator/lib/KnexMigrator"); |
||||
|
||||
process.on("uncaughtException", function(err) { |
||||
console.error(err.stack); |
||||
console.log("Node NOT Exiting..."); |
||||
}); |
||||
|
||||
process.on("unhandledRejection", function(err) { |
||||
console.error("> > > ", err); |
||||
process.exit(1); |
||||
}); |
||||
|
||||
class Cli { |
||||
|
||||
constructor(commander) { |
||||
this.program = commander; |
||||
this.args = commander.args; |
||||
this.migrator = new KnexMigrator(); |
||||
} |
||||
|
||||
async _handleMigrationUp() { |
||||
// console.log('Handling migration UP');
|
||||
|
||||
let migrationSteps = this.program.steps || 9999; |
||||
|
||||
if (this.program.file) { |
||||
migrationSteps = 0; |
||||
} |
||||
|
||||
await this.migrator.migrationsUp({ |
||||
env: this.program.env || "dev", |
||||
dbAlias: this.program.dbAlias || "primary", |
||||
migrationSteps, |
||||
onlyList: this.program.list, |
||||
file: this.program.file || null |
||||
}); |
||||
} |
||||
|
||||
async _handleMigration() { |
||||
if (this.args.length < 2) { |
||||
console.log("Show CLI help"); |
||||
return; |
||||
} |
||||
|
||||
switch (this.args[1]) { |
||||
case "create": |
||||
case "c": |
||||
// console.log('Handling migration create');
|
||||
const files = await this.migrator.migrationsCreate({ |
||||
dbAlias: this.program.dbAlias || "primary" |
||||
}); |
||||
break; |
||||
|
||||
case "up": |
||||
case "u": |
||||
await this._handleMigrationUp(this.args); |
||||
break; |
||||
|
||||
case "down": |
||||
case "d": |
||||
// console.log('Handling migration DOWN');
|
||||
|
||||
let migrationSteps = this.program.steps || 9999; |
||||
|
||||
if (this.program.file) { |
||||
migrationSteps = 0; |
||||
} |
||||
|
||||
await this.migrator.migrationsDown({ |
||||
env: this.program.env || "dev", |
||||
dbAlias: this.program.dbAlias || "primary", |
||||
migrationSteps, |
||||
onlyList: this.program.list, |
||||
file: this.program.file || null |
||||
}); |
||||
break; |
||||
|
||||
default: |
||||
break; |
||||
} |
||||
} |
||||
|
||||
async _handleInit(args) { |
||||
await this.migrator.init(args); |
||||
} |
||||
|
||||
async _handleSync() { |
||||
await this.migrator.sync(); |
||||
} |
||||
|
||||
async _handleClean(args) { |
||||
await this.migrator.clean(args); |
||||
} |
||||
|
||||
async handle() { |
||||
console.log("> > > > > > >"); |
||||
|
||||
let args = {}; |
||||
|
||||
console.log("Handling args:", this.args, this.args.length); |
||||
|
||||
if (this.args.length === 0) { |
||||
console.log("Show CLI help"); |
||||
return; |
||||
} |
||||
|
||||
switch (this.args[0]) { |
||||
case "init": |
||||
case "i": |
||||
console.log("Handling init"); |
||||
args = {}; |
||||
args.type = this.args[1] || "mysql2"; |
||||
await this._handleInit(args); |
||||
break; |
||||
|
||||
case "sync": |
||||
case "s": |
||||
await this._handleSync(); |
||||
break; |
||||
|
||||
case "clean": |
||||
case "c": |
||||
console.log("Handling clean"); |
||||
args = {}; |
||||
args.dbAlias = this.program.dbAlias || null; |
||||
args.env = this.program.env || null; |
||||
args.json = this.program.json || false; |
||||
await this._handleClean(args); |
||||
break; |
||||
|
||||
case "migration": |
||||
case "m": |
||||
// console.log('Handling migration');
|
||||
await this._handleMigration(); |
||||
break; |
||||
|
||||
case "help": |
||||
case "h": |
||||
console.log("Show CLI help"); |
||||
break; |
||||
|
||||
default: |
||||
console.log("Unknown option"); |
||||
console.log("Show CLI help"); |
||||
break; |
||||
} |
||||
|
||||
console.log("< < < < < < < < < <"); |
||||
|
||||
return 0; |
||||
} |
||||
} |
||||
|
||||
module.exports = Cli; |
||||
/** |
||||
* @copyright Copyright (c) 2021, Xgene Cloud Ltd |
||||
* |
||||
* @author Naveen MR <oof1lab@gmail.com> |
||||
* @author Pranav C Balan <pranavxc@gmail.com> |
||||
* |
||||
* @license GNU AGPL version 3 or any later version |
||||
* |
||||
* This program is free software: you can redistribute it and/or modify |
||||
* it under the terms of the GNU Affero General Public License as |
||||
* published by the Free Software Foundation, either version 3 of the |
||||
* License, or (at your option) any later version. |
||||
* |
||||
* This program is distributed in the hope that it will be useful, |
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of |
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
||||
* GNU Affero General Public License for more details. |
||||
* |
||||
* You should have received a copy of the GNU Affero General Public License |
||||
* along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
* |
||||
*/ |
@ -1,396 +0,0 @@
|
||||
const should = require("should"); |
||||
const path = require("path"); |
||||
const { promisify } = require("util"); |
||||
const fs = require("fs"); |
||||
const jsonfile = require("jsonfile"); |
||||
const glob = require("glob"); |
||||
const SqlMigratorCli = require("../lib/SqlMigratorCli"); |
||||
const SqlClientFactory = require("../../SqlClient/lib/SqlClientFactory"); |
||||
|
||||
class TestUtil { |
||||
static async checkFileExists(file, isExists, msg) { |
||||
const exist = await promisify(fs.exists)("./config.xc.json"); |
||||
should.equal(exist, isExists, msg); |
||||
} |
||||
|
||||
static async cmdInit(args) { |
||||
const cli = new SqlMigratorCli(args); |
||||
await cli.handle(); |
||||
} |
||||
|
||||
static async cmdSync(args) { |
||||
const cli = new SqlMigratorCli(args); |
||||
await cli.handle(); |
||||
} |
||||
|
||||
static async cmdMigrationCreate(args) { |
||||
const cli = new SqlMigratorCli(args); |
||||
await cli.handle(); |
||||
} |
||||
|
||||
/** |
||||
* |
||||
* @param args |
||||
* @param args.tn |
||||
* @param args.env |
||||
* @param args.envIndex |
||||
* @param args.recordsLength |
||||
* @returns {Promise<void>} |
||||
*/ |
||||
static async cmdMigrationUpVerify(args) { |
||||
args.env = args.env || "dev"; |
||||
args.envIndex = args.envIndex || 0; |
||||
|
||||
const project = await promisify(jsonfile.readFile)("./config.xc.json"); |
||||
const sqlClient = SqlClientFactory.create( |
||||
project.envs[args.env][args.envIndex] |
||||
); |
||||
|
||||
const exists = await sqlClient.hasTable({ tn: args.tn }); |
||||
should.equal( |
||||
exists.data.value, |
||||
true, |
||||
`${args.tn} should have got created on migration` |
||||
); |
||||
|
||||
const rows = await sqlClient.selectAll( |
||||
project.envs[args.env][args.envIndex].meta.tn |
||||
); |
||||
should.equal( |
||||
rows.length, |
||||
args.recordsLength, |
||||
`${args.tn} should have got created on migration` |
||||
); |
||||
} |
||||
|
||||
/** |
||||
* |
||||
* @param args |
||||
* @param args.tn |
||||
* @param args.env |
||||
* @param args.envIndex |
||||
* @param args.recordsLength |
||||
* @returns {Promise<void>} |
||||
*/ |
||||
static async cmdMigrationDownVerify(args) { |
||||
args.env = args.env || "dev"; |
||||
args.envIndex = args.envIndex || 0; |
||||
|
||||
const project = await promisify(jsonfile.readFile)("./config.xc.json"); |
||||
const sqlClient = SqlClientFactory.create( |
||||
project.envs[args.env][args.envIndex] |
||||
); |
||||
|
||||
const exists = await sqlClient.hasTable({ tn: args.tn }); |
||||
should.equal( |
||||
exists.data.value, |
||||
false, |
||||
`${args.tn} table should have got created on migration` |
||||
); |
||||
|
||||
const rows = await sqlClient.selectAll( |
||||
project.envs[args.env][args.envIndex].meta.tn |
||||
); |
||||
should.equal( |
||||
rows.length, |
||||
args.recordsLength, |
||||
`${args.tn} table should have got created on migration` |
||||
); |
||||
} |
||||
|
||||
static async cmdMigrationUp(args) { |
||||
const cli = new SqlMigratorCli(args); |
||||
await cli.handle(); |
||||
} |
||||
|
||||
static async cmdMigrationCreateVerify(args) { |
||||
const { upStatement } = args; |
||||
const { downStatement } = args; |
||||
const { recordsLength } = args; |
||||
const { dbAlias } = args; |
||||
|
||||
let files = []; |
||||
|
||||
files = await promisify(glob)(`./server/tool/${dbAlias}/migrations/*.up.sql`); |
||||
console.log(files); |
||||
should.equal( |
||||
files.length, |
||||
recordsLength, |
||||
`/server/tool/${dbAlias}/migrations up file is not created` |
||||
); |
||||
|
||||
await promisify(fs.writeFile)( |
||||
files[files.length - 1], |
||||
upStatement, |
||||
"utf-8" |
||||
); |
||||
|
||||
files = await promisify(glob)( |
||||
`./server/tool/${dbAlias}/migrations/*.down.sql` |
||||
); |
||||
should.equal( |
||||
files.length, |
||||
recordsLength, |
||||
`./server/tool/${dbAlias}} down files is not created` |
||||
); |
||||
await promisify(fs.writeFile)( |
||||
files[files.length - 1], |
||||
downStatement, |
||||
"utf-8" |
||||
); |
||||
} |
||||
|
||||
static async cmdMigrationDown(args) { |
||||
const cli = new SqlMigratorCli(args); |
||||
await cli.handle(); |
||||
} |
||||
|
||||
/** |
||||
* |
||||
* @param args |
||||
* @param args.env |
||||
* @param args.dbAlias |
||||
* @returns {Promise<void>} |
||||
*/ |
||||
static async cmdMigrationCleanVerify(args) { |
||||
let exists = await promisify(fs.exists)("./server/tool/primary"); |
||||
should.equal( |
||||
exists, |
||||
false, |
||||
"./server/tool/primary is still left after clean" |
||||
); |
||||
|
||||
exists = await promisify(fs.exists)("./server/tool/primary/migrations"); |
||||
should.equal( |
||||
exists, |
||||
false, |
||||
"./server/tool/primary/migrations is still left after clean" |
||||
); |
||||
|
||||
exists = await promisify(fs.exists)("./server/tool/secondary"); |
||||
should.equal( |
||||
exists, |
||||
false, |
||||
"./server/tool/secondary is still left after clean" |
||||
); |
||||
|
||||
exists = await promisify(fs.exists)("./server/tool/secondary/migrations"); |
||||
should.equal( |
||||
exists, |
||||
false, |
||||
"./server/tool/secondary/migrations is still left after clean" |
||||
); |
||||
|
||||
// database exists in all environments
|
||||
|
||||
const project = await promisify(jsonfile.readFile)("./config.xc.json"); |
||||
|
||||
for (const key in project.envs) { |
||||
for (var i = 0; i < project.envs[key].length; ++i) { |
||||
const connection = project.envs[key][i]; |
||||
|
||||
if (connection === "sqlite3") { |
||||
const key = "dev"; |
||||
|
||||
for (var i = 0; i < project.envs[key].length; ++i) { |
||||
const sqlClient = SqlClientFactory.create(connection); |
||||
exists = await sqlClient.hasDatabase({ |
||||
databaseName: connection.connection.connection.filename |
||||
}); |
||||
should.equal( |
||||
exists.data.value, |
||||
false, |
||||
`${key}/${ |
||||
connection.connection.connection.filename |
||||
} do not exists` |
||||
); |
||||
} |
||||
} else { |
||||
try { |
||||
exists = { data: { value: false } }; |
||||
const sqlClient = SqlClientFactory.create(connection); |
||||
exists = await sqlClient.hasDatabase({ |
||||
databaseName: connection.connection.database |
||||
}); |
||||
} catch (e) { |
||||
// exists.data = {false};
|
||||
} |
||||
|
||||
should.equal( |
||||
exists.data.value, |
||||
false, |
||||
`${key}/${connection.connection.database} do not exists` |
||||
); |
||||
} |
||||
|
||||
// exists = await sqlClient.hasTable({tn:connection.meta.tn});
|
||||
// should.equal(
|
||||
// exists,
|
||||
// false,
|
||||
// `${key}/${$connection.connection.database}/${
|
||||
// connection.meta.tn
|
||||
// } do not exists`
|
||||
// );
|
||||
} |
||||
} |
||||
} |
||||
|
||||
static async cmdMigrationClean(args) { |
||||
const cli = new SqlMigratorCli(args); |
||||
await cli.handle(); |
||||
} |
||||
|
||||
/** |
||||
* |
||||
* @param {object} - args for future reasons |
||||
* @returns {Promise<void>} |
||||
*/ |
||||
static async cmdInitVerify(args = {}) { |
||||
/** ************** START : init verify *************** */ |
||||
await this.checkFileExists( |
||||
"./config.xc.json", |
||||
true, |
||||
"config.xc.json is not created on init" |
||||
); |
||||
await this.checkFileExists( |
||||
"./xmigrator", |
||||
true, |
||||
"./xmigrator is not created on init" |
||||
); |
||||
await this.checkFileExists( |
||||
"./server/tool/primary", |
||||
true, |
||||
"./server/tool/primary is not created on init" |
||||
); |
||||
await this.checkFileExists( |
||||
"./server/tool/primary/migrations", |
||||
true, |
||||
"./server/tool/primary/migrations is not created on init" |
||||
); |
||||
await this.checkFileExists( |
||||
"./server/tool/secondary", |
||||
true, |
||||
"./server/tool/secondary is not created on init" |
||||
); |
||||
await this.checkFileExists( |
||||
"./server/tool/secondary/migrations", |
||||
true, |
||||
"./server/tool/secondary/migrations is not created on init" |
||||
); |
||||
/** ************** END : init verify *************** */ |
||||
} |
||||
|
||||
/** |
||||
* |
||||
* @param {object} - args |
||||
* @returns {Promise<void>} |
||||
*/ |
||||
static async cmdSyncVerify() { |
||||
const project = await promisify(jsonfile.readFile)("./config.xc.json"); |
||||
|
||||
try { |
||||
for (const key in project.envs) { |
||||
for (let i = 0; i < project.envs[key].length; ++i) { |
||||
const connection = project.envs[key][i]; |
||||
const sqlClient = SqlClientFactory.create(connection); |
||||
|
||||
if (connection.client === "sqlite3") { |
||||
let exists = await sqlClient.hasDatabase({ |
||||
databaseName: connection.connection.connection.filename |
||||
}); |
||||
should.equal( |
||||
exists.data.value, |
||||
true, |
||||
`${key}: /${ |
||||
connection.connection.connection.filename |
||||
} do not exists` |
||||
); |
||||
|
||||
exists = await sqlClient.hasTable({ |
||||
tn: connection.meta.tn |
||||
}); |
||||
should.equal( |
||||
exists.data.value, |
||||
true, |
||||
`/${connection.connection.connection.filename}/${ |
||||
connection.meta.tn |
||||
} do not exists` |
||||
); |
||||
} else if (connection.client === "oracledb") { |
||||
let exists = await sqlClient.hasDatabase({ |
||||
databaseName: connection.connection.user |
||||
}); |
||||
should.equal( |
||||
exists.data.value, |
||||
true, |
||||
`${key}/${connection.connection.user} do not exists` |
||||
); |
||||
|
||||
exists = await sqlClient.hasTable({ |
||||
tn: connection.meta.tn |
||||
}); |
||||
should.equal( |
||||
exists.data.value, |
||||
true, |
||||
`${key}/${connection.connection.database}/${ |
||||
connection.meta.tn |
||||
} do not exists` |
||||
); |
||||
} else { |
||||
let exists = await sqlClient.hasDatabase({ |
||||
databaseName: connection.connection.database |
||||
}); |
||||
should.equal( |
||||
exists.data.value, |
||||
true, |
||||
`${key}/${connection.connection.database} do not exists` |
||||
); |
||||
|
||||
exists = await sqlClient.hasTable({ |
||||
tn: connection.meta.tn |
||||
}); |
||||
should.equal( |
||||
exists.data.value, |
||||
true, |
||||
`${key}/${connection.connection.database}/${ |
||||
connection.meta.tn |
||||
} do not exists` |
||||
); |
||||
} |
||||
} |
||||
} |
||||
} catch (e) { |
||||
console.log(e); |
||||
throw e; |
||||
} |
||||
} |
||||
|
||||
static sleep(milliSeconds = 1100) { |
||||
const until = new Date().getTime() + milliSeconds; |
||||
while (new Date().getTime() < until) {} |
||||
} |
||||
} |
||||
|
||||
module.exports = TestUtil; |
||||
/** |
||||
* @copyright Copyright (c) 2021, Xgene Cloud Ltd |
||||
* |
||||
* @author Naveen MR <oof1lab@gmail.com> |
||||
* @author Pranav C Balan <pranavxc@gmail.com> |
||||
* |
||||
* @license GNU AGPL version 3 or any later version |
||||
* |
||||
* This program is free software: you can redistribute it and/or modify |
||||
* it under the terms of the GNU Affero General Public License as |
||||
* published by the Free Software Foundation, either version 3 of the |
||||
* License, or (at your option) any later version. |
||||
* |
||||
* This program is distributed in the hope that it will be useful, |
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of |
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
||||
* GNU Affero General Public License for more details. |
||||
* |
||||
* You should have received a copy of the GNU Affero General Public License |
||||
* along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
* |
||||
*/ |
@ -1,425 +0,0 @@
|
||||
/* eslint-disable func-names */ |
||||
// 'use strict';
|
||||
const { promisify } = require("util"); |
||||
const fs = require("fs"); |
||||
const jsonfile = require("jsonfile"); |
||||
const should = require("should"); |
||||
const SqlClientFactory = require("../../SqlClient/lib/SqlClientFactory"); |
||||
|
||||
const TestUtil = require("./TestUtil"); |
||||
|
||||
const sqlStatements = require("./sql.statements"); |
||||
|
||||
const { DB_TYPE } = process.env; |
||||
|
||||
let sqlType = "mysql"; |
||||
let upStatement = null; |
||||
let downStatement = null; |
||||
let blogUp = null; |
||||
let blogDown = null; |
||||
|
||||
let testNum = 0; |
||||
|
||||
const sqlDbs = { |
||||
sqlite: { |
||||
sqlType: "sqlite", |
||||
upStatement: sqlStatements.sqlite3.user.up, |
||||
downStatement: sqlStatements.sqlite3.user.down, |
||||
blogUp: sqlStatements.sqlite3.blog.up, |
||||
blogDown: sqlStatements.sqlite3.blog.down |
||||
} |
||||
// mysql: {
|
||||
// sqlType: "mysql",
|
||||
// upStatement: sqlStatements.mysql.user.up,
|
||||
// downStatement: sqlStatements.mysql.user.down,
|
||||
// blogUp: sqlStatements.mysql.blog.up,
|
||||
// blogDown: sqlStatements.mysql.blog.down
|
||||
// },
|
||||
// pg: {
|
||||
// sqlType: "pg",
|
||||
// upStatement: sqlStatements.pg.user.up,
|
||||
// downStatement: sqlStatements.pg.user.down,
|
||||
// blogUp: sqlStatements.pg.blog.up,
|
||||
// blogDown: sqlStatements.pg.blog.down
|
||||
// },
|
||||
// mssql: {
|
||||
// sqlType: "mssql",
|
||||
// upStatement: sqlStatements.mssql.user.up,
|
||||
// downStatement: sqlStatements.mssql.user.down,
|
||||
// blogUp: sqlStatements.mssql.blog.up,
|
||||
// blogDown: sqlStatements.mssql.blog.down
|
||||
// },
|
||||
// oracle: {
|
||||
// sqlType: "oracle",
|
||||
// upStatement: sqlStatements.oracledb.user.up,
|
||||
// downStatement: sqlStatements.oracledb.user.down,
|
||||
// blogUp: sqlStatements.oracledb.blog.up,
|
||||
// blogDown: sqlStatements.oracledb.blog.down
|
||||
// }
|
||||
}; |
||||
|
||||
let db = sqlDbs[DB_TYPE]; |
||||
|
||||
if (!db) { |
||||
console.error("Invalid DB Type, running tests on sqlite", DB_TYPE); |
||||
db = sqlDbs.sqlite; |
||||
} |
||||
|
||||
// sqlDbs.forEach(function(db) {
|
||||
describe("SqlMigratorCli : Tests", function() { |
||||
before(async function() { |
||||
try { |
||||
await promisify(fs.unlink)("./config.xc.json"); |
||||
} catch (e) { |
||||
console.log(".."); |
||||
} |
||||
|
||||
sqlType = db.sqlType; |
||||
upStatement = db.upStatement; |
||||
downStatement = db.downStatement; |
||||
blogUp = db.blogUp; |
||||
blogDown = db.blogDown; |
||||
testNum = 0; |
||||
}); |
||||
|
||||
beforeEach(function(done) { |
||||
console.log("\n", `${sqlType}:${testNum}`); |
||||
testNum++; |
||||
done(); |
||||
}); |
||||
|
||||
it(`xmigrator init should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
await TestUtil.cmdInit({ |
||||
args: ["i", sqlType] |
||||
}); |
||||
await TestUtil.cmdInitVerify(); |
||||
}); |
||||
|
||||
it(`xmigrator sync should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
await TestUtil.cmdSync({ |
||||
args: ["s"] |
||||
}); |
||||
|
||||
await TestUtil.cmdSyncVerify(); |
||||
}); |
||||
|
||||
it(`xmigrator migration create (first migration) should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
await TestUtil.cmdMigrationCreate({ |
||||
args: ["m", "c"] |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationCreateVerify({ |
||||
upStatement, |
||||
downStatement, |
||||
recordsLength: 1, |
||||
dbAlias: "primary" |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator migration create (second migration) should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
TestUtil.sleep(); |
||||
|
||||
await TestUtil.cmdMigrationCreate({ |
||||
args: ["m", "c"] |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationCreateVerify({ |
||||
upStatement: blogUp, |
||||
downStatement: blogDown, |
||||
recordsLength: 2, |
||||
dbAlias: "primary" |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator migration up --step=1 (first migration) should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
await TestUtil.cmdMigrationUp({ |
||||
args: ["m", "u"], |
||||
steps: 1 |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationUpVerify({ |
||||
recordsLength: 1, |
||||
tn: "user" |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator migration up --step=1 (second migration) should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
await TestUtil.cmdMigrationUp({ |
||||
args: ["m", "u"], |
||||
steps: 1 |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationUpVerify({ |
||||
recordsLength: 2, |
||||
tn: "blog" |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator migration down should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
await TestUtil.cmdMigrationDown({ |
||||
args: ["m", "d"] |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationDownVerify({ |
||||
recordsLength: 0, |
||||
tn: "blog" |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationDownVerify({ |
||||
recordsLength: 0, |
||||
tn: "user" |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator migration up should pass`, async function() { |
||||
await TestUtil.cmdMigrationUp({ |
||||
args: ["m", "u"] |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationUpVerify({ |
||||
recordsLength: 2, |
||||
tn: "blog" |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationUpVerify({ |
||||
recordsLength: 2, |
||||
tn: "user" |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator migration down --steps=1 (first migration) should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
await TestUtil.cmdMigrationDown({ |
||||
args: ["m", "d"], |
||||
steps: 1 |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationDownVerify({ |
||||
recordsLength: 1, |
||||
tn: "blog" |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator migration down --steps=1 (second migration) should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
await TestUtil.cmdMigrationDown({ |
||||
args: ["m", "d"], |
||||
steps: 1 |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationDownVerify({ |
||||
recordsLength: 0, |
||||
tn: "user" |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator migration create (secondary - 1st migration) should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
TestUtil.sleep(); |
||||
|
||||
await TestUtil.cmdMigrationCreate({ |
||||
args: ["m", "c"], |
||||
dbAlias: "db2" |
||||
}); |
||||
//
|
||||
await TestUtil.cmdMigrationCreateVerify({ |
||||
upStatement, |
||||
downStatement, |
||||
recordsLength: 1, |
||||
dbAlias: "db2" |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator migration create (secondary - 2nd migration) should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
TestUtil.sleep(); |
||||
|
||||
await TestUtil.cmdMigrationCreate({ |
||||
args: ["m", "c"], |
||||
dbAlias: "db2" |
||||
}); |
||||
//
|
||||
await TestUtil.cmdMigrationCreateVerify({ |
||||
upStatement: blogUp, |
||||
downStatement: blogDown, |
||||
recordsLength: 2, |
||||
dbAlias: "db2" |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator migration up should pass db2`, async function() { |
||||
await TestUtil.cmdMigrationUp({ |
||||
args: ["m", "u"], |
||||
dbAlias: "db2" |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationUpVerify({ |
||||
recordsLength: 2, |
||||
tn: "user", |
||||
envIndex: 1 |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationUpVerify({ |
||||
recordsLength: 2, |
||||
tn: "blog", |
||||
envIndex: 1 |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator migration down should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
await TestUtil.cmdMigrationDown({ |
||||
args: ["m", "d"], |
||||
dbAlias: "db2" |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationDownVerify({ |
||||
recordsLength: 0, |
||||
envIndex: 1, |
||||
tn: "user" |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationDownVerify({ |
||||
recordsLength: 0, |
||||
envIndex: 1, |
||||
tn: "blog" |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator migration up --env test should pass`, async function() { |
||||
await TestUtil.cmdMigrationUp({ |
||||
args: ["m", "u"], |
||||
dbAlias: "db2", |
||||
env: "test" |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationUpVerify({ |
||||
recordsLength: 2, |
||||
tn: "user", |
||||
envIndex: 1, |
||||
env: "test" |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationUpVerify({ |
||||
recordsLength: 2, |
||||
tn: "blog", |
||||
envIndex: 1, |
||||
env: "test" |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator migration down --env test should pass`, async function() { |
||||
this.timeout(20000); |
||||
await TestUtil.cmdMigrationDown({ |
||||
args: ["m", "d"], |
||||
dbAlias: "db2", |
||||
env: "test" |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationDownVerify({ |
||||
recordsLength: 0, |
||||
envIndex: 1, |
||||
tn: "user", |
||||
env: "test" |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationDownVerify({ |
||||
recordsLength: 0, |
||||
envIndex: 1, |
||||
tn: "blog", |
||||
env: "test" |
||||
}); |
||||
}); |
||||
|
||||
it(`xmigrator clean --env=test --dbAlias=db2 should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
await TestUtil.cmdMigrationClean({ |
||||
args: ["c"], |
||||
env: "test", |
||||
dbAlias: "db2" |
||||
}); |
||||
|
||||
let exists = false; |
||||
|
||||
// database exists in all environments
|
||||
const project = await promisify(jsonfile.readFile)("./config.xc.json"); |
||||
|
||||
const key = "test"; |
||||
for (let i = 0; i < project.envs[key].length; ++i) { |
||||
const connection = project.envs[key][i]; |
||||
if (connection.meta.dbAlias === "db2") { |
||||
try { |
||||
const sqlClient = SqlClientFactory.create(connection); |
||||
} catch (e) { |
||||
exists = false; |
||||
} |
||||
|
||||
should.equal( |
||||
exists, |
||||
false, |
||||
`${key}/${connection.connection.database} do not exists` |
||||
); |
||||
} |
||||
} |
||||
|
||||
// migrations table exists in all environments
|
||||
}); |
||||
|
||||
if (db.sqlType === "oracle") { |
||||
console.log("\n\nPlease Drop All Database in Oracle Manually\n\n"); |
||||
} else { |
||||
it(`xmigrator clean should pass`, async function() { |
||||
this.timeout(20000); |
||||
|
||||
await TestUtil.cmdMigrationClean({ |
||||
args: ["c"] |
||||
}); |
||||
|
||||
await TestUtil.cmdMigrationCleanVerify(); |
||||
// migrations table exists in all environments
|
||||
}); |
||||
} |
||||
}); |
||||
// });
|
||||
/** |
||||
* @copyright Copyright (c) 2021, Xgene Cloud Ltd |
||||
* |
||||
* @author Naveen MR <oof1lab@gmail.com> |
||||
* @author Pranav C Balan <pranavxc@gmail.com> |
||||
* |
||||
* @license GNU AGPL version 3 or any later version |
||||
* |
||||
* This program is free software: you can redistribute it and/or modify |
||||
* it under the terms of the GNU Affero General Public License as |
||||
* published by the Free Software Foundation, either version 3 of the |
||||
* License, or (at your option) any later version. |
||||
* |
||||
* This program is distributed in the hope that it will be useful, |
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of |
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
||||
* GNU Affero General Public License for more details. |
||||
* |
||||
* You should have received a copy of the GNU Affero General Public License |
||||
* along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
* |
||||
*/ |
@ -1,103 +0,0 @@
|
||||
module.exports = { |
||||
mysql: { |
||||
user: { |
||||
up: `CREATE TABLE \`user\` (\n \`id\` INT NOT NULL AUTO_INCREMENT,\n \`title\` VARCHAR(45) NULL,\n PRIMARY KEY (\`id\`)\n);`, |
||||
down: "drop table user;" |
||||
}, |
||||
|
||||
blog: { |
||||
up: |
||||
"CREATE TABLE `blog` (\n `id` INT NOT NULL AUTO_INCREMENT,\n `title` VARCHAR(45) NULL,\n PRIMARY KEY (`id`)\n);", |
||||
down: "drop table blog;" |
||||
} |
||||
}, |
||||
|
||||
pg: { |
||||
user: { |
||||
up: `CREATE TABLE "user" (
|
||||
user_id serial PRIMARY KEY, |
||||
username VARCHAR (50) UNIQUE NOT NULL, |
||||
password VARCHAR (50) NOT NULL, |
||||
email VARCHAR (355) UNIQUE NOT NULL, |
||||
created_on TIMESTAMP NOT NULL, |
||||
last_login TIMESTAMP |
||||
);`,
|
||||
down: `DROP TABLE "user";` |
||||
}, |
||||
|
||||
blog: { |
||||
up: `CREATE TABLE "blog" (
|
||||
user_id serial PRIMARY KEY, |
||||
username VARCHAR (50) UNIQUE NOT NULL, |
||||
password VARCHAR (50) NOT NULL, |
||||
email VARCHAR (355) UNIQUE NOT NULL, |
||||
created_on TIMESTAMP NOT NULL, |
||||
last_login TIMESTAMP |
||||
);`,
|
||||
down: `DROP TABLE "blog";` |
||||
} |
||||
}, |
||||
|
||||
mssql: { |
||||
user: { |
||||
up: |
||||
'CREATE TABLE "user" (\n user_id INT PRIMARY KEY,\n last_name VARCHAR(50) NOT NULL,\n first_name VARCHAR(50),\n );', |
||||
down: 'DROP TABLE "user";' |
||||
}, |
||||
|
||||
blog: { |
||||
up: |
||||
'CREATE TABLE "blog" (\n blog_id INT PRIMARY KEY,\n blog_content VARCHAR(50) NOT NULL,\n );', |
||||
down: 'drop table "blog";' |
||||
} |
||||
}, |
||||
// INFO: semicolon is require at the end
|
||||
oracledb: { |
||||
user: { |
||||
up: |
||||
'CREATE TABLE "user"\n( user_id number(10) NOT NULL,\n user_name varchar2(50) NOT NULL\n);', |
||||
down: 'DROP TABLE "user";' |
||||
}, |
||||
|
||||
blog: { |
||||
up: |
||||
'CREATE TABLE "blog"\n( blog_id number(10) NOT NULL,\n blog_name varchar2(50) NOT NULL\n);', |
||||
down: 'DROP TABLE "blog";' |
||||
} |
||||
}, |
||||
|
||||
sqlite3: { |
||||
user: { |
||||
up: |
||||
"CREATE TABLE user (\n id INTEGER PRIMARY KEY,\n first_name TEXT NOT NULL\n)", |
||||
down: "drop table user" |
||||
}, |
||||
|
||||
blog: { |
||||
up: "CREATE TABLE blog (\n id INTEGER PRIMARY KEY\n)", |
||||
down: "drop table blog" |
||||
} |
||||
} |
||||
}; |
||||
/** |
||||
* @copyright Copyright (c) 2021, Xgene Cloud Ltd |
||||
* |
||||
* @author Naveen MR <oof1lab@gmail.com> |
||||
* @author Pranav C Balan <pranavxc@gmail.com> |
||||
* |
||||
* @license GNU AGPL version 3 or any later version |
||||
* |
||||
* This program is free software: you can redistribute it and/or modify |
||||
* it under the terms of the GNU Affero General Public License as |
||||
* published by the Free Software Foundation, either version 3 of the |
||||
* License, or (at your option) any later version. |
||||
* |
||||
* This program is distributed in the hope that it will be useful, |
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of |
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
||||
* GNU Affero General Public License for more details. |
||||
* |
||||
* You should have received a copy of the GNU Affero General Public License |
||||
* along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
* |
||||
*/ |
@ -1,22 +0,0 @@
|
||||
/** |
||||
* @copyright Copyright (c) 2021, Xgene Cloud Ltd |
||||
* |
||||
* @author Naveen MR <oof1lab@gmail.com> |
||||
* @author Pranav C Balan <pranavxc@gmail.com> |
||||
* |
||||
* @license GNU AGPL version 3 or any later version |
||||
* |
||||
* This program is free software: you can redistribute it and/or modify |
||||
* it under the terms of the GNU Affero General Public License as |
||||
* published by the Free Software Foundation, either version 3 of the |
||||
* License, or (at your option) any later version. |
||||
* |
||||
* This program is distributed in the hope that it will be useful, |
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of |
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
||||
* GNU Affero General Public License for more details. |
||||
* |
||||
* You should have received a copy of the GNU Affero General Public License |
||||
* along with this program. If not, see <http://www.gnu.org/licenses/>. |
||||
* |
||||
*/ |
File diff suppressed because it is too large
Load Diff
@ -1,73 +0,0 @@
|
||||
{ |
||||
"name": "xc-migrator-ts", |
||||
"version": "0.1.18", |
||||
"description": "SQL based schema migrations or evolutions", |
||||
"main": "index.js", |
||||
"module": "index.js", |
||||
"bin": { |
||||
"xcm": "./index.js" |
||||
}, |
||||
"scripts": { |
||||
"test:travis": "echo \"test travis\"", |
||||
"dev": "node index.js -i test", |
||||
"pubilsh": "npm publish .", |
||||
"migration-test": "rm -rf xmigrator.json primary-db secondary-db && DB_TYPE=mysql mocha ./lib/SqlMigratorCli/tests/sql.cli.test.js --exit", |
||||
"migration-tests": "bash ./docker-compose/migrations/run_migration_tests.sh mysql", |
||||
"docs": "jsdoc -c jsdoc.json", |
||||
"doc": "jsdoc2md -t readme.hbs ./lib/SqlMigrator/lib/KnexMigrator.js > README.md", |
||||
"clean": "node index.js c && rm -rf xmigrator xmigrator.json" |
||||
}, |
||||
"keywords": [], |
||||
"author": "oof1lab <oof1lab@gmail.com>", |
||||
"homepage": "https://xgene.cloud", |
||||
"license": "AGPL-3.0-or-later", |
||||
"directories": { |
||||
"lib": "lib", |
||||
"test": "__tests__" |
||||
}, |
||||
"files": [ |
||||
"lib" |
||||
], |
||||
"repository": { |
||||
"type": "git", |
||||
"url": "git+https://github.com/nocodb/nocodb.git" |
||||
}, |
||||
"bugs": { |
||||
"url": "https://github.com/nocodb/nocodb/issues" |
||||
}, |
||||
"dependencies": { |
||||
"boxen": "^3.1.0", |
||||
"colors": "^1.3.3", |
||||
"commander": "^2.19.0", |
||||
"dayjs": "^1.8.32", |
||||
"debug": "^4.1.1", |
||||
"ejs": "^3.0.1", |
||||
"emittery": "^0.5.1", |
||||
"glob": "^7.1.3", |
||||
"handlebars": "^4.7.6", |
||||
"jsonfile": "^5.0.0", |
||||
"knex": "^0.20.8", |
||||
"mkdirp": "^0.5.1", |
||||
"rmdir": "^1.2.0" |
||||
}, |
||||
"peerDependencies": { |
||||
"sqlite3": "^5.0.0" |
||||
}, |
||||
"devDependencies": { |
||||
"chai": "^4.2.0", |
||||
"dotenv": "^8.2.0", |
||||
"eslint": "^5.16.0", |
||||
"eslint-config-airbnb-base": "^13.1.0", |
||||
"eslint-config-prettier": "^4.1.0", |
||||
"eslint-plugin-import": "^2.16.0", |
||||
"eslint-plugin-mocha": "^5.3.0", |
||||
"eslint-plugin-prettier": "^3.0.1", |
||||
"jsdoc": "^3.5.5", |
||||
"minami": "^1.2.3", |
||||
"mocha": "^6.0.2", |
||||
"nyc": "^13.3.0", |
||||
"prettier": "^1.16.4", |
||||
"should": "^13.2.3", |
||||
"jsdoc-to-markdown": "^5.0.3" |
||||
} |
||||
} |
@ -1,14 +0,0 @@
|
||||
# Install & setup |
||||
|
||||
|
||||
# API Reference |
||||
{{#class name="KnexMigrator"}} |
||||
{{>body~}} |
||||
{{>member-index~}} |
||||
{{>separator~}} |
||||
{{>members~}} |
||||
{{/class}} |
||||
|
||||
|
||||
|
||||
|
@ -1,41 +0,0 @@
|
||||
const dayjs = require('dayjs'); |
||||
|
||||
|
||||
exports.getUniqFilenamePrefix = function () { |
||||
|
||||
return dayjs().format('YYMMDD_HHmmss') |
||||
|
||||
}; |
||||
|
||||
exports.getFilenameForUp = function (prefix) { |
||||
|
||||
return prefix + '.up.sql' |
||||
|
||||
}; |
||||
|
||||
exports.getFilenameForDown = function (prefix) { |
||||
|
||||
return prefix + '.down.sql' |
||||
|
||||
};/** |
||||
* @copyright Copyright (c) 2021, Xgene Cloud Ltd |
||||
* |
||||
* @author Naveen MR <oof1lab@gmail.com> |
||||
* @author Pranav C Balan <pranavxc@gmail.com> |
||||
* |
||||
* @license GNU AGPL version 3 or any later version |
||||
* |
||||
* This program is free software: you can redistribute it and/or modify |
||||
* it under the terms of the GNU Affero General Public License as |
||||
* published by the Free Software Foundation, either version 3 of the |
||||
* License, or (at your option) any later version. |
||||
* |
||||
* This program is distributed in the hope that it will be useful, |
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of |
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the |
||||
* GNU Affero General Public License for more details. |
||||
* |
||||
* You should have received a copy of the GNU Affero General Public License |
||||
* along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
* |
||||
*/ |
@ -1,11 +0,0 @@
|
||||
node_modules |
||||
*.iml |
||||
.idea |
||||
*.log* |
||||
.nuxt |
||||
.vscode |
||||
.DS_Store |
||||
coverage |
||||
dist |
||||
sw.* |
||||
.env |
@ -1,21 +0,0 @@
|
||||
MIT License |
||||
|
||||
Copyright (c) 2021 nocodb |
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy |
||||
of this software and associated documentation files (the "Software"), to deal |
||||
in the Software without restriction, including without limitation the rights |
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell |
||||
copies of the Software, and to permit persons to whom the Software is |
||||
furnished to do so, subject to the following conditions: |
||||
|
||||
The above copyright notice and this permission notice shall be included in all |
||||
copies or substantial portions of the Software. |
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR |
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, |
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE |
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER |
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, |
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE |
||||
SOFTWARE. |
@ -1,7 +0,0 @@
|
||||
main.container{ |
||||
max-width:100% !important; |
||||
} |
||||
|
||||
footer a[href="https://nuxtjs.org"]{ |
||||
display: none; |
||||
} |
@ -1,33 +0,0 @@
|
||||
<template> |
||||
<div> |
||||
<iframe type="text/html" width="100%" style="height:100%" |
||||
:src="`https://www.youtube.com/embed/${id}`" |
||||
frameborder="0" allowfullscreen></iframe> |
||||
</div> |
||||
</template> |
||||
|
||||
<script> |
||||
export default { |
||||
name: "youtube", |
||||
props: { |
||||
id: String |
||||
} |
||||
} |
||||
</script> |
||||
|
||||
<style scoped> |
||||
div { |
||||
background-color: red; |
||||
width: 100%; |
||||
padding-top: min(500px,56%); |
||||
position: relative; |
||||
} |
||||
|
||||
iframe { |
||||
position: absolute; |
||||
top: 0; |
||||
left: 0; |
||||
bottom: 0; |
||||
right: 0; |
||||
} |
||||
</style> |
@ -1,17 +0,0 @@
|
||||
--- |
||||
title: 'NocoDB' |
||||
description: 'NocoDB' |
||||
position: 0 |
||||
category: 'NocoDB' |
||||
fullscreen: true |
||||
menuTitle: 'NocoDB' |
||||
--- |
||||
|
||||
|
||||
|
||||
|
||||
# 🎯 Why are we building this ? |
||||
Most internet businesses equip themselves with either spreadsheet or a database to solve their business needs. Spreadsheets are used by a Billion+ humans collaboratively every single day. However, we are way off working at similar speeds on databases which are way more powerful tools when it comes to computing. Attempts to solve this with SaaS offerings has meant horrible access controls, vendor lockin, data lockin, abrupt price changes & most importantly a glass ceiling on what's possible in future. |
||||
|
||||
# ❤ Our Mission : |
||||
Our mission is to provide the most powerful no-code interface for databases which is open source to every single internet business in the world. This would not only democratise access to a powerful computing tool but also bring forth a billion+ people who will have radical tinkering-and-building abilities on internet. |
@ -1,10 +0,0 @@
|
||||
{ |
||||
"title": "NocoDB", |
||||
"url": "https://blog.nocodb.com", |
||||
"logo": { |
||||
"light": "/favicon-128.png", |
||||
"dark": "/favicon-128.png" |
||||
}, |
||||
"github": "nocodb/nocodb/packages/noco-blog", |
||||
"twitter": "@nocodb" |
||||
} |
@ -1,10 +0,0 @@
|
||||
import theme from '@nuxt/content-theme-docs' |
||||
|
||||
export default theme({ |
||||
docs: { |
||||
primaryColor: '#3282ff' |
||||
}, |
||||
css: [ |
||||
"./assets/main.css" |
||||
] |
||||
}) |
File diff suppressed because it is too large
Load Diff
@ -1,15 +0,0 @@
|
||||
{ |
||||
"name": "noco-blog", |
||||
"version": "1.0.0", |
||||
"license": "MIT", |
||||
"scripts": { |
||||
"dev": "nuxt", |
||||
"build": "nuxt build", |
||||
"start": "nuxt start", |
||||
"generate": "nuxt generate" |
||||
}, |
||||
"dependencies": { |
||||
"@nuxt/content-theme-docs": "0.7.2", |
||||
"nuxt": "^2.15.2" |
||||
} |
||||
} |
Before Width: | Height: | Size: 6.3 KiB |
Before Width: | Height: | Size: 1.7 KiB |
Before Width: | Height: | Size: 3.2 KiB |
Before Width: | Height: | Size: 982 B |
Loading…
Reference in new issue