Browse Source

docs: engineering section, fix broken links

Signed-off-by: Raju Udava <86527202+dstala@users.noreply.github.com>
pull/6467/head
Raju Udava 1 year ago
parent
commit
d979df00b2
  1. 6
      packages/noco-docs/docs/010.index.md
  2. 106
      packages/noco-docs/docs/020.getting-started/050.self-hosted/020.environment-variables.md
  3. 2
      packages/noco-docs/docs/020.getting-started/050.self-hosted/030.upgrading.md
  4. 2
      packages/noco-docs/docs/130.automation/020.webhook/010.webhook-overview.md
  5. 16
      packages/noco-docs/docs/150.engineering/010.architecture.md
  6. 23
      packages/noco-docs/docs/150.engineering/020.repository-structure.md
  7. 50
      packages/noco-docs/docs/150.engineering/030.development-setup.md
  8. 189
      packages/noco-docs/docs/150.engineering/040.unit-testing.md
  9. 218
      packages/noco-docs/docs/150.engineering/050.playwright.md
  10. 164
      packages/noco-docs/docs/150.engineering/060.builds-and-releases.md
  11. 78
      packages/noco-docs/docs/150.engineering/070.translation.md
  12. 5
      packages/noco-docs/docs/150.engineering/_category_.json
  13. 6
      packages/noco-docs/versioned_docs/version-0.109.7/010.index.md
  14. 102
      packages/noco-docs/versioned_docs/version-0.109.7/020.getting-started/020.environment-variables.md
  15. 2
      packages/noco-docs/versioned_docs/version-0.109.7/020.getting-started/030.upgrading.md
  16. 6
      packages/noco-docs/versioned_docs/version-0.109.7/030.setup-and-usages/020.table-operations.md
  17. 6
      packages/noco-docs/versioned_docs/version-0.109.7/030.setup-and-usages/040.column-types.md
  18. 2
      packages/noco-docs/versioned_docs/version-0.109.7/030.setup-and-usages/240.meta-management.md
  19. 2
      packages/noco-docs/versioned_docs/version-0.109.7/040.developer-resources/020.rest-apis.md
  20. 4
      packages/noco-docs/versioned_docs/version-0.109.7/040.developer-resources/030.sdk.md

6
packages/noco-docs/docs/010.index.md

@ -37,21 +37,21 @@ We provide different integrations in three main categories. See <a href="/setup-
### Programmatic Access
We provide the following ways to let users to invoke actions in a programmatic way. You can use a token (either JWT or Social Auth) to sign your requests for authorization to NocoDB.
We provide the following ways to let users invoke actions in a programmatic way. You can use a token (either JWT or Social Auth) to sign your requests for authorization to NocoDB.
- ⚡ &nbsp;REST APIs
- ⚡ &nbsp;NocoDB SDK
### Sync Schema
We allow you to sync schema changes if you have made changes outside NocoDB GUI. However, it has to be noted then you will have to bring your own schema migrations for moving from environment to others. See <a href="/setup-and-usages/sync-schema" target="_blank">Sync Schema</a> for details.
We allow you to sync schema changes if you have made changes outside NocoDB GUI. However, it has to be noted then you will have to bring your own schema migrations for moving from environment to others. See <a href="/data-source/data-source-overview#sync-metadata" target="_blank">Sync Schema</a> for details.
### Audit
We are keeping all the user operation logs under one place. See <a href="/setup-and-usages/audit" target="_blank">Audit</a> for details.
## Why are we building this?
Most internet businesses equip themselves with either spreadsheet or a database to solve their business needs. Spreadsheets are used by a Billion+ humans collaboratively every single day. However, we are way off working at similar speeds on databases which are way more powerful tools when it comes to computing. Attempts to solve this with SaaS offerings has meant horrible access controls, vendor lockin, data lockin, abrupt price changes & most importantly a glass ceiling on what's possible in future.
Most internet businesses equip themselves with either spreadsheet or a database to solve their business needs. Spreadsheets are used by a Billion+ humans collaboratively every single day. However, we are way off working at similar speeds on databases which are way more powerful tools when it comes to computing. Attempts to solve this with SaaS offerings has meant horrible access controls, vendor lockin, data lockin, abrupt price changes & most importantly a glass ceiling on what's possible in the future.
## Our Mission
Our mission is to provide the most powerful no-code interface for databases which is open source to every single internet business in the world. This would not only democratise access to a powerful computing tool but also bring forth a billion+ people who will have radical tinkering-and-building abilities on internet.

106
packages/noco-docs/docs/020.getting-started/050.self-hosted/020.environment-variables.md

@ -5,61 +5,61 @@ hide_table_of_contents: true
keywords : ['NocoDB environment variables', 'NocoDB env variables', 'NocoDB envs', 'NocoDB env']
---
For production usecases, it is **recommended** to configure
For production use-cases, it is **recommended** to configure
- `NC_DB`,
- `NC_AUTH_JWT_SECRET`,
- `NC_PUBLIC_URL`,
- `NC_REDIS_URL`
| Variable | Comments | If absent |
|---|---|---|
| NC_DB | See our example database URLs [here](https://github.com/nocodb/nocodb#docker). | A local SQLite will be created in root folder if `NC_DB` is not provided |
| NC_DB_JSON | Can be used instead of `NC_DB` and value should be valid knex connection JSON | |
| NC_DB_JSON_FILE | Can be used instead of `NC_DB` and value should be a valid path to knex connection JSON | |
| DATABASE_URL | JDBC URL Format. Can be used instead of NC_DB. | |
| DATABASE_URL_FILE | Can be used instead of DATABASE_URL: path to file containing JDBC URL Format. | |
| NC_AUTH_JWT_SECRET | JWT secret used for auth and storing other secrets | A random secret will be generated |
| PORT | For setting app running port | `8080` |
| DB_QUERY_LIMIT_DEFAULT | Default pagination limit | 25 |
| DB_QUERY_LIMIT_MAX | Maximum allowed pagination limit | 1000 |
| DB_QUERY_LIMIT_MIN | Minimum allowed pagination limit | 1 |
| NC_TOOL_DIR | App directory to keep metadata and app related files | Defaults to current working directory. In docker maps to `/usr/app/data/` for mounting volume. |
| NC_PUBLIC_URL | Used for sending Email invitations | Best guess from http request params |
| NC_JWT_EXPIRES_IN | JWT token expiry time | `10h` |
| NC_CONNECT_TO_EXTERNAL_DB_DISABLED | Disable Project creation with external database | |
| NC_INVITE_ONLY_SIGNUP | Removed since version 0.99.0 and now it's recommended to use [super admin settings menu](/setup-and-usages/account-settings#enable--disable-signup). Allow users to signup only via invite url, value should be any non-empty string. | |
| NUXT_PUBLIC_NC_BACKEND_URL | Custom Backend URL | ``http://localhost:8080`` will be used |
| NC_REQUEST_BODY_SIZE | Request body size [limit](https://expressjs.com/en/resources/middleware/body-parser.html#limit) | `1048576` |
| NC_EXPORT_MAX_TIMEOUT | After NC_EXPORT_MAX_TIMEOUT csv gets downloaded in batches | Default value 5000(in millisecond) will be used |
| NC_DISABLE_TELE | Disable telemetry | |
| NC_DASHBOARD_URL | Custom dashboard url path | `/dashboard` |
| NC_GOOGLE_CLIENT_ID | Google client id to enable google authentication | |
| NC_GOOGLE_CLIENT_SECRET | Google client secret to enable google authentication | |
| NC_MIGRATIONS_DISABLED | Disable NocoDB migration | |
| NC_MIN | If set to any non-empty string the default splash screen(initial welcome animation) and matrix screensaver will disable | |
| NC_SENTRY_DSN | For Sentry monitoring | |
| NC_REDIS_URL | Custom Redis URL. Example: `redis://:authpassword@127.0.0.1:6380/4` | Meta data will be stored in memory |
| NC_DISABLE_ERR_REPORT | Disable error reporting | |
| NC_DISABLE_CACHE | To be used only while debugging. On setting this to `true` - meta data be fetched from db instead of redis/cache. | `false` |
| AWS_ACCESS_KEY_ID | For Litestream - S3 access key id | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| AWS_SECRET_ACCESS_KEY | For Litestream - S3 secret access key | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| AWS_BUCKET | For Litestream - S3 bucket | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| AWS_BUCKET_PATH | For Litestream - S3 bucket path (like folder within S3 bucket) | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| NC_SMTP_FROM | For SMTP plugin - Email sender address | |
| NC_SMTP_HOST | For SMTP plugin - SMTP host value | |
| NC_SMTP_PORT | For SMTP plugin - SMTP port value | |
| NC_SMTP_USERNAME | For SMTP plugin (Optional) - SMTP username value for authentication | |
| NC_SMTP_PASSWORD | For SMTP plugin (Optional) - SMTP password value for authentication | |
| NC_SMTP_SECURE | For SMTP plugin (Optional) - To enable secure set value as `true` any other value treated as false | |
| NC_SMTP_IGNORE_TLS | For SMTP plugin (Optional) - To ignore tls set value as `true` any other value treated as false. For more info visit https://nodemailer.com/smtp/ | |
| NC_S3_BUCKET_NAME | For S3 storage plugin - AWS S3 bucket name | |
| NC_S3_REGION | For S3 storage plugin - AWS S3 region | |
| NC_S3_ACCESS_KEY | For S3 storage plugin - AWS access key credential for accessing resource | |
| NC_S3_ACCESS_SECRET | For S3 storage plugin - AWS access secret credential for accessing resource | |
| NC_ADMIN_EMAIL | For updating/creating super admin with provided email and password | |
| NC_ATTACHMENT_FIELD_SIZE | For setting the attachment field size(in Bytes) | Defaults to 20MB |
| NC_ADMIN_PASSWORD | For updating/creating super admin with provided email and password. Your password should have at least 8 letters with one uppercase, one number and one special letter(Allowed special chars $&+,:;=?@#\|'.^*()%!_-" ) | |
| NODE_OPTIONS | For passing Node.js [options](https://nodejs.org/api/cli.html#node_optionsoptions) to instance | |
| NC_MINIMAL_DBS | Create a new SQLite file for each project. All the db files are stored in `nc_minimal_dbs` folder in current working directory. (This option restricts project creation on external sources) | |
| NC_DISABLE_AUDIT | Disable Audit Log | `false` |
| NC_AUTOMATION_LOG_LEVEL | Possible Values: `OFF`, `ERROR`, `ALL`. See [Webhooks](/developer-resources/webhooks#call-log) for details. | `OFF` |
| Variable | Comments | If absent |
|------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------|
| NC_DB | See our example database URLs [here](https://github.com/nocodb/nocodb#docker). | A local SQLite will be created in root folder if `NC_DB` is not provided |
| NC_DB_JSON | Can be used instead of `NC_DB` and value should be valid knex connection JSON | |
| NC_DB_JSON_FILE | Can be used instead of `NC_DB` and value should be a valid path to knex connection JSON | |
| DATABASE_URL | JDBC URL Format. Can be used instead of NC_DB. | |
| DATABASE_URL_FILE | Can be used instead of DATABASE_URL: path to file containing JDBC URL Format. | |
| NC_AUTH_JWT_SECRET | JWT secret used for auth and storing other secrets | A random secret will be generated |
| PORT | For setting app running port | `8080` |
| DB_QUERY_LIMIT_DEFAULT | Default pagination limit | 25 |
| DB_QUERY_LIMIT_MAX | Maximum allowed pagination limit | 1000 |
| DB_QUERY_LIMIT_MIN | Minimum allowed pagination limit | 1 |
| NC_TOOL_DIR | App directory to keep metadata and app related files | Defaults to current working directory. In docker maps to `/usr/app/data/` for mounting volume. |
| NC_PUBLIC_URL | Used for sending Email invitations | Best guess from http request params |
| NC_JWT_EXPIRES_IN | JWT token expiry time | `10h` |
| NC_CONNECT_TO_EXTERNAL_DB_DISABLED | Disable Project creation with external database | |
| NC_INVITE_ONLY_SIGNUP | Removed since version 0.99.0 and now it's recommended to use [super admin settings menu](/setup-and-usages/account-settings#enable--disable-signup). Allow users to signup only via invite url, value should be any non-empty string. | |
| NUXT_PUBLIC_NC_BACKEND_URL | Custom Backend URL | ``http://localhost:8080`` will be used |
| NC_REQUEST_BODY_SIZE | Request body size [limit](https://expressjs.com/en/resources/middleware/body-parser.html#limit) | `1048576` |
| NC_EXPORT_MAX_TIMEOUT | After NC_EXPORT_MAX_TIMEOUT csv gets downloaded in batches | Default value 5000(in millisecond) will be used |
| NC_DISABLE_TELE | Disable telemetry | |
| NC_DASHBOARD_URL | Custom dashboard url path | `/dashboard` |
| NC_GOOGLE_CLIENT_ID | Google client id to enable google authentication | |
| NC_GOOGLE_CLIENT_SECRET | Google client secret to enable google authentication | |
| NC_MIGRATIONS_DISABLED | Disable NocoDB migration | |
| NC_MIN | If set to any non-empty string the default splash screen(initial welcome animation) and matrix screensaver will disable | |
| NC_SENTRY_DSN | For Sentry monitoring | |
| NC_REDIS_URL | Custom Redis URL. Example: `redis://:authpassword@127.0.0.1:6380/4` | Meta data will be stored in memory |
| NC_DISABLE_ERR_REPORT | Disable error reporting | |
| NC_DISABLE_CACHE | To be used only while debugging. On setting this to `true` - meta data be fetched from db instead of redis/cache. | `false` |
| AWS_ACCESS_KEY_ID | For Litestream - S3 access key id | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| AWS_SECRET_ACCESS_KEY | For Litestream - S3 secret access key | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| AWS_BUCKET | For Litestream - S3 bucket | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| AWS_BUCKET_PATH | For Litestream - S3 bucket path (like folder within S3 bucket) | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| NC_SMTP_FROM | For SMTP plugin - Email sender address | |
| NC_SMTP_HOST | For SMTP plugin - SMTP host value | |
| NC_SMTP_PORT | For SMTP plugin - SMTP port value | |
| NC_SMTP_USERNAME | For SMTP plugin (Optional) - SMTP username value for authentication | |
| NC_SMTP_PASSWORD | For SMTP plugin (Optional) - SMTP password value for authentication | |
| NC_SMTP_SECURE | For SMTP plugin (Optional) - To enable secure set value as `true` any other value treated as false | |
| NC_SMTP_IGNORE_TLS | For SMTP plugin (Optional) - To ignore tls set value as `true` any other value treated as false. For more info visit https://nodemailer.com/smtp/ | |
| NC_S3_BUCKET_NAME | For S3 storage plugin - AWS S3 bucket name | |
| NC_S3_REGION | For S3 storage plugin - AWS S3 region | |
| NC_S3_ACCESS_KEY | For S3 storage plugin - AWS access key credential for accessing resource | |
| NC_S3_ACCESS_SECRET | For S3 storage plugin - AWS access secret credential for accessing resource | |
| NC_ADMIN_EMAIL | For updating/creating super admin with provided email and password | |
| NC_ATTACHMENT_FIELD_SIZE | For setting the attachment field size(in Bytes) | Defaults to 20MB |
| NC_ADMIN_PASSWORD | For updating/creating super admin with provided email and password. Your password should have at least 8 letters with one uppercase, one number and one special letter(Allowed special chars $&+,:;=?@#\|'.^*()%!_-" ) | |
| NODE_OPTIONS | For passing Node.js [options](https://nodejs.org/api/cli.html#node_optionsoptions) to instance | |
| NC_MINIMAL_DBS | Create a new SQLite file for each project. All the db files are stored in `nc_minimal_dbs` folder in current working directory. (This option restricts project creation on external sources) | |
| NC_DISABLE_AUDIT | Disable Audit Log | `false` |
| NC_AUTOMATION_LOG_LEVEL | Possible Values: `OFF`, `ERROR`, `ALL`. See [Webhooks](/automation/webhook/create-webhook#call-log) for details. | `OFF` |

2
packages/noco-docs/docs/020.getting-started/050.self-hosted/030.upgrading.md

@ -4,7 +4,7 @@ description: 'Upgrading NocoDB : Docker, Node and Homebrew!'
keywords: ['NocoDB upgrade', 'upgrade NocoDB', 'upgrade nocodb']
---
By default, if `NC_DB` is not specified upon [installation](/getting-started/installation), then SQLite will be used to store metadata. We suggest users to separate the metadata and user data in different databases as pictured in our [architecture](/engineering/architecture).
By default, if `NC_DB` is not specified upon [installation](/getting-started/self-hosted/installation), then SQLite will be used to store metadata. We suggest users to separate the metadata and user data in different databases as pictured in our [architecture](/engineering/architecture).
## Docker

2
packages/noco-docs/docs/130.automation/020.webhook/010.webhook-overview.md

@ -8,5 +8,5 @@ Note that, Webhooks currently are specific for associated table.
- [Create Webhook](create-webhook)
- [Modify Webhook](actions-on-webhook)
- [Delete Webhook](delete-webhook)
- [Delete Webhook](actions-on-webhook#delete-webhook)

16
packages/noco-docs/docs/150.engineering/010.architecture.md

@ -0,0 +1,16 @@
---
title: "Architecture Overview"
description: "Simple overview of NocoDB architecture"
hide_table_of_contents: true
---
By default, if `NC_DB` is not specified, then SQLite will be used to store your metadata. We suggest users to separate the metadata and user data in different databases.
![image](/img/architecture.png)
| Project Type | Metadata stored in | Data stored in |
|---------|-----------|--------|
| Create new project | NC_DB | NC_DB |
| Create new project with External Database | NC_DB | External Database |
| Create new project from Excel | NC_DB | NC_DB |

23
packages/noco-docs/docs/150.engineering/020.repository-structure.md

@ -0,0 +1,23 @@
---
title: "Repository structure"
description: "Repository Structure"
hide_table_of_contents: true
---
We use ``Lerna`` to manage multi-packages. We have the following [packages](https://github.com/nocodb/nocodb/tree/master/packages).
- ``packages/nc-cli`` : A CLI to create NocoDB app.
- ``packages/nocodb-sdk``: API client sdk of nocodb.
- ``packages/nc-gui``: NocoDB Frontend.
- ``packages/nc-lib-gui``: The build version of ``nc-gui`` which will be used in ``packages/nocodb``.
- ``packages/nc-plugin``: Plugin template.
- ``packages/noco-blog``: NocoDB Blog which will be auto-released to [nocodb/noco-blog](https://github.com/nocodb/noco-blog).
- ``packages/noco-docs``: NocoDB Documentation which will be auto-released to [nocodb/noco-docs](https://github.com/nocodb/noco-docs).
- ``packages/nocodb``: NocoDB Backend, hosted in [NPM](https://www.npmjs.com/package/nocodb).

50
packages/noco-docs/docs/150.engineering/030.development-setup.md

@ -0,0 +1,50 @@
---
title: "Development Setup"
description: "How to set-up your development environment"
---
## Clone the repo
```bash
git clone https://github.com/nocodb/nocodb
# change directory to the project root
cd nocodb
```
## Install dependencies
```bash
# run from the project root
pnpm i
```
## Start Frontend
```bash
# run from the project root
pnpm start:frontend
# runs on port 3000
```
## Start Backend
```bash
# run from the project root
pnpm start:backend
# runs on port 8080
```
Any changes made to frontend and backend will be automatically reflected in the browser.
## Enabling CI-CD for Draft PR
CI-CD will be triggered on moving a PR from draft state to `Ready for review` state & on pushing changes to `Develop`. To verify CI-CD before requesting for review, add label `trigger-CI` on Draft PR.
## Accessing CI-CD Failure Screenshots
For Playwright tests, screenshots are captured on the tests. These will provide vital clues for debugging possible issues observed in CI-CD. To access screenshots, Open link associated with CI-CD run & click on `Artifacts`
![Screenshot 2022-09-29 at 12 43 37 PM](https://user-images.githubusercontent.com/86527202/192965070-dc04b952-70fb-4197-b4bd-ca7eda066e60.png)

189
packages/noco-docs/docs/150.engineering/040.unit-testing.md

@ -0,0 +1,189 @@
---
title: "Writing Unit Tests"
description: "Overview to Unit Testing"
---
## Unit Tests
- All individual unit tests are independent of each other. We don't use any shared state between tests.
- Test environment includes `sakila` sample database and any change to it by a test is reverted before running other tests.
- While running unit tests, it tries to connect to mysql server running on `localhost:3306` with username `root` and password `password` (which can be configured) and if not found, it will use `sqlite` as a fallback, hence no requirement of any sql server to run tests.
### Pre-requisites
- MySQL is preferred - however tests can fallback on SQLite too
### Setup
```bash
pnpm --filter=-nocodb install
# add a .env file
cp tests/unit/.env.sample tests/unit/.env
# open .env file
open tests/unit/.env
```
Configure the following variables
> DB_HOST : host
> DB_PORT : port
> DB_USER : username
> DB_PASSWORD : password
### Run Tests
``` bash
pnpm run test:unit
```
### Folder Structure
The root folder for unit tests is `packages/nocodb/tests/unit`
- `rest` folder contains all the test suites for rest apis.
- `model` folder contains all the test suites for models.
- `factory` folder contains all the helper functions to create test data.
- `init` folder contains helper functions to configure test environment.
- `index.test.ts` is the root test suite file which imports all the test suites.
- `TestDbMngr.ts` is a helper class to manage test databases (i.e. creating, dropping, etc.).
### Factory Pattern
- Use factories for create/update/delete data. No data should be directly create/updated/deleted in the test.
- While writing a factory make sure that it can be used with as less parameters as possible and use default values for other parameters.
- Use named parameters for factories.
```ts
createUser({ email, password})
```
- Use one file per factory.
### Walk through of writing a Unit Test
We will create an `Table` test suite as an example.
#### Configure test
We will configure `beforeEach` which is called before each test is executed. We will use `init` function from `nocodb/packages/nocodb/tests/unit/init/index.ts`, which is a helper function which configures the test environment(i.e resetting state, etc.).
`init` does the following things -
- It initializes a `Noco` instance(reused in all tests).
- Restores `meta` and `sakila` database to its initial state.
- Creates the root user.
- Returns `context` which has `auth token` for the created user, node server instance(`app`), and `dbConfig`.
We will use `createProject` and `createProject` factories to create a project and a table.
```typescript
let context;
beforeEach(async function () {
context = await init();
project = await createProject(context);
table = await createTable(context, project);
});
```
#### Test case
We will use `it` function to create a test case. We will use `supertest` to make a request to the server. We use `expect`(`chai`) to assert the response.
```typescript
it('Get table list', async function () {
const response = await request(context.app)
.get(`/api/v1/db/meta/projects/${project.id}/tables`)
.set('xc-auth', context.token)
.send({})
.expect(200);
expect(response.body.list).to.be.an('array').not.empty;
});
```
:::info
We can also run individual test by using `.only` in `describe` or `it` function and the running the test command.
:::
```typescript
it.only('Get table list', async () => {
```
#### Integrating the New Test Suite
We create a new file `table.test.ts` in `packages/nocodb/tests/unit/rest/tests` directory.
```typescript
import 'mocha';
import request from 'supertest';
import init from '../../init';
import { createTable, getAllTables } from '../../factory/table';
import { createProject } from '../../factory/project';
import { defaultColumns } from '../../factory/column';
import Model from '../../../../src/lib/models/Model';
import { expect } from 'chai';
function tableTest() {
let context;
let project;
let table;
beforeEach(async function () {
context = await init();
project = await createProject(context);
table = await createTable(context, project);
});
it('Get table list', async function () {
const response = await request(context.app)
.get(`/api/v1/db/meta/projects/${project.id}/tables`)
.set('xc-auth', context.token)
.send({})
.expect(200);
expect(response.body.list).to.be.an('array').not.empty;
});
}
export default function () {
describe('Table', tableTests);
}
```
We can then import the `Table` test suite to `Rest` test suite in `packages/nocodb/tests/unit/rest/index.test.ts` file(`Rest` test suite is imported in the root test suite file which is `packages/nocodb/tests/unit/index.test.ts`).
### Seeding Sample DB (Sakila)
```typescript
function tableTest() {
let context;
let sakilaProject: Project;
let customerTable: Model;
beforeEach(async function () {
context = await init();
/******* Start : Seeding sample database **********/
sakilaProject = await createSakilaProject(context);
/******* End : Seeding sample database **********/
customerTable = await getTable({project: sakilaProject, name: 'customer'})
});
it('Get table data list', async function () {
const response = await request(context.app)
.get(`/api/v1/db/data/noco/${sakilaProject.id}/${customerTable.id}`)
.set('xc-auth', context.token)
.send({})
.expect(200);
expect(response.body.list[0]['FirstName']).to.equal('MARY');
});
}
```

218
packages/noco-docs/docs/150.engineering/050.playwright.md

@ -0,0 +1,218 @@
---
title: "Playwright E2E Testing"
description: "Overview to playwright based e2e tests"
---
## How to run tests
All the tests reside in `tests/playwright` folder.
Make sure to install the dependencies (in the playwright folder):
```bash
pnpm --filter=playwright install
pnpm exec playwright install --with-deps chromium
```
### Run Test Server
Start the backend test server (in `packages/nocodb` folder):
```bash
pnpm run watch:run:playwright
```
Start the frontend test server (in `packages/nc-gui` folder):
```bash
NUXT_PAGE_TRANSITION_DISABLE=true pnpm run dev
```
### Running all tests
For selecting db type, rename `.env.example` to `.env` and set `E2E_DEV_DB_TYPE` to `sqlite`(default), `mysql` or `pg`.
headless mode(without opening browser):
```bash
pnpm run test
```
with browser:
```bash
pnpm run test:debug
```
For setting up mysql(sakila):
```bash
docker-compose -f ./tests/playwright/scripts/docker-compose-mysql-playwright.yml up -d
```
For setting up postgres(sakila):
```bash
docker-compose -f ./tests/playwright/scripts/docker-compose-playwright-pg.yml
```
### Running individual tests
Add `.only` to the test you want to run:
```js
test.only('should login', async ({ page }) => {
// ...
})
```
```bash
pnpm run test
```
## Concepts
### Independent tests
- All tests are independent of each other.
- Each test starts with a fresh project with a fresh sakila database(option to not use sakila db is also there).
- Each test creates a new user(email as `user@nocodb.com`) and logs in with that user to the dashboard.
Caveats:
- Some stuffs are shared i.e, users, plugins etc. So be catious while writing tests touching that. A fix for this is in the works.
- In test, we prefix email and project with the test id, which will be deleted after the test is done.
### What to test
- UI verification. This includes verifying the state of the UI element, i.e if the element is visible, if the element has a particular text etc.
- Test should verify all user flow. A test has a default timeout of 60 seconds. If a test is taking more than 60 seconds, it is a sign that the test should be broken down into smaller tests.
- Test should also verify all the side effects the feature(i.e. On adding a new column type, should verify column deletion as well) will have, and also error cases.
- Test name should be descriptive. It should be easy to understand what the test is doing by just reading the test name.
### Playwright
- Playwright is a nodejs library for automating chromium, firefox and webkit.
- For each test, a new browser context is created. This means that each test runs in a new incognito window.
- For assertion always use `expect` from `@playwright/test` library. This library provides a lot of useful assertions, which also has retry logic built in.
## Page Objects
- Page objects are used to abstract over the components/page. This makes the tests more readable and maintainable.
- All page objects are in `tests/playwright/pages` folder.
- All the test related code should be in page objects.
- Methods should be as thin as possible and its better to have multiple methods than one big method, which improves reusability.
The methods of a page object can be classified into 2 categories:
- Actions: Performs an UI actions like click, type, select etc. Is also responsible for waiting for the element to be ready and the action to be performed. This included waiting for API calls to complete.
- Assertions: Asserts the state of the UI element, i.e if the element is visible, if the element has a particular text etc. Use `expect` from `@playwright/test` and if not use `expect.poll` to wait for the assertion to pass.
## Writing a test
Let's write a test for testing filter functionality.
For simplicity, we will have `DashboardPage` implemented, which will have all the methods related to dashboard page and also its child components like Grid, etc.
### Create a test suite
Create a new file `filter.spec.ts` in `tests/playwright/tests` folder and use `setup` method to create a new project and user.
```js
import { test, expect } from '@playwright/test';
import setup, { NcContext } from '../setup';
test.describe('Filter', () => {
let context: NcContext;
test.beforeEach(async ({ page }) => {
context = await setup({ page });
})
test('should filter', async ({ page }) => {
// ...
});
});
```
### Create a page object
Since filter is UI wise scoped to a `Toolbar` , we will add filter page object to `ToolbarPage` page object.
```js
export class ToolbarPage extends BasePage {
readonly parent: GridPage | GalleryPage | FormPage | KanbanPage;
readonly filter: ToolbarFilterPage;
constructor(parent: GridPage | GalleryPage | FormPage | KanbanPage) {
super(parent.rootPage);
this.parent = parent;
this.filter = new ToolbarFilterPage(this);
}
}
```
We will create `ToolbarFilterPage` page object, which will have all the methods related to filter.
```js
export class ToolbarFilterPage extends BasePage {
readonly toolbar: ToolbarPage;
constructor(toolbar: ToolbarPage) {
super(toolbar.rootPage);
this.toolbar = toolbar;
}
}
```
Here `BasePage` is an abstract class, which used to enforce structure for all page objects. Thus all page object *should* inherit `BasePage`.
- Helper methods like `waitForResponse` and `getClipboardText` (this can be access on any page object, with `this.waitForResponse`)
- Provides structure for page objects, enforces all Page objects to have `rootPage` property, which is the page object created in the test setup.
- Enforces all pages to have a `get` method which will return the locator of the main container of that page, hence we can have focused dom selection, i.e.
```js
// This will only select the button inside the container of the concerned page
await this.get().querySelector('button').count();
```
### Writing an action method
This a method which will reset/clear all the filters. Since this is an action method, it will also wait for the `delete` filter API to return. Ignoring this API call will cause flakiness in the test, down the line.
```js
async resetFilter() {
await this.waitForResponse({
uiAction: async () => await this.get().locator('.nc-filter-item-remove-btn').click(),
httpMethodsToMatch: ['DELETE'],
requestUrlPathToMatch: '/api/v1/db/meta/filters/',
});
}
```
### Writing an assertion/verification method
Here we use `expect` from `@playwright/test` library, which has retry logic built in.
```js
import { expect } from '@playwright/test';
async verifyFilter({ title }: { title: string }) {
await expect(
this.get().locator(`[data-testid="nc-fields-menu-${title}"]`).locator('input[type="checkbox"]')
).toBeChecked();
}
```
## Tips to avoid flakiness
- If an UI action, causes an API call or the UI state change, then wait for that API call to complete or the UI state to change.
- What to wait out can be situation specific, but in general, is best to wait for the final state to be reached, i.e. in the case of creating filter, while it seems like waiting for the filter API to complete is enough, but after its return the table rows are reloaded and the UI state changes, so its better to wait for the table rows to be reloaded.
## Accessing playwright report in the CI
- Open `Summary` tab in the CI workflow in github actions.
- Scroll down to `Artifacts` section.
- Access reports which suffixed with the db type and shard number(corresponding to the CI workerflow name). i.e `playwright-report-mysql-2` is for `playwright-mysql-2` workflow.
- Download it and run `pnpm install -D @playwright/test && npx playwright show-report ./` inside the downloaded folder.

164
packages/noco-docs/docs/150.engineering/060.builds-and-releases.md

@ -0,0 +1,164 @@
---
title: "Releases & Builds"
description: "NocoDB creates Docker and Binaries for each PR"
---
## Builds of NocoDB
There are 3 kinds of docker builds in NocoDB
- Release builds [nocodb/nocodb](https://hub.docker.com/r/nocodb/nocodb) : built during NocoDB release.
- Daily builds [nocodb/nocodb-daily](https://hub.docker.com/r/nocodb/nocodb-daily) : built every 6 hours from Develop branch.
- Timely builds [nocodb/nocodb-timely](https://hub.docker.com/r/nocodb/nocodb-timely): built for every PR and manually triggered PRs.
Below is an overview of how to make these builds and what happens behind the scenes.
## Release builds
### How to make a release build ?
- Click [NocoDB release action](https://github.com/nocodb/nocodb/actions/workflows/release-nocodb.yml)
- You should see the below screen
![image](https://user-images.githubusercontent.com/35857179/167240353-a02f690f-c865-4ade-8645-64382405c9ea.png)
- Change `Use workflow from` to `Branch: master`. If you choose the wrong branch, the workflow will be ended.
![image](https://user-images.githubusercontent.com/35857179/167240383-dda05f76-8323-4f4a-b3e7-9db886dbd68d.png)
- Then there would be two cases - you can either leave target tag and pervious tag blank or manually input some values
- Target Tag means the target deployment version, while Previous Tag means the latest version as of now. Previous Tag is used for Release Note only - showing the file / commit differences between two tags.
### Tagging
The naming convention would be following given the actual release tag is `0.100.0`
- `0.100.0-beta.0` (first version of pre-release)
- `0.100.0-beta.1` (include bug fix changes on top of the previous version)
- `0.100.0-beta.2`(include bug fix changes on top of the previous version)
- and so on ...
- `0.100.0` (actual release)
- `0.100.1` (minor bug fix release)
- `0.100.2` (minor bug fix release)
### Case 1: Leaving inputs blank
- If Previous Tag is blank, then the value will be fetched from [latest](https://github.com/nocodb/nocodb/releases/latest)
- If Target Tag is blank, then the value will be Previous Tag plus one. Example: 0.90.11 (Previous Tag) + 0.0.1 = 0.90.12 (Target Tag)
### Case 2: Manually Input
Why? Sometimes we may mess up in NPM deployment. As NPM doesn't allow us to redeploy to the same tag again, in this case we cannot just use the previous tag + 1. Therefore, we need to use previous tag + 2 instead and we manually configure it.
- After that, click `Run workflow` to start
- You can see Summary for the overall job status.
- Once `release-draft-note` and `release-executables` is finished, then go to [releases](https://github.com/nocodb/nocodb/releases), edit the draft note and save as draft for time being.
- Example: Adding header, update content if necessary, and click `Auto-generate release notes` to include more info.
- Once `release-docker` is finished, then test it locally first. You'll be expected to see `Upgrade Available` notification in UI as we haven't published the release note. (the version is retrieved from there)
- Once everything is finished, then publish the release note and the deployment is considered as DONE.
### How does release action work ?
#### validate-branch
To check if `github.ref` is master. Otherwise, other branches will be not accepted.
#### process-input
To enrich target tag or previous tag if necessary.
#### pr-to-master
Automate a PR from develop to master branch so that we know the actual differences between the previous tag and the current tag. We choose master branch for a deployment base.
#### release-npm
Build frontend and backend and release them to NPM. The changes during built such as version bumping will be committed and pushed to a temporary branch and an automated PR will be created and merged to master branch.
Note that once you publish with a certain tag, you cannot publish with the same tag again.
#### release-draft-note
Generate a draft release note. Some actions need to be done after this step.
#### release-docker
Build docker image and publish it to Docker Hub. It may take around 15 - 30 mins.
#### close-issues
Open issues marked with label `Status: Fixed` and `Status: Resolved` will be closed by bot automatically with comment mentioning the fix is included in which version.
Example:
![image](https://user-images.githubusercontent.com/35857179/167241574-f8f7061f-c689-444a-b761-0a727974c53f.png)
#### publish-docs
Publish Documentations
#### update-sdk-path
`nocodb-sdk` is used in frontend and backend. However, in develop branch, the value would be `file:../nocodb-sdk` for development purpose (so that the changes done in nocodb-sdk in develop will be included in frontend and backend). During the deployment, the value will be changed to the target tag. This job is to update them back.
#### sync-to-develop
Once the deployment is finished, there would be some new changes being pushed to master branch. This job is to sync the changes back to develop so that both branch won't have any difference.
## Daily builds
### What are daily builds ?
- NocoDB creates every 6 hours from develop branches and publishes as nocodb/nocodb-daily
- This is so that we can easily try what is in the develop branch easily.
### Docker images
- The docker images will be built and pushed to Docker Hub (See [nocodb/nocodb-daily](https://hub.docker.com/r/nocodb/nocodb-daily/tags) for the full list).
## Timely builds
### What are timely builds ?
NocoDB has github actions which creates docker and binaries for each PR! And these can be found as a **comment on the last commit** of the PR.
Example shown below
- Go to a PR and click on the comment.
<img width="1111" alt="Screenshot 2023-01-23 at 15 46 36" src="https://user-images.githubusercontent.com/5435402/214083736-80062398-3712-430f-9865-86b110090c91.png" />
- Click on the link to copy the docker image and run it locally.
<img width="1231" alt="Screenshot 2023-01-23 at 15 46 55" src="https://user-images.githubusercontent.com/5435402/214083755-945d9485-2b9e-4739-8408-068bdf4a84b7.png" />
This is to
- reduce pull request cycle time
- allow issue reporters / reviewers to verify the fix without setting up their local machines
### Docker images
When a non-draft Pull Request is created, reopened or synchronized, a timely build for Docker would be triggered for the changes only included in the following paths.
- `packages/nocodb-sdk/**`
- `packages/nc-gui/**`
- `packages/nc-plugin/**`
- `packages/nocodb/**`
The docker images will be built and pushed to Docker Hub (See [nocodb/nocodb-timely](https://hub.docker.com/r/nocodb/nocodb-timely/tags) for the full list). Once the image is ready, Github bot will add a comment with the command in the pull request. The tag would be `<NOCODB_CURRENT_VERSION>-pr-<PR_NUMBER>-<YYYYMMDD>-<HHMM>`.
![image](https://user-images.githubusercontent.com/35857179/175012097-240dab05-da93-4c4e-87c1-1c36fb1350bd.png)
## Executables or Binaries
Similarly, we provide a timely build for executables for non-docker users. The source code will be built, packaged as binary files, and pushed to Github (See [nocodb/nocodb-timely](https://github.com/nocodb/nocodb-timely/releases) for the full list).
Currently, we only support the following targets:
- `node16-linux-arm64`
- `node16-macos-arm64`
- `node16-win-arm64`
- `node16-linux-x64`
- `node16-macos-x64`
- `node16-win-x64`
Once the executables are ready, Github bot will add a comment with the commands in the pull request.
![image](https://user-images.githubusercontent.com/35857179/175012070-f5f3e7b8-6dc5-4d1c-9f7e-654bc5039521.png)
NocoDB creates Docker and Binaries for each PR.
This is to
- reduce pull request cycle time
- allow issue reporters / reviewers to verify the fix without setting up their local machines

78
packages/noco-docs/docs/150.engineering/070.translation.md

@ -0,0 +1,78 @@
---
title: "i18n translation"
description: "Contribute to NocoDB's i18n translation"
---
- NocoDB supports 30+ foreign languages & community contributions are now simplified via [Crowdin](https://crowdin.com/).
## How to add / edit translations ?
### Using Github
- For English, make changes directly to [en.json](https://github.com/nocodb/nocodb/blob/develop/packages/nc-gui/lang/en.json) & commit to `develop`
- For any other language, use `crowdin` option.
### Using Crowdin
- Setup [Crowdin](https://crowdin.com) account
- Join [NocoDB](https://crowdin.com/project/nocodb) project
![Screenshot 2022-09-08 at 10 26 23 PM](https://user-images.githubusercontent.com/86527202/189181511-51b8671e-bee8-45d5-8216-a4a031bc6309.png)
- Click the language that you wish to contribute
![Screenshot 2022-09-08 at 10 29 56 PM](https://user-images.githubusercontent.com/86527202/189182132-0eed7d5a-eaa1-43e1-929d-688f375763c1.png)
- Click the `Translate` button; this opens up `Crowdin Online Editor`
![Screenshot 2022-09-08 at 10 32 17 PM](https://user-images.githubusercontent.com/86527202/189182450-999124e8-566c-40af-9d3c-731a11c1b6aa.png)
- Select string in `English` on the left-hand menu bar [1]
- Propose changes [2]
- Save [3]
Note: Crowdin provides translation recommendation's as in [4]. Click directly if it's apt
![Screenshot 2022-09-08 at 10 37 38 PM](https://user-images.githubusercontent.com/86527202/189184278-69d688ed-4e5a-4d5a-b629-9f6d10d79346.png)
A GitHub Pull Request will be automatically triggered (periodicity- 6 hours). We will follow up on remaining integration work items.
#### Reference
Refer following articles to get additional details about Crowdin Portal usage
- [Translator Introduction](https://support.crowdin.com/crowdin-intro/)
- [Volunteer Translation Introduction](https://support.crowdin.com/for-volunteer-translators/)
- [Online Editor](https://support.crowdin.com/online-editor/)
## How to add a new language ?
#### GitHub changes
- Update enumeration in `enums.ts` [packages/nc-gui/lib/enums.ts]
- Map JSON path in `a.i18n.ts` [packages/nc-gui/plugins/a.i18n.ts]
#### Crowdin changes [admin only]
- Open `NocoDB` project
- Click on `Language` on the home tab
- Select target language, `Update`
- Update array in `tests/playwright/tests/language.spec.ts`
![Screenshot 2022-09-08 at 10 52 59 PM](https://user-images.githubusercontent.com/86527202/189186570-5c1c7cad-6d3f-4937-ab4d-fa7ebe022cb1.png)
![Screenshot 2022-09-08 at 10 54 04 PM](https://user-images.githubusercontent.com/86527202/189186632-0b9f5f55-0550-4d8f-a8ae-7e9b9076774e.png)
## String Categories
- **General**: simple & common tokens (save, cancel, submit, open, close, home, and such)
- **Objects**: objects from NocoDB POV (project, table, field, column, view, page, and such)
- **Title**: screen headers (compact) (menu headers, modal headers)
- **Lables**: text box/ radio/ field headers (few words) (Labels over textbox, radio buttons, and such)
- **Activity**/ actions: work items (few words) (Create Project, Delete Table, Add Row, and such)
- **Tooltip**: additional information associated with work items (usually lengthy) (Additional information provided for activity)
- **Placeholder**: placeholders associated with various textboxes (Text placeholders)
- **Msg**
- Info: general/success category for everything
- Error: warnings & errors
- Toast: pop-up toast messages
> Note: string name should be in camelCase. Use above list as priority order in case of ambiguity.

5
packages/noco-docs/docs/150.engineering/_category_.json

@ -0,0 +1,5 @@
{
"label": "Engineering",
"collapsible": true,
"collapsed": true
}

6
packages/noco-docs/versioned_docs/version-0.109.7/010.index.md vendored

@ -29,7 +29,7 @@ Also NocoDB's app store allows you to build business workflows on views with com
### App Store for Workflow Automations
We provide different integrations in three main categories. See <a href="/setup-and-usages/account-settings#app-store" target="_blank">App Store</a> for details.
We provide different integrations in three main categories. See <a href="/0.109.7/setup-and-usages/account-settings#app-store" target="_blank">App Store</a> for details.
- ⚡ &nbsp;Chat : Slack, Discord, Mattermost, and etc
- ⚡ &nbsp;Email : AWS SES, SMTP, MailerSend, and etc
@ -44,11 +44,11 @@ We provide the following ways to let users to invoke actions in a programmatic w
### Sync Schema
We allow you to sync schema changes if you have made changes outside NocoDB GUI. However, it has to be noted then you will have to bring your own schema migrations for moving from environment to others. See <a href="/setup-and-usages/sync-schema" target="_blank">Sync Schema</a> for details.
We allow you to sync schema changes if you have made changes outside NocoDB GUI. However, it has to be noted then you will have to bring your own schema migrations for moving from environment to others. See <a href="/0.109.7/setup-and-usages/sync-schema" target="_blank">Sync Schema</a> for details.
### Audit
We are keeping all the user operation logs under one place. See <a href="/setup-and-usages/audit" target="_blank">Audit</a> for details.
We are keeping all the user operation logs under one place. See <a href="/0.109.7/setup-and-usages/audit" target="_blank">Audit</a> for details.
## Why are we building this?
Most internet businesses equip themselves with either spreadsheet or a database to solve their business needs. Spreadsheets are used by a Billion+ humans collaboratively every single day. However, we are way off working at similar speeds on databases which are way more powerful tools when it comes to computing. Attempts to solve this with SaaS offerings has meant horrible access controls, vendor lockin, data lockin, abrupt price changes & most importantly a glass ceiling on what's possible in future.

102
packages/noco-docs/versioned_docs/version-0.109.7/020.getting-started/020.environment-variables.md vendored

@ -10,55 +10,55 @@ For production usecases, it is **recommended** to configure
- `NC_PUBLIC_URL`,
- `NC_REDIS_URL`
| Variable | Comments | If absent |
|---|---|---|
| NC_DB | See our database URLs | A local SQLite will be created in root folder if `NC_DB` is not provided |
| NC_DB_JSON | Can be used instead of `NC_DB` and value should be valid knex connection JSON | |
| NC_DB_JSON_FILE | Can be used instead of `NC_DB` and value should be a valid path to knex connection JSON | |
| DATABASE_URL | JDBC URL Format. Can be used instead of NC_DB. | |
| DATABASE_URL_FILE | Can be used instead of DATABASE_URL: path to file containing JDBC URL Format. | |
| NC_AUTH_JWT_SECRET | JWT secret used for auth and storing other secrets | A random secret will be generated |
| PORT | For setting app running port | `8080` |
| DB_QUERY_LIMIT_DEFAULT | Default pagination limit | 25 |
| DB_QUERY_LIMIT_MAX | Maximum allowed pagination limit | 1000 |
| DB_QUERY_LIMIT_MIN | Minimum allowed pagination limit | 1 |
| NC_TOOL_DIR | App directory to keep metadata and app related files | Defaults to current working directory. In docker maps to `/usr/app/data/` for mounting volume. |
| NC_PUBLIC_URL | Used for sending Email invitations | Best guess from http request params |
| NC_JWT_EXPIRES_IN | JWT token expiry time | `10h` |
| NC_CONNECT_TO_EXTERNAL_DB_DISABLED | Disable Project creation with external database | |
| NC_INVITE_ONLY_SIGNUP | Removed since version 0.99.0 and now it's recommended to use [super admin settings menu](/setup-and-usages/account-settings#enable--disable-signup). Allow users to signup only via invite url, value should be any non-empty string. | |
| NUXT_PUBLIC_NC_BACKEND_URL | Custom Backend URL | ``http://localhost:8080`` will be used |
| NC_REQUEST_BODY_SIZE | Request body size [limit](https://expressjs.com/en/resources/middleware/body-parser.html#limit) | `1048576` |
| NC_EXPORT_MAX_TIMEOUT | After NC_EXPORT_MAX_TIMEOUT csv gets downloaded in batches | Default value 5000(in millisecond) will be used |
| NC_DISABLE_TELE | Disable telemetry | |
| NC_DASHBOARD_URL | Custom dashboard url path | `/dashboard` |
| NC_GOOGLE_CLIENT_ID | Google client id to enable google authentication | |
| NC_GOOGLE_CLIENT_SECRET | Google client secret to enable google authentication | |
| NC_MIGRATIONS_DISABLED | Disable NocoDB migration | |
| NC_MIN | If set to any non-empty string the default splash screen(initial welcome animation) and matrix screensaver will disable | |
| NC_SENTRY_DSN | For Sentry monitoring | |
| NC_REDIS_URL | Custom Redis URL. Example: `redis://:authpassword@127.0.0.1:6380/4` | Meta data will be stored in memory |
| NC_DISABLE_ERR_REPORT | Disable error reporting | |
| NC_DISABLE_CACHE | To be used only while debugging. On setting this to `true` - meta data be fetched from db instead of redis/cache. | `false` |
| AWS_ACCESS_KEY_ID | For Litestream - S3 access key id | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| AWS_SECRET_ACCESS_KEY | For Litestream - S3 secret access key | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| AWS_BUCKET | For Litestream - S3 bucket | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| AWS_BUCKET_PATH | For Litestream - S3 bucket path (like folder within S3 bucket) | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| NC_SMTP_FROM | For SMTP plugin - Email sender address | |
| NC_SMTP_HOST | For SMTP plugin - SMTP host value | |
| NC_SMTP_PORT | For SMTP plugin - SMTP port value | |
| NC_SMTP_USERNAME | For SMTP plugin (Optional) - SMTP username value for authentication | |
| NC_SMTP_PASSWORD | For SMTP plugin (Optional) - SMTP password value for authentication | |
| NC_SMTP_SECURE | For SMTP plugin (Optional) - To enable secure set value as `true` any other value treated as false | |
| NC_SMTP_IGNORE_TLS | For SMTP plugin (Optional) - To ignore tls set value as `true` any other value treated as false. For more info visit https://nodemailer.com/smtp/ | |
| NC_S3_BUCKET_NAME | For S3 storage plugin - AWS S3 bucket name | |
| NC_S3_REGION | For S3 storage plugin - AWS S3 region | |
| NC_S3_ACCESS_KEY | For S3 storage plugin - AWS access key credential for accessing resource | |
| NC_S3_ACCESS_SECRET | For S3 storage plugin - AWS access secret credential for accessing resource | |
| NC_ADMIN_EMAIL | For updating/creating super admin with provided email and password | |
| NC_ATTACHMENT_FIELD_SIZE | For setting the attachment field size(in Bytes) | Defaults to 20MB |
| Variable | Comments | If absent |
|---|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---|
| NC_DB | See our database URLs | A local SQLite will be created in root folder if `NC_DB` is not provided |
| NC_DB_JSON | Can be used instead of `NC_DB` and value should be valid knex connection JSON | |
| NC_DB_JSON_FILE | Can be used instead of `NC_DB` and value should be a valid path to knex connection JSON | |
| DATABASE_URL | JDBC URL Format. Can be used instead of NC_DB. | |
| DATABASE_URL_FILE | Can be used instead of DATABASE_URL: path to file containing JDBC URL Format. | |
| NC_AUTH_JWT_SECRET | JWT secret used for auth and storing other secrets | A random secret will be generated |
| PORT | For setting app running port | `8080` |
| DB_QUERY_LIMIT_DEFAULT | Default pagination limit | 25 |
| DB_QUERY_LIMIT_MAX | Maximum allowed pagination limit | 1000 |
| DB_QUERY_LIMIT_MIN | Minimum allowed pagination limit | 1 |
| NC_TOOL_DIR | App directory to keep metadata and app related files | Defaults to current working directory. In docker maps to `/usr/app/data/` for mounting volume. |
| NC_PUBLIC_URL | Used for sending Email invitations | Best guess from http request params |
| NC_JWT_EXPIRES_IN | JWT token expiry time | `10h` |
| NC_CONNECT_TO_EXTERNAL_DB_DISABLED | Disable Project creation with external database | |
| NC_INVITE_ONLY_SIGNUP | Removed since version 0.99.0 and now it's recommended to use [super admin settings menu](/0.109.7/setup-and-usages/account-settings#enable--disable-signup). Allow users to signup only via invite url, value should be any non-empty string. | |
| NUXT_PUBLIC_NC_BACKEND_URL | Custom Backend URL | ``http://localhost:8080`` will be used |
| NC_REQUEST_BODY_SIZE | Request body size [limit](https://expressjs.com/en/resources/middleware/body-parser.html#limit) | `1048576` |
| NC_EXPORT_MAX_TIMEOUT | After NC_EXPORT_MAX_TIMEOUT csv gets downloaded in batches | Default value 5000(in millisecond) will be used |
| NC_DISABLE_TELE | Disable telemetry | |
| NC_DASHBOARD_URL | Custom dashboard url path | `/dashboard` |
| NC_GOOGLE_CLIENT_ID | Google client id to enable google authentication | |
| NC_GOOGLE_CLIENT_SECRET | Google client secret to enable google authentication | |
| NC_MIGRATIONS_DISABLED | Disable NocoDB migration | |
| NC_MIN | If set to any non-empty string the default splash screen(initial welcome animation) and matrix screensaver will disable | |
| NC_SENTRY_DSN | For Sentry monitoring | |
| NC_REDIS_URL | Custom Redis URL. Example: `redis://:authpassword@127.0.0.1:6380/4` | Meta data will be stored in memory |
| NC_DISABLE_ERR_REPORT | Disable error reporting | |
| NC_DISABLE_CACHE | To be used only while debugging. On setting this to `true` - meta data be fetched from db instead of redis/cache. | `false` |
| AWS_ACCESS_KEY_ID | For Litestream - S3 access key id | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| AWS_SECRET_ACCESS_KEY | For Litestream - S3 secret access key | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| AWS_BUCKET | For Litestream - S3 bucket | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| AWS_BUCKET_PATH | For Litestream - S3 bucket path (like folder within S3 bucket) | If Litestream is configured and `NC_DB` is not present. SQLite gets backed up to S3 |
| NC_SMTP_FROM | For SMTP plugin - Email sender address | |
| NC_SMTP_HOST | For SMTP plugin - SMTP host value | |
| NC_SMTP_PORT | For SMTP plugin - SMTP port value | |
| NC_SMTP_USERNAME | For SMTP plugin (Optional) - SMTP username value for authentication | |
| NC_SMTP_PASSWORD | For SMTP plugin (Optional) - SMTP password value for authentication | |
| NC_SMTP_SECURE | For SMTP plugin (Optional) - To enable secure set value as `true` any other value treated as false | |
| NC_SMTP_IGNORE_TLS | For SMTP plugin (Optional) - To ignore tls set value as `true` any other value treated as false. For more info visit https://nodemailer.com/smtp/ | |
| NC_S3_BUCKET_NAME | For S3 storage plugin - AWS S3 bucket name | |
| NC_S3_REGION | For S3 storage plugin - AWS S3 region | |
| NC_S3_ACCESS_KEY | For S3 storage plugin - AWS access key credential for accessing resource | |
| NC_S3_ACCESS_SECRET | For S3 storage plugin - AWS access secret credential for accessing resource | |
| NC_ADMIN_EMAIL | For updating/creating super admin with provided email and password | |
| NC_ATTACHMENT_FIELD_SIZE | For setting the attachment field size(in Bytes) | Defaults to 20MB |
| NC_ADMIN_PASSWORD | For updating/creating super admin with provided email and password. Your password should have at least 8 letters with one uppercase, one number and one special letter(Allowed special chars $&+,:;=?@#\|'.^*()%!_-" ) | |
| NODE_OPTIONS | For passing Node.js [options](https://nodejs.org/api/cli.html#node_optionsoptions) to instance | |
| NC_MINIMAL_DBS | Create a new SQLite file for each project. All the db files are stored in `nc_minimal_dbs` folder in current working directory. (This option restricts project creation on external sources) | |
| NC_DISABLE_AUDIT | Disable Audit Log | `false` |
| NC_AUTOMATION_LOG_LEVEL | Possible Values: `OFF`, `ERROR`, `ALL`. See [Webhooks](/developer-resources/webhooks#call-log) for details. | `OFF` |
| NODE_OPTIONS | For passing Node.js [options](https://nodejs.org/api/cli.html#node_optionsoptions) to instance | |
| NC_MINIMAL_DBS | Create a new SQLite file for each project. All the db files are stored in `nc_minimal_dbs` folder in current working directory. (This option restricts project creation on external sources) | |
| NC_DISABLE_AUDIT | Disable Audit Log | `false` |
| NC_AUTOMATION_LOG_LEVEL | Possible Values: `OFF`, `ERROR`, `ALL`. See [Webhooks](/0.109.7/developer-resources/webhooks#call-log) for details. | `OFF` |

2
packages/noco-docs/versioned_docs/version-0.109.7/020.getting-started/030.upgrading.md vendored

@ -3,7 +3,7 @@ title: 'Upgrading'
description: 'Upgrading NocoDB : Docker, Node and Homebrew!'
---
By default, if `NC_DB` is not specified upon [installation](/getting-started/installation), then SQLite will be used to store metadata. We suggest users to separate the metadata and user data in different databases as pictured in our [architecture](/engineering/architecture).
By default, if `NC_DB` is not specified upon [installation](/0.109.7/getting-started/installation), then SQLite will be used to store metadata. We suggest users to separate the metadata and user data in different databases as pictured in our [architecture](/engineering/architecture).
## Docker

6
packages/noco-docs/versioned_docs/version-0.109.7/030.setup-and-usages/020.table-operations.md vendored

@ -141,7 +141,7 @@ You can use Quick Import when you have data from external sources such as Airtab
### Import Airtable into an Existing Project
- See [here](/setup-and-usages/import-airtable-to-sql-database-within-a-minute-for-free)
- See [here](/0.109.7/setup-and-usages/import-airtable-to-sql-database-within-a-minute-for-free)
### Import CSV data into an Existing Project
@ -151,7 +151,7 @@ You can use Quick Import when you have data from external sources such as Airtab
- **Use First Row as Headers**: If it is checked, the first row will be treated as header row.
- **Import Data**: If it is checked, all data will be imported. Otherwise, only table will be created.
![image](https://user-images.githubusercontent.com/35857179/197454479-1ed18dce-1d0b-4ee3-88b3-9b6a132dea2a.png)
- You can revise the table name by double clicking it, column name and column type. By default, the first column will be chosen as <a href="/setup-and-usages/display-value" target="_blank">Display Value</a> and cannot be deleted.
- You can revise the table name by double-clicking it, column name and column type. By default, the first column will be chosen as <a href="/0.109.7/setup-and-usages/display-value" target="_blank">Display Value</a> and cannot be deleted.
![image](https://user-images.githubusercontent.com/35857179/197454633-5b30323e-2b13-4c55-843a-948c093d373e.png)
- Click `Import` to start importing process. The table will be created and the data will be imported.
![image](https://user-images.githubusercontent.com/35857179/197455547-2d93df5e-a7f0-4c88-af53-990067625967.png)
@ -164,7 +164,7 @@ You can use Quick Import when you have data from external sources such as Airtab
- **Use First Row as Headers**: If it is checked, the first row will be treated as header row.
- **Import Data**: If it is checked, all data will be imported. Otherwise, only table will be created.
![image](https://user-images.githubusercontent.com/35857179/197455788-8dd8a7d1-38f3-48c3-a05e-6ab0cf25045c.png)
- You can revise the table name, column name and column type. By default, the first column will be chosen as <a href="/setup-and-usages/display-value" target="_blank">Display Value</a> and cannot be deleted.
- You can revise the table name, column name and column type. By default, the first column will be chosen as <a href="/0.109.7/setup-and-usages/display-value" target="_blank">Display Value</a> and cannot be deleted.
:::note

6
packages/noco-docs/versioned_docs/version-0.109.7/030.setup-and-usages/040.column-types.md vendored

@ -51,7 +51,7 @@ description: 'NocoDB Column Types Overview'
### LinkToAnotherRecord
For more about Link To Another Record, please visit <a href="/setup-and-usages/link-to-another-record" target="_blank">here</a>.
For more about Link To Another Record, please visit <a href="/0.109.7/setup-and-usages/link-to-another-record" target="_blank">here</a>.
<!-- ### ForeignKey
#### Available Database Types
@ -258,7 +258,7 @@ For more about Link To Another Record, please visit <a href="/setup-and-usages/l
### Formula
For more about Formulas, please visit <a href="/setup-and-usages/formulas" target="_blank">here</a>.
For more about Formulas, please visit <a href="/0.109.7/setup-and-usages/formulas" target="_blank">here</a>.
### QR-Code
@ -286,7 +286,7 @@ Since it's a virtual column, the cell content (Barcode) cannot be changed direct
### Rollup
For more about Rollup, please visit <a href="/setup-and-usages/rollup" target="_blank">here</a>.
For more about Rollup, please visit <a href="/0.109.7/setup-and-usages/rollup" target="_blank">here</a>.
### DateTime

2
packages/noco-docs/versioned_docs/version-0.109.7/030.setup-and-usages/240.meta-management.md vendored

@ -25,7 +25,7 @@ To access it, click the down arrow button next to Project Name on the top left s
## Sync Metadata
Go to `Data Sources`, click ``Sync Metadata``, you can see your metadata sync status. If it is out of sync, you can sync the schema. See [Sync Schema](/setup-and-usages/sync-schema) for more.
Go to `Data Sources`, click ``Sync Metadata``, you can see your metadata sync status. If it is out of sync, you can sync the schema. See [Sync Schema](/0.109.7/setup-and-usages/sync-schema) for more.
![image](https://user-images.githubusercontent.com/35857179/219833485-3bcaa6ec-88bc-47cc-b938-5abb4835dc31.png)

2
packages/noco-docs/versioned_docs/version-0.109.7/040.developer-resources/020.rest-apis.md vendored

@ -9,7 +9,7 @@ Once you've created the schemas, you can manipulate the data or invoke actions u
Here's the overview of all APIs. For the details, please check out <a href="https://all-apis.nocodb.com/" target="_blank">NocoDB API Documentation</a>.
You may also interact with the API's resources via <a href="/developer-resources/accessing-apis#swagger-ui" target="_blank">Swagger UI</a>.
You may also interact with the API's resources via <a href="/0.109.7/developer-resources/accessing-apis#swagger-ui" target="_blank">Swagger UI</a>.
:::note

4
packages/noco-docs/versioned_docs/version-0.109.7/040.developer-resources/030.sdk.md vendored

@ -7,7 +7,7 @@ We provide SDK for users to integrate with their applications. Currently only SD
:::note
The NocoDB SDK requires authorization token. If you haven't created one, please check out <a href="/developer-resources/accessing-apis" target="_blank">Accessing APIs</a> for details.
The NocoDB SDK requires authorization token. If you haven't created one, please check out <a href="/0.109.7/developer-resources/accessing-apis" target="_blank">Accessing APIs</a> for details.
:::
@ -57,7 +57,7 @@ Once you have configured `api`, you can call different types of APIs by `api.<Ta
:::note
For Tag and FunctionName, please check out the API table <a href="/developer-resources/rest-apis" target="_blank">here</a>.
For Tag and FunctionName, please check out the API table <a href="/0.109.7/developer-resources/rest-apis" target="_blank">here</a>.
:::

Loading…
Cancel
Save