Browse Source

refactor: increase csv batch download timeout

Signed-off-by: Pranav C <pranavxc@gmail.com>
pull/743/head
Pranav C 3 years ago
parent
commit
61c3966cbb
  1. 1
      packages/noco-docs/content/en/getting-started/installation.md
  2. 2
      packages/nocodb/src/lib/dataMapper/lib/sql/BaseModelSql.ts

1
packages/noco-docs/content/en/getting-started/installation.md

@ -151,6 +151,7 @@ And connection params for this database can be specified in `NC_DB` environment
| AWS_SECRET_ACCESS_KEY | No | For Litestream - S3 secret access key | If Litestream is configured and NC_DB is not present. SQLite gets backed up to S3 |
| AWS_BUCKET | No | For Litestream - S3 bucket | If Litestream is configured and NC_DB is not present. SQLite gets backed up to S3 |
| AWS_BUCKET_PATH | No | For Litestream - S3 bucket path (like folder within S3 bucket) | If Litestream is configured and NC_DB is not present. SQLite gets backed up to S3 |
| NC_EXPORT_MAX_TIMEOUT | No | After NC_EXPORT_MAX_TIMEOUT csv gets downloaded in batches | Default value 5000 will be used |
### Docker Compose

2
packages/nocodb/src/lib/dataMapper/lib/sql/BaseModelSql.ts

@ -2428,7 +2428,7 @@ class BaseModelSql extends BaseModel {
let offset = +args.offset || 0;
const limit = 100;
// const size = +process.env.NC_EXPORT_MAX_SIZE || 1024;
const timeout = +process.env.NC_EXPORT_MAX_TIMEOUT || 500;
const timeout = +process.env.NC_EXPORT_MAX_TIMEOUT || 5000;
const csvRows = [];
const startTime = process.hrtime();
let elapsed, temp;

Loading…
Cancel
Save