/utils/csvUpload
This module exports the default csvUpload method for mapp utils module.
This utility supplies a way to insert data into a table from a csv file.
The function takes a File object and reads the contents.
The data is sanatised (see splitRowIntoFields
) and then passed to the supplied query
param.
{
query: "sql_table_insert"
queryparams: {
table: "scratch"
}
layer: {}
fields: {
"numeric_field": "numeric",
"char_field": "text"
}
}
- layer: the layer on which the csv upload is being done.
- queryparams.table: the table being inserted on.
- fields: this describes the fields being inserted, the name of the field is used as the key and the type is the value of the key
The function transforms the read data into an object as below:
```json
{
"numeric_field::numeric":[
1,2,3
],
"char_field::text":[
'a','b','c'
]
}
This data is passed to the relevant insert query (defaults to sql_table_insert
).
Additionally files exceeding the lambda upload limit (6mb) will be chunked into parts and inserted separately.
- Source
Methods
(inner) chunkRows(rows, params) → {Array}
The method iterates over each row in the rows param. And returns an array of chunks according to max payload size for the POST request.
Name | Type | Description |
---|---|---|
rows | array | An array of rows extracted from the CSV file. |
params | object | Parameter for chunking the rows array. |
Name | Type | Attributes | Description |
---|---|---|---|
params.fields | Object | The definition of the columns being inserted e.g. {field: field_type}. | |
params.chunkSize | Number | <optional> | The chunk size in bytes to upload the data in. Defaults to 4MB. |
params.schema | Array | <optional> | The schema array of column types. |
- Source
An array of field arrays chunked to size.
- Type:
- Array
(async, inner) csvUpload(file, params) → {Promise.<(Object|Error)>}
This function uploads a CSV file to a database store.
Name | Type | Description |
---|---|---|
file | File | The CSV file to upload. |
params | Object | The parameters object. |
Name | Type | Attributes | Description |
---|---|---|---|
params.query | String | The query to upload the data. | |
params.fields | Object | The definition of the columns being inserted e.g. {field: field_type}. | |
params.layer | layer | The layer being inserted on. | |
params.queryparams.table | layer | The table being inserted into. | |
params.async | Boolean | <optional> | If set to true, the data is uploaded asynchronously. Default is false. |
params.header | Boolean | <optional> | If set to true, the first row is treated as a header row and not uploaded. |
- Source
The outcome object and any messages returned by the database.
- Type:
- Promise.<(Object|Error)>
(inner) postRequestData(data, params) → {Promise.<(Object|Error)>}
The method returns a parameterised query object from the XHR utility method.
Name | Type | Description |
---|---|---|
data | Array | Data for the post request body. |
params | Object | Parameters for the post query. |
- Source
Returns promise from XHR query utility.
- Type:
- Promise.<(Object|Error)>
(inner) splitRowIntoFields(row) → {Array}
The method splits a CSV row string into fields.
Name | Type | Description |
---|---|---|
row | String | The row to split. |
- Source
The fields array.
- Type:
- Array
Type Definitions
schemaMethods
schemaMethods are used to convert the field values to the correct type for the database. The methods are used in the chunkRows method to convert the field values to the correct type for the database.
- Object
Name | Type | Description |
---|---|---|
text | function | Replace quotes in string values. |
int | function | Parse value as integer. |
numeric | function | Parse value as float. |
- Source